Managing Complexity: Exploring the Cockpit of a 1960s F-5 Fighter Jet

The other day, I visited the Western Museum of Flight with my friend, Ed. It’s a tiny, volunteer-run museum next to Zamperini Field in Torrance, California, and it boasts several original prototypes of some iconic fighter jets, which I enjoyed seeing. But the surprising highlight of the visit for me was sitting in the cockpit of an F-5.

I was largely apathetic about the F-5 at first. It’s an older jet (first deployed in the early 1960s), and it was mostly an export and training plane. However, it was the only plane where we got to go into the cockpit, and I had never sat in the cockpit of any fighter jet before.

My first reaction was surprise at how comfortable it was in there. Much better than my office chair! (I need to get a new office chair.)

My second reaction was overwhelm. Take a look at this instrument panel:

Here’s a more dynamic view:

That’s a whole lot of dials and buttons and levers to track, all while flying at the speed of the sound and dogfighting with other fighters. I felt awe and appreciation for the pilots, who somehow were able to monitor all of this complexity in real-time.

After I got over my initial overwhelm, I took a closer look. To my surprise, everything seemed to make sense. Dials and buttons were clearly labeled. Color-coding helped me quickly figure out which buttons I should avoid. The buttons and switches felt good when I pressed and flipped them — not enough resistance to be hard, but enough to feel solid and high-quality. It doesn’t hide the complexity, but it makes it manageable, even enjoyable. Look more closely at the weapons panel on the lower left:

Notice the diagrams and descriptions. Notice the spacing — dense, but comfortable.

When you think about it, of course the inside is well-designed. A jet is a high-performance device, and the pilot’s life literally depends on their ability to process massive amounts of complexity in real-time. Still, I found the design inspiring. I wish all of my dashboards were designed as well.

Here’s a more zoomed out look at what it’s like to sit in the cockpit, along with some additional commentary:

Somatic Computing

A few months ago, I sat down to a meeting with Kristin Cobble, Gwen Gordon, and Rebecca Petzel. We were all on our laptops, pulling up notes and sending off last minute emails. As the meeting was about to start, we all closed our laptops with a deep breath, looked at each other for a moment, then burst out laughing.

We had all noticed the same thing at the same time, and we had communicated that to each other nonverbally. Closing our laptops had completely shifted the energy of the room.

Space matters. It affects our bodies and their relations with the things around them, and there is wisdom and energy inherent in that.

Most of our interfaces with digital tools were not invented with our bodies in mind. They were created to serve machines, and for the most part, they haven’t evolved much. We’re starting to see that shift now, with mobile devices, with touch, and with gestural interfaces. When our technology starts to augment our body’s natural wisdom, we are going to see a huge shift in its power.

My friend, Anselm Hook, gave a great talk at AR Devcamp last year, where he explored the notion of developing software somatically (what he’s calling the “Slow Code movement”):

The technology we need to build these sorts of interfaces exist today, and I think we’ll see an inflection point soon. But that inflection point won’t come because of enterprise-driven markets, despite the opportunity that these interfaces hold. Those markets simply aren’t big enough. It will come because of games.

This is no great insight, but I think that many of us are remarkably blind to it, and it’s worth reminding ourselves of this over and over again. It’s no accident that Kinect — one of the most remarkable and successful research-driven products for personal computing — came out of Microsoft’s Xbox 360 group and not one of its other divisions.