Somatic Computing

A few months ago, I sat down to a meeting with Kristin Cobble, Gwen Gordon, and Rebecca Petzel. We were all on our laptops, pulling up notes and sending off last minute emails. As the meeting was about to start, we all closed our laptops with a deep breath, looked at each other for a moment, then burst out laughing.

We had all noticed the same thing at the same time, and we had communicated that to each other nonverbally. Closing our laptops had completely shifted the energy of the room.

Space matters. It affects our bodies and their relations with the things around them, and there is wisdom and energy inherent in that.

Most of our interfaces with digital tools were not invented with our bodies in mind. They were created to serve machines, and for the most part, they haven’t evolved much. We’re starting to see that shift now, with mobile devices, with touch, and with gestural interfaces. When our technology starts to augment our body’s natural wisdom, we are going to see a huge shift in its power.

My friend, Anselm Hook, gave a great talk at AR Devcamp last year, where he explored the notion of developing software somatically (what he’s calling the “Slow Code movement”):

The technology we need to build these sorts of interfaces exist today, and I think we’ll see an inflection point soon. But that inflection point won’t come because of enterprise-driven markets, despite the opportunity that these interfaces hold. Those markets simply aren’t big enough. It will come because of games.

This is no great insight, but I think that many of us are remarkably blind to it, and it’s worth reminding ourselves of this over and over again. It’s no accident that Kinect — one of the most remarkable and successful research-driven products for personal computing — came out of Microsoft’s Xbox 360 group and not one of its other divisions.