Darkest Hour: Nuanced Historical Flick or Meme-ish Schlock?

Darkest Hour is about the World War II events that took place in Great Britain from May 10 through June 4, 1940 — Neville Chamberlain’s resignation as prime minister, Winston Churchill’s ascension, France falling to Nazi Germany (including the siege at Calais), the miraculous evacuation of over 330,000 British troops at Dunkirk, all culminating in Churchill’s decision not to enter into peace talks with Hitler, which led to the Battle of Britain.

I enjoyed the movie. It was entertaining, well crafted, and beautifully acted, and I think the choice to focus on that single, eventful month was an excellent one. Gary Oldman is physically the exact opposite of Churchill, yet he absolutely disappears into the role. I also thought the movie was nicely and importantly nuanced until the end (where it devolves into a schlocky mess).

In particular, it highlighted Churchill’s spotty track record prior to his ascension, his politically precarious position, and the strategic, emotional, and moral complexity around decisions that will unavoidably cost lives. It offered a little taste of coalition politics, which I find especially fascinating these days as I wonder about the future of the two-party system in the U.S. I also felt more empathetic to the strongly differing viewpoints of the time, especially to Chamberlain and Viscount Halifax. It’s so much easier to judge when you have the benefit of hindsight.

I wouldn’t say the ending ruined the whole movie for me (a surprising sentiment, given that I knew what was going to happen), but I didn’t like it. It lost its nuance. First, there was an incredibly grotesque, entirely fictional scene where Churchill decides to take the subway to Westminster Station so that he can mix with the people. It’s a cheap, gimmicky device made even worse by the inclusion of the one person of color in the entire movie.

Then, all semblance of nuance disappeared, and it became a series of will-of-the-people, fight-until-the-end propaganda piece. Which, in some ways, was accurate. Churchill’s strengths were his way with words and his ability to inspire. Earlier in the film, they did touch on the difficult balancing act between looking disastrous reality in the face while also maintaining hope, which I appreciated.

I get why the movie ended the way it did. It would have been too complicated, for example, to try to explain that the Battle of Britain, while heroic and extraordinary, would likely have been futile had Japan not attacked Pearl Harbor and the U.S. not entered the war.

I’m okay with the ending. I just didn’t like it. And I especially hated the fact that the movie ended with this inspirational “Churchill” quote:

Success is not final, failure is not fatal: it is the courage to continue that counts.

Here’s what the International Churchill Society has to say about this quote on a page entitled, “Quotes falsely attributed to Winston Churchill”:

We can find no attribution for either one of these and you will find that they are broadly attributed to Winston Churchill. They are found nowhere in his canon, however. An almost equal number of sources found online credit these sayings to Abraham Lincoln — but we have found none that provides any attribution in the Lincoln Archives.

Falsely sourcing quotes is a pet peeve of mine. I get why they might have created that idiotic subway scene. But why end the movie Internet meme style? It was lazy and unnecessary, and it summed up how the overall ending of this otherwise solid movie was for me.

George Washington’s Warning About Political Parties

From George Washington’s farewell address at the close of his second term:

I have already intimated to you the danger of parties in the State, with particular reference to the founding of them on geographical discriminations. Let me now take a more comprehensive view, and warn you in the most solemn manner against the baneful effects of the spirit of party generally.

This spirit, unfortunately, is inseparable from our nature, having its root in the strongest passions of the human mind. It exists under different shapes in all governments, more or less stifled, controlled, or repressed; but, in those of the popular form, it is seen in its greatest rankness, and is truly their worst enemy.

The alternate domination of one faction over another, sharpened by the spirit of revenge, natural to party dissension, which in different ages and countries has perpetrated the most horrid enormities, is itself a frightful despotism. But this leads at length to a more formal and permanent despotism. The disorders and miseries which result gradually incline the minds of men to seek security and repose in the absolute power of an individual; and sooner or later the chief of some prevailing faction, more able or more fortunate than his competitors, turns this disposition to the purposes of his own elevation, on the ruins of public liberty.

Without looking forward to an extremity of this kind (which nevertheless ought not to be entirely out of sight), the common and continual mischiefs of the spirit of party are sufficient to make it the interest and duty of a wise people to discourage and restrain it.

It serves always to distract the public councils and enfeeble the public administration. It agitates the community with ill-founded jealousies and false alarms, kindles the animosity of one part against another, foments occasionally riot and insurrection. It opens the door to foreign influence and corruption, which finds a facilitated access to the government itself through the channels of party passions. Thus the policy and the will of one country are subjected to the policy and will of another.

There is an opinion that parties in free countries are useful checks upon the administration of the government and serve to keep alive the spirit of liberty. This within certain limits is probably true; and in governments of a monarchical cast, patriotism may look with indulgence, if not with favor, upon the spirit of party. But in those of the popular character, in governments purely elective, it is a spirit not to be encouraged. From their natural tendency, it is certain there will always be enough of that spirit for every salutary purpose. And there being constant danger of excess, the effort ought to be by force of public opinion, to mitigate and assuage it. A fire not to be quenched, it demands a uniform vigilance to prevent its bursting into a flame, lest, instead of warming, it should consume.

Misogyny and the Curious Case of Ada Lovelace

In honor of International Women’s Day, I’d like to honor two women, and I’d like to share a strange story about misogyny. The first is Ada Lovelace, the remarkable daughter of Lord Byron, who published the very first computer program in 1843. Because I have many friends and colleagues in technology, I often see Ada’s name crop up in various social media channels on this day. I also occasionally get surprised emails from friends, who start reading about her work, see references to some guy named, “Eugene Eric Kim,” and wonder if that person could possibly be me.

Yup, it’s me. In 1999, I co-authored a piece about Ada in Scientific American with the brilliant Betty Alexandra Toole, who is the second person I’d like to honor. I am exceptionally proud of the piece, because Betty and I cleared up several myths about Ada there. I also found a bug in Ada’s program, and am the first person to have identified and published it, 156 years after the fact!

I studied history of science in college, and while I was primarily interested in understanding scientific revolutions, I was irresistably drawn to the history of computers and computer science. Shortly after graduating, I came across Betty’s extensive and definitive scholarship on Ada and emailed her about it. Betty was warm and delightful, and it turned out she was friends with my colleague, the inimitable Michael Swaine, and his partner, Nancy Groth, and that she lived in the Bay Area. We became friends, and we talked often about her work.

One of the curious things I learned about Betty was how many people seemed to want to discredit Ada’s contributions to computer science. I found it interesting and weird, but I didn’t delve too much into it myself. I more or less trusted Betty’s scholarship, but I still reserved the right to be skeptical myself. If other historians were discounting Ada’s contributions, there was probably a reason for it, right?

Then, in 1998, Betty forced me to confront my skepticism. She wanted to write a popular article about Ada’s work, and she asked me if I would co-author it with her. I said I would, but that I needed to look at the data myself before I could do it with her. Betty not only agreed, this was her plan all along. She had copies of Ada’s original notes, and she knew that I could understand the technical parts.

So I spent a few months carefully reading through Ada’s notes, double-checking her math, and also reading the scholarly work that disputed her contributions. I concluded that those claims were baseless.

Ada first met Charles Babbage (the inventor of the first computer) when she was 17. She published the first computer program when she was 27. I was 23 when I first read her notes, so I tried to remember what I was like when I first started learning algebra, then I tried to adjust my expectations based on the state of education at the time. All of that was largely unnecessary. Ada was a competent mathematician and a skillful learner. At minimum, she would have been in the same advanced classes as me in high school, and she likely would have been ahead of me. Moreover, she would have argued with me often and put me in my place. She was clearly proud, independent, and stubborn.

But she wasn’t a mathematical prodigy, either, or a technical genius like Babbage. Her genius was in recognizing the brilliance and potential of Babbage’s invention, not just from a pie-in-the-sky perspective (Babbage himself was skilled in spinning yarns), but from a down-and-dirty, here’s-how-it-actually-works perspective. She was both a visionary and a hacker, a century ahead of her time.

Here’s what Betty and I wrote in our article:

We cannot know for certain the extent to which Babbage helped Ada on the Bernoulli numbers program. She was certainly capable of writing the program herself given the proper formula; this is clear from her depth of understanding regarding the process of programming and from her improvements on Babbage’s programming notation. Additionally, letters between Babbage and Ada at the time seem to indicate that Babbage’s contributions were limited to the mathematical formula and that Ada created the program herself. While she was working on the program, Ada wrote to Babbage, “I have worked incessantly, & most successfully, all day. You will admire the Table & Diagram extremely. They have been made out with extreme care, & all the indices most minutely & scrupulously attended to.”

The importance of Ada’s choosing to write this program cannot be overstated. Babbage had written several small programs for the Analytical Engine in his notebook in 1836 and 1837, but none of them approached the complexity of the Bernoulli numbers program. Because of her earlier tutelage, Ada was at least familiar with the numbers’ properties. It is possible that Ada recognized that a Bernoulli numbers program would nicely demonstrate some of the Analytical Engine’s key features, such as conditional branching. Also, because Menabrea had alluded to the Bernoulli numbers in his article, Ada’s program tied in nicely with her translation of Menabrea.

What strikes me, almost 20 years after I wrote that article with Betty, is how often people continue to question Ada’s contributions. Why would they do so? The scholarship is actually not extensive. Only a few historians have actually looked at the source material, and the same ones (including Betty and me) are cited over and over again. There is no smoking gun suggesting she is not responsible for her own work, and the evidence in her favor is considerable. Why is there any doubt? More importantly, why is the doubt so often propagated?

I have two hypotheses. The first is that people are lazy. Most people never bother looking at the historical pieces, such as the one I cowrote with Betty, much less the source material.

Second, most people are biased (consciously or not) against women. If Ada had been a man, and if the historical record were exactly the same, I have no doubt that far fewer people would be questioning Ada’s contributions. Remember, I fell into this category too. When Betty first told me about the controversy, even though I trusted her, I assumed the source data would be far less conclusive than it was. Truthfully, if I weren’t already friends with Betty, I probably would have just assumed that the skeptics were correct.

Betty, of course, understood all of this far more viscerally than I did. We talked about this extensively back then, and while I thought I understood, my understanding is far more deep and nuanced today. I know that my mere presence on the byline of the article we co-wrote (and that Betty deserves far more credit for) automatically boosted the credibility of our work, and that this had nothing to do with my reputation in the field (which was and deservedly continues to be non-existent).

It’s sobering, but I think there’s something heartening about these current times, despite all of the very real challenges we continue to face. People are far more conscious about unconscious bias — whether it’s about gender, race, sexuality, height, weight, age, fashion, whatever — than ever before. If we can acknowledge it, we can also do something about it.

So here’s to Ada Lovelace, badass and worthy hero to women and men everywhere! And here’s to my friend, Betty Alexandra Toole! Happy International Women’s Day!

Richard Feynman on Explaining Things Clearly

My dad is a physicist who idolized Richard Feynman, which meant that I idolized him as well growing up. One thing about him that made a huge impression on me was how simply and clearly he explained things. Feynman was a genius. If he could explain physics clearly to anyone, then no one had any excuse to put on airs.

The Long Now Foundation recently republished one of my favorite essays about Feynman, Danny Hillis’s, “Richard Feynman and the Connection Machine.” (Hat tip to William Barnhill.) It not only shares many great anecdotes about Feynman, it also is an oral history of the early days at Hillis’s groundbreaking parallel computing company, Thinking Machines.

The whole essay is fantastic, but I like this excerpt about Feynman’s explaining prowess in particular:

In the meantime, we were having a lot of trouble explaining to people what we were doing with cellular automata. Eyes tended to glaze over when we started talking about state transition diagrams and finite state machines. Finally Feynman told us to explain it like this,

“We have noticed in nature that the behavior of a fluid depends very little on the nature of the individual particles in that fluid. For example, the flow of sand is very similar to the flow of water or the flow of a pile of ball bearings. We have therefore taken advantage of this fact to invent a type of imaginary particle that is especially simple for us to simulate. This particle is a perfect ball bearing that can move at a single speed in one of six directions. The flow of these particles on a large enough scale is very similar to the flow of natural fluids.”

This was a typical Richard Feynman explanation. On the one hand, it infuriated the experts who had worked on the problem because it neglected to even mention all of the clever problems that they had solved. On the other hand, it delighted the listeners since they could walk away from it with a real understanding of the phenomenon and how it was connected to physical reality.

We tried to take advantage of Richard’s talent for clarity by getting him to critique the technical presentations that we made in our product introductions. Before the commercial announcement of the Connection Machine CM-1 and all of our future products, Richard would give a sentence-by-sentence critique of the planned presentation. “Don’t say `reflected acoustic wave.’ Say [echo].” Or, “Forget all that `local minima’ stuff. Just say there’s a bubble caught in the crystal and you have to shake it out.” Nothing made him angrier than making something simple sound complicated.

But what Richard hated, or at least pretended to hate, was being asked to give advice. So why were people always asking him for it? Because even when Richard didn’t understand, he always seemed to understand better than the rest of us. And whatever he understood, he could make others understand as well. Richard made people feel like a child does, when a grown-up first treats him as an adult. He was never afraid of telling the truth, and however foolish your question was, he never made you feel like a fool.

Connecting Past and Present

Doug Engelbart’s passing earlier this summer has me reflecting a lot on our friendship, the work we did together, the influence he had on my life and my career, and also larger issues around connecting past and present. Talking with my former HyperScope teammates reminded me of how much fun we had together. I’m proud of what we did, but I’m even prouder of how we did it.

We worked in two ways that were particularly innovative. First, we were an open project. Anyone could participate in any of our meetings, including the face-to-face ones. We even provided food.

Several people (but not Doug) were nervous when I first proposed doing this. They were concerned about people disrupting the process. I wasn’t worried about disruption, because we had a strong core team, and I knew how to facilitate an open process. In particular, “open” does not mean everyone is equal. As I explained to everyone who joined us, I invited everyone to speak up, but we had a task we needed to complete, and I reserved the right to direct the conversation and kick people out if they didn’t respect the ground rules.

I had to assert that right a few times, but otherwise, the openness overwhelming improved the project. We had a number of people make really valuable contributions, and Craig Latta in particular came out of nowhere to become an important part of our core team.

Second, I proactively invited members of Doug’s original lab to join us. I wanted the team to learn everything we could from the people who had done it before us, and I also wanted these folks to understand how much we valued them. The whole team agreed that this was one of the best parts of the experience. Not only did we learn a ton, but we felt like an extended family.

I’m constantly amazed how rarely this occurs. People don’t think to reach out to their forebears, and they miss out on a huge learning opportunity. The HyperScope project was an unusual situation (although Silicon Valley is rife with these opportunities, given how compressed the history of our industry is), but these opportunities also apply to projects where people have recently left.

I think that most of the time, it simply doesn’t occur to people. Sometimes, there are tricky politics and emotions involved. Often, it’s because we don’t value the hard-earned knowledge that we can learn from those who came before us. This is why so many companies often lay people off without taking into account the loss of institutional knowledge.

Regardless of the reasons, I think it’s unfortunate that connecting the old with the new is such a rarity, and I’d love to see this shift.