Why Are We Afraid of Data?

My friend, Gbenga Ajilore, is an economics professor. Last month, he gave a great talk at AlterConf in Chicago entitled, “How can open data help facilitate police reform?” It concisely explains how data helps us overcome anecdotal bias.

I was particularly struck by his point about how we need police buy-in for this data to be truly useful, and I was left with a bit of despair. Why is buy-in about the importance of data so hard? This should be common sense, right?

Clearly, it’s not. Earlier this year, I expressed some disbelief about how, in professional sports, where there are hundreds of millions of dollars riding on outcomes, there is still strong resistance to data and analytics.

On the one hand, it’s incredible that this is still an issue in professional sports, 14 years after Moneyball was first published and several championships were won by analytics-driven franchises (including two “cursed” franchises, the Boston Red Sox and the Chicago Cubs, both led by data nerd Theo Epstein).

On the other hand, it’s a vivid reminder of how hard habits and groupthink are to break, even in a field where the incentives to be smarter than everyone else come in the form of hundreds of millions of dollars. If it’s this hard to shift mindsets in professional sports, I don’t even want to imagine how long it might take in journalism. It’s definitely helping me recalibrate my perspective about the mindsets I’m trying to shift in my own field.

The first time I started to understand the many social forces that cause us to resist data was right after college, when I worked as an editor at a technology magazine. One of my most memorable meetings was with a vendor that made a tool that analyzed source code to surface bugs. All software developers know that debugging is incredibly hard and time-consuming. Their tool easily and automatically identified tons and tons of bugs, just by feeding it your source code.

“This is one of the best demos I’ve ever seen!” I exclaimed to the vendor reps. “Why isn’t everyone knocking on your door to buy this?”

The two glanced at each other, then shrugged their shoulders. “Actually,” one explained, “we are having a lot of trouble selling this. When people see this demo, they are horrified, because they realize how buggy their code is, and they don’t have the time or resources to fix it. They would rather that nobody know.”

Super Bowl Collaboration Lessons: Data, Preparation, and Accountability

The New England Patriots won its fourth Super Bowl last night, beating the Seattle Seahawks 28-24 in a thrilling contest and a crazy finish. I normally root against all Boston sports teams, but I made an exception this year, because Tom Brady is of my generation, and I can’t be mad when an old guy wins.

(If you’re not interested in football, skip ahead to my takeaways.)

For those of you who didn’t see the game, the ending was exciting and… surprising. New England was up by four points with about two minutes left, when Seahawks quarterback, Russell Wilson, threw a long pass to unheralded receiver Jermaine Kearse. The Patriots undrafted, rookie cornerback, Malcolm Butler, made an unbelievable play on the ball that should have effectively ended the game.

Except that miraculously, the ball hit Kearse’s leg as he was falling on the ground, and Kearse somehow managed to catch the ball on his back. The Seahawks had a first down five yards away from the end zone, and, with the best power running back in the game, Marshawn Lynch, they seemed destined to steal this game from the Patriots. Sure enough, Lynch got the ball on the next play and rumbled his way to the one-yard line with a minute to spare.

To understand what happened next, you have to understand how clock management works in the NFL. The Seahawks had one minute and three chances to move the ball one yard and win the game. The clock would run down continuously unless one of two things happened: the Seahawks threw an incomplete pass or one of the teams called a timeout. (Technically, they could also stop the clock by running out of bounds, but that was unlikely scenario given their field position.) The Seahawks only had one timeout remaining, meaning that they could stop the clock with it once. The only other way it could stop the clock was to throw an incomplete pass.

At the same time, the Patriots had a very hard decision to make. If the Seahawks scored, then the Patriots would need enough time to score as well. With a minute left, they had a chance. Any less, and they were as good as done.

To summarize, the Seahawks top priority was to score. It’s second priority was to do so with as little time as possible left on the clock, so that the Patriots didn’t have a chance to answer. The Patriots either had to try to stop the Seahawks from scoring, or — if they believed that the odds of that happening were unlikely — they needed to let the Seahawks score as quickly as possible, so that they had a chance to answer.

That was the complicated version of the situation, at least. The joy of football is that it is both wildly complex and brutally simple. The simple version of the situation was this: One yard, three chances. Give ball to big, strong man known affectionately as “Beast mode.” Watch him rumble into end zone. Celebrate victory.

That was the version that most people — myself included — saw, so when the Seahawks decided to try to pass the ball, most everybody was shocked. The reason you don’t pass in that situation is that you risk throwing an interception. That’s exactly what happened. Malcolm Butler, the unlucky victim of Kearse’s lucky catch two plays before, anticipated the play, intercepted the ball, and preserved the Patriots victory.

The media — both regular and social — predictably erupted. How could they throw the ball on that play?!

(Okay, takeaways start here.)

I took three things away from watching the game and that play in particular.

First, I had assumed — like most of America — that Pete Carroll, Seattle’s head coach, had made a bungling error. But when he explained his reasoning afterward, there seemed to be at least an ounce of good sense behind his call.

With about 30 seconds remaining, in the worst case scenario, Carroll needed to stop the clock at least twice to give his team three chances to score. They only had one timeout. So, they would try a pass play first. If they scored, then they would very likely win the game, because the Patriots would not have enough time to score. If they threw an incomplete pass, the clock would stop, and the Seahawks would have two more chances to try to score, both times likely on the ground.

Still, it seemed like a net bad decision. It still seemed like handing the ball off the Lynch was a lower risk move with a higher probability of success in that situation.

It turns out that when you look at the actual numbers, it’s not clear that this is the case. (Hat tip to Bob Blakley for the pointer to the article.) In fact, the data suggests that Patriots head coach, Bill Belichik, made the more egregious error in not calling a timeout and stopping the clock. It may have even behooved him to let the Seahawks score, a strategy he employed in the Super Bowl four years earlier. Or, maybe by not calling timeout, Belichik was demonstrating a mastery of game theory.

The bottom line is that the actual data did not support most people’s intuitions. Both coaches — two of the best in the game — had clearly done their homework, regardless of whether or not their decisions were right or wrong.

Second, regardless of whether or not the decision was good or bad, I loved how multiple people on the Seahawks — Carroll, Wilson, and offensive coordinator, Darrell Bevell — insisted that they were solely responsible for the decision. There was no finger pointing, just a lot of leaders holding themselves accountable. Clearly, there is a very healthy team dynamic on the Seahawks.

Third, I loved Malcolm Butler’s post-game interview, not just for the raw emotion he displayed, but for what he actually said. He made an unbelievable play, which he attributed to his preparation. He recognized the formation from his film study, and he executed. All too often, we see remarkable plays like Butler’s, and we attribute it to incredible athleticism or talent, when in reality, practice and preparation are responsible.

2013 Self-Care Review

Here are my self-care “results” from 2013. Quick review:

  • My self-care dashboard is a way of tracking personal practices to keep me healthy, high-performing, and sane.
  • Practices tracked: Playing basketball, working out (other than basketball, including one-hour or longer walks), taking non-weekend play days, not checking work email after weeknight dinners, not checking work email on weekends.
  • Everytime I do one of the above, I get a point. I total up my points once a week. There is a theoretical maximum of 25 weekly points, but that’s completely unrealistic. My “ideal” steady state is 10 points per week.

I started tracking in 2012 and found that my four-week rolling average was the best way of tracking my current state of self-care. For example, if I did most of my practices regularly for three weeks, then had a week where I only did one or two, my four-week rolling average would still be relatively high. Conversely, if I had several bad weeks in a row, then had one week where I hit my “ideal” number of 10, I’d still have a relatively low four-week rolling average.

Here are my 2013 rolling averages (in blue) as compared to 2012:

Self-Care Rolling Average 2012-2013

Quick analysis:

  • Overall, I did a much better job of taking care of myself in 2013 than in 2012. My rolling average never hit zero in 2013, and it hit 9 at its peak — double my peak in 2012.
  • My numbers were up in all five categories this year except for working out, which saw a slight downturn. However, I played twice as much basketball this year than last, and I’m in far better shape right now than I was a year ago.
  • The biggest increases were due to me turning off work email. From April through August, I wasn’t working in the traditional sense, which explains the huge improvement. In August, I “officially” launched Changemaker Bootcamp. Later in the month, I started a project with the Garfield Foundation. My rolling average actually dipped below 2012 levels in that time period, again largely due to email. In October, I recognized the trend and adjusted, which led to my numbers going back up to peak 2012 levels, but not to peak 2013 levels.
  • The graphs are cyclical in both 2012 and 2013, and there seem to be some peaks and valleys in common. Some of it is coincidental. In April and May of both 2012 and 2013, I traveled quite a bit — for work in 2012 (where the dip was more unpleasant), for play in 2013 (where the dip was largely incidental).
  • I think my line will always be cyclical, given my personality and lifestyle. I’m good with that. However, I’d like a smaller amplitude, and I’d like the overall average to be closer to 10 than it is right now.

Bottom line: I took much better care of myself this year than last, but I have a ways to go before I hit the right balance. 2014 will be a good test as to whether I’m making systemic improvements, or whether my gains this year were more a reflection of me taking a break. I feel good about it being the former, but we’ll see. Either way, I’m happy about how 2013 went.

You can learn more about how I do my tracking at Faster Than 20. If you’re using my spreadsheet for your own tracking, please let me know how it’s going! If you’d like help getting setup, drop me an email or leave a comment below.

Two Collaboration Insights from Last Weekend’s Football Games

It’s football season! Today, I came across two gems about this past weekend’s games that spoke to me about collaboration in general.

Data and Impact

In reaming Tennessee Titans coach, Mike Munchak, Bill Barnwell wrote:

Mike Munchak put us all through a lot of field goals for no real reason, and in doing so, he illuminated the difference between meaningful game management and the illusion of impact.

In this one, biting sentence, Barnwell is offering commentary on how we evaluate coaches. When you kick a field goal, you’re putting points on the board, and so it may look like you’re getting results. But the real question is whether kicking a field goal the highest impact move you could have made at the time. The sports analytics movement has shown that some of our most common practices are often the worst moves we can make, even though they give us the illusion of progress.

Sound familiar?

Fundamentals

The Seattle Seahawks have the league’s stingiest defense, and they completely dismantled the San Francisco 49ers powerful offense this past weekend. How did they do it? Gregg Easterbrook wrote:

The short version of the success of the Seahawks’ defense is good players who hustle, communicate with each other and wrap-up tackle. Contemporary NFL defenses are so plagued by players’ desire for spectacular plays that make “SportsCenter” that blown coverages and missed assignments have become de rigueur. Seattle’s defense almost never has a broken play. And those lads can tackle! Seattle misses fewer tackles than any NFL defense. Lots of wrap-up tackles where the runner gains an extra yard are better than a few spectacular hits for a loss, plus frequent missed tackles. Seattle defenders understand this.

Hustle, communicate, execute. It sounds so basic, it’s almost a cliche.

A big reason I developed Changemakers Bootcamp was my realization that getting really, really good at the basics could have a much bigger impact than on inventing new tools or processes. Unfortunately, the vast majority of attention and focus seems to be on the latter rather than the former. We see lots of stories from sports and other fields what a mistake this can be.

Photo by Philip Robertson. CC BY 2.0.

The Mainstreaming of Analytics

John Hollinger, a long-time ESPN.com columnist and inventor of the Player Efficiency Rating (PER) for evaluating basketball players, is joining the Memphis Grizzlies front office as its Vice President of Basketball Operations.

This is wacky on a number of levels. First, it represents the ongoing rise of the numbers geek in sports, a movement pioneered by Bill James almost 40 years ago, given an identity a decade ago in Michael Lewis’s book, Moneyball, and gaining official acceptance in the NBA five years ago, when the Houston Rockets named Daryl Morey its General Manager. Want to run a professional sports team? These days, an MIT degree seems to give you a better chance than spending years in the business.

Second, Hollinger spent over a decade sharing his thinking and his tools for all to see. Now, all his competition needs to do to understand his thinking is to Google him. Tom Ziller writes:

The major difference between Hollinger and, say, Morey is that we all know Hollinger’s theories. We know his positions, and we’ve learned from his work…. Will his canon hurt his ability to make moves? We can lay out exactly which players he likes based on his public formulas and his writings. Other GMs will know which Memphis players he’ll sell low on. You can anticipate his draft choices if you’re picking behind him. If you’ve got a high-production, low-minutes undersized power forward, you know you can goose the price on him because history indicates that Hollinger values him quite seriously.

This is all a gross simplification: Hollinger’s oeuvre is filled with nuance. He doesn’t rank players solely by PER, and in fact he probably has some adjustments to his myriad metrics up his sleeve. He’s not going to be nearly as predictable as a decision-maker as anyone would be as a writer. The stakes are different, the realities of action are different. But no decision-maker in the NBA has had this much of their brain exposed to the world. Morey isn’t shy, but that big Michael Lewis spread on Shane Battier was as far as we ever got into the GM’s gears. Zarren is notoriously careful about what he says. He might be the only GM or assistant GM in the league more secretive than Petrie.

It’s interesting to consider the implications on the Big Data movement in business (on which Moneyball had a much greater influence than most would probably admit). Business is not a zero sum game like professional sports, so there’s more room for nuance and many positive examples of openness and transparency. Still, for all those who believe that openness and competition do not have to be at odds with each other, this will be fascinating to watch.

Ziller also makes a wonderful point about the importance of communicating meaning from analysis:

In the end, what Hollinger’s hire means is that the ability to do the hard analysis is important, but so is translating that to a language the people on the court can understand. That’s always been a wonderful Hollinger strength: making quant analysis accessible without dumbing it down. Even someone as brilliant as Morey, who has a team of quants, can’t always achieve that.

I’m reminded of a tale from Rick Adelman’s days in Houston. Morey’s team would deliver lengthy scouting reports to the team and coaching staff well before a game. It’d have player tendencies, shooting charts, instructions on match-up advantages — everything you could ask for to prep for a game. And out of all of the coaches and all of the players only two — Shane Battier and Chuck Hayes — would devour the reports. The rest (Adelman included) would leaf through, pretend to care and go play ball. That story might be an exaggeration on the part of the person who told it, but even if that’s the case, it shows how important accessibility is. You can build the world’s greatest performance model. And if you can’t explain what it means to the people using it, it’s worthless.