Emergent Learning Forum Gathering on Social Networks

I’ve been following Alex Gault‘s blog for several months now. It is an outstanding source of articles and information on collaboration and Knowledge Management. Earlier this past week, I learned that Alex had organized the program for last Tuesday’s Emergent Learning Forum meeting, “Social Networking, Relationship Capital and Expertise Management.” Both the speaker list (Spoke Software‘s Andy Halliday, Intel’s Anita Lo, and Tacit Knowledge SystemsDavid Gilmour) and the opportunity to meet yet another blogger in person were too much for me to resist.    (140)

Alex kicked off the meeting with an excellent introduction to Social Networks, providing some background material (see Social Networks) and a concise overview of the current marketplace.    (141)

Andy Halliday followed by talking a bit about Spoke Software, although the scope of his talk was much more general. Spoke’s tool identifies social networks within the company (including people’s contacts outside of the company) by analyzing outbound email, and then acts as a referral broker. Andy emphasized individuals’ abilities to control their profile and protect their data. Afterwards, I asked Andy whether they had identified a threshold for how large an organization usually is before it can benefit from such a tool. He answered 1,000 people, but added that Spoke’s extended search capabilities (for contacts outside of the company) increased the tool’s utility for smaller companies.    (142)

Spoke is marketing the tool to salespeople, but it is clearly cognizant of the wider opportunities. Andy cited a few, including search results based on your social network (essentially Brian Lincoln‘s Collab:GrassRootsPeerReview idea) and a tool for sorting your inbox (including spam filtering). Spoke hosts a free online version of its tool called the Spoke Network.    (143)

Anita Lo, Intel’s Productivity Program Manager, gave a remote presentation on Intel’s recently deployed expert locator service. There were some technical difficulties and the talk was cut short before Anita could talk in-depth about the system, but a few points caught my ear. Intel conducted an internal survey to identify its most salient knowledge management need, and expert location was the top priority. The result was a system called People Yellow Pages, based on a tool that they purchased but that Anita did not identify. The system seemed to depend on people keeping their profiles updated as well as an overall taxonomy for categorization, which was managed by a librarian and validated periodically via surveys. This approach is in stark contrast with Spoke Software‘s and Tacit Knowledge Systems‘s, so it would have been nice to discuss how well it worked and whether Intel had evaluated any tools that automatically built and maintained people’s profiles.    (144)

(Seb Paquet posted some comments on Anita’s talk as well. Interestingly, Seb watched the talk remotely from his perch in Canada, and he may have been able to follow the talk more clearly than those in attendance!)    (145)

David Gilmour closed out the morning’s talks. I wrote about David’s excellent Harvard Business Review article before. His talk mirrored that article in some ways — an impassioned belief in the knowledge brokering model, coordinating collaboration rather than trying to force people to work together — but he provided much more detail in several areas. In particular, he spent much time discussing his company’s emphasis on individual privacy (as had Andy earlier) and noted that Tacit Knowledge Systems had several patents on ways to protect privacy, one indication of the value the company places on it.    (146)

David observed that individuals liked to use the tool to see their own profiles, which are automatically constructed based on their behavior (email, documents, Web, etc.). On the business end, he noted that pharmaceutical companies (which make up many of his clients) needed no persuasion regarding the ROI of his approach; the only question was whether or not the tool worked as advertised.    (147)

Jay Cross, the CEO of Emergent Learning Forum, posted some notes on the day’s talks as well. He also mentioned Alex’s other blog, the Collaboration Cafe, which I have added to my aggregator as well.    (148)

Huberman on Communities of Practices and Forecasting

Bernardo Huberman, the director of the Information Dynamics Lab at Hewlett-Packard, gave a talk at Stanford on January 8 entitled, “Information Dynamics in the Networked World.” Huberman covered two somewhat disparate topics: Automatically discovering Communities Of Practice by analyzing email Social Networks, and a general forecasting technique based on markets.    (TX)

The first part of his talk was a summary of his recent papers. I’ve alluded to this work here many times, mostly in reference to Josh Tyler and SHOCK. The premise of the work is similar to that posed by David Gilmour in his recent Harvard Business Review article.    (TY)

Huberman described a technique for discovering clusters of social networks within an organization by analyzing email. The key innovation is an algorithm for discovering these clusters in linear time. The algorithm, inspired by circuit analysis, treats edges between nodes as resistors. By solving Kirchoff’s equations (which gives “voltage” values for each node), the algorithm determines to which cluster a node belongs.    (TZ)

The second part of Huberman’s talk was enthralling: Predicting the near-term future using markets. This is not a new idea. A very topical example of a similar effort is the Iowa Electronic Markets for predicting the outcome of presidential elections.    (U0)

The methodology Huberman described (developed by Kay-Yut Chen and, Leslie Fine) aggregates the predictions of a small group of individuals. It works in two stages. The first provides behavioral information about the participants, specifically their risk aversion. Huberman remarked that risk aversion is like a fingerprint; an individual’s level of risk aversion is generally constant. In the second stage, participants make predictions using small amounts of real money. The bets are anonymous. Those predictions are adjusted based on risk aversion levels, then aggregated.    (U1)

Huberman’s group set up a toy experiment to test the methodology. There were several marbles in an urn, and each marble was one of ten different colors. Participants were allowed to observe random draws from the urn, and then were asked to bet on the likelihood that a certain color would be drawn. In other words, they were guessing the breakdown of colors in the urn.    (U2)

Although some individuals did a good job of figuring out the general shape of the distribution curve, none came close to predicting the actual distribution. However, the aggregated prediction was almost perfect.    (U3)

The group then tried the methodology on a real-life scenario: predicting HP’s quarterly revenues. They identified a set of managers who were involved in the official forecast, and asked them to make bets. The official forecast was significantly higher than the actual numbers for that quarter. The aggregated prediction, however, was right on target. Huberman noted that the anonymity of the bets was probably the reason for the discrepancy.    (U4)

The discussion afterwards was lively. Not surprisingly, someone asked about the Policy Analysis Market, the much maligned (and subsequently axed) brainchild of John Poindexter’s Information Awareness Office. Huberman responded that the proposal was flawed, not surprisingly suggesting that this technique would have been the right way to implement the market. It was also poorly framed and, because of the vagaries of politics, will most likely never be considered in any form for the foreseeable future.    (U5)

I see many applications for this technique, and I wonder whether the Institute for the Future and similar organizations have explored it.    (U6)

As an aside, I love how multidisciplinary the work at HP’s Information Discovery Lab is. I’ve been following the lab for about a year now, and am consistently impressed by the quality and scope of the research there.    (U7)

Knowledge Management as Information Brokering

David Gilmour, CEO of Tacit Knowledge Systems, wrote an excellent (and short) essay in the October issue of Harvard Business Review entitled, “How to Fix Knowledge Management.” The gist of the article:    (P3)

The problem is that most organized corporate information sharing is based on a failed paradigm: publishing. In the publishing model, someone collects information from employees, organizes it, advertises its availability, and sits back to see what happens. But because employees quickly create vast amounts of information, attempts to fully capture it are frustrated every time. Even the most organized efforts collect just a fraction of what people know, and by the time this limited knowledge is published, it’s often obsolete. The expensive process is time consuming, and it doesn’t scale well. (16)    (P4)

Gilmour’s solution:    (P5)

Instead of squelching people’s natural desire to control information, companies should exploit it. They should stop trying to extract knowledge from employees; they should instead leave knowledge where it is and create opportunities for sharing by making knowledge easy for others to find. This requires a shift away from knowledge management based on a publishing model, and a focus on collaboration management based on a brokering model. (17)    (P6)

Tacit Knowledge Systems‘s system does this by scanning all of the email and other documents on a corporate network, and building profiles of individuals based on these behaviors. The system can then alert people to other individuals with similar interests, brokering an introduction between them. If you think there are potential privacy problems here, you’re not alone. Josh Tyler‘s SHOCK works in a similar way, but distributes control of the profile to the individual; see his paper, “SHOCK: Communicating with Computational Messages and Automatic Private Profiles.”    (P7)

IT as Commodity and its Contribution to Productivity

There were two interesting articles about IT and productivity in the Harvard Business Review this past year: Nicholas Carr’s “IT Doesn’t Matter” (May) and Diana Farrell’s “The Real New Economy” (October).    (OK)

Carr’s title is a bit misleading. It’s not that IT no longer matters at all; it’s that IT is less important (for most companies — a subtle, but important disclaimer) from a strategic standpoint, because it has become a commodity. Carr writes:    (OL)

What makes a resource truly strategic — what gives it the capacity to be the basis for a sustained competitive advantage — is not ubiquity but scarcity. You only gain an edge over rivals by having or doing something that they can’t have or do. By now, the core functions of IT — data storage, data processing, and data transport — have become available and affordable to all. Their very power and presence have begun to transform them from potentially strategic resources into commodity factors of production. They are becoming costs of doing business that must be paid by all but provide distinction to none. (42)    (OM)

IT, according to Carr, is infrastructural technology, and like the infrastructural technology of the past (e.g. railroads, power grid, etc.), once it’s built-out, the potential competitive advantages for individual companies go away. Just as most companies don’t develop strategies centered around usage of electricity, neither should they invest considerable resources into developing strategies centered around IT.    (ON)

Based on this argument, Carr proposes three “new rules for IT management”:    (OO)

  • Spend less    (OP)
  • Follow, don’t lead    (OQ)
  • Focus on vulnerabilities, not opportunities    (OR)

He closes his article by saying:    (OS)

IT management should, frankly, become boring. The key to success, for the vast majority of companies, is no longer to seek advantage aggressively but to manage costs and risks meticulously. If, like many executives, you’ve begun to take a more defensive posture toward IT in the last two years, spending more frugally and thinking more pragmatically, you’re already on the right course. The challenge will be to maintain that discipline when the business cycle strengthens and the chorus of hype about IT’s strategic value rises anew. (49)    (OT)

The “vast majority of companies” is the only disclaimer in the entire article, but it’s enough to appease me somewhat. Carr is absolutely right. Most companies are not in the position to leverage IT in innovative and strategic ways, because they lack the in-house expertise. This is evident in how most companies overspend on IT — for example, on upgrading PCs and software too aggressively.    (OU)

However, while many things we associate with IT have indeed become commoditized, I still don’t think it’s right to call IT as a whole a commodity. Perhaps the problem is with the breadth of the term; we need something more concrete in scope.    (OV)

This is somewhat evident in Farrell’s article, where she addresses the famous Productivity Paradox. In the past 10 years, national productivity numbers have increased significantly, and yet, upon closer examination of the data, Farrell did not find a significant correlation with investments into IT. According to Farrell, a combination of innovations in technology and business practices is responsible for the productivity increase. These productivity increases intensify competition, which then starts this cycle of innovation anew.    (OW)

Farrell concedes Carr’s point about the rapid diffusion of IT eroding individual competitive advantage. However, she also notes that coupling IT with more unique capabilities restores that advantage.    (OX)

Farrell concludes her article by identifying three common practices in companies that have successfully invested in IT:    (OY)

  • They targeted investments at productivity levers that mattered most for their industries and themselves.    (OZ)
  • They carefully thought through the sequence and timing of investments.    (P0)
  • They didn’t pursue IT in isolation; instead, they developed managerial innovations in tandem with technological ones.    (P1)

This last point is crucial. IT is inherently different from other infrastructural technologies in that its potential for coevolution is enormous and still largely unexplored.    (P2)