Visualizing Wiki Life Cycles

On the first day of WikiSym in Denmark last August, I spotted Alex Schroeder before the workshop began and went over to say hello. Pleasantries naturally evolved into a discussion about Purple Numbers. (Yes, I’ve got problems.) Alex suggested that unique node identifiers were more trouble than they were worth, because in practice, nodes that you wanted to link to were static. Me being me, my response was, “Let’s look at the numbers.” Alex being Alex, he went off and did the measurements right away for Community Wiki, and he did some followup measurements based on further discussions after the conference.    (LSP)

As it turned out, the numbers didn’t tell us anything useful, but our discussions firmly implanted some ideas in my head about Wiki decay rates — the time it takes for information in a Wiki page to stop being useful.    (LSQ)

I had toyed with this concept before. A few years ago, I came up with the idea of changing the background color of a page to correspond to the age of the page. A stale page would be yellowed; an active page would be bright white. I had originally envisioned the color to be based on number of edits. However, I realized this past week that I was mixing up my metaphors. There have been a few studies indicating a strong correlation between frequent edits and content quality, so it makes sense to indicate edit frequencies ambiently. However, just because content has not been edited recently does not mean the information itself is stale. You need to account for how often the page is accessed as well.    (LSR)

(At the Wikithon last week, Kirsten Jones implemented the page coloring idea. She came up with a metric that combined edits and accesses, which she will hopefully document on the Wiki soon! It’s cool, and it should be easy to deploy and study. Ingy dot Net suggested that the page should become moldy, a suggestion I fully endorse.)    (LSS)

This past Sunday, I had brunch with the Socialtext Bloomington Boys. Naturally, pleasantries evolved into Matthew and me continuing along our Wiki Analytics track, this time with help from Shawn Devlin and Matt Liggett. We broke Wiki behavior into a number of different archetypes, then brainstormed ways to visually represent the behavior of each of these types. We came up with this:    (LST)

https://i0.wp.com/farm1.static.flickr.com/149/388587151_3f730b0a5c_m.jpg?w=700    (LSU)

The x-axis represents time. The blue line is accesses; the green line is edits. Edits are normalized (edits per view) so that, under normal circumstances, the green line will always be below the blue (because users will usually access a page before editing it). The exception is when software is interacting with the Wiki more than people. The whole graph should consist of a representative time-slice in that Wiki’s lifespan.    (LSV)

The red line indicates the median “death” rate of Wiki pages. After much haggling, we decided that the way to measure page death was to determine the amount of time it takes for a page to reach some zero-level of accesses. We’ll need to look at actual data to see what the baseline should be and whether this is a useful measurement.    (LSW)

The red line helps distinguish between archetypes that may have the same access/edit ratio and curve. For example, on the upper left, you see idealized Wiki behavior. Number of edits are close to number of accesses, both of which are relatively constant across the entire Wiki over time. Because it’s a healthy Wiki, you’ve got a healthy page death rate.    (LSX)

On the upper right, you see a Wiki that is used for process support. A good example of this is a Wiki used to support a software development process. At the beginning of the process, people might be capturing user stories and requirements. Later in the process, they might be capturing bugs. Once a cycle is complete, those pages rapidly become stale as the team creates new pages to support a new cycle. The death line in this case is much shorter than it is for the idealized Wiki.    (LSY)

Again, one use of the Wiki isn’t better than the other. They’re both good in that they’re both augmenting human processes. The purpose of the visualization is to help identify the archetypes so that you can adjust your facilitation practices and tools to best support these behaviors.    (LSZ)

This is all theory at this point. We need to crunch on some real data. I’d love to see others take these ideas and run with them as well.    (LT0)

Socialtext 2.0 Released

Congratulations to Ross Mayfield, Peter Kaminski, Adina Levin, and all the excellent folks at Socialtext for the release of Socialtext 2.0. Even bigger props for slipping in “Purple Consulting” in the screencast. I’ve been cranking so hard over the past six months, I didn’t have a chance to congratulate them on their Open Source release last July, so now I get to combine my commentary here. (In fact, I’m sitting on a bunch of Wiki-related posts right now that I need to push out; a lot of really cool stuff has been happening.) That’s good, because I have plenty to say.    (L72)

Socialtext 2.0 is an important release for three reasons. First, it doesn’t just look good, it’s highly usable. Adina and Pete deserve big-time credit for this. They’ve spent months painstakingly experimenting and testing the design. More importantly, they haven’t just focused on making it easy to use, but they’ve also agonized over how to accomodate expert usage as well.    (L73)

Have they succeeded? I think the personal home base concept is great. I love the fact that Backlinks are visible on the page and get lots of love. I love their new Recent Changes interface (and I hope to see a Tag Cloud view of the all pages index in the next release). I hate the fact that a Recent Changes link is not on every Wiki page. Both Pete and Adina are well aware of this beef, and I’m also well aware of their reason for not including it. Testing and user observation will tell what’s better.    (L74)

Second, Socialtext 2.0 has a really cool REST interface. Chris Dent has been boasting about it for months, but I didn’t look at it myself until Kirsten Jones walked me through it last week. (Her WikiWednesday presentation from earlier this month is online.) It really is cool, and it’s also useful. Congrats to Chris, Kirsten, Matthew O’Connor, and Matt Liggett for their excellent work!    (L75)

What’s great about this API is that it could very well serve as a standard URI scheme for all Wikis. This would obviate the need for a separate SOAP or Atom API. You just have a regular Web app, and you get the API behavior for free.    (L76)

For example, Alex Schroeder‘s currently going through the same process that Chris went through a year ago with Atom and OddMuse. An easier way around this problem would be to implement these REST APIs.    (L77)

(This is also a great opportunity for me to mention WikiOhana again, which gained great traction at WikiSym last month and which now has a lively Wiki of its own. PBWiki recently announced its own Wiki API, which is a good thing. We are all part of the same Wiki family. Socialtext and PBWiki need to talk about how their two efforts can work together. That’s the WikiOhana Way.)    (L78)

The third important thing about Socialtext 2.0 is that it’s Open Source. (Big props to Jonas Luster and Andy Lester for finally making this happen.) Here’s the thing. I think the announcement a few months back was overblown by a lot of blogosphere hype. The reality of all corporate Open Source releases is that — in and of themselves — they’re mostly meaningless. Mostly, but not completely. The fact that Socialtext 2.0 is Open Source means that other Wiki implementations can benefit from the great work that the Socialtext developers have done, from the APIs to the user interface. That makes for a healthier ecosystem, which is good for everybody.    (L79)

That said, the reason the actual open sourcing of Socialtext 2.0 (and any proprietary software project) is mostly meaningless is that the license is a critical, but tiny part of what makes Open Source software interesting and important. The big part is the community and collaborative process, and a lot of other things besides an open license are required to make that successful.    (L7A)

Before Socialtext went Open Source, I spent many hours talking to a bunch of people there about the impending release. I wanted to know how committed they were to making this a truly open and collaborative software project, because I felt the potential impact on the Wiki community was enormous. The answer I got was complex. The fact that everyone was willing to talk to me with no strings attached, in and of itself, demonstrated a commitment to openness, and I’m still grateful for that. The code itself will be a short-term bottleneck, as it needs a lot of work before outside developers will find it compelling. I also think the licensing terms are weaker than they need to be, although I also understand the outside pressures that make it so.    (L7B)

In short, I think the spirit is strong within Socialtext to fully realize the potential of this Open Source project, but there are also roadblocks. Hopefully, external pressures won’t squash that spirit. If Socialtext ever fulfills its potential as an Open Source company, it will not only help the ecosystem, but it will also tremendously benefit Socialtext as a business.    (L7C)

Purple Numbers and WIKIWYG

For a while, it was looking like I was going to break another personal blogging record last month, then things got so busy I had zero time to blog whatsoever. That means I’m in catch up mode again, so as usual, I’ll post in reverse chronological order (which in the blogosphere is really reverse reverse chronological order).    (JYN)

Yesterday, I spent the afternoon at Socialtext, where they were having an all-hands meeting. They graciously invited me to participate in the Open Space segment of their gathering, which meant quality time with Chris Dent and a rare opportunity to evangelize the Church Of Purple together. Of course, Chris has been spreading the Purple religion at Socialtext for a while now, so it wasn’t as much about evangelism as it was about next steps.    (JYO)

As I’ve mentioned many times before, browser-based WYSIWYG editors are an exciting development because they allow us to make Purple Numbers transparent in the authoring process. Right now, when you edit a PurpleWiki page, you see the node ID tags (e.g. {nid 123}). This is impossible to get around with the default browser text-editing widget. However, with a WYSIWYG editor, you can hide the Purple Numbers while still maintaining their associations with a node behind the scenes.    (JYP)

That’s the theory, anyway. In particular, I’ve been excited about WIKIWYG ever since Ross Mayfield showed me an early prototype last August. I had a personal bias, since Chris Dent and Matt Liggett helped write it, as did Casey West and the inimitable Brian Ingerson, whom I finally met last weekend at Tag Camp.    (JYQ)

Yesterday afternoon, we discussed Purple Numbers and WIKIWYG, and it was good. Then in the evening, Ingy and I spent a few hours trying to get WIKIWYG integrated into PurpleWiki.    (JYR)

We didn’t quite make it. Our biggest roadblock was a bug we discovered in Mozilla’s design mode that we can’t do much about. (My days of statically typed languages are well behind me.) But, we got something somewhat working, and I learned a heckuvalot. You can play with our semi-working demo.    (JYS)

WIKIWYG seems well-architected and is easy to customize. For folks with relatively standard Wiki editing requirements, I highly encourage you to play with it. PurpleWiki has some special formatting funkiness (mainly due to the Purple Numbers), but we were able to get around this fairly easily. (This was also true thanks to PurpleWiki‘s model of parsing to an intermediate data structure, then using view drivers to serialize. I wish more Wiki engines did this. I know Magnus Manske is thinking about doing this for Mediawiki, and I think Janne Jalkanen is already doing it with JSPWiki.)    (JYT)

The Mozilla bug annoyed me, because it’s a show-stopper in some ways, and there’s not much I can do about it. I didn’t realize it, but all of the JavaScript WYSIWYG widgets actually switch to the browser’s “design mode” in order to handle WYSIWYG editing. As with many HTML editors, design mode does not handle structure cleanly, and you end up getting weird artifacts such as spurious break tags. Our problem was that we serialize node ID information as id attributes in the HTML tags. However, Firefox does not maintain those attributes correctly when you move content around.    (JYU)

I’ll report the bug (if folks have suggestions as to the best way to bring this to the right people’s attention, let me know), but it also puts the kibosh on my hopes for WIKIWYG and Purple Numbers. Even if the bug is fixed in the next version of Firefox, we’re still prey to all the folks using older versions as well as Internet Explorer or Safari, which have their own problems with design mode.    (JYV)

Chris and I discussed one workaround that I’m still pondering: render the Purple Number and have users be responsible for maintaining the association with the nodes. That’s the status quo, except users are doing it in WikiText rather than in WYSIWYG. Doing it in WYSIWYG certainly lowers the bar, and it’s probably the next best thing for us to do.    (JYW)