Socialtext 2.0 Released

Congratulations to Ross Mayfield, Peter Kaminski, Adina Levin, and all the excellent folks at Socialtext for the release of Socialtext 2.0. Even bigger props for slipping in “Purple Consulting” in the screencast. I’ve been cranking so hard over the past six months, I didn’t have a chance to congratulate them on their Open Source release last July, so now I get to combine my commentary here. (In fact, I’m sitting on a bunch of Wiki-related posts right now that I need to push out; a lot of really cool stuff has been happening.) That’s good, because I have plenty to say.    (L72)

Socialtext 2.0 is an important release for three reasons. First, it doesn’t just look good, it’s highly usable. Adina and Pete deserve big-time credit for this. They’ve spent months painstakingly experimenting and testing the design. More importantly, they haven’t just focused on making it easy to use, but they’ve also agonized over how to accomodate expert usage as well.    (L73)

Have they succeeded? I think the personal home base concept is great. I love the fact that Backlinks are visible on the page and get lots of love. I love their new Recent Changes interface (and I hope to see a Tag Cloud view of the all pages index in the next release). I hate the fact that a Recent Changes link is not on every Wiki page. Both Pete and Adina are well aware of this beef, and I’m also well aware of their reason for not including it. Testing and user observation will tell what’s better.    (L74)

Second, Socialtext 2.0 has a really cool REST interface. Chris Dent has been boasting about it for months, but I didn’t look at it myself until Kirsten Jones walked me through it last week. (Her WikiWednesday presentation from earlier this month is online.) It really is cool, and it’s also useful. Congrats to Chris, Kirsten, Matthew O’Connor, and Matt Liggett for their excellent work!    (L75)

What’s great about this API is that it could very well serve as a standard URI scheme for all Wikis. This would obviate the need for a separate SOAP or Atom API. You just have a regular Web app, and you get the API behavior for free.    (L76)

For example, Alex Schroeder‘s currently going through the same process that Chris went through a year ago with Atom and OddMuse. An easier way around this problem would be to implement these REST APIs.    (L77)

(This is also a great opportunity for me to mention WikiOhana again, which gained great traction at WikiSym last month and which now has a lively Wiki of its own. PBWiki recently announced its own Wiki API, which is a good thing. We are all part of the same Wiki family. Socialtext and PBWiki need to talk about how their two efforts can work together. That’s the WikiOhana Way.)    (L78)

The third important thing about Socialtext 2.0 is that it’s Open Source. (Big props to Jonas Luster and Andy Lester for finally making this happen.) Here’s the thing. I think the announcement a few months back was overblown by a lot of blogosphere hype. The reality of all corporate Open Source releases is that — in and of themselves — they’re mostly meaningless. Mostly, but not completely. The fact that Socialtext 2.0 is Open Source means that other Wiki implementations can benefit from the great work that the Socialtext developers have done, from the APIs to the user interface. That makes for a healthier ecosystem, which is good for everybody.    (L79)

That said, the reason the actual open sourcing of Socialtext 2.0 (and any proprietary software project) is mostly meaningless is that the license is a critical, but tiny part of what makes Open Source software interesting and important. The big part is the community and collaborative process, and a lot of other things besides an open license are required to make that successful.    (L7A)

Before Socialtext went Open Source, I spent many hours talking to a bunch of people there about the impending release. I wanted to know how committed they were to making this a truly open and collaborative software project, because I felt the potential impact on the Wiki community was enormous. The answer I got was complex. The fact that everyone was willing to talk to me with no strings attached, in and of itself, demonstrated a commitment to openness, and I’m still grateful for that. The code itself will be a short-term bottleneck, as it needs a lot of work before outside developers will find it compelling. I also think the licensing terms are weaker than they need to be, although I also understand the outside pressures that make it so.    (L7B)

In short, I think the spirit is strong within Socialtext to fully realize the potential of this Open Source project, but there are also roadblocks. Hopefully, external pressures won’t squash that spirit. If Socialtext ever fulfills its potential as an Open Source company, it will not only help the ecosystem, but it will also tremendously benefit Socialtext as a business.    (L7C)

Fighting WikiSpam: Eaton and Shared Blacklists

WikiSym 2005 was awesome. Massive props to Dirk Riehle and the program committee for throwing an outstanding event and drawing tons of great, great people. With Wikimania last August and WikiSym this past week, the Wiki community is really starting to gel. And it’s about time. Can you believe Wikis are 10 years old?    (JXD)

Now the bad news: I walked away with some action items. How do I get myself into these messes?!    (JXE)

The first action item can be traced back to an ad hoc meeting that happened at Wikimania regarding WikiSpam. On August 6, a group of Wiki developers — me (PurpleWiki), Alex Schroeder (OddMuse), Brion Vibber (Mediawiki), Thomas Waldmann (MoinMoin), Sven Dowideit (TWiki), Janne Jalkanen (JSPWiki) — along with John Breslin and Jochen Topf, got together to discuss ways we could collaborate on fighting WikiSpam. Our goal was to identify the simplest possible first step and not to get mired in process discussions.    (JXF)

Since all of us were already maintaining URL blacklists, we decided to merge them and host it as a Sourceforge project. We agreed on a standard format (which I’ll document and post soon), and we agreed to send our respective lists to Alex, who already has scripts to slice, dice, and merge.    (JXG)

One of my action items then was to create the Sourceforge project. I did that immediately, but for some reason, the project was rejected. Thus began a month-long go-around with Sourceforge support where I tried to discover why they had rejected the proposal. In the end, the project was approved, and I never got an answer as to why it was rejected in the first place. At that point, I was mired in other work, and so I never followed up.    (JXH)

WikiSym was the kick in the butt I needed to follow-up. On Sunday, Sunir Shah hosted an antispam workshop, which about 40 people attended. First, Sunir reviewed techniques (many of which are listed at MeatBall:WikiSpam). Then we broke out.    (JXI)

In my breakout, I described what we had agreed on at Wikimania. Then Peter Kaminski described a very cute idea he had for making it easy to fight WikiSpam. In a nutshell, Peter suggested we write a simple drop-in replacement CGI wrapper that would filter a POST payload for spam and call the real CGI script — be it a Wiki, a blog, or anything else — if the payload were spam-free. Such a wrapper would enable users to install spam-protection for any CGI script without having to write a single line of code and without having to do any complex configuration. It wouldn’t require any special access to your web server, since it would just be a CGI script. And you could easily add other spam-fighting measures, such as throttling and IP blacklists.    (JXJ)

I thought it was a brilliant idea. So Peter and I sat down afterwards and whipped it up. Took about an hour. It’s called Eaton, it works, and it’s Public Domain. Peter Kaminski has already blogged about it, and there’s some important commentary there from Jay Allen, the creator of MT-Blacklist.    (JXK)

It’s a proof of concept, and it won’t scale. It can and should be improved, and I’d encourage folks to do so. Nevertheless, it’s pretty cool. Bravo to Peter for a very clever idea.    (JXL)

By the way, the first person to figure out the origins of the name “Eaton” wins a cookie.    (JXM)