How might a [distributed model of trust] system work? …The open follower model (a la Twitter) is likely to be the dominant social motif of most web apps, … the algorithm for calculating trust should be completely open, or completely closed…I agree with Newmark that it would be useful — and potentially radical — to have a trusted trust framework that is not controlled by a market player — like Google or Yahoo — or any government. Newmark suggests there might be a role for government in such a system, but in a public-private setting, where checks and balances involved non-governmental groups.
Howard Owens, along with Ingram, Alex Howard and others picked up the discussion today on Twitter. Unfortunately, they didn’t use a hash tag, so there’s no easy way to point to their discussion (here’s a link to the Owens/Ingram exchanges). Luckily, a vacationing Ingram took time away from shuffleboard and bocci ball to author a 1,000 word summary of the day’s exchanges with the post Anonymous Comments: Are They Good or Evil? (I guess neither he nor Howard is a basketball fan.)
In a nutshell, Howard said that anonymous comments were an abomination (I’m paraphrasing somewhat) and were in fact unethical, since commenters on a news site had a “fundamental right” to know the identity of the other people commenting. I tried to make a number of points, including the fact that anonymity is a red herring, and that the more important thing in encouraging a strong and healthy community conversation is standards of behaviour, regardless of anonymity. …It is virtually impossible to actually verify someone’s identity online…I believe that one of the principles of running a media site is that you should open up interaction to as many people as possible….Persistent (and quasi-verified) identity agents like Facebook Connect and OpenID can help with some of the problems that online comments have …
The series of exchanges reminded me of a series of conversations I had last weekend in Austin. Without betraying any confidences, some awesome people are thinking creatively about ways to help us to understand, share and improve, our online identities and reputation. Hopefully, some news organizations, besides Owens’ Batavian, are thinking along the same lines.
[Update 1] I inadvertently left out the other piece that kicked off these reputation conversations this weekend, Sarah Lacy’s interview with John Temple, the editor of the Pierre Omidyar local news project that everyone’s talking about.
For a site that intends to be very community oriented, there was one big shocker: Peer will not have comments. “(Comments) descend into racism, hate, ugliness and reflect badly on news organizations that have them,” said Temple. Why? Because people do not have to show their faces when they comment so there’s no sense of responsibility, he argued. “We think anonymity is a huge problem when it comes to comments,” he said.
[Update 2] Mathew pointed to this post by Steve Buttry, who wants it both ways.
In Washington, we have lots of government workers or workers for government contractors or nonprofit associations who might be actually barred or strongly inhibited from commenting publicly on some issues. I wonder if we can have it both ways. How would it work to provide an incentive for people submitting to some form of verified identity or registering through Facebook Connect (not verified, but Facebook is a place where most people identify themselves accurately)?