Our goal is to combine both approaches—to run a moderation service that tries to provide a baseline and to also have an open ecosystem where anyone who wants to innovate can come in and start building. I think this is particularly useful around cases where information is really fast moving and there’s specialized knowledge. There are organizations out there already in the business of fact-checking, or figuring out if a verified account is actually a politician or not. They can start annotating and putting that information into the network, and we can build off that collective intelligence.
Recently there was a very high-profile incident on X where deepfake porn of Taylor Swift started spreading and the platform was not super prompt at clamping down. What’s your approach to moderating deepfakes?
From the start we’ve been using some AI-detection services—image labeling services—but this is an area where there’s a lot of innovation and we’ve been looking at other alternatives.
This is also where a third-party labeling system could really come into use. We can move faster as an open collective of people—she has lots of fans who could help identify content like this very proactively.
What are the benefits of federation—where a social network is decentralized, consisting of a bunch of independent servers instead of one central hub—for the casual internet user?
The goals here are to give developers the freedom to build, and users the right to leave. The ability for people to host their own data means that users always have other alternatives, and that their experience doesn’t have to just come from us. For example, if a user wants to try a wholly different app, or a whole different experience, or they want to move to a parallel social network.
If someone was to use your protocol and build, say, a Taylor Swift deepfake porn community, is there anything you could do to stop that?
With the open web model, someone can always put their own website on the internet, but it doesn’t have to be indexed. We’re also playing a role in surfacing and indexing content. For really bad stuff out there, we’re trying to make sure that it never gets shown, by de-promoting it and not connecting to it.
Can you explain your business model?
We really think that money follows value. There’s been skepticism that this whole model of social can work. People are even wondering what it is. So, first of all, we’re trying to prove that this ecosystem has value to users and developers, and that it can kick off an era of open innovation.
From there, we’re going to monetize while following our values. Early on, Twitter was very open and everyone built on it. But then they shut down at some point, right? They turned into much more of a platform, and less something that looked like a protocol.
Our whole approach is getting back to protocols, not platforms, and there are certain guarantees that we’ve built into the protocol. It’s locked open. Once we have proven out this approach, I think there’s lots of ways that money is going to flow through the ecosystem. We’re going to start exploring some of those models this year.
Read the full article here