Community Rules as Documents

Some White Dudes Signing the U.S. Constitution

Facebooks’s Community Standards aren’t versioned and appear to be engineered to resist the idea of versioning.


Yesterday I attended the Content Moderation at Scale meeting held in Washington DC, which had representatives of large social media platforms like Facebook, Twitter, Google, Wikipedia, Twitch, Vimeo, Trip Advisor talking about their processes for content moderation. These companies were joined by representatives from the Department of Homeland Security and Europol. This was an east coast version of a similar event that happened at Santa Clara University earlier this year called Content Moderation and Removal at Scale. But if you look at the speaker list you can see it wasn’t exactly getting the band back together. It’s also interesting to note that bringing the event to DC brought in some new sponsors, including the CATO Institute, Charles Koch Institute and Craig Newmark Philanthropies. Yes, quite a diverse (and powerful) set of actors are interested in this topic.

As you can imagine a lot of stuff happened, some of which you can see reflected in the Twitter stream, and the slides and videos are online for the moment. Unfortunately there was very little opportunity for question & answer with the audience after the presentations from the various companies, and by my recollection none of the participants asked questions of each other. So even though it had a light breezy feel to it, this was a tightly controlled event, and in many ways it had to be, to get these people up on stage to talk about this very sensitive topic at all.

As I said, there was very little Q&A with the audience, but there was an online form that was used for people to submit questions to the moderator. Perhaps I wasn’t paying enough attention but I didn’t hear very many questions coming from this system, and there was very little time for discussion after the various presentations anyway. I submitted one question, that was not asked, which is the topic of the remainder of this post.

One of the highlights of the day for me was Tal Niv of GitHub talking about how they use GitHub itself to make their policies available.

https://twitter.com/edsu/status/993517767340036097?ref_src=twsrc%5Etfw

This means that issues are versioned, with a clear history of how they have changed and when. In addition anyone with a (free) GitHub account can create issues that ask questions, or fork the policy and send a [pull request] with a suggested change. Now it appears that only 7 people have actually contributed to the repository–but it is the principle that matters here. For example we can see that the repository has largely been shepherded by Joseph Mazzella who is Product Counsel at GitHub. We can see what he changed and his comments provide some context about why. I think this level of transparency is important for community building.

Community was an essential and oft repeated concept that participants used to talk about how their moderation guidelines work. Specifically Facebook’s recently released Community Standards were held up by Facebook representatives Peter Stern and Katilin Sullivan. They described a “mini-legislature” that put these standards together, which really operate as the law of Facebook. Is it possible to think of 2 billion people as a community? Maybe? It’s certainly different than thinking of Vimeo as a community, which has 2 million users and allows you to pay money for the right to host your content with them, instead of profiting off of selling your data to third parties. But I digress.

My question for the transparency panel was: Have Facebook considered using a process like GitHub’s for making the changes to their community guidelines clear? At the moment there doesn’t appear to be a change log for the guidelines so it’s hard to understand how/if they have changed. There doesn’t appear to be a way to ask questions about the document, or to suggest changes. Facebook already have a GitHub account so this wouldn’t be asking them to use something that they weren’t using already. This level of transparency (at least for the changes) seems important if Facebook is taking their community guidelines seriously.

But the situation is actually worse.

The community guidelines themselves are broken up into different sections, which appear to be at different URLs. In itself this isn’t really a problem. But these aren’t separate URLs for different documents. The community standards are a small JavaScript application, which dynamically fetches the contents from Facebook’s servers when you click on the sections. If you want to take a look at this open up the Developer Tools in your browser and look at the network pane while clicking around. This means that it’s really quite difficult to snapshot a version of the community standards. For example Internet Archive is currently not able to render them, they just appear as an empty shell:

Fortunately Webrecorder does seem to be able to capture the site, since it allows you to interactively click the section links, and uses your browser to run the JavaScript application. But this won’t really work in a scenario where the standards are automatically captured every day, even if you have all the URLs for each section.

In addition for whatever reason it looks like a specialized HTTP library like requests_html, which uses a browser to execute JavaScript before rendering the DOM’s content, is unable to retrieve the content. I discovered all this because I was considering pointing an instance of diffengine at the Community Rules, but quickly realized it wouldn’t work, without a significant amount of custom coding. Maybe I’ll still do it if nobody else has tackled it yet.

I might be reading between the lines a little bit too much, but it appears that Facebook’s choice of deployment for their Community Rules greatly inhibits the level of transparency and accountability to which they can be held. Not only is there no real way to track how these rules are changing, they appear to be engineered in such a way to prevent them from easily being tracked. This is in stark contrast to organizations like GitHub, Tumblr (thanks Tarleton) and Wikimedia (Mediawiki embraces version history) who want to share detailed information about how their policies are changing.


It seems to me that version histories for these community documents are a required foundation for building online communities. I hope that future Content Moderation at Scale events (and I do hope there will be more) can address this issue. Version control of documents seems like a natural fit for a room full of lawyers and legal nerds, right? Also, another bit of unsolicited advice, maybe lose the cute session where everyone pretends to get their hands dirty in content moderation, and use the additional time for actual, unmoderated Q&A.


Originally published at inkdroid.org on May 8, 2018.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top