Mark Zuckerberg's back
Editorial

Memo to Facebook: A Platform Isn't a Community

6 minute read
Carrie Melissa Jones avatar
SAVED
Tech companies have been confusing the term “platform” with the term “community” for some time now. And that's causing a lot of problems.

What do data security and scalable online moderation have in common? Both are plaguing Facebook right now, and both indicate a profound milestone in the evolution of the Information Age.

For many years, tech companies and consumers alike have been confusing the term “platform” with the term “community.” "Platform," like its more techie counterpart, "network," has been a catch-all term for any mechanism that allows members to exchange anything between one another. But a platform does not a community make. The confusion between these words seems like an innocent enough mistake, but is in fact at the root of the questions plaguing large platforms today. It's why trust and safety of users fall on some of the lowest paid and most underappreciated workers (moderators), and it's why safety will never be reached at giant platform scale.

When we treat platforms as though they are communities, the question we are left to ask is: How do we keep everyone safe? Such notions are impossible at this scale, at least in today’s world. We need to ask a better question.

Community professionals know this. You build one community at a time. That’s because creating a lasting community is about building safety and belonging among groups of people with a shared sense of identity, and not everyone can be safe with everyone else. In fact, the first thing that a community professional does when creating a community from scratch or optimizing an existing one is to understand what common identities exist across the member base and build value for those people. The idea of any of the large platforms doing this is preposterous at best. The immediate reaction would be: we need to break this apart into smaller communities. And that’s the root we need to return to. So how do we do this?

The first step forward is to acknowledge the differences between platforms and communities and decide which path to take forward. Today, I propose the best path is one of reorganizing the giants once and for all.

What Is the Key Difference Between Communities and Platforms?

Communities have two key characteristics: members have a shared sense of identity and participate in shared experiences. I don’t know about you, but I’ve never felt a shared sense of identity with someone just because we’ve both used Facebook or Instagram or watched videos on YouTube. In Facebook’s case, it’s impossible for me to be in community with 2 billion people at a time.

But none of the large platforms that have come to define the first 20 years of the 21st century want to admit they’re not building community, and many have no clue how to do so. However, there are people all around the world who build community professionally, study it, and could tell you that strong identity and community self-governance are key to creating community resilience and long-term safety.

If you were to hand the problem of moderation of a network this large to a community strategist, the first thing they would try to do is organize around organizational goals, and then organize around member goals. Where they would get stuck is in thinking about members. There is no one monolithic Facebook member because there is no one monolithic Facebook community.

Instead, what is happening on these platforms is a lot like what is happening in the blockchain “community” space. A lot of networks are cropping up, but no trust is in place. Trust in a network must be created through community building. Only then have you built something more than a network of people exchanging things. You’ve built something strong enough to save lives.

Learning Opportunities

Related Article: Blockchain Is an Opportunity for Community Managers

What’s Next?

Back in February, a group of lawyers, academics, and content moderation specialists gathered on the Santa Clara University campus to attend the first Content Moderation and Removal at Scale Conference. During this conference, policy makers from large companies such as Facebook, Airbnb, Pinterest and Patreon discussed moderation at giant platform scale. Dave Willner, head of community policy at Airbnb and former head of content policy at Facebook spoke at the conference about the difficulty of such a notion. Paraphrased in a tweet by the Director of the Center for Democracy and Technology Emma Llanso, he stated: "There’s no 'average' Facebook user, [you] can’t base policy on what The Community wants [because] 'The Community didn’t want any one thing.'"

This is why we need to stop creating policy or thinking about our platforms as a community in the first place. It would be like having one world-wide government trying to set policy for individuals from Topeka, Kansas to Nairobi, Kenya. What works in one community does not work in all.

So why are we still building policies and platforms governed by one main body, that lack self-determination, and that propose solutions without actually incorporating insight from the communities that give them power?

Historically speaking, companies have long stayed “above” moderation in order to reduce liability. I believe this same sort of thinking is what leads company leaders not to ask questions and not to probe further into what is being done with user data. "It’s a network, and the network will do what it does," is the rationale behind many moderation decisions and company decisions. And it’s hurting all of us.

The law has made it possible — even preferable — for these companies to stand back and watch as the network causes damage to users’ lives. Section 230 of the Communications Decency Act of 1996 (a common name for Title V of the Telecommunications Act of 1996) has long dictated how these platforms’ moderation strategies work. Section 230 states explicitly that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” That is, the platforms are not responsible for what content creators publish, unless (and this is a major caveat) they are "aware" of it. So for years now, technology platforms have turned a blind eye to vigilantly moderating content in order to limit their liability.

To Live Up to the 'Community' Name, We Need a New Model

Imagine a community where no leader is held liable for the actions of its members. Where atrocities and harassment run rampant and people were silent in the face of injustice. Wait, we know what that looks like. And it's time the platform creators let go of some of their power and create a new model, breaking apart these giant platforms into smaller, self-governing bodies. Just like the U.S. federal government does not try to regulate every municipality, so should these platforms be thinking about breaking into smaller pieces, so they may actually have some hope of forming lasting communities.

In last week’s apology to the American people in a full-page ad in The New York Times, Facebook CEO Mark Zuckerberg signed off with: “Thank you for believing in this community.”

What community is he talking about? I look up from my Facebook newsfeed, alone in my office, scratch my head and wonder.

fa-solid fa-hand-paper Learn how you can join our contributor community.

About the Author

Carrie Melissa Jones

Carrie Melissa Jones is a community leader, entrepreneur and researcher who has been involved with online community leadership since the early 2000s. She is currently the CEO of Gather Community Consulting. Connect with Carrie Melissa Jones:

Main image: Alessio Jacona