Frances Haugen_ Facebook Informer

Frances Haugen_ Facebook Informer

She was the source of the damning "Facebook Files." Here's how she thinks DAOs and blockchains could fix the company no one likes.

You don't like Facebook. Democrats don't like Facebook. Republicans don't like Facebook. It may be the one thing most people in the United States agree on.

But the "Facebook problem" is almost certainly worse than you think.

That's the gist of Frances Haugen, the former Facebook project manager who published more than 20,000 pages of documents in September 2021 that shed light on the darkest corners of the Internet. The evidence spurred reporting by The Wall Street Journal's "The Facebook Files," which found that the company, whose corporate name is now Meta (FB), "knows down to the smallest detail that its platforms are riddled with flaws that cause harm, often in ways that only the company fully understands."

This article is part of Road to Consensus, a series that features speakers and the big ideas they will be discussing at Consensus 2022, CoinDesk's festival of the year, June 9-12 in Austin, Texas. Learn more.

Just some of the damning findings: Facebook's algorithms knowingly made the site "angrier" to increase engagement, Facebook intentionally tried to recruit preschool-aged youth (even though the minimum age is 13), and Facebook knew Instagram was toxic to teenage girls.

Haugen then testified before the U.S. Congress, where she charged that "Facebook's products harm children, fuel divisions, and weaken our democracy." She lives in Puerto Rico and describes herself as a "champion of accountability and transparency in social media." She recently wrote an essay for The New York Times praising Europe's Digital Services Act, which she says will "lift the curtain for the first time on the algorithms that decide what we see and when we see it in our feeds." She's excited about the Digital Services Act, but knows it's just a start.

One possible solution?

Blockchain.

On a recent Zoom call, Haugen proposed a startling thought experiment: What if Facebook had been founded as a decentralized autonomous organization? "If there had been a DAO that regulated Facebook - if it had been owned by the users - I don't think we would have said, "Hey, keep putting random [stuff] into our accounts that we didn't ask for," Haugen says.

She suspects that a "distributed" social network - one that's truly composed of our friends and family - might offer a better way forward. Haugen also explains why the problem is more global than you think, what she envisions as a solution (hint: it's not good news for CEO Mark Zuckerberg) and why Elon Musk's purchase of Twitter (TWTR) could be a win for social media.

It's now been more than seven months since the "Facebook Files." What do you think is the biggest issue today?

I would say it's about algorithm ranking. In March 2021, Nick Clegg [former deputy prime minister of the U.K. and vice president of global affairs at Meta] published... - oh, bless him - published an article titled "It Takes Two To Tango." I strongly recommend that you read this editorial. It is a true work of art.

He says [and I paraphrase], "Hey, you guys keep blaming us for what you see on Facebook, but let's be honest. You chose your friends, you chose your interests. It takes two to tango. Watch where you point your finger, because four fingers are pointing back at you."

I take it you don't find that particularly convincing!

While we're on the subject of victim blaming. He said this knowing that Facebook researchers had run the same experiment over and over where they took empty accounts and followed moderate interests. In the case of Instagram, they followed accounts that were about healthy eating. And let's face it, we could all eat a little healthier.

Or a lot healthier, in my case.

And all [people] did was click on the first five or ten things every day and follow all the hashtags that were suggested. Within two to three weeks, they were being actively sent pro-anorexia content and active self-harm content. There was no "tango between two people." It was just the escalation of the engagement-based ranking.

Can you elaborate on why that's so important?

I'll give you a little example. I was interviewed about two weeks ago by a journalist who had just had a baby and had created a new account for his baby. It's a very cute baby. Only cute baby photos are posted on this account. There is one photo per day. The baby has no friends other than other cute babies, right?

To be honest, that sounds like a pretty great account.

All they post are cute baby photos. And yet, 10% of his feed is of children who are suffering. They're kids in the hospital with tubes coming out of them. They're severely deformed children. Dying children. How in the world do you go from a cute baby to a mutilated child?

Because all the algorithm knows is that there is such a thing as babies, and that some baby content gets higher engagement than other content. Even though it hasn't commented on any of those photos and hasn't liked any of them, it's probably lingering on them.

That's pretty darn disturbing.

Think about what that does in other contexts, right? With teenagers, it causes them to starve or kill themselves. With adults, it causes people to drift to those extremes. When they did the experiment with a center-left point of view, they were pushed toward "Let's kill the Republicans." With a center-right point of view, they were pushed toward white genocide. And that's not a time horizon of a month. That's a time horizon of a week.

It's terrifying.

Think of what that does to society. And this is where it gets really scary. This is the reason I get up every single day. The version of Facebook that we interact with in the United States is the cleanest, purest version of Facebook in the world.

In 2020, 87% of the budget was spent on misinformation in English, even though only 9% of users speak English. Most people don't realize that there are at least a billion people in the world - if not 2 billion - for whom the Internet is synonymous with Facebook.

2 billion?

Facebook has gone into their countries and said, based on Free Basics, "Hey, if you use Facebook, your data is free. If you use anything on the open web, you have to pay for the data yourself."

So imagine what this market dynamic does by pushing everyone onto Facebook. Now you have a situation where the country is very unstable. The most fragile countries in the world are often linguistically diverse, they speak smaller languages, and now Facebook's business model can't provide security systems.

If we focus on censorship instead of focusing on product safety, we're basically leaving behind the people who are in the most vulnerable places in the world. And those people have no way to leave Facebook. You can disagree.

What role do you think blockchain plays in all of this, as a potential solution?

What excites me the most is the question, [what if] there had been a DAO in 2008 that ruled Facebook? To be clear, the problem with Facebook is not your family or your friends.

Facebook has been running experiments that are all about giving you more content that you've agreed to. That's content from people you're actually friends with, pages you've actually followed, groups you've actually joined. When [it] makes that free, there's less hate speech, less nudity, less violence. They just say, "Hey, let's trust your judgment and give you more of what you're asking for." It's not rocket science.

[But Facebook has had to get you to consume more content every single quarter since 2008, and family and friends have let Facebook down. Facebook needed them to produce more and more content. And when they didn't, they started doing all these weird little hacks.

So how does a DAO fit into this picture?

If there had been a DAO regulating Facebook - if it had been owned by the users - I don't think we would have said, "Hey, keep putting some [crap] in our accounts that we didn't want." We would still have something that would be like the Facebook of 2008. So I'm cautiously optimistic that exploring different business models could have potential.

Also, I think it would be easier to run a version of Facebook that is about our family and friends. Family and friends are not the problem. A system of amplification that uses algorithms to direct our attention - that's the problem.

Podcast: What Facebook's patents tell us about the battle for the soul of the metaverse

If you can make it decentralized and build a system that's very similar to Facebook, but only for your own family and friends, that would be much safer.

What do you think needs to happen to solve the Facebook problem?

I think it's at least a governance issue. One of Facebook's fundamental problems is that it won't acknowledge the power. [It can't acknowledge responsibility. Take, for example, the situation where high school kids are suffering broken bones because kids are fighting to post on Instagram. Think about that for a moment. What would Instagram have to do to delete that account? Why don't they shut it down?

They can't acknowledge responsibility. Unless there's a major leadership change ... I spoke at a risk conference yesterday, and the CEO of the trade organization said, "It's not about having checklists. It's not about making sure someone goes through a form. It's about creating a culture of accountability. And Facebook fundamentally lacks such a culture of accountability.

What do you hope will happen?

I hope the [U.S. Securities and Exchange Commission] acts. One of the things we will ask them to do is to require Mark [Zuckerberg] to sell some of his shares. That would allow shareholders to intervene. So that's my big hope. Who knows if it will happen or not?

And I think the fact that the DSA [Digital Services Act] passed means that we will be able to develop solutions.

Give us an example?

I'll give you one that works in any language. Should you have to click on a link to share it?

It makes so much sense.

With Twitter you have to do it, with Facebook you don't have to do it. And it reduces misinformation by 10 to 15%.

Speaking of Twitter, you once suggested that a private Twitter - owned by [Tesla (TSLA) CEO Elon] Musk - might actually be more secure. Why is that?

Remember when I talked about how if we had a DAO for Facebook, we probably wouldn't have gotten a bunch of [crap] fed into our accounts? A lot more would have stayed with our family and friends. There are a lot of non-content-based solutions.

What do you mean by that?

It means focusing on product security, not magical AI [artificial intelligence] that pulls things out [as a form of censorship]. But you can only do that if you're willing to sacrifice small pieces of profit and a small number of users. One reason I'm rooting for Elon is that he was the first to say publicly that we're going to turn off the bots.

One thing that most people don't realize - and this might be interesting to this particular [crypto] audience - is that when we talk about dollars, we have very detailed accounting laws, right? So when you want to say I have a dollar or I have a liability, [there are] very specific rules about when you have to acknowledge those things.

We don't have similar rules for what a person is, but the values of companies are incredibly dependent on the number of users they say they have. I've talked to people who run the biggest captchas in the world, and they say there are sites where 90% of the reported users are bots.

Damn.

That's right. And these sites are deliberately choosing very lax captcha settings because they want bigger numbers, but the biggest threat to us is bots. And Elon said, "We're going to use the fact that we don't have to report user numbers anymore to clear the air."

Let's close on a personal note. What you did was incredibly brave. If you don't mind sharing how you experienced the aftermath?

You know, it's interesting. People imagine the gnashing of teeth, the drama, and all those things. But I had a remarkably uneventful transition.

I'm surprised and delighted to hear that.

I guess I get interviewed a lot more often. But no one recognizes me in Puerto Rico, which is great. Even online, it's crazy. I think I've gotten maybe 15 to 20 mean things sent to my DMs. So if you want to be the first ... [Both laugh.]

And what I find remarkable is that [often] women who are in the public eye get sexually harassed really badly. I don't even get sexually harassed, which is shocking to me as someone who has worked in four social networks. So I can't really complain. It's been pretty relaxed.

I'm crossing my fingers that it stays that way. Thanks for doing this, and I'll see you on our new DAO version of Facebook!