Frances Haugen, a former employee, testified before the United States (US) Senate on the impact of Facebook’s policies and technologies with examples of case studies and research from companies listing what has long been suspected: Facebook and its products create and exacerbate divisions, hatred and evil in today’s society.

“I’m here today because I believe Facebook’s products harm children, fuel division and weaken our democracy,” she said in her opening statement to lawmakers. “These problems are solvable. A more secure, more respectful of freedom of expression and more enjoyable social network is possible. But there is one thing that I hope everyone takes away from these disclosures is that Facebook can change, but clearly isn’t going to do it on its own. “

There were several takeaways from Haugen’s testimony. Three stand out.

The algorithms based on Facebook’s engagement and its financial incentives are at the heart of the damage it causes and must be regulated; the company’s data mine must be open to outside researchers; and there is unanimity among US lawmakers that regulation is needed.

Haugen is right

What makes Haugen’s position are the communications and research about the company that she leaked on Facebook.

So far, evidence of harm has been seen in the results. For example, Senator Richard Blumenthal said his team created a fake Instagram account identifying themselves as a 13-year-old girl, and after following “easily found” accounts on extreme diets, the social media service began. recommending accounts that promote self-harm.

Now, a document leaked by Haugen shows that Facebook itself has made similar conclusions about how its products drive people to extreme content. “Carol’s Journey to QAnon,” a case study cited in the documents, showed that a test user created by a Facebook researcher was exposed to polarizing content “within a day” after following conservative pages (Fox news, Donald Trump, etc.). The company’s recommendations algorithm “started including conspiracy recommendations after just two days,” and it took less than a week to get a QAnon recommendation.

Haugen adds that the company has dragged its feet to solve these technological problems because it prioritizes growth over security (REUTERS)

The company’s research weakens Facebook’s defense that the divisions that manifest on its products are a reflection of real-world factors.

Haugen adds that the company has dragged its feet in solving these technological challenges because it prioritizes growth over security. The company has denied this characterization, again suggesting that the security threats are due to divisions and polarizations that persist offline.

So maybe his solution

The former Facebook employee told lawmakers they need to step in to regulate the company. “If we had proper oversight, or if we reformed [Section] 230 To hold Facebook accountable for the consequences of their intentional ranking decisions, I think they would get rid of engagement-based ranking, ”Haugen said. “Because it exposes teens to more anorexic content, it separates families and in places like Ethiopia, it literally stokes ethnic violence.”

How does an algorithm do this? Engagement-based ranking means that the part of the computer code that decides which posts you see on your screens most often is based on a prediction of what you’re most likely to click or post (or do). a break to read or to re-share).

In essence, she suggests having some sort of distinction in a US law – section 230 of the Communications Decency Act – to set aside immunity to companies for damage caused by their algorithms.

The suggestion is crucial as Section 230, enacted in 1996, has become the subject of intense scrutiny since online discourse began to translate into greater harm offline. Section 230, for simplicity, states that businesses will not be responsible for what their users post, even if they make good faith editorial decisions.

But the solution is not easy

And this is where the challenge lies. Democrats and Republicans attacked Section 230 – Democrats say the law gives companies a free pass when they delay hateful content, while GOP says companies crack down on free speech using the item as a shield.

This brawl largely refers to how the content – user posts – is moderated. It is very different from the algorithms which Haugen says need more scrutiny.

An attempt to solve a technically complex problem could easily confuse the two, especially if lawmakers do not overstep their partisan considerations.

As it stands, lawmakers have not been sufficiently aware of the nuances involved. For example, Senator Blumenthal pushed an Instagram rep to pledge to “end the finsta” – seemingly oblivious to what that meant. Finsta refers to young people who create anonymous accounts – the pseudonym is a crucial concept for privacy.

Clear awareness, however, is only one of the challenges ahead in technology regulation.

Facebook, the leaked documents say, relies on a wealth of data relating to deeper social interactions. This makes the second of the takeaways from Haugen’s testimony cited above important – independent researchers must have access to it in order to identify what can be a large amount of unforeseen issues that need to be kept in mind when discussing. design of any regulatory framework. This in itself will be difficult as it involves sensitive user data and proprietary information about the company.

Another challenge will be to ensure that any regulations that add a compliance burden don’t end up giving Big Tech an advantage. Start-ups, which can challenge their dominance, are likely to struggle to keep pace the way some of the world’s wealthiest companies can with the cost of compliance.

During Haugen’s testimony last week, it was clear that lawmakers had a strong desire to intervene. The wind could indeed turn against Facebook. But it remains to be seen how far that tide will reach, or whether it will turn into an overwhelming tsunami.

In perspective looks at current issues in technology and science, the visible and invisible factors at play and their implications for the future

[email protected]

Opinions expressed are personal

Enjoy unlimited digital access with HT Premium

Subscribe now to continue reading


Source link