Home Human communication Facebook hurts its users because that’s where its profits are

Facebook hurts its users because that’s where its profits are

1
0

Thinking about Facebook and what to do with it means grappling with two conflicting sets of facts. The first is that Facebook is an extremely useful platform for communication, news posting, business activity, etc. that billions of people around the world rely on. The other is that Facebook is a highly addictive for-profit entity that exploits and manipulates human psychology to make money, with disastrous results for the rest of society.

The company is back in the news, now after a week of hell thanks to explosive revelations from a former employee who grew disillusioned and leaked to the public a trove of his internal documents. Through a continuous the Wall Street newspaper series based on the leak, a 60 minutes interview and an appearance before Congress, the crux of the whistleblower Frances Haugen’s case is this: Facebook is well aware of the various misdeeds and dangers of its platforms, but still has failed to rectify them as it would conflict with the continued growth of the business and profits.

A report found that company researchers themselves had determined that Instagram, which is owned by Facebook, had a psychologically damaging effect on teenage girls, even though it publicly denied it and launched a version of it. Instagram for under 13s. Another found that the company’s 2018 redesign of its algorithm to promote ‘meaningful social interactions’, or MSI, instead prompted posts and content based on outrage, social division, violence and bullshit. . Others show that Facebook intentionally targeted children and found ways to hook them to the product early on, and dragged its feet to remove posts it knew were made by drug cartels. and human traffickers.

There have been a number of solutions that have been offered to fix Facebook’s problem, such as using antitrust laws to break it, amending Section 230 to allow tech companies to be sued for material posted on their platforms, or, like Haugen suggested it to Congress, demanding more transparency from the company and the creation of a regulatory oversight board. But much of what has been revealed in those documents adds more weight to arguments that the company, and other de facto monopolies like it, should be treated as a public service or even owned by the state.

What the documents clearly indicate is that, as Haugen told Congress, when Facebook encounters a conflict between its profits and the safety of people, it “has systematically resolved those conflicts in favor of their own profits.” Facebook knows its platforms are bad for kids, but in order to keep growing, it needs to hook these kids into its user base as adults, and for those kids to bring their parents into the world. bosom. “They are a valuable but untapped audience,” says a 2020 internal paper on tweens, with the company studying tweens, imagining new products to capture them and discussing the idea of ​​”playdates as a lever for growth.”

Facebook understands that boosting MSI can fuel division, vitriol, and all kinds of unhealthy behavior among its users, but not doing so means less engagement and, therefore, potentially less profit. When an employee suggested dealing with misinformation and anger by removing the priority the algorithm gives to content re-shared by large user chains, Mark Zuckerberg, she wrote, would not “s” there was an important trade-off with the MSI impact ”. When the researchers suggested that the company change its algorithm so that it didn’t send users into deeper, more extreme rabbit holes, like interest in health recipes that quickly led to content anorexic, senior officers ignored it for fear of limiting user engagement.

“A lot of the changes I’m talking about won’t make Facebook an unprofitable business,” Haugen told Congress this week. “It just won’t be a ridiculously profitable business like it is today.”

Much like the companies that drive sales by making devices that are destined to break down and stop working after a few years, it’s Facebook’s thirst for growth and bigger profits that explains its reluctance to act responsibly. It would seem obvious to remove these incentives from the equation, especially with such platforms assuming the status of “natural monopolies” like railways and telecommunications.

If a company is state-owned or simply a tightly regulated utility, it does not need to work under the capitalist logic of growth and excessive profit seeking that has fueled these problems, nor to survive if its user base no longer needs or cares. for that. The fact that the business is going out of fashion among young people and is mainly used by people over the age of thirty could be a problem for Mark Zuckerberg, private owner of Facebook, but it is not much of a problem for a public service that One government reluctantly nationalized because of how much its users have come to depend on. In fact, it sounds like a ready-made solution for a platform that, for most of us, is addictive and unhealthy at best.

If the younger generations don’t care about Facebook’s survival, why should we force them to think otherwise? If people are happier when they are persuaded to stop following everything and dump their news feeds, why should we fight back, as Facebook recently did to the creator of the tool who let them do it?

Of course, there are a lot of practical issues that should be ironed out. On the one hand, Facebook may be an American company, but its utility-type services are provided all over the world. So there are real questions about what a public or regulated Facebook would look like – questions like “Who is this audience?” “Or” Regulated by whom?

Likewise, strict oversight and democratic control should be put in place around such an agenda, lest the exploitation and manipulation carried out by the platform simply be transferred from the private sector to the government. (Remember, however, that through its surveillance programs and cyber operations, Washington and other governments are already using platforms like Facebook to collect and store data about the world’s users and manipulate information about them. and by taking part of them in public property.

But if the exact solution is not yet clear, what is is clear that the current state of affairs is untenable. Beyond the issues highlighted by the Haugen leak, we have long known that social media platforms and other technological innovations are mentally unhealthy for us, having been deliberately designed to be so addictive that software engineers and the tech moguls responsible for it avoid using their own creations. There may be a way to keep social media and its most useful features in our lives while getting rid of its more malignant features; or maybe it all turns out to be a mistake that is fundamentally incompatible with how the human brain works. But to find out, we need to at least try something different.

Unfortunately, this is not the solution that most the Wall Street newspaper series, most of Congress and other media seem to be pointing fingers in reaction to this leak. Not surprisingly, this news sparked calls for more intense “content moderation”, i.e. censorship, by these tech companies as a way to prevent the spread of all kinds. of disinformation or to prevent platforms from “allowing dangerous social movements”, as Haugen accused them. of.

Ironically, this despite the fact that the documents themselves show censorship madness as a solution to these problems. The very first story of NewspaperThe series tells how Facebook created a “whitelist” of tens of thousands of high-profile accounts, shielding them from censorship for posting the sort of thing that would cause other users to be. censored, suspended or permanently banned. All the while, the company’s censors have lashed out at lower-level users, removing completely innocuous messages or those whose message they misinterpreted, including the Arabic-language media and activists during the process. the Israeli crackdown on Palestinians earlier this year. These platforms have repeatedly shown that they cannot be trusted to accurately and responsibly moderate content, as documented this week in a Human Rights Watch report on its removal of Palestinian content.

This response is in line with a long-standing trend of what Olivier Jutel has called Technological reductionism: the belief that tech companies and their products are not only harmful and unhealthy for us, but responsible for all the bad things you can think of that have happened in the past few years. In deciding how to deal with this problem (and choosing censorship as the way forward), we risk missing out and attributing the broader political, economic and social factors that are causing the turmoil of our world today to rather to the almost mystical power of social networks. Was Facebook really singularly responsible for the January 6 Capitol riot? Or was it just one of the many useful tools that made it possible for attendees to organize themselves for the event, attendees motivated by a combination of economic dislocation and lies widely disseminated by elites and mainstream media. about the election?

Haugen told the Newspaper that his motivation to come forward was to see a liberal friend of his being carried away by sinister delusions described as “a mixture of occultism and white nationalism” after spending “more and more time reading online forums” (not Facebook, oddly enough), culminating in the end of the friendship. Yet people regularly encounter or consume propaganda, let alone just use social media and the internet, without going down a similar path. Sadly, we never know what the underlying factors were that caused Haugen’s friend to be sucked into this miasma of lies, or what caused him to later renounce those beliefs. Disinformation has always been rife in the world; finding the answers to these questions will help us understand why it seems particularly powerful at this time.

Avoiding mass censorship efforts does not mean that we are powerless to do anything. There are clear changes that can be made to Facebook’s algorithms, design, core mission, and resources that would bring it closer to the real public service it claims to be the nihilistic, lucrative juggernaut it operates in, and none of them would. threaten our right to speak freely or interfere with our ability to stay in touch with loved ones, organize events or other useful features of these platforms. Who knows, we might even want to log out from time to time.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here