Press "Enter" to skip to content

Facebook struggles with the functions it used to define social media

SAN FRANCISCO – In 2019, a group of Facebook researchers began a new study to analyze one of the fundamental characteristics of the social network: the “like” button.

They looked at what people would do if Facebook removed the distinctive thumbs-up icon and other emoji reactions from posts on Instagram, its photo-sharing app, according to company documents. At times, the buttons had caused “stress and anxiety” in younger Instagram users, especially if the posts weren’t getting enough likes from their friends, experts found.

However, the researchers found that when the like button was hidden, users interacted less with posts and ads. At the same time, it didn’t alleviate teens’ social anxiety, and young users didn’t share more photos, as the company thought it would, leading to mixed results.

Mark Zuckerberg, Facebook’s chief executive, and other managers debated hiding the like button from more Instagram users, according to the documents. In the end, a larger test unfolded in a limited capacity to “build a positive press narrative” around Instagram.

The research on the like button was an example of how Facebook has questioned the essential functions of social networks. As the company has faced crisis after crisis over disinformation, privacy and hate speech, a central issue has been whether the basic way in which the platform operates has been to blame (in essence, the characteristics that have made Facebook be Facebook).

In addition to the like button, Facebook has put its share button under scrutiny, which allows users to instantly broadcast content posted by other people; its groups function, which is used to form digital communities, and other tools that define how more than 3.5 billion people behave and interact online. The investigation, embodied in thousands of pages of internal documents, underscores how the company has repeatedly struggled with what it has created.

What the researchers discovered was often far from positive. Time and again, they found that people misused key functions or that those characteristics amplified toxic content, among other effects. In an internal memo from August 2019, several researchers mentioned that Facebook’s “basic product mechanism” (that is, the gist of how the product worked) was what allowed disinformation and hate speech to flourish on the site. .

“The mechanism of our platform is not neutral,” they concluded.

The documents (which include slideshows, internal conversation threads, charts, memos, and more) don’t show what actions Facebook took after receiving the findings. In recent years, the company has tweaked some features, making it easier for people to hide posts they don’t want to see and disable recommendations from political groups to reduce the spread of misinformation.

Article Source

Disclaimer: This article is generated from the feed and not edited by our team.