Facebook Explores Ranking Some Key Content Categories Differently

Facebook has been exploring how and whether to rank some important categories of content differently such as news, politics, and health to make it easier to find posts it deems valuable and informative.

Last year, Facebook tested the concept around COVID-19 and U.S. voting information.

The platform created information hubs with links and resources from official sources, and promoted these at the top of people’s News Feeds.

More than 600 million clicked through to credible sources of information about COVID-19 via Facebook and Instagram, and an estimated 4.5 million Americans were helped to register to vote.

The news was buried deep in a post by Facebook Vice President of Global Affairs Nick Clegg on Medium earlier this week in which he blasts The Atlantic for comparing Facebook to a Doomsday Machine, “a device with the sole purpose of destroying all human life.”

Clegg wrote that the relationship between the user and an algorithm is not as transparent as Facebook would like, and that needs to change to give them a better understanding of how the ranking algorithms work and why they make particular decisions.

“You should be able to talk back to the algorithm and consciously adjust or ignore the predictions it makes — to alter your personal algorithm in the cold light of day, through breathing spaces built into the design of the platform,” Clegg wrote.

Earlier this week, Facebook did launch a new tool to give people control of who comments on their public post, giving them more control of what they share to News Feeds. It’s a change in the company’s thinking.

While some may believe the algorithms control the user, and not the user the algorithms, the social media platform during the year of the pandemic gained platform visitors.

In the fourth quarter of 2020, Facebook supported nearly 2.8 billion monthly active users, up from 2.6 billion in the first quarter of that year, according to Statista.

During the last reported quarter, the company stated 3.14 billion people used each month at least one of their products, such as Facebook, WhatsApp, Instagram, or Messenger.

Less than 45% of Gen Z on Snapchat, 38% on TikTok, 39% on Facebook, and 31% on Twitter believe they will spend daily time on these platforms in the next five years.

The data was gleamed from Sweety High, an integrated media company tailored to Gen Z.

The survey, which was fielded between February 24 and March 5. About 8,805 participated between the ages of 13 and 24.

Clegg also referenced the Netflix documentary "The Social Dilemma," as well as the book Surveillance Capitalism, written by Harvard social psychologist Shoshana Zuboff.

“In each of these dystopian depictions, people are portrayed as powerless victims, robbed of their free will,” he writes. “Humans have become the playthings of manipulative algorithmic systems. But is this really true? Have the machines really taken over?”

Facebook’s personalized News Feeds are built by user choices and actions. It is made up primarily of content from the friends and family chosen by the user, along with the Pages chosen to follow, and the Groups joined. Ranking is the process of using algorithms to order that content.

While companies like Facebook must be open about how the relationship between users and the platform’s major algorithms really work, they need to give users more control of the types of content and ads they see.

Every piece of content —  including posts the user sees or not from friends, Pages followed, and Groups joined — goes through the ranking process.

Thousands of signals are assessed for these posts, such as who posted it, when, whether it’s a photo, video or link, how popular it is on the platform, or the type of device you are using.

The algorithm uses these signals to predict how likely it is to be relevant and meaningful to the user, as well as how likely the person might be to “like” it or find that viewing it was worth their time.

The goal is to make sure the user sees what they find most meaningful — not to keep them glued to their smartphone for hours on end.

“It’s not in Facebook’s interest — financially or reputationally — to continually turn up the temperature and push users towards ever more extreme content,” he wrote, calling it a spam filer in the inbox.

 

Next story loading loading..