Facebook is working on adjusting its posting algorithm again. He announced on Thursdaythat tests new controls that allow users to reduce the occurrence of unwanted content in favor of what interests them.
Testing so far takes place only on a small sample of users who can set up to see content from specific friends, groups, and sites more oftenthat they like. They can also choose to see less unwanted content.
New post display
The test is currently only available in the Facebook application for English-speaking users. Adds three items to the menu to manage what appears in the Post Summary: Friends & Family, Groups, and Sites and Publicities.
As part of the test, users can use the drop-down menus to choose whether they want to keep the ratio of displayed posts at a “normal” level or according to their preferences. increase or suppress.
The test can also indicate topics that users are interested in or that they would rather not see. He said in a blog post on Facebook that the test will initially cover a “small percentage of people” worldwide and will gradually expand over the next few weeks.
Other changes on Facebook
This is not the first time Facebook has modified the way posts are displayed so that preferred specific content. An update was launched earlier this year to give users more control over their channel through the “Favorites” and “Last” filters.
According to The New York Times, Facebook also changed its algorithm last year to display more content from credible publishers such as The New York Times, CNN and NPR, and less content from sources that spread false information.
In addition, Meta, which owns Facebook and several other messaging applications, will make it easier to access existing controls, including Favorites, Snooze, Unfollow, and Reconnect. The test will be expanded in the coming weeks.
News for advertisers
Companies and brands have to gain Greater control over the type of topics where their ads appear. This is part of an extension of topic exclusion controls that are currently only available to a limited number of advertisers running English ads.
The new controls will allow advertisers to prevent their ads from appearing alongside topics such as “News and Politics,” “Crime and Tragedy,” and “Social Issues Discussed.” So, for example, if they choose to distance themselves from controversial topics, their ads will not appear to people who are interested in such topics.
Facebook algorithms are notorious for they support controversial content and dangerous misinformation. As a result, Facebook – and its parent company Meta – are under increasing pressure from regulators to be more transparent in this regard.
While Congress is considering solutions that would give users more control over what they see and remove at least some of the ambiguity over how the algorithms work, Facebook probably hopes it still has enough time to self-regulate.