Facebook and Instagram explain their content recommendation guidelines and point out what types of posts are banned
For the first time, Facebook and Instagram have decided to shed some light on the guidelines they follow for recommending content to their respective user communities. A topic that is usually controversial, largely due to the opacity that surrounds the suggestion system of these two social networks.
“We make personalized recommendations to people who use our services to help them discover new communities and content,” they explain from both Facebook and Instagram. The key is: how?
Facebook and Instagram have chosen to explain what is never going to be recommended, rather than delve into the details of the suggested content
According to Guy Rosen, a Facebook manager, personalized suggestions are based “on the content in which you have expressed interest and the actions you take in our applications”. That was explained in an article published and deleted during the last hours that Engadget had echoed.
What is not recommended on Facebook and Instagram
Both on the page that explains the recommendations of Facebook and the one that explains those of Instagram, rather, what is not going to be suggested is explained and not so much the details about the content that is disseminated among users.
On both platforms, content that talks about harming oneself, which may represent violence, which may be sexually explicit or suggestive such as “photos of people with transparent clothing”, that promotes the use of certain regulated products such as medicines or that comes from an account or entity that is “not recommended” due to the type of publications it makes. Content accepted on Facebook and Instagram, but which according to their rules are not likely to be recommended.
Both on Facebook and Instagram, they explain, sexually explicit or suggestive content such as “photos of people with transparent clothes” will never be recommended.
In the case of Facebook, the explanations on the basic standards used for recommendations are slightly more specific than on Instagram and are noted to be based on the strategy they have been using since 2016 to deal with problematic content: delete, reduce and report.
The plan involves remove content that violates its community rules, such as fraudulent interactions that occurred on the page of the Ministry of Health of the Government of Spain; reduce the dissemination of problematic content that does not directly violate its rules, such as that related to disinformation about the coronavirus; and report users with additional information that allows them to better judge what kind of content they consume, such as fact checks.
Although it does not delve into the operation of the algorithms and their configuration, these guidelines now revealed for the first time They can serve as a reference when judging what Facebook and Instagram do and do not do with the content they promote.