It's likely that FB uses user response to posts to decide how to prioritize them. However, several patents describe systems where they analyze (using CV) and make decision on the importance of pictures before they are published.
That's a major missing hypothesis that seriously undermines your conclusion.
Why do you need me to share data at all? Just create brand new accounts and test what they see in a more controlled manner. Then you'll be able to test what's recommended from the same accounts over time while building any specific profiles yourself.
Of course it is. I'd even say that what we found is probably similar to the issue of offensive suggestions by search engines. A minority of Instagram users see the platform as a free source of soft porn images and their behavior is probably picked up by ML systems, amplified, and pictures of nudity are pushed for all users, in a vicious cycle. Just like search engines spread far-right conspiracies by suggesting them to millions of users after a few thousands searched for them.
Unless your don't want to, but you are obliged to use IG because your business focuses on a demographic on which IG has a monopoly (the 15-25 demographics in the EU).
Nothing worse than owning a car repair shop and having to force your mechanics to wear a bikini for Instagram ads. Imagine having to have that conversation with a bunch of short tempered hairy dudes
Where's the suggestion that it applies to ads? The only coverage is 2 un-sourced instances of the ad creative review algorithm falsely flagging the content.
How confident are you that Instagram’s algorithm isn’t just optimizing for that particular user? And that it’s just that most users of Instagram seem to engage with scantily clad photos more than other photos?
Look at the data file, which is linked to in the article and contains the suggestions. It's very unlikely that at least some of them are automatically generated from content found on the web.