For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | n_kb's commentsregister


It's likely that FB uses user response to posts to decide how to prioritize them. However, several patents describe systems where they analyze (using CV) and make decision on the importance of pictures before they are published.


Remember kids, just because a Megacorp has a patent on something, doesn't imply that they are using an approach like this anywhere in production.


I think it's called "subtlety" but I'm no native speaker.


From your profile I can gather that you are affiliated with the source so I'd really be interested in what you mean by this


We didn't have enough data to test these hypotheses. If more people contribute their data, we'll be able to test that: https://algorithmwatch.org/en/instagram-algorithm/


That's a major missing hypothesis that seriously undermines your conclusion.

Why do you need me to share data at all? Just create brand new accounts and test what they see in a more controlled manner. Then you'll be able to test what's recommended from the same accounts over time while building any specific profiles yourself.


That's against FB's and IG's terms of service, so as an investigative outlet, they can't really do that and then promote the results.


We didn't have enough data to test these hypotheses. If more people contribute their data, we'll be able to test that: https://algorithmwatch.org/en/instagram-algorithm/


Of course it is. I'd even say that what we found is probably similar to the issue of offensive suggestions by search engines. A minority of Instagram users see the platform as a free source of soft porn images and their behavior is probably picked up by ML systems, amplified, and pictures of nudity are pushed for all users, in a vicious cycle. Just like search engines spread far-right conspiracies by suggesting them to millions of users after a few thousands searched for them.


but as opposed to "offensive suggestions", there is nothing wrong with showing-skin.


Unless your don't want to, but you are obliged to use IG because your business focuses on a demographic on which IG has a monopoly (the 15-25 demographics in the EU).


That's where models, brand ambassadors, and influencers come in. It's not like lifestyle business just found out that sex sells.


I'm sure there are worse jobs than using Instagram


There's a difference between using Instagram and having to wear a Bikini on Instagram even if you don't want to.


Nothing worse than owning a car repair shop and having to force your mechanics to wear a bikini for Instagram ads. Imagine having to have that conversation with a bunch of short tempered hairy dudes


Nobody is "obliged" to use it.

You can also buy ads if you want to show your posts to people, like any other business does.


To be fair, the linked article does suggest that this effect occurs for ads as well as organic posts.


Where's the suggestion that it applies to ads? The only coverage is 2 un-sourced instances of the ad creative review algorithm falsely flagging the content.


How confident are you that Instagram’s algorithm isn’t just optimizing for that particular user? And that it’s just that most users of Instagram seem to engage with scantily clad photos more than other photos?


You might be surprised.


Look at the data file, which is linked to in the article and contains the suggestions. It's very unlikely that at least some of them are automatically generated from content found on the web.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You