Listen to this article now
Instagram isn’t exactly forthcoming about how it classifies ‘sensitive content’ or what constitutes it. Last year, Instagram introduced the sensitive control where the company defined sensitive content as “posts that don’t necessarily break our rules but could potentially be upsetting to some people — such as posts that may be sexually suggestive or violent.”
Within the next several weeks, Instagram will roll out the changes to all users. This will provide new content controls for search, Reels, hashtag pages, “accounts you could follow,” and recommended posts in the feed.
Rather than allowing users to mute specific content topics, Instagram’s controls have three options: one that displays you less of that bucket of content, the regular setting, and an option to see more sensitive content. Users under the age of 18 will be unable to select the latter option.
Some categories of content and accounts are allowed on Instagram since they don’t violate the platform’s Community Guidelines, but may not be eligible for recommendations, according to a Help Center post that explains the sorts of content and accounts Instagram considers sensitive. The Sensitive Information Control setting can be changed at any moment to show more or less content that “impedes our ability to foster a safe community.” These are, according to Instagram:
- Content that may depict violence, such as people fighting. (We remove graphically violent content.)
- Content that may be sexually explicit or suggestive, such as pictures of people in see-through clothing. (We remove content that contains adult nudity or sexual activity.)
- Content that promotes the use of certain regulated products, such as tobacco or vaping products, adult products and services, or pharmaceutical drugs. (We remove content that attempts to sell or trade most regulated goods.)
- Content that may promote or depict cosmetic procedures.
- Content that may be attempting to sell products or services based on health-related claims, such as promoting a supplement to help a person lose weight.
Instagram emphasizes in the graphics that go with its blog entries that “some people don’t want to see content about topics like drugs or firearms.” Instagram’s lack of openness in defining sensitive content and its choice to not provide users with more comprehensive content management is concerning, especially considering its decision to group sex and violence together as “sensitive.”
From our vantage point, it seems counterintuitive that a user who dislikes seeing posts about weight loss schemes and diet culture would also dislike photos of people wearing see-through clothes. An end result is a tool that doesn’t provide useful means for people to avoid seeing things they don’t want to see when browsing Instagram’s algorithms, but it urges users to turn off an unclear definition of “adult” material.