Instagram has mentioned on several occasions how keen it is on making sure its young users stay protected on the app. Therefore, anyone joining below the age of 16, from now onwards, would be forced to switch to a new restrictive setting.
This change will be carried out by default and will include a more restrictive filter for content for anyone that’s below that age. They’re also recommending that any teens already on the app should do the same as it's for their own good.
The news was confirmed via a blog post this week that was updated by the social media platform.
We saw Instagram rename its already existing settings for content during the early part of the summer of this year. This includes subcategories like Less, More, and Standard. This enables users to put a limit on content that could be controversial or age sensitive.
Common examples include the likes of violence, sexually explicit, health, and cosmetic surgeries, among other themes. Only if the Instagram user happens to be 18 and above, they’ll be allowed to gain access to the More category.
As one can expect, the latter is the most open or least filtered version of this application that is readily available for public viewing.
Therefore, anyone below the age of 16 will have their Less subcategory opened up by default. And that is going to really alter the way things are shown on the app like Reels, feed, and their Search as well. Obviously, the change will also extend to Recommendations too.
The app will also go one step further by enabling a new settings checkup. This will, users will be limiting those with whom content can be shared and the types of content too.
There will be restrictions on who is allowed to Directly Message them and what content types their followers will be allowed to see. Users will also be given a heads-up about a new feature that limits the time spent on the app. This is usually seen in the form of an update.
The company’s spokesperson said that one common type of content that won’t be visible is political-themed and violent protests. They feel young users don’t need that exposure at this stage.
Also, it was outlined how content settings would only be applicable to those accounts that teenagers aren’t currently following.
We feel the change is plenty and a great initiative. This is especially true after we’ve seen so many lawsuits and studies come forward speaking about the negative effects of Instagram on young minds.
The company says it’s been working hard to provide the best protection for users and there’s simply no better way to do that than this.
Meanwhile, experts in mental health have issued warnings that parents shouldn’t be too happy or rely on such settings because such parental controls can only go so far. They’re not the ultimate solution and a little supervision is always great when a child is using the application.
Read next: Instagram May Want Reels To Be The Next Big Thing But Data Shows It Makes Up Only 22% Of The App’s Content
This change will be carried out by default and will include a more restrictive filter for content for anyone that’s below that age. They’re also recommending that any teens already on the app should do the same as it's for their own good.
The news was confirmed via a blog post this week that was updated by the social media platform.
We saw Instagram rename its already existing settings for content during the early part of the summer of this year. This includes subcategories like Less, More, and Standard. This enables users to put a limit on content that could be controversial or age sensitive.
Common examples include the likes of violence, sexually explicit, health, and cosmetic surgeries, among other themes. Only if the Instagram user happens to be 18 and above, they’ll be allowed to gain access to the More category.
As one can expect, the latter is the most open or least filtered version of this application that is readily available for public viewing.
Therefore, anyone below the age of 16 will have their Less subcategory opened up by default. And that is going to really alter the way things are shown on the app like Reels, feed, and their Search as well. Obviously, the change will also extend to Recommendations too.
The app will also go one step further by enabling a new settings checkup. This will, users will be limiting those with whom content can be shared and the types of content too.
There will be restrictions on who is allowed to Directly Message them and what content types their followers will be allowed to see. Users will also be given a heads-up about a new feature that limits the time spent on the app. This is usually seen in the form of an update.
The company’s spokesperson said that one common type of content that won’t be visible is political-themed and violent protests. They feel young users don’t need that exposure at this stage.
Also, it was outlined how content settings would only be applicable to those accounts that teenagers aren’t currently following.
We feel the change is plenty and a great initiative. This is especially true after we’ve seen so many lawsuits and studies come forward speaking about the negative effects of Instagram on young minds.
The company says it’s been working hard to provide the best protection for users and there’s simply no better way to do that than this.
Meanwhile, experts in mental health have issued warnings that parents shouldn’t be too happy or rely on such settings because such parental controls can only go so far. They’re not the ultimate solution and a little supervision is always great when a child is using the application.
Read next: Instagram May Want Reels To Be The Next Big Thing But Data Shows It Makes Up Only 22% Of The App’s Content