Users of Instagram can now put a smile on their face since they no longer have to worry about offensive captions. This is after the Facebook-owned company trained an AI system to detect any offensive caption. The main idea behind this new system is to allow Instagram users the chance of pausing and reconsidering what they want to say. Announcing the new feature in their blog, Instagram said it would be rolled out to some countries as soon as possible. Well, this comes as good news especially to individuals who have had to deal with cyber bullying before.
Over the last couple of years, cyber bullying has become rampant is social media platforms including Facebook, Instagram and YouTube. Any Instagram user with access to the tool who happens to type an offensive caption will instantly receive a prompt notifying them it is similar to those used in cyber bullying. The tool is then going to give users an option to edit the caption before publishing it online. If this is not enough, the AI system educates users on what is not allowed on Instagram or whenever your account is at risk of breaking the set rules.
At the start of the year, Instagram launched an app that notifies users when their comment on other people’s posts is considered offensive. In their blog, Instagram wrote,“Results have been promising and we’ve found that these types of nudges can encourage people to reconsider their words when given a chance.” The new AI System is expected to deliver the same results thus reducing cyber bullying on Instagram.
By notifying users whenever their caption is offensive it will prove quite easy in containing individuals who are into promoting images of self-harm. The latest move by Instagram is seen as a way in which they can become more aware of their users. If you have not accessed this new feature, then you need to wait until it is rolled out to your country of residence. Luckily, the rolling out is already underway in some parts of the world and hence it is only a matter of time before you access it.