New Instagram AI Feature Targeted at Online Bullying

Instagram now warns people before they leave mean comments.
 
The company announced a new AI feature on July 8 that warns users as they’re typing a comment if the content might be offensive. It gives commenters a chance to reconsider before posting. Instagram also plans to test a feature called “restrict,” which lets users block people who are leaving hurtful comments with the person’s knowledge.  
 
Instagram has long been under pressure to address widespread bullying on the platform. In 2015, 12-year-old Kennis Cady died by suicide and her mother said she believes that bullying on Instagram contributed to her daughter’s poor mental state before her death.
 
An Instagram spokesperson told the Atlantic in September 2018 that “there is no place for bullying” on the platform.
 
“Any form of online abuse on Instagram runs completely counter to the culture we’re invested in…Everyone should feel safe…sharing their lives through photos and videos.”
 
42% of cyberbullying victims age 12 to 20 reported being bullied on Instagram, according to a 2017 survey by a British anti-bullying organization.