Instagram will soon let users be the judges of what is considered "offensive content" on their accounts.
(CCM) — On Monday, Instagram announced that it would be stepping up its anti-harassment protocol by giving users the option to control what is displayed in their comments. Instagram already had guidelines in place for what it considers offensive material, which can be removed from the platform at the company's discretion. Those rules will still apply, but this move will mark the first time that the photo-sharing platform will launch an initiative that will let users, themselves, determine what they find to be abrasive content. According to reports by The Washington Post, the update would allow users to systematically get rid of offensive content off of their own accounts on a post-by-post basis, or they could choose to completely disable comments on their uploads.
"Our goal is to make Instagram a friendly, fun and, most importantly, safe place for self expression," said Instagram's Head of Public Policy, Nicky Jackson Colaco, in a statement. Colaco specified that this feature would first be rolled out to "high-volume" accounts, with expansion to average-volume accounts to come. "As we learn, we look forward to improving the comment for our broader community." Comment moderation capabilities are expected to roll out to the general public in the coming months.
Image: © Bloomua - Shuttterstock.com