Twitter is rolling out prompts that ask users to review their tweets before sending replies that are “potentially harmful or offensive,” the social media company said Wednesday in a blog post.
The prompts also offer options for users to edit or delete the post, unless they choose to send it out as is.
The rollout follows tests of the feature Twitter launched last year that encouraged users to pause before hitting send on such replies.
When testing the feature, Twitter said 34 percent of people revised their initial response or decided not to send it at all after the prompt.
And after being prompted once, people composed on average 11 percent fewer offensive replies in the future, according to Twitter.
Twitter said it updated the feature since the initial test, including amending when and how it sends the reminders.
For example, the tool will consider the “nature of the relationship” between the original tweet’s poster, and the person who is replying.
“If two accounts follow and reply to each other often, there’s a higher likelihood that they have a better understanding of preferred tone of communication,” Twitter said.
The platform also made adjustments to its technology to “better account for situations in which language may be reclaimed by underrepresented communities and used in non-harmful ways.”
Twitter said it’s launching the prompts on iOS and Android devices, starting with accounts that have enabled English-language settings.
The feature has also been updated to create an easier way for people to provide feedback to Twitter about whether they found the prompt helpful or relevant.