You can use Twitter to add content alerts to tweets
You can use Twitter to add content alerts to tweets

Twitter is testing a new feature that allows users to add specific content warnings to personal photos and videos sent in Tweets. The platform confirmed that this feature will be available to some users during testing.

Although the platform currently provides a way to add content warnings to tweets. But the only option is to add warnings to all tweets.

In other words, every photo or video you post contains a content disclaimer, whether or not it contains sensitive material.

With the new tested functionality, you can add warnings to an individual tweet and assign specific categories to that warning.

As shown in the video posted by Twitter, apparently when editing a photo or video, you can add a warning about the content by clicking the flag icon in the lower right corner of the toolbar.

On the next screen, you can categorize warnings including “nudity,” “violence,” or “allergy.” After the tweet was posted, the photo or video looked blurry and a content warning was added explaining why the selection was made. If the user wants to view the content, they can click on the warning.

As is currently the case, if you do not report content when posting sensitive content, the platform relies on user reports to determine if your content should receive any warnings.

This replaces the Twitter platform, which applies content warnings to all tweets.

In addition to experimenting with content warnings, Twitter has also announced that it will manage the reporting process in a human-centered manner.

Instead of asking users what rules were violated in a tweet, users can describe exactly what happened and use the response to identify specific violations.

"People use the platform to discuss what's going on in the world, which sometimes means sharing disturbing or sensitive content," the company said. We're testing an option for some of you to add a one-time alert to photos and videos you post to help those who might need an alert.

Previous Post Next Post