New Delhi, Dec 4: To encourage respectful conversations, YouTube is launching a new feature that will warn users when their comment may be offensive to others, giving them the option to reflect before posting. From the reminder, the commenter can move forward with posting the comment as is, or take a few extra moments to edit the comment before posting it. The notification will appear before the comment YouTube's AI-based systems deem offensive. YouTube Rewind 2020 Cancelled as This Year 'Has Been Different', Twitterati Divided Over Online Video Platform's Stand.

Johanna Wright, Vice President of Product Management At YouTube, said that in order to help creators better manage comments and connect with their audience, the company will test a new filter in YouTube Studio for potentially inappropriate and hurtful comments that have been automatically held for review.

"So that creators don't ever need to read them if they don't want to. We'll also be streamlining the comment moderation tools to make this process even easier for creators," Wright said in a blog post on Thursday.

"We'll then look closely at how content from different communities is treated in our search and discovery and monetisation systems. We'll also be looking for possible patterns of hate, harassment, and discrimination that may affect some communities more than others". "In the last quarter, of the more than 1.8 million channels we terminated for violating our policies, more than 54,000 terminations were for hate speech," the company added.

Starting in 2021, YouTube will ask creators on a voluntary basis to provide YouTube with their gender, sexual orientation, race and ethnicity. YouTube revealed that since early 2019, it has increased the number of daily hate speech comment removals by 46 times.

(The above story first appeared on LatestLY on Dec 04, 2020 01:29 PM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).