It claimed that it had been acting against content that glorified extremism and racism, since 2017.
YouTube had issues in 2017, with major advertisers pulling their ads beginning in March when a report appeared stating that these ads were appearing on videos which had racist, sexist, extremist and anti-Semitic content.
In a blog post on Wednesday, YouTube said: "Today, we're taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.
|
There have been complaints over the last few days by a journalist, Carlos Maza, a gay man, that videos attacking him, have been put up on the site and not removed despite well-documented complaints.
The post also said the site would act "to limit recommendations of borderline content and harmful misinformation, such as videos promoting a phony miracle cure for a serious illness, or claiming the earth is flat".
Channels that violate these and other rules put in place would be "suspended from the YouTube Partner program, meaning they can’t run ads on their channel or use other monetisation features like Super Chat".