YouTube is responding to allegations that it allows racist and homophobic harassment on its platform. But now it has to actually enforce it.

What happened? On Wednesday, YouTube announced an update to its harassment policy that means material that “maliciously insulted or demeaned others because of their race, gender or sexual orientation” will be removed. It will also ban “veiled or implied” threats or “language suggesting physical violence may occur.”

Why the change: The update comes about six months after YouTube came under fire for refusing to ban Steven Crowder, a right-wing personality, who had used racist and homophobic language against a Vox journalist on his channel. YouTube said Crowder’s words did not breach its policies. This update looks to be a response to the backlash that followed.

Okay, so how will it do this? A sprinkling of AI, but mostly a lot of help from thousands of new moderators who will be hired to watch videos and scan them for problematic content. YouTube’s track record of actually enforcing its own policies is really not great, however.

And haven’t there been issues with using moderators? Yep. Earlier this year, an investigation from the Verge detailed severe mental health issues among moderators at Facebook; another investigation at the Washington Post found YouTube moderators suffering similarly. Reddit has tried to aid human moderators with AI-powered “automoderators,” but the system is imperfect and still requires human review.

The other YouTube policy hiccup affects kids: In September, YouTube and the Federal Trade Commission reached a $170 million settlement for the company’s illegally keeping and using data on what children watching, a violation of the Children’s Online Privacy Protection Act. YouTube was required to create a labeling system for children’s videos; if they are aimed at kids, then creators aren’t allowed to collect ad money or target potentially interested viewers on the basis of their watch history.

But what exactly is kids’ content? That’s what YouTube and a ton of creators, worried they will lose income, want to know, especially when it comes to content like unboxing videos or animations that might seem to appeal to kids but could have crossover adult appeal. On Wednesday, YouTube wrote a letter to the FTC asking for clarity because its policy is “complex.” Expect a lot of legal tussling and not a lot of clarity in the months to come.