Because some people aren't interested in their children learning the truth. For example, half of Americans apparently don't want schools to talk about the ongoing impacts of slavery and racism. https://www.edweek.org/teaching-learning/half-of-americans-dont-think-schools-should-teach-about-racisms-impact-today/2022/02
I could easily find more examples of laws and parents demand that children not be taught the truth. And that's what children should be learning in school, even if it means being uncomfortable at times. The world is what it is, and attempting to deny that and bury your head in the sand won't help your child succeed in life when they grow up and fail to grasp how the world works because YOU wanted to hide that from them.