You're delusional if you think Hollywood is suddenly going to become less White. They want the optics of being "woke" but they still want to make money. They know who their main audience is. And it's sad that there's a sentiment of White people being "forced" to watch movies with minorities in them. God forbid White people pay attention to something that doesn't reinforce their belief that they're the center of the universe.