Hollywood’s Woke Era Is Over. Now It’s Turning the Culture War Into Camp.
- Posted on January 21, 2026
- Movies
- By The New York Times
- 3 Views
Hollywood’s Woke Era Is Over. Now It’s Turning the Culture War Into Camp.
The industry seemed penned in by our political debates — until it started channeling them into wild caricatures and frothy drama. continue reading...