ADVERTISEMENT

17 Must-See Movies That Show America As It Actually Is

ADVERTISEMENT
Photo: Courtesy of the Sundance Institute.
If you think of a movie that is about America, you might think of a western, a biopic of a historical figure, or a film that's trying to be overly patriotic. But there are endless ways to show American life on film just as there are endless ways to live life as an American. And, in recent years, more and more movies that show the actual diversity of America have been seen and appreciated.
It took a long time to get to this point and there's still a long way to go. In fact, one of the movies in the following slideshow is a documentary about just that, and how awful film representation has been for transgender people.
In 2020, Hollywood is still coming to terms with the fact that American movies don't have to be about white, straight, middle class, cisgender people — and learning that people are interested in stories that aren't always about that same group. Poor people are Americans. Non-white people are Americans. Gay people are Americans. Trans people are Americans.
It's a really big country, full of a lot of very different people. From horror to comedy to drama, all the films in this slideshow either slow a part of America that isn't often seen or comment on America as the troubled, complicated, diverse country it is.
ADVERTISEMENT

More from Movies

R29 Original Series

ADVERTISEMENT