9 Incredible Movies About Florida You Need To See

You can learn a lot about the culture and history of a place by the art it inspires. Florida is often depicted as a sunny place for shady people, but there are plenty of films that paint it in a more positive light as well. In any case, you can’t deny that our state has some fascinating stories to tell.

How many of these Florida films have you seen? What’s your favorite movie about the Sunshine State?

OnlyInYourState may earn compensation through affiliate links in this article.