9 Incredible Movies About Florida You Need To See

You can learn a lot about the culture and history of a place by the art it inspires. Florida is often depicted as a sunny place for shady people, but there are plenty of films that paint it in a more positive light as well. In any case, you can’t deny that our state has some fascinating stories to tell.

We’re aware that these uncertain times are limiting many aspects of life as we all practice social and physical distancing. While we’re continuing to feature destinations that make our state wonderful, we don’t expect or encourage you to go check them out immediately. We believe that supporting local attractions is important now more than ever and we hope our articles inspire your future adventures! And on that note, please nominate your favorite local business that could use some love right now: https://www.onlyinyourstate.com/nominate/

How many of these Florida films have you seen? What’s your favorite movie about the Sunshine State?