I’m aware that, at the moment, Florida is deemed unsafe for people to travel to, but what is generally the worst state to live in? Factors such as education, religious extremism, crime, cost of food, healthcare, and access to other resources are relevant. Will “Deep South” states remain one of the worst places to live in due to traditionalism, or are more progressive states bound to grow worse?
Louisiana and Mississippi, no doubt. Florida is just shit, it has no redeeming qualities and everything is expensive, so that’s pretty bad too
Florida is expensive?? Damn. Make sure, next time you come on up to the Great White North, bring a couple extra bucks with you.
It’s not nearly as bad as Miami or most of the rest of Florida lol