Quote:
Originally Posted by manaboutown
What still throws me is many folks in Florida referring to the West Coast mean the West Coast of Florida on the Gulf of Mexico, not the the states such as California bordering the Pacific Ocean.
|
It’s definitely weird and definitely a real thing. I was born and raised here and until I moved to the actual west coast, it seemed normal to say.

Now, I say the west coast of Florida. Or the Gulf coast.