Quote:
Originally Posted by Ele201
Yes you are right. Certain groups want to change the American culture, our endeared products and brands that have been around a long time. They are doing no harm to anyone. Yet some want to “cancel” out anything they don’t like.
|
"the American culture?" Seriously? Do you really want to go there? The American culture exists because undocumented immigrants showed up from the other side of the planet, invaded the continent, and attempted genocide on the natives living here. Once they wiped out most of the tribes and took their females to OWN and rape, they then tore up the native lands and gutted their sacred graveyards and sanctuaries to build buildings. Of course they couldn't build the entire country themselves, so at some point early on, they imported slaves from Africa to do most of the hard work, including raping THEIR females and selling the offspring to other landowners.
That is the "American culture."
Which - thankfully, has evolved over the past couple of centuries. You might prefer to continue murdering native Americans, raping the women and forcing black men to plow your fields at the end of a whip, but most Americans have a different culture now.
It's probably in your best interest to catch up.