God forbid! While we all know that what we know as America today originated from the Britain of old, the idea of the “colony” influencing the traditions of the “mainland” is something that is just unthinkable! However, if the analyses of various people are to be believed, it might just be that the American culture is fast creeping up on the British culture.
Take for example what is going to happen to the Antiquarius Center on King’s Road. This place is where you can find all sorts of antique things – from cuff links to photos. In the very near future, however, the Center will be demolished to make way for Anthropologie, and American chain, which will feature a fashion emporium. How one can even think about exchanging something as culturally and artistically rich as the Center for a “personality-less chain,” I don’t know!
There are other things that point to our society being influenced by American culture. The stereotypical Englishman wearing a suit and a hat has been replaced by sneakers and blue jeans. The stereotype of bad English teeth is vanishing due to dental procedures that make them straight and pearly white. (Now this one, I really don’t mind.) Psychologists and psychiatrists are fast turning to medication for their patients. Fast food like KFC is becoming more and more popular. American urban music is finding its way into the hearts of the youth. The list goes on and on and on…
But does this really mean that Britain is being Americanised? And if so, is it necessarily all a bad thing?