![]() |
Opinion>Readers Voice | |
![]() |
![]() |
|
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
||||||||||||||||||||
![]() |
![]() |
Who or what is the Western World? I've read tons of post which
puts China V. Western World or something of a lesser nature. Others are more
like Western sucks and blah blah blah, but usually everybody thinks of western
world in all different ways. *** During the early 16th century, explorers and conquerors like Christopher Columbus and several other conquered new continents on behalf of the Western nations. Up until the 19th century, Europeans settled new lands and thus the term "Western" came to encompass nations and former colonies such as the United States, Canada, Australia, New Zealand, etc. populated mostly by European-descended Caucasians. Japan in 1955, (immediately after its occupation by the US) would be considered by most to be part of the West - while Japan in 1750 would not. Similarly, North America in 1850 would be considered part of the West while it would not be in 1450, or 1500, even - before substantial colonization had occurred. Cold War Post-Cold War African history can speak of Western influences by a group of small countries
that lie to its north. Oftentimes use of the term "The West" was motivated by racist attitudes towards Slavic Europeans, in that the term was not encompassing of them whereas "Europe" is. In the Near East or Middle East, (both terms relative to Europe as being in the west), the distinction of Eastern and Western Europe is of less importance, so countries that we might speak of as part of Eastern Europe, i.e. Russia are counted as Western when speaking about the general cultural of Europe and Christianity. But the line between East and West doesn't move any further East, even when contrasted with China. Current The Asian countries Japan, South Korea, are sometimes considered part of the West and sometimes not. Latin and South American countries are sometimes considered part of the West and sometimes not. Mainland China, the remainder of the Middle East, India, and Russia are generally not considered part of the West. One should distinguish "Western society" from the socio-economic term "first world" in that, for example, South America is sometimes mentioned as a Western society, but much of it is poor. The term The North has in many contexts replaced earlier usage of the term "the west", particularly in the critical sense. It is a little more coherent, because there is some absolute geographical definition of "northern countries", and this distinction statistically happens to capture most wealthy countries (and many wealthy regions within countries). More typically, the term "The West" contains a pejorative meaning - simply to describe and deliniate the wealthy and dominant societies from the poorer societies - those who are believe they are subjugated economically, military, and otherwise, by deliberate restraints placed on them by the wealthier ones. "The West" then becomes simply a term to mean "Wealthy, Colonial, Europe-descended (or allied) societies". ***
|
|
![]() |
![]() |
|
|||||||||||||||||
![]() |
![]() |
![]() |
![]() |