Opinion>Readers Voice
         
 

Who or what is the Western World?
genesis  Updated: 2004-07-13 10:02

I've read tons of post which puts China V. Western World or something of a lesser nature. Others are more like Western sucks and blah blah blah, but usually everybody thinks of western world in all different ways.

Here's an excerpt from an Encyclopedia about the meaning of Western World:

***

During the early 16th century, explorers and conquerors like Christopher Columbus and several other conquered new continents on behalf of the Western nations. Up until the 19th century, Europeans settled new lands and thus the term "Western" came to encompass nations and former colonies such as the United States, Canada, Australia, New Zealand, etc. populated mostly by European-descended Caucasians.

Japan in 1955, (immediately after its occupation by the US) would be considered by most to be part of the West - while Japan in 1750 would not. Similarly, North America in 1850 would be considered part of the West while it would not be in 1450, or 1500, even - before substantial colonization had occurred.

Cold War
During the Cold War, a new definition emerged. The Earth was divided into three "worlds", numbered "1st, 2nd and 3rd". The first were NATO-members and other nations aligned with the United States. The second world were the Eastern bloc nations in the Communist sphere of influence, such as the Soviet Union, People's Republic of China, etc. The third world were nations unaligned with either. Hence, the Western world became a synonym for the first world.

Post-Cold War
After the end of the Cold War, the phrase "second world" fell into disuse, and "first world" came to refer to the democratic, capitalist, wealthy, industrial, developed nations, which had been characteristic for most nations aligned with the US. The "third world" came to refer to the poor, unindustrialized developing nations. That is, the term "Western" is not so much a geographical definition as it is a cultural and economic one, therefore:

African history can speak of Western influences by a group of small countries that lie to its north.

Australia can be considered a Westernized country located in the East.
International companies founded in America may be considered foreign influences in Europe, but be said to be Western when their presence is seen (and sometimes criticized) in the Orient.

Nowadays, people differ in their definitions of the West, and different definitions overlap only partly. There are certainly non-Western developed nations, not all Western countries are members of NATO, etc.

Oftentimes use of the term "The West" was motivated by racist attitudes towards Slavic Europeans, in that the term was not encompassing of them whereas "Europe" is.

In the Near East or Middle East, (both terms relative to Europe as being in the west), the distinction of Eastern and Western Europe is of less importance, so countries that we might speak of as part of Eastern Europe, i.e. Russia are counted as Western when speaking about the general cultural of Europe and Christianity. But the line between East and West doesn't move any further East, even when contrasted with China.

Current
Depending on context, the Western countries may be restricted to the founding members of NATO, the European Union (EU) and Switzerland. A broader definition might extend to Australia and New Zealand and sometimes Israel.

The Asian countries Japan, South Korea, are sometimes considered part of the West and sometimes not.

Latin and South American countries are sometimes considered part of the West and sometimes not. Mainland China, the remainder of the Middle East, India, and Russia are generally not considered part of the West.

One should distinguish "Western society" from the socio-economic term "first world" in that, for example, South America is sometimes mentioned as a Western society, but much of it is poor.

The term The North has in many contexts replaced earlier usage of the term "the west", particularly in the critical sense. It is a little more coherent, because there is some absolute geographical definition of "northern countries", and this distinction statistically happens to capture most wealthy countries (and many wealthy regions within countries).

More typically, the term "The West" contains a pejorative meaning - simply to describe and deliniate the wealthy and dominant societies from the poorer societies - those who are believe they are subjugated economically, military, and otherwise, by deliberate restraints placed on them by the wealthier ones. "The West" then becomes simply a term to mean "Wealthy, Colonial, Europe-descended (or allied) societies".

***
Anyways for those who wish to demean the western world. they should at least know which or who they are talking about?

The above content represents the view of the author only.
 
  Story Tools  
   
Advertisement