What is Imperialism?

, , Leave a comment

What is Imperialism?
Imperialism is when a stronger and more established country creates policies in a weaker country that enables the former to extend its influences in that country. Such influences can include cultural, political, military or economic, among others. The end goal of imperialism is almost always to increase their power, influence and wealth.

The term is derived from the Latin word ‘imperium’ which means ‘command’ or ‘supreme power’. Imperialism was originally used to describe how England and Japan operated overseas in the mid 19th century. Back then, imperialism was all about industrialization and being able to get cheaper raw materials (including slaves) from all over the world to benefit the ’empire’. Nationalism also played a role in the sense that an ’empire’ is seen as more powerful the more territories is has subjugated. The English in particular coined the term ‘white man’s burden’ (Rudyard Kipling), which pertains to the perceived ‘duty’ of the western world to ‘civilize’ the natives who they viewed as savages, to subject them to western culture’”thus the need to extend their influence.

Today, imperialism is still a widely used term which goes hand in hand with capitalism. Influences of stronger nations over economic policies, military and politics have been said to be acts of imperialism because of these influences serving to unfairly and exclusively benefit the country imposing them.

Tea Time Quiz

[forminator_poll id="23176"]
 

Leave a Reply