The+West

What is the west? The "WEST" we talk too much about in history includes Europe and the United States. Europe is geographically located in the western hemisphere of the Earth, that's why it is called the West. But, the term the west represent all these countires that have had similar historical paths, and European nations have been ruled by monarchs, and then by democracies. As they industrialized, they became superpowers, and for some years, they were the world. However, the West, I think, also included the United States. Why? Well, the United States was also so similar to European nations such as Britain and France. And it also became engaged in the imperialist wave of the 19th century, and it was proven a match for the European powers during WWI and WWII. So basically, because of its role and influences, the US is also the west.