The Major Consequence of World War I on the United States Society

The World War I had major impacts on the European and American society both politically, economically and socially. Initially, the ideology of isolationism by the United States meant the country remained secluded and did not see the need to participate in the First World War (Dunn, 2013). The then United States President, Wilson, felt that the country had nothing to do with the war but an attack by the German submarines on the United States shipping and the former encouragement of Mexico to attack United States, left the country with no option but to enter into the war. However, the War went on into shaping the country’s economy, making it world economic superpower.

Read also Events and Causes That Led To World War I

            Although United States lost thousands of its soldiers to influenza and attacks during the war, the war brought many economic changes (MacMillan, 2014). The war helped to end depression in the country, creating a number of employment opportunities for the unemployed United States citizens. Some employment opportunities for blacks and women were created. The post-war women revolution empowered women economically, and changed the status of women in the United States society. The minority groups began to receive recognition and economic livelihood in America.

Read also Impact of Geography on the Conduct of War

            The World War I weakened established monarchies like Austria and Germany and the unrest disrupted the productivity as most industrial productions diverted their efforts towards the war (MacMillan, 2014). The superior European economy was largely affected and powers shifted to the United States, which became the largest creditor in the world. Moreover, the war boosted the American manufacturing industry and other industries. It led to fast conversion of the economic strength of the United States into military and diplomatic superpower. Before World War I, the British had the biggest navy. However, the war propelled United States into the world’s leading manufacturer with the largest amount of gold to back its dollar. Therefore, the World War I brought huge economic changes, boosting the American economy and the life of the entire population.

Read also Impacts Of Second World War On Canada

            In addition, the war created a new perception regarding the leadership and government control of the American citizens. Through the war, over fifty thousand American soldiers died, while many others were wounded (Salvante, 2013). Those who returned home helped to carry the Spanish flu back home. It is estimated the flu killed over two million people worldwide, as soldiers who returned home unknowingly took the virus with them. In America, the loss of its soldiers led to development of new ideas, which shaped the country’s sensibilities. President Wilson introduced the idea of self-determination and the introduction of democracy to make world a better place to live.  

Read also American Art Before and After the Second World War

Actually, World War I changed the role of the United States in the world (Salvante, 2013). The initial ideology of isolation was discarded and the country changed its foreign policy and participated in shaping the world democracy. Through its position as the world’s superpower that was attained through the war, the United States began to participate and express concern for world peace and respect for humanity. The Americans initially felt that they were safe and immune to external attack. However, the German navy attack changed the isolationism of the America and led to their participation in the war and creation of world democratic societies. Moreover, the American citizens begun to realize that they needed to be more unified in order to stand against external aggression.

Get Your Custom Paper From Professional Writers. 100% Plagiarism Free, No AI Generated Content and Good Grade Guarantee. We Have Experts In All Subjects.

Place Your Order Now
Scroll to Top