How did WW1 change American society? World War One changed the lives of America socially, politically and economically. The war had a massive impact on almost every aspect of society, particularly women, workers and minorities.

How did WW1 change American society?

American officials to Wilson: The right to travel through a war zone on belligerent ships isn’t worth dying for.


As the months passed following the Lusitania disaster, Wilson kept up the diplomatic pressure on the German government to a degree that alarmed some congressmen and other prominent Americans.

Senator Wesley Jones of Washington implored the president “to be careful, to proceed slowly, to make no harsh or arbitrary demands, to keep in view the rights of 99,999,000 people at home rather than of the 1,000 reckless, inconsiderate and unpatriotic citizens who insist on going abroad in belligerent ships.” Senator Robert La Follette of Wisconsin spoke of the wisdom of Wilson’s Mexico policy as compared with the president’s policy regarding American sea travel into the European war zone. The policy of warning Americans that they traveled to Mexico at their own risk was, he said, “a small sacrifice on the part of the few to preserve the peace of the nation. But how much less sacrifice it requires for our citizens to refrain from travel on armed belligerent ships.”

Cite This Article
"How Did WW1 Change American Society?" History on the Net
© 2000-2019, Salem Media.
November 13, 2019 <https://www.historyonthenet.com/how-did-ww1-change-american-society>
More Citation Information.