Wednesday, December 14, 2011

What did America gain from starting both WWI and WWII?

WWI and WWII established American power. It showed that the US was capable of participating in war and allowed us to participate in foreign affairs more actively.

0 comments:

Post a Comment