Ericka Ohagan
Wednesday, December 14, 2011
What did America gain from starting both WWI and WWII?
Your Ad Here
WWI and WWII established American power. It showed that the US was capable of participating in war and allowed us to participate in foreign affairs more actively.
0 comments:
Post a Comment
Newer Post
Older Post
Home
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment