1. The United States took a leadership role on the world stage in the 1940s. Evaluate the United States’ performance in that regard from World War II through the Cold War.2. Evaluate the role of the federal government in ensuring equality for all Americans from the Great Depression through the 1960s.
3. Evaluate the role of government in American society from the New Deal to the New Right. 4. Americans have an innate distrust of government. Do you buy that generalization or is it too simplistic? Using evidence from the New Deal to the present, what major factors support your argument? For more information on 1940s read this: https://en.wikipedia.org/wiki/1940s