Clip

The aftermath of World War I in Germany
The defeat at the end of World War I led to a culture of resentment and blame in Germany, which manifested in ugly politics and a bankrupted country that couldn't participate in a European order. The United States, under Wilson, tried to shape the peace, crafting a new international order to prevent another global conflict.