Germany–United States relations

Germany-U.S. relations are the bilateral relations between Germany and the United States. The two nations fought against each other in two world wars. The U.S.-German relationship is very positive in terms of democratic ideals, anti-communism and high economic trade and trade trade.