Is the United States in Decline? The United States has been a powerful leading force for democracy and world order since the Great Depression, but there have been clear signs of decline. This article is not intended to be a biased bashing of the United States. I...
By Ray Williams September 21,2021 The United States has been a powerful leading force for democracy and world order since the Great Depression, but there are clear signs of America’s decline. While many American leaders and the media have repeatedly asserted...