Why has racism increased in America?
For what reason has has racial discrimination ballooned in the United States of America? It has become commonplace for white Americans to express their dislike against African Americans and other ethnicities. It seems everywhere you look there is racial prejudice and anti-Semitism growing. Are white Americans instinctively racist?