Is Christian supremacy, both socially and legally enforced, a growing threat to the nonreligious people of the west, particularly in the U.S.? The current president of the United States and a growing number of conservative politicians in the U.S. assert that Christianity is the religion of our nation. Our nation indeed once was Christian through and through, but scientific advances lead to the growth of a doubt of the need for religion to explain life, and this has led to a large amount of semi-religious and nonreligious people in the U.S. Now however, the conservatives of America have turned very angry at this change, and have even gone so far as to try to have religion taught in public schools and abortions banned entirely. The phrase "one nation under God" is an ever increasing siren call to the angry and ever more marginalized Christian base, that wants to be the majority once more. This anger appears to have been provoked by Muslim refugees and Muslim terrorism. Is Christian supremacy a growing menace in the future of secular society?