Fundamentalism and American Culture
A full history of the origins and direction of fundamentalism in America traces the important influence of this religious movement and their conservative Christian views on American culture, social institutions, politics, education, and more. Simultaneous.