History of American Capitalism
For better or for worse, capitalism is the philosophy that has come to define the United States. In this intriguing essay, Beckert takes a look at the historiography of American capitalism, which has been, according to Beckert, ironically neglected by historians until recently.