Does anyone know a good article or book (or several) that explains why U.S. Americans are so religious/Christian (for instance, as compared to Europe which is so secular)? I’m sure I could find some, but I’m wondering if there are ones people know of that are actually good or accepted as pretty on-target. I know, I know. I’m in divinity school and should know this. But I don’t.