Ever since the field of biology emerged in the United States and Europe at the start of the nineteenth century, it has been bound up in debates over sexual, racial, and national politics. And as our social viewpoints have shifted, so has the science of the body.

Report Quote Report Quote Report Quote Submit Quote Submit Quote Submit Quote