Medicine in the American West

Medicine in the American West

Explores the medical practices and advances made in the American West during the 1800s, explaining the often dangerous surgical and medical procedures and lack of sanitation.

Become a Librarian

Reviews

Popular Reviews

Reviews with the most likes.

There are no reviews for this book. Add yours and it'll show up right here!