Online archive of questions on various topics answered by our experts. You can also ask a question (registration is required)
+18 votes
Why do Americans trust their doctors when they admit they don't know everything about health?
by (4.4k points)

2 Answers

+123 votes
 
Best answer
I am not sure they all do, but as a whole they have performed pretty well (doctors). They can't force us to eat well,live less stressful lives,love more. They do end up trying to fix us after we are broke. I like to keep in mind it is called the "practice of medicine", what does that say :) they are still in the process of learning.
by (4.5k points)
selected by
0 votes
Wow...umm and exactly where did you come upon this startling information about our American doctors (not that you're generalizing or anything)?. . If you're going to post such a bold claim, tell us where we can read/hear for ourselves that American doctors are "clearly admitting" they can't do their jobs.. . A#s.
by (4.6k points)
...