Health knowledge made personal
Join this community!
› Share page:
Go
Search posts:

What is Happening to American Medical Care?

Posted Sep 07 2012 3:03pm

Almost daily I am told accounts of bad U.S. medical care. Today, a woman told me about a friend who went to her doctor and had several questions. The young woman doctor answered one question and then when the patient wanted to ask a second question, she said, " I only answer one question a visit, you will lhave to make another appointment!" I hope the patient immediately changed doctors.

I have some great concerns about what will happen to U.S. medical care, as doctors seem to care less and less. The young doctors seem to be more interested  in making money than in practicing good medicine and many of the young women doctors either don't work after their training, work just a day or two, or stop working after a few years. Often physician assistants and nurse practitioners will listen to problems more than the doctors and yet their training is very inadequate to be in sole charge of a patient. This is particularly true of children. I shudder when I hear an announcement in a drugstore or grocery store saying "Have your child's school physical done by our nurse practitioner in our retail clinic." When a child is not undressed down to their shorts or under pants and their abdomen examined on an examining table, a great deal can be lost. Also if an adequate history, diet history, and family history are not taken,  much can be missed. I think parents and patients have to insist on good medical care and if they are not getting it, they need to change doctors and then complain to the local medical society. Probably nothing will be done in the way of a slap on the wrist of the offending doctors, but at least the doctors can be put on notice that better medical care if needed.
 

Post a comment
Write a comment:

Related Searches