Could artificial intelligence end the electronic medical record nightmare? 

Medicine is an oral science. People talk to doctors about their problems. Doctors listen and ask questions. Doctors tell the patient's story to other doctors to share information and gain new ideas. Doctors and nurses talk about a patient's progress. They talk to social workers and physical therapists and all sorts of experts who can help solve the patient's problems and improve their health care.

The electronic medical record has killed the oral science. Doctors now hunt and peck for the information to share. Nurses stare at screens, taking half an hour to enter data, something that used to take three minutes. As far as I can see, everyone in health care hates the new quantified medical record except the insurance companies. There are hundreds of editorials by doctors documenting the fact they can only see two-thirds of the patients they used to see if they have to spend their day entering data.

Apple's Siri, IBM's Watson and their relatives could solve this.

Here is an example:

"Siri, I would like to admit Ms. Jones to the hospital for her knee replacement."

"Sure, Dr. Stone, shall I use your pre-op order set?"

"Yes."

"Tell me the medications she is on." After I speak the medications' names, Siri might ask, "OK, let's be sure to let her cardiologist know to adjust her blood thinners a few days before surgery. And by the way, the medication she is on has been recalled and this alternative is recommended."

You can see how this could go. Artificial intelligence has long since solved these highly formulaic situations and could prompt doctors to be better at their jobs. Medical staff would never have to waste precious time looking through a dozen menus and screens of unrelated information.

The nurses could dictate their findings on their rounds and, through Siri, a message could notify the doctor if a wound is not looking right. The patients could tell Siri their medical history before coming to the office and then the doctor could review it with them and fill in the color the patient didn't think of.

The point is, we can all talk. We just can't type and hunt and peck efficiently. And Siri can both listen and add the information we may never have even learned in medical school making both the patient and the doctor smarter.

And even the big insurance companies may love it as the medical record may now actually be accurate instead of a cut-and-paste job from predone records. Not to mention that Siri could prompt the doctor to consider a less expensive alternative to a drug, a dressing or a therapy. With just conversation, all doctors could have the knowledge base of supercomputers to make their differential diagnoses. So, shall we all talk to each other again?

Dr. Kevin R. Stone is an orthopedic surgeon at The Stone Clinic and chairman of the Stone Research Foundation in San Francisco. He pioneers advanced orthopedic surgical and rehabilitation techniques to repair, regenerate and replace damaged cartilage and ligaments. For more info, visit www.stoneclinic.com.

About The Author

Dr. Kevin R. Stone

Bio:
Dr. Kevin R. Stone is an orthopedic surgeon at The Stone Clinic and chairman of the Stone Research Foundation in San Francisco. He pioneers advanced orthopedic surgical and rehabilitation techniques to repair, regenerate and replace damaged cartilage and ligaments.
Pin It
Favorite

More by Dr. Kevin R. Stone

Latest in Kevin R. Stone

© 2018 The San Francisco Examiner

Website powered by Foundation