Will the healthcare system be fully automated by robots?
Robots - the doctors of the future?
DUSSELDORF. Artificial intelligence (AI) has the potential to change the entire healthcare system and thus also treatment. Already today, their targeted use can lead to enormous time and cost savings in the form of faster diagnoses and targeted therapies.
Will we be in the "Dr. Algorithmus" office hours in the near future? Does he shake our hand, ask about our symptoms, then make a diagnosis within seconds and discharge us with the appropriately prescribed medication? Or is a visit to the doctor even completely unnecessary in many cases?
In Germany there is still much skepticism about this, but according to a PwC survey, 41 percent of Germans are generally open to treatment by an AI.
The humanoid robot Xiaoyi ("little doctor") successfully passed the Chinese medical approval test in November 2017, with an above-average success with 96 points.
In preparation for the medical exam, he had previously been fed around one million medical images, 53 medical books, two million medical files and 400,000 medical articles and reports.
Big data requires the use of AI
Xiaoyi now has the necessary medical expertise to practice as a licensed doctor in China. It will soon be available for purchase, according to the plans of the company IFlyTek, which Xiaoyi developed and implemented.
According to the Master Plan 2025, AI is to be more closely integrated into professional areas of activity in China - including in the healthcare sector.
But even in this country, the use of AI in the field of health care has led to numerous achievements. Even today, robots in the operating theater assist with highly complex interventions with the highest precision, in the laboratory blood and tissue samples are examined fully automatically, and in a ward in the hospital a robot system monitors the administration of medication or the patient's vital functions.
The enormous amounts of data that arise today - big data - make the use of AI in everyday clinical practice necessary. AI can reach or even exceed the quality of doctors. Not only does it process existing information, AI can even learn new things by recording symptoms, analyzing X-rays and making initial diagnoses.
The advantage is that this happens within a very short time. Whether brain tumor or intestinal polyp, self-learning AI algorithms are already used today for the early detection of tumors or the evaluation of x-rays.
The AI-supported systems, image-supported diagnostic procedures, use their databases for this purpose, which were previously fed with thousands of images.
The result is an "artificial superbrain" that not only makes the most accurate diagnosis possible within a few seconds with a corresponding therapy proposal, but above all - in contrast to a human brain - never forgets.
Doctor's reservation is a high hurdle
The aim is to use AI despite existing medical legal hurdles. It is noteworthy, however, that certain measures - currently still - may only be carried out by doctors. A medical license is required to practice medical medicine.
This doctor's reservation applies to services which require specialist medical knowledge to control and treat health hazards. Therefore, the use of software that circumvents this doctor's reservation is currently reserved for special model projects.
In addition, there is the principle of personal service provision, which is one of the essential characteristics of medical activity. This is mainly characterized by personal doctor-patient contact.
It is possible for the doctor to delegate services within limits. However, the doctor must act personally if the difficulty, danger or unpredictability make the use of his specific specialist knowledge necessary.
The reimbursement system of the health insurance companies also requires a medical service. Can a treatment service provided be remunerated if it is not performed by the doctor but by artificially intelligent software or an AI robot? Does this activity fall within the scope of delegable services?
Nothing would be gained for the doctor or the hospital if diagnosis and therapy were carried out using AI, but they were not paid for by anyone themselves.
Liability issues too
In addition to these questions, from a legal point of view there is always the question of liability when using artificial intelligence. Is it an operating error by the hospital staff or is it the hospital's organizational fault if someone is harmed?
According to the current status of the political and legal discussions on this, the responsibility should primarily remain with the person or the user and not with the robot or its AI algorithm or its manufacturer.
In any case, the technological development requires a new regulation of liability issues and insurance models. It becomes particularly interesting when it comes to compliance with medical confidentiality when using AI.
In the past, professional secrecy, which is protected by criminal law, made it difficult to use software solutions that could transmit patient data to third parties.
With a new regulation of the criminal law regulations, doctors and hospitals can fall back on an unprecedented extent to external service providers. This should also have a positive effect on the solutions and applications discussed here.
As before, however, when using external service providers, the existing requirements of so-called order processing must be observed and corresponding contracts concluded.
Trust in a doctor has an impact on the success of the therapy
Medical practice will change in the future, there is no question of that. AI can support medical work in certain areas, but it can never replace the doctor as the "last resort".
Trust in the treating doctor and the interpersonal relationship between doctor and patient should not be underestimated in terms of their influence on the success of the therapy. Targeted use of AI can instead reduce the workload on doctors as a result of increased efficiency.
As a result, the doctor could at best take more time for the individual patient. Few believe that AI can ever learn empathy and replace the special, interpersonal relationship of trust.
Karolina Lange, LL.M (medical law), is a lawyer in the Düsseldorf law firm Taylor Wessing who specializes in regulatory advice in the healthcare sector.
Jana Hammesfahr works in the same law firm as a research assistant in the field of health law.
- Which gloves are good for winter riding?
- When and where was there Maya
- What is a pumpkin flute
- Why is coffee called Java
- How do I get Kaiser insurance
- What do you like most about Jamshedpur
- What does bomboclat mean in Jamaican
- Why do some people hate Buddhism
- What is hello in chinese
- What exactly is a buster
- Who was your childhood sports hero?
- How do we get creative in programming
- Operations management technology used in design
- How is postnatal depression diagnosed
- Does the frequency affect an alternating current
- How can people make paper at home
- Has anyone ever died on Quora
- Has AMD already published its navigation cards
- So it's a word
- What does it take to be popular
- How do children develop creative thinking
- What is a fertility test
- What is Goud
- What are some properties of heterogeneous substances