Home
Perspectives
- Don't Thwart Innovation!
Don't Thwart Innovation!
Whoever wants to take part in the global race to develop artificial intelligence (AI) for medicine must first create the right conditions for it. At the Siemens Healthineers symposium during the German government’s Digital Summit 2018, innovators, politicians, and lawyers made the case for better framework conditions – and a public discourse.
Photos: Sebastian Gabriel
At a time of pervasive electronic networking, classic healthcare is developing into a “digital health ecosystem,” which not only networks medical facilities, but uses countless other data sources to provide optimal patient care and early patient involvement. The individual patient should be central in all this, said Professor Björn Eskofier of the Friedrich-Alexander University Erlangen-Nuremberg.
Data must be more readily available
The head of the Machine Learning and Data Analytics Lab illustrated this point with two examples of wearables and other sensors helping create highly individual data sets that are then used in therapeutic decision-making. The first example was a device to monitor the heartbeat of unborn children based on AI algorithms. The second example was that of a portable AI system for motion analysis in Parkinson's patients that uses deep learning algorithms to evaluate for the doctor parameters correlating with the course of the disease, such as step length and standing phase.
But a comprehensive digital ecosystem that makes these and other applications possible does not come about by itself. Political and legal conditions must be created, and the ethical aspects of the use of AI in medicine need to be widely discussed in society. “Our wish list for politicians includes better support for startups and more regulatory clarity. In addition, we need measures to make capital and data more readily available, and a training program to attract enough skilled workers,” said Eskofier.
“We have a lot of catching up to do”
Politicians also recognize that there is much scope for optimization on the regulatory side: “We have a lot of catching up to do in this area," said Tino Sorge, a member of the Health Committee of the German Bundestag. Sorge emphasized that it was up to the legislature to ensure that laws such as the forthcoming “E-Health Act II” enable innovation and meet international standards. “We shouldn’t only be talking about data protection,” said the politician.
As President of the Board of Medical Valley European Metropolitan Region Nuremberg, Professor Erich Reinhardt welcomed the increasing willingness of politicians to make progress in the digitization of healthcare and the introduction of medical AI applications. However, it is not enough to talk, action needs to be taken, stressed the former Siemens manager: “We have test tracks on highways for autonomous vehicles. It would be good if we could similarly establish test tracks for digital health.”
Making the case for personal health managers
Reinhardt also pointed out that in a digital public health ecosystem it must be possible to access all kinds of medical data, with the consent of the patient or citizen. Specifically in the case of AI applications it would also be useful if large annotated data sets were made available that small and medium-sized companies in particular could use to train algorithms: “That would speed many things up enormously and help make healthcare more efficient and effective – for the benefit of the citizen,” said Reinhardt.
From a legal perspective Professor Christian Dierks of Dierks+Company emphasized that even after the implementation of the European General Data Protection Regulation, questions still remained regarding the rights to, and treatment of, health data. The lawyer expects that the role of the patient will be redefined within the framework of the “E-Health Act II” and patients will be given the means to decide to whom and to what extent data should be made available: “Some say patients cannot do this, but they will have to be able to because no one else will do it.” It is also conceivable that the patient would delegate this task: “Maybe there will be a new profession of personal health data manager,” said Dierks.
Black box algorithms: Can an AI application be liable?
Another aspect of artificial intelligence that needs further discussion was raised by Thomas Friese, PhD. Friese is General Manager Data Architecture & Technology Platforms at Siemens Healthineers. He reported that in the field of research much effort is being put into opening up the “black box” of AI under the motto of “explainable AI”. Especially the new deep-learning algorithms tend not to reveal why they make the recommendations they do.
Is that a problem in medicine? Dierks sees ambiguity at least in terms of civil liability law. Because as a rule it is the agent who is liable. If an AI application is designed in such a way that doctors have to accept its recommendations then they are no longer strictly speaking liable as all they can do is check the plausibility of such a recommendation. Should this plausibility check become too time-consuming, there is a risk that at least some AI applications will be put on hold.
How that should be handled, whether from a technical or a legal perspective, is open. “One possibility would be to give AI its own legal status so that it can be held liable and would then also be insurable,” said Dierks, introducing a completely new idea to the debate. Whatever solution is finally arrived at in these liability and data sovereignty questions, the public discourse should be taking place now.
Share this page
Philipp Grätzel von Grätz lebt und arbeitet als freiberuflicher Medizinjournalist in Berlin. Seine Spezialgebiete sind Digitalisierung, Technik und Herz-Kreislauf-Therapie.
- The statements by Siemens Healthineers customers described herein are based on results that were achieved in the customer’s unique setting. Since there is no “typical” hospital and many variables exist (e.g., hospital size, case mix, level of IT adoption) there can be no guarantee that other customers will achieve the same results.