An expert maintains the view that product liability laws should be applied to ensure the safety of artificial intelligence in case of medical accidents caused by medical AI. (Credit: Getty Images)
An expert maintains the view that product liability laws should be applied to ensure the safety of artificial intelligence in case of medical accidents caused by medical AI. (Credit: Getty Images)

As the use of artificial intelligence (AI) increases in the medical field, experts are calling for an overhaul of the legal system to ensure the safety of using AI.

More specifically, they maintain that the product liability law should be applied, assigning responsibility to manufacturers in case of medical errors caused by medical AI.

One such expert is Professor Kim Hwa of Ewha Womans University School of Law, who recently released a paper titled "Medical Artificial Intelligence and Civil Liability" in Bioethics Policy Research published by the Ewha Institute for Biomedical Law & Ethics.

In the paper, Professor Kim noted that there are efforts to utilize AI in the medical field by, for instance, using AI in diagnosis and image reading to relieve doctors of overwork and allow them to focus on their practice. Kim emphasized that it is necessary to consider how to be legally responsible if a medical accident occurs due to using AI.

Professor Kim explained that the current use of AI in the medical field adopts a "human-intervention type" method, in which a doctor makes the final decision. So, if a medical accident occurs due to AI, the doctor will be held responsible, Kim added.

"In the case of medical AI, it is inevitable that human-interactive AI, which requires direct human intervention, will be designed in principle because it can affect the body and health," Kim said. "Therefore, even with the help of AI, doctors will be held responsible for medical treatment and final judgment."

However, as the technology of medical AI advances, the intervention of doctors, who are ultimately responsible, may decrease. So, it is necessary to supplement the legal system to ensure they can be held accountable for adverse outcomes.

"Incentives to utilize medical AI are necessary to reduce the burden on individuals due to the shortage of medical personnel. However, it is difficult to agree on how to reduce the liability of medical errors (for doctors)," he said.

However, as the technology related to medical AI develops in the future, it is possible that the relationship and behavior of doctors as the final responsible party may be diminished. In this case, there may be questions about who will share responsibility for adverse outcomes caused by medical AI and how they do it, Kim added.

To ensure AI's safety, he suggested the Product Liability Act, which imposes partial liability on manufacturers, as an alternative. Through the duty to observe manufacturing and prevent risks stipulated in Article 48, paragraph 2 of the Product Liability Act, manufacturers can be responsible for reactively managing the safety of medical AI.

"Product liability law can be a reasonable alternative for medical AI," Professor Kim said. "The manufacturing inspection and risk prevention duty in Article 48(2) of the Product Liability Act impose an obligation on manufacturers to prevent damages that may occur after the product is placed on the market, so it is likely to have important implications for adverse outcomes caused by AI. This will help ensure the safety of AI after the fact."

However, medical AI that exists only as software is difficult to interpret as a product. Therefore, legislative efforts are needed to include AI that exists as software under the product liability law, Kim said.

“It may be difficult to easily recognize defects in products due to the characteristics of AI, whose algorithms change according to the external environment and learning," Kim said. "It may be necessary to introduce new rules for the existence of defects and presumption of causation that can easily recognize defects in AI according to the changing environment.”

 

Copyright © KBR Unauthorized reproduction, redistribution prohibited