IMDRF Lays Groundwork for Harmonized Machine Learning Regulation

The International Medical Device Regulators Forum (IMDRF) has released a document that provides a number of key terms and definitions for machine learning (ML) that regulatory agencies may find helpful as they develop their own regulations. One of the key discussions found in the document addresses the meaning of the “locked” algorithm, a consideration that may prove critical to advancing harmonization over these critical next few years.

The IMDRF has proposed a new regulatory term, the machine learning-enabled medical device (MLMD), and the question of how to characterize the various types of changes to an MLMD appears in Section 7.1.1. That section states that the word “locked” has been described in a number of different ways, including when the algorithm does not perform continuous learning and when the developer has no impending plans to adapt the algorithm. This might be a key definitional problem to resolve because the way the term is defined may lead to different regulatory determinations as to when an algorithm is and is not an MLMD.

Complex Regulatory Questions Encoded in Definitions

The document provides definitions for a wide range of terms, such as continuous learning and reliability. However, there are number of more complex considerations taken up. Among the key questions addressed in Section 7.1.1 is how to describe the aspects of a change to an MLMD, such as cause, effect, trigger, domain, and effectuation. The domain of the change can be heterogenous when it describes a local adaptation of the algorithm at one clinical site or a series of sites within a given region.

Heterogenous changes can also be in reference to the operation of the algorithm within the context of a single demographic. Conversely, a global change is a change that occurs to all iterations of the algorithm regardless of the location. Another point of consideration is the agent of effectuation of a change, which can be external or internal to the algorithm. The external effectuator may be the operator of the clinical site, whereas the internal effectuator is a predetermined change encoded in the software. The task for regulatory agencies in this context may be to make a clear statement regarding the locus of regulatory liability when the software is licensed by a developer to a clinical operator, particularly when the licensing agreement fails to delineate that responsibility.

Another critical piece of the regulatory puzzle, which is described in Section 7.1.2, is a change to the environment for data. The cause of an environmental change might stem from changes to the format and/or quality of the data inputs, or an addition of or changes to the patient population. The effect on the algorithm’s operation may be positive or negative, although the term “domain” in this context also takes up homogenous and heterogenous changes, similar to how the term domain is used with regard to changes to the algorithm itself.

Some of the terms found in this document were already in place when the IMDRF formed, as the Global Harmonization Task Force (GHTF) had already laid some of the definitional groundwork. IMDRF made reference to a developmental standard, ISO/IEC DIS 22989, as a reference for the definition of machine learning, which defines the term as the use of computational techniques to optimize the operation of the model such that the model’s behavior “reflects the data or experience.” IMDRF said that MLMD is a medical device that makes use of machine learning, in part or in whole, to achieve its intended medical purpose, or intended use, a term of regulatory art with an interesting history.

There are a number of potential regulatory conflicts when the ML is an accessory to a medical device because some jurisdictions treat accessories as devices whereas others do not, IMDRF said. There is also some consideration for how a medical device is defined in a given jurisdiction, given the difference in how particular product types and/or intended uses are handled. IMDRF pointed out that the commercial distribution of device that incorporate human tissue is not permitted by all regulators.

In addition to ISO/IEC DIS 22989, the document refers to ISO/IEC TR 24027 for control of bias. ISO/IEC TR 24027 describes both wanted and unwanted bias, with an algorithm trained to detect a specific disease or condition preferentially over similar conditions serving as an example of wanted bias. Presumably a desired bias becomes part of the statement of intended use, but the document provides no additional background on this consideration. Other terms that are defined in the IMDRF document include supervised and semi-supervised learning, both of which are described in ISO/IEC DIS 22989.

Harmonization on Uneven Footing

The context for this IMDRF glossary is complex, given that regulation of artificial intelligence in the European Union is itself in a state of flux. The EU’s Artificial Intelligence Act could be duplicative of the Medical Device Regulation (MDR), which still is not yet fully implemented. Another complicating factor is that the EU’s new European Health Data Space legislative proposal could create an environment in which an electronic health record that makes use of artificial intelligence of any sort is necessarily subject to a total of three regulatory regimes, the MDR, the Artificial Intelligence Act and the Health Data Space legislation.

Fortunately, not all the regulatory and legislative activities in the areas of AI and ML seem to contribute to regulatory disharmony. The FDA, Health Canada, and the Medicines and Health Care Products Regulatory Agency (MHRA) struck a blow for harmonization recently with their paper on good machine learning practices, although MHRA is still forging a new regulatory system for devices of all types. Thus, developers of MLMDs should stay closely attuned to these developments in order to minimize the regulatory friction on their way to market.

Get more of Enzyme

Sign up for the latest updates in your inbox
Ready to level up? Inquire about certification.
info@enzyme.com or

Ready to do more?