Clarifying and strengthening the regulation of Artificial Intelligence (AI)
Feedback updated 7 Feb 2025
We asked
The TGA sought feedback on a consultation paper Clarifying and strengthening the regulation of Artificial Intelligence (AI) between 12 September 2024 to 20 October 2024. The consultation was part of the Australian Government’s Supporting Safe and Responsible AI Budget measure and included a number of proposals aimed at managing future risks and leveraging opportunities associated with the use of AI models and systems within, or as, therapeutic goods.
Specifically, we sought feedback regarding:
- potential changes to language, terminology and definitions within the Therapeutic Goods Act 1989 (the Act) and the Therapeutic Goods (Medical Devices) Regulations 2002 (the Regulations)
- the appropriateness of the TGA’s existing regulatory approach and requirements for medical devices that are, or incorporate, AI
- risks and/or advantages of maintaining international harmonisation
- the appropriateness of currently excluded software under the Excluded Goods Determination 2018 (the Determination)
- public perception of what ‘transparency’ means in the context of AI technology and what measures might be put in place to address this issue
- education material and guidance needed to provide clarity.
You said
Fifty-three (53) responses were received from a range of stakeholders from within the healthcare and therapeutic goods sectors, including members of consumer representative organisations, health professional peak bodies, government entities and the medical device industry, including sponsors and manufacturers (developers) of software.
Most stakeholders agreed that:
- the TGA’s existing risk and principles-based regulatory framework is flexible, robust and largely fit for purpose to meet the current and emerging risks associated with AI technology
- there is opportunity for amendments and improved guidance resources that will help improve stakeholder understanding by clarifying and strengthening the existing framework
- ongoing review and refinement of existing definitions or clarification through guidance should be undertaken in harmonisation with broader national and international activities to ensure clarity
- if future refinements are required, they should be informed by further consultation with stakeholders.
Stakeholder feedback identified areas for further review and public consultation including:
Definitions
The majority (91%) of stakeholders confirmed the TGA’s existing framework and technology-agnostic approach is robust and flexible to effectively regulate AI where it is, or is incorporated within, a therapeutic good. However, 78% of stakeholders from healthcare and therapeutic goods sectors commented that terminology in the Act and the Regulations, whilst generally understood by traditional medical device manufacturers, is not intuitive to developers within the software industry.
Specifically, stakeholders recommended further review and clarification of certain terminology and definitions including ‘manufacturer’, ‘sponsor’, and ‘supply’ to ensure clarity and alignment with other legislation and international frameworks. More broadly, stakeholders proposed the inclusion of terms such as ‘software’, ‘bias’, ‘AI drift’, ‘locked model’, ‘autonomous learning’, ‘substantial change’, ‘incorporates software’, and ‘programmed or programmable medical device’ to the legislation.
Stakeholders also requested guidance be developed to interpret these terms, to ensure clarity and resolve disparity with terminology used in other legislation and international frameworks. Feedback from the therapeutic goods industry further stated that any proposed clarifications or amendments should be made only where necessary and in alignment with international standards.
Roles and responsibilities
The majority (81%) of stakeholders agreed that a review of the definitions within the Act and subordinate legislation is required to clarify responsibility for the development, deployment, and use of AI models and systems. Stakeholders expressed concern that existing legislation does not clearly stipulate the responsibility for the outcomes of advanced AI technologies, including adaptive AI, where these outputs constitute an offence under the Act. Specifically, stakeholders recommended that where AI replaces human services or when the deployer is unaware of outputs that legal clarity be provided as to what constitutes an offence.
Medical device industry stakeholders noted clarity is required regarding regulatory responsibilities for activities unique to software and AI technologies. These activities include:
- supply through online marketplaces where products are hosted on overseas servers
- use of open-source software and datasets
- how to account for outputs of adaptive or generative AI models where the original deployer does not have oversight of the relevant output.
Health professionals strongly expressed views that the legal responsibility for the safety, quality, and performance of software, including AI, should be assigned to manufacturers and sponsors. Health professionals stated that their responsibility was to understand the risks, safe operation, ideal use cases, and how to verify the AI models and systems they choose to use within the context of their clinical practice.
Compliance
Stakeholders confirmed the use of AI products is already prevalent within the healthcare sector, driven by benefits including increased efficiency, improved patient health outcomes, cost reduction, improved accessibility and capability. However, there were issues associated with the use of these products, both observed and reported in responses received, including:
- a general lack of understanding of what products meet the definition of a medical device and are therefore regulated by the TGA
- inappropriate use of AI-enabled products due to a lack of understanding or misinformation about the intended purpose of these product.
The majority (78%) of stakeholders requested that the TGA continue to directly engage with the software sector to deliver improved education and guidance materials to help clarify existing regulatory obligations.
Classification rules
Most stakeholders (61%) indicated the current classification rules are largely appropriate for use in medical devices that are, or include, AI models or systems and immediate changes are not required.
Additionally, most stakeholders (76%) indicated a future review of the existing classification rules is needed for software based medical devices intended to provide a prediction or prognosis. Therapeutic goods industry stakeholders suggested these changes should only be initiated when more evidence is available regarding the use of these kinds of products, and at a time when other jurisdictions are considering a similar classification rule change.
Essential principles
Most stakeholders (64%) broadly agreed the existing essential principles for safety and performance remain appropriate for use in medical devices that are, or incorporate, AI. However, respondents indicated more information should be developed and disseminated to explain the TGA’s requirements with respect to:
- ongoing validation of adaptive and generative AI
- use of open datasets
- open-source software
- performance reporting in clinical settings
- labelling requirements
- instructions for use.
Regulation for specific AI types and subtypes
Members of the medical device industry expressed concern that regulating AI as a whole or as individual subsets will be challenging because of the variety and complexity associated with these technologies. These respondents indicated regulatory requirements should continue to be centred on risk factors (such as the severity of the condition a device is intended to treat, the intended user of the device, etc.) rather than specific technologies or subtypes of AI. Where definitions and clarity about AI types and subtypes are sought, stakeholders generally felt this would be better achieved through guidance rather than specific regulations.
Excluded software
Most respondents expressed that many of the current software exclusions remain appropriate. However, further guidance is needed to better support stakeholders with understanding the conditions of exclusion. However, 62% of consumer and health professional stakeholders expressed concern regarding the existing conditional exclusion for certain low-risk digital mental health tools. Particularly where these tools are supplied to consumers independent of clinician oversight and review.
Most (53%) respondents from across all stakeholder cohorts called for a more detailed review and ongoing monitoring of the conditional exclusions including digital scribes and consumer health products. Some members of the medical device industry cautioned changes to the conditional exclusions should avoid the regulating low-risk products and recommended proposed changes be the subject of future consultation.
Advertising and transparency
A consistent theme from stakeholders (71%) across all cohorts was the desire for access to more information about therapeutic goods and how they are assessed and approved by the TGA, including when accessing digital therapeutic goods through virtual and online environments.
Consumer and health professional stakeholders requested access to additional publicly available information including:
- the model and/or trade name(s) for ARTG included goods
- specific intended purpose or indications of the good
- whether the device is, or operates using, an AI model or system, and information about the datasets that have been used to train and test the AI
- greater transparency regarding updates and new versions to help assess whether the outputs of a product are likely to change as a result
- in-app or in-product notifications tied to risks the user should be aware of when using it.
Stakeholders acknowledged the utility the unique device identifier (UDI) system will offer to assist with transparency. However, access to information about all therapeutic goods, including software based medical devices may require further review of the Advertising Code and medical device labelling requirements.
Guidance
Most respondents (78%) from across all stakeholder cohorts noted that while many guidelines and standards relating to the regulation of therapeutic goods and provision of healthcare services exist, there are few resources explaining how they should be applied to emerging technologies including AI. Some members of the medical device industry acknowledged review of these resources and development of new standards and guidelines is currently underway.
These stakeholders also suggested a general review of the TGA website is required to improve accessibility, searchability, and software-related content generally. The majority of respondents (91%) requested the reinstation of specific content and landing pages for consumers and health professionals. Members of the medical device industry requested the development of more guidance and resources for AI-specific topics including explicit information about the technical requirements for different subtypes of AI. Stakeholders from the medical device industry also requested that guidance and information be made available through a range of mechanisms, including social media platforms.
Continuous change control for adaptive AI
Members of the medical device industry and health professionals noted continuously adapting AI models are likely to require constant monitoring and real time evaluation of performance to ensure the quality of system outputs do not degrade over time. These stakeholders also expressed that tailored AI models would likely need to be deployed by manufacturers to perform this rolling review of model performance.
Some voiced desire for clear guidance regarding what constitutes ‘significant change’ in the context of software as a medical device. These stakeholders also requested more information be developed to clarify the regulatory processes associated with assessing and approving changes in adaptive models and systems.
International harmonisation
The majority (95%) of stakeholders from across the therapeutic goods sector stated international harmonisation and engagement with comparable jurisdictions should be maintained as far as possible to minimise regulatory burden and disruption to supply of innovative devices. Some therapeutic goods industry stakeholders suggested that Australia should be responsive and reactive to developments in other comparable jurisdictions and the identification of specific risks with respect to AI, rather than proactive.
We did
This consultation was undertaken under the Budget measure for Safe and Responsible AI and was intended to identify areas for improvement in the legislative framework to mitigate future risks and leverage opportunities associated with the use of AI within the therapeutic goods sector, including software as a medical device. The TGA has provided a report to the Australian Government to inform policy decisions that responds to the findings of the AI review.
Published responses
View submitted responses where consent has been given to publish the response.
Overview
Artificial Intelligence (AI) has already made a difference to many lives, providing the potential to solve problems faster, and opening up opportunities to get things done in smarter and better ways. If safely deployed, its development and adoption can improve wellbeing, quality of life and economic growth. At the same time, caution is needed as AI presents a potential to create or amplify harms to individuals, organisations, communities and social cohesion through risks associated with its use including inherent bias, accuracy and data quality. These harms may disproportionally affect vulnerable and marginalised groups including people with cognitive disability, displaced workers, older people, culturally and linguistically diverse communities, regional communities, women, girls, gender diverse people and people who are mentally or physically unwell.
Due to the number of unknowns around how AI works in the general public, there is low public trust that AI systems are being designed, developed, deployed and used safely and responsibly, particularly in high-risk settings. People are concerned about personal privacy, the impact of bias and errors, and a near future where people can’t tell real from fake.
Through the 2024-2025 Budget measure for Safe and Responsible AI, the Australian Government is acting to ensure the design, development and deployment of AI systems in Australia in legitimate but high-risk settings, is safe and can be relied upon, while ensuring the use of AI in low-risk settings can continue to flourish largely unimpeded.
As part of the Australian Government’s Department of Health and Aged Care, the Therapeutic Goods Administration (TGA) regulates therapeutic goods, including AI models and systems when they meet the definition of a medical device under Section 41BD of the Therapeutic Goods Act 1989. The TGA has issued this consultation paper as part of the review of priority areas in the health and aged care sector.
Why your views matter
This consultation seeks feedback on proposals identified for mitigating risks and leveraging opportunities associated with the use of AI models and systems across our regulated environment, and aligns with the Department’s broader review. Your feedback will help shape the Australian Government's approach to clarifying and strengthening the regulation of AI within the therapeutic goods sector.
What happens next
This consultation is an in-principle consultation intended to identify the areas of potential reform or refinement to address the risks and leverage the opportunities associated with the increasing use of AI models and systems in our sector. There are two main outcomes that will arise from this consultation:
- A report to the Australian Government.
- Based on a decision and guidance from the government, a potential forward program of work for the TGA which may include further consultation.
Report to government
Our report aims to:
- Identify areas of our legislative and regulatory framework where strengths currently exist and where potential changes could be considered to strengthen the mitigation of risks associated with AI models and systems.
- Raise matters for consideration that are broader than the TGA’s responsibilities, legislation that is interconnected with AI in therapeutic goods and requires cross-agency consideration such as privacy, data security, and cybersecurity.
- Emphasise the importance of international harmonisation being maintained as much as possible to assist access to products and export opportunities.
Events
-
Webinar registrations open - TGA AI review - Clarifying and strengthening the regulation of AI
From 26 Sep 2024 at 09:30 to 26 Sep 2024 at 11:30This webinar will provide an overview of the consultation paper and an opportunity for stakeholders to ask questions.
To register, please copy the following link into your browser:
https://events.teams.microsoft.com/event/1c1989dd-5225-4653-ae85-a09539e7e570@34a3929c-73cf-4954-abfe-147dc3517892 -
Webinar registrations open - TGA AI review - Clarifying and strengthening the regulation of AI
From 1 Oct 2024 at 13:00 to 1 Oct 2024 at 15:00This webinar will provide an overview of the consultation paper and an opportunity for stakeholders to ask questions.
To register, please copy the following link into your browser: https://events.teams.microsoft.com/event/67af7a9f-3dc4-435b-9a64-dc3a7ae96185@34a3929c-73cf-4954-abfe-147dc3517892
Audiences
- Aboriginal and Torres Strait Islander People
- Seniors
- Men
- Women
- Carers and guardians
- Families
- Parents
- Young people
- Academics
- Consumers
- Non-government organisations
- State government agencies
- Commonwealth agencies
- Health professionals
- Health workforce
- General public
- Community groups
- Businesses
- Contracted Service Providers
- Aged care service providers
- Aged care workforce
- Aged care professionals
- Industry
- Sponsors
- Manufacturers
- BPRU staff
- Graduates
- Online and Publications staff
- Secretariat
- PCCD
- Health staff
- HPRG (TGA) Staff
- Prescription medicines
- Complementary medicines
- Over-the-counter medicines
- Medical Devices & IVDs
- Biologicals
- Other
Interests
- Hospitals
- e-Health
- Health technology
- Legislation
- Rural health services
- Home Care
- Mental health
- Prescription drugs
- Dental health
- Non-prescription medicines
- Strategic Policy
- Policy Development
Share
Share on Twitter Share on Facebook