Artificial intelligence living evidence

New and promising innovations in the use of artificial intelligence (AI) to support healthcare.

There are three topics in this series.

N.B.: While the living tables note evidence of accuracy and precision in models, many models require local data for rollout, so are not (yet) directly transferable to new clinical contexts.

Definitions and background

AI is the ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings. Automated decision-making means any technology that either assists or replaces the judgment of human decision-makers.1

AI includes a vast array of systems, software, intelligent processes and tools, ranging in sophistication and complexity. While some applications – such as ChatGPT – are relatively new, many others have been in use for decades.

Functionally, AI tools can:2

  • gather and collate data, e.g. collect and transmit vital signs from wearable technology such as smartwatches
  • apply rules to complex data sources, e.g. incorporate algorithms, decision trees and alerts within electronic health record (EHR) systems
  • learn, e.g. identify patterns in big datasets. This has applications for imaging analysis and predicting patient treatment responses, including for precision medicine
  • act and perform, e.g. compose clinical notes or discharge summaries (generative AI), perform surgery independently.3-5

In medicine, AI also has the power to develop innovations for:

  • new science, such as drug discovery6
  • new medicine, such as innovative ways to promote consumer engagement
  • new organisation, such as novel ways to organise administrative tasks, rostering, supply chains and finance, etc.

Challenges with the field and state of the evidence:

  • Although AI systems have demonstrated success in a wide variety of research studies, relatively few AI tools have been widely translated into medical practice.7, 8
  • Randomised controlled trials involving AI often have suboptimal quality of reporting practices and transparency.9, 10
  • New AI methods can be difficult to fit to existing healthcare systems. They may work only in a narrow domain, and they can have built-in biases that disproportionally affect already marginalised groups.11 Assessing AI’s readiness for real-life clinical application is therefore vital and should incorporate a range of stakeholders.12
  • Despite the increasingly rich AI literature in healthcare, the research mainly concentrates around a few disease types: cancer, nervous system disease and cardiovascular disease.13
  • Systemic human biases often make their way into AI models, including widespread and rooted bias based on sex and gender, race and ethnicity, age, socioeconomic status, geographic location, and urban or rural contexts. Biased and imbalanced datasets are the primary factors responsible for AI biases in the healthcare field.14-16 This can generate misleading or inaccurate information that could pose risks to health, equity and inclusiveness.17, 18

References

  1. Australian Medical Association. Artificial Intelligence in Healthcare. Australia: AMA; 2023 [cited 19 Dec 2023]. Available from: https://www.ama.com.au/articles/artificial-intelligence-healthcare
  2. Davenport T, Kalakota R. The potential for artificial intelligence in healthcare. Future Healthc J. 2019 Jun;6(2):94-8. DOI: 10.7861/futurehosp.6-2-94
  3. Han J, Davids J, Ashrafian H, et al. A systematic review of robotic surgery: From supervised paradigms to fully autonomous robotic approaches. The International Journal of Medical Robotics and Computer Assisted Surgery. 2022 2022/04/01;18(2):e2358. DOI: 10.1002/rcs.2358
  4. Shoja M, Van de Ridder JMM, Rajput V. The Emerging Role of Generative Artificial Intelligence in Medical Education, Research, and Practice. Cureus. 2023 24 Jun 2023. DOI: 10.7759/cureus.40883
  5. Koohi-Moghadam M, Bae KT. Generative AI in Medical Imaging: Applications, Challenges, and Ethics. Journal of Medical Systems. 2023 2023/08/31;47(1):94. DOI: 10.1007/s10916-023-01987-4
  6. Nature Editorial Team. AI’s potential to accelerate drug discovery needs a reality check. London: Nature; 2023 [cited 17 Oct 2023]. Available from: https://www.nature.com/articles/d41586-023-03172-6
  7. Rajpurkar P, Chen E, Banerjee O, et al. AI in health and medicine. Nature Medicine. 2022 01 Jan 2022;28(1):31-8. DOI: 10.1038/s41591-021-01614-0
  8. Banerji CRS, Chakraborti T, Harbron C, et al. Clinical AI tools must convey predictive uncertainty for each individual patient. Nature Medicine. 2023 2023/10/11. DOI: 10.1038/s41591-023-02562-7
  9. Shahzad R, Ayub B, Siddiqui MAR. Quality of reporting of randomised controlled trials of artificial intelligence in healthcare: a systematic review. BMJ Open. 2022;12(9):e061519. DOI: 10.1136/bmjopen-2022-061519
  10. Plana D, Shung DL, Grimshaw AA, et al. Randomized Clinical Trials of Machine Learning Interventions in Health Care: A Systematic Review. JAMA Network Open. 2022;5(9):e2233946-e. DOI: 10.1001/jamanetworkopen.2022.33946
  11. Beam AL, Drazen JM, Kohane IS, et al. Artificial Intelligence in Medicine. New England Journal of Medicine. 2023 30 Mar 2023;388(13):1220-1. DOI: 10.1056/NEJMe2206291
  12. Hogg HDJ, Al-Zubaidy M, Talks J, et al. Stakeholder Perspectives of Clinical Artificial Intelligence Implementation: Systematic Review of Qualitative Evidence. J Med Internet Res. 2023 10 Jan 2023;25:e39742. DOI: 10.2196/39742
  13. Jiang F, Jiang Y, Zhi H, et al. Artificial intelligence in healthcare: past, present and future. Stroke and Vascular Neurology. 2017;2(4):230-43. DOI: 10.1136/svn-2017-000101
  14. European Parliament. Artificial intelligence in healthcare: Applications, risks, and ethical and societal impacts. European Parliament; 2022 [cited 13 Jul 2023]. Available from: https://www.europarl.europa.eu/RegData/etudes/STUD/2022/729512/EPRS_STU(2022)729512_EN.pdf
  15. Grant C. Algorithms Are Making Decisions About Health Care, Which May Only Worsen Medical Racism. USA: American Civil Liberties Union; 2022 [cited 13 Jul 2023]. Available from: https://www.aclu.org/news/privacy-technology/algorithms-in-health-care-may-worsen-medical-racism
  16. Abdulazeem H, Whitelaw S, Schauberger G, et al. A systematic review of clinical health conditions predicted by machine learning diagnostic and prognostic models trained or validated using real-world primary health care data. PLoS One. 2023;18(9):e0274276. DOI: 10.1371/journal.pone.0274276
  17. World Health Organization (WHO). WHO calls for safe and ethical AI for health. Geneva: WHO; 2023 [cited 24 Jul 2023]. Available from: https://www.who.int/news/item/16-05-2023-who-calls-for-safe-and-ethical-ai-for-health
  18. Dorr DA, Adams L, Embí P. Harnessing the Promise of Artificial Intelligence Responsibly. JAMA. 2023;329(16):1347-8. DOI: 10.1001/jama.2023.2771

Living evidence tables include some links to low quality sources and an assessment of the original source has not been undertaken. Sources are monitored regularly but due to rapidly emerging information, tables may not always reflect the most current evidence. The tables are not peer reviewed, and inclusion does not imply official recommendation nor endorsement of NSW Health.

Last updated on 19 Jan 2024

Back to top