A large AI/ML language model for use in electronic health records

A large AI/ML language model for use in electronic health records

There is a growing interest in using artificial intelligence (AI) and electronic health records (EHRs) to improve healthcare delivery and outcomes. EHRs contain a large amount of structured (e.g., disease codes, medication codes) and unstructured (e.g., clinical narratives) data, but using the structured data fields in clinical documentation can be a burden for healthcare providers. Natural language processing (NLP) is a key technology that enables medical AI systems to understand the clinical language used in healthcare, and most NLP solutions are based on deep learning models implemented using neural network architectures, such as convolutional neural networks (CNNs) and recurrent neural networks (RNNs). 

More recently, transformer architectures, which use a self-attention mechanism, have become state-of-the-art and have achieved the best performance on many NLP benchmarks. These models are typically trained in two stages: language model pretraining on a large corpus of unlabeled text, and fine-tuning on specific tasks with labeled data. 

This articles explores the use of large transformer models in the clinical domain due to the sensitive nature of the data and the need for careful ethical considerations.




Next Article

Did you find this useful?

Medigy Innovation Network

Connecting innovation decision makers to authoritative information, institutions, people and insights.

Medigy Logo

The latest News, Insights & Events

Medigy accurately delivers healthcare and technology information, news and insight from around the world.

The best products, services & solutions

Medigy surfaces the world's best crowdsourced health tech offerings with social interactions and peer reviews.


© 2024 Netspective Media LLC. All Rights Reserved.

Built on May 8, 2024 at 12:09pm