Page 20 - AIH-1-4
P. 20

Artificial Intelligence in Health                                        AI scribe in clinical documentation



            transcripts of clinical encounters will likely produce   3.7. Involvement of clinicians in co-design and
            significantly improved accurate results. Some of these   implementation
            models are already available, for example, Alphabet’s   Clinicians’ input in the design and implementation of
            MedPalm.  Some companies are even offering end-to-end   any new system in healthcare is  crucial  for  its  success.
                    9
            pipelines starting with speech-to-text transcription to note   As the primary users of AI scribe applications, clinicians
            generation. 10                                     will have a deep understanding of their operational needs
            3.4. Retrieval-augmented generation (RAG)          and workflow requirements. This will not only make such
                                                               applications more effective but also improve adoption by
            LLMs store data in their parameters. However, their ability   clinicians. Similarly, ongoing education and training of
            to retrieve and present precise information remains limited,   clinicians along with the acquisition of their feedback, will
            leading  to  subpar  performance  in  knowledge-intensive   ensure seamless integration as well as improvement of the
            tasks  compared  to more  task-specific architectures.  This   application.
            can be overcome by providing the model access to “non-
            parametric” data, known as RAG. The combination of   4. Conclusion and prospects
            parametric information with explicit non-parametric
            information can lead to much more accurate output.    Clinical documentation is a crucial component of
                                                         11
            This, when applied to AI scribes, can potentially improve   modern healthcare, but  it also  contributes significantly
            the quality of the generated note significantly.   to the burnout of clinicians. AI-based technologies, like
                                                               AI  scribes provide  a potential solution to alleviate this
            3.5. Small LLMs                                    burden. Even though current technology is not without
                                                               challenges, the prospects are promising. Anticipating
            Another potential is the use of “tiny LLMs” or “small   widespread demand, EHR vendors will likely incorporate
            LLMs.” These are LLMs with a smaller number of     AI models in the core of their software, enabling not
            parameters. The idea is that LLMs contain large amounts   just  AI  scribes  but  also  improving  clinical  decision
            of generic data that may add little value to a specific   support systems, automatic summarization of medical
            task. Therefore, the models are trained and fine-tuned   history, and research. Furthermore, AI also enables
            on smaller amounts of more specific high-quality data   the advancements in patient-facing EHR systems that
            to improve their performance while keeping their size
            and thus computational expense low. The performance   allow for documentation and provision of personalized
            of these smaller LLMs for text summarization has   information.
            been shown to be poor compared to larger LLMs.     Acknowledgments
                                                         12
            However, there is potential for  improvement  through
            various methodologies. For example, knowledge can be   None.
            transferred from a larger LLM to a smaller LLM to achieve
            better performance through improving “reasoning” by the   Funding
            smaller model. This methodology showed that the smaller   None.
            LLM can even outperform some of the larger LLMs for
            certain tasks.  In the context of AI scribes, this can be   Conflict of interest
                       13
            very beneficial. Not only can the output be improved, the   The author is the founder of a company that specializes
            financial burden involved with implementing in-house   in AI scribe services, which is relevant to the topic of
            LLMs can be reduced significantly.                 this article. This has not influenced the content of the

            3.6. In-house AI solutions                         manuscript. No reference to the author’s company is made,
                                                               but it is declared for full transparency.
            To ensure true data security, complete control over
            data and customization, in-house AI solutions could   Author contributions
            be implemented. Training and implementing LLMs is a   This is a single-authored article.
            computationally heavy task, which necessitates a significant
            financial investment in the initial phase. However, this   Ethics approval and consent to participate
            initial investment will pay off in the long term and may
            even prove more profitable by reducing physician burnout,   Not applicable.
            improving efficiency, and ensuring data security. It will also   Consent for publication
            offer seamless integration with in-house systems, reducing
            technical difficulties.                            Not applicable.



            Volume 1 Issue 4 (2024)                         14                               doi: 10.36922/aih.3103
   15   16   17   18   19   20   21   22   23   24   25