Core Concepts
Generative LLMs efficiently summarize clinical dialogues through prompt tuning.
Stats
We developed prompt-tuning algorithms to instruct generative LLMs to summarize clinical text using up to 20 billion parameters.
The GatorTronGPT-20B model achieved the best performance on all evaluation metrics.
Quotes
"Prompt-based learning is the key technology that utilizes a ‘prompt’—additional instructional information added to the input data—to guide LLMs in generating text that follows these instructions."
"The proposed solution has a low computing cost as the LLM parameters are not updated during prompt-tuning."