I'm a Masters student at EPFL and Natural Language Processing (NLP) researcher in the EPFL LLM team collaborating with the NLP and MLO labs. My current focus is on adapting Large Language Models for healthcare.
My latest project is MEDITRON, currently the best open-source medical Large Language Model (LLM) in the world.
We've been working with an amazing team of 20 researchers from EPFL and Yale School of Medicine for the past 6 months to extend LLaMA-2 with high-quality medical knowledge.
We've publicly released the weights for Meditron-70B and Meditron-7B on Huggingface.
-
🦾 GitHub Repo: epfLLM/meditron
-
📖 Paper: MEDITRON-70B: Scaling Medical Pre-Training For Large Language Models (pre-print)
-
📢 Announcement: LinkedIn post
-
🗞️ Press Release: EPFL's new Large Language Model for Medical Knowledge
-
🧬 NEW medical dataset: Clinical Guidelines Corpus
- MSc in Computer Science @ EPFL, Swiss Federal Institute of Technology
- BSc Honours in Computer Science and Mathematics @ McGill University