MAGMA – Multimodal Augmentation of Generative Models through Adapter-based Finetuning, 21.10.2024, Lea Chrysanthopoulou BLIP-2: Bootstrapping Language-Image Pre-training with Frozen Image Encoders and Large Language Models, 28.10.2024, Louis Müller Flamingo: a Visual Language Model for Few-Shot Learning, 04.11.2024, Nilushika Nadaraj Self-supervised transformer for sparse and irregularly sampled multivariate clinical time-series, 04.11.2024, Zhao Gwinn Research on Multimodal Fusion of Temporal Electronic Medical Records, 11.11.2024, Zhaokun Wang PromptEHR: Conditional Electronic Healthcare Records Generation with Prompt Learning, 11.11.2024, Lisanne Rüh A Multimodal Transformer: Fusing Clinical Notes with Structured EHR Data for Interpretable In-Hospital Mortality Prediction, 18.11.2024, Gianina Krone Deep multi-modal intermediate fusion of clinical record and time series data in mortality prediction, 18.11.2024, Jakob Forstmann FuseMoE: Mixture-of-Experts Transformers for Fleximodal Fusion, 25.11.2024, Yutong He Learning Missing Modal Electronic Health Records with Unified Multi-modal Data Embedding and Modality-Aware Attention, 25.11.2024, Xiran Hu Integrating Physiological Time Series and Clinical Notes with Transformer for Early Prediction of Sepsis, 02.12.2024, Finn Hillengass Multimodal Pretraining of Medical Time Series and Notes, 02.12.2024, Junhong Cai Learning to write notes in electronic health records, 09.12.2024, Tim Kolber The shaky foundations of large language models and foundation models for electronic health records. 09.12.2024, Maya Arseven Improving Medical Predictions by Irregular Multimodal Electronic Health Records Modeling, 16.12.2024, Shaowei Zhang Time Series as Images: Vision Transformer for Irregularly Sampled Time Series, 16.12.2024, Weiguo Li Hierarchical Pretraining on Multimodal Electronic Health Records, 07.01.2024, Mengtong Guo Subtle variation in sepsis-III definitions markedly influences predictive performance within and across methods, 07.01.2024, Yaxi Zhuang Language Models Still Struggle to Zero-shot Reason about Time Series, 14.01.2024, Yuzhen He OR Unleash The Power of Pre-Trained Language Models for Irregularly Sampled Time Series, 14.01.2024, Yuzhen He