--- language: en license: mit tags: - electrocardiogram - multimodal-learning - zero-shot-learning - medical-ai - interpretability --- # Interpretable Multimodal Zero-Shot ECG Diagnosis (ZETA) [![📄 Paper](https://img.shields.io/badge/Paper-Nature-blue)](https://www.nature.com/articles/s44325-025-00099-x) [![💻 GitHub](https://img.shields.io/badge/Code-GitHub-black)](https://github.com/Tang-Jia-Lu/Zeta) ## 🧠 Overview We propose **ZETA**, a zero-shot multimodal framework for ECG diagnosis that aligns signals with **structured clinical observations**. Instead of directly predicting diseases, ZETA **compares ECG signals with positive and negative clinical evidence**, mimicking differential diagnosis. ## 🖼️ Framework

## ⚙️ Method - **Structured observations**: LLM-generated + expert-validated - **Multimodal alignment**: pretrained ECG-text model - **Inference**: - ✅ match with positive observations - ❌ match with negative observations Prediction is based on **relative evidence strength**. ## 📊 Results

## 📦 Checkpoint ```bash ZETA/checkpoints/best.pt