README / README.md
shwai-he's picture
Update README.md
4effdd3 verified
metadata
title: README
emoji: πŸ’»
colorFrom: blue
colorTo: purple
sdk: static
pinned: false

LLM-Drop

πŸ€— LLM-Drop hosts research artifacts for efficient foundation models, with a focus on large language models and unified multimodal models.

Our work studies how modern foundation models can be made more efficient while preserving their core capabilities. This page collects model weights, code links, project pages, and related resources from our research projects.

πŸ“Œ Projects

🧩 LLM-Drop

Uncovering the Redundancy in Transformers via a Unified Study of Layer Dropping
TMLR 2026

πŸ” Pruning on Representations

Demystifying When Pruning Works via Representation Hierarchies

🌐 Sparse Unified Models

Understanding and Harnessing Sparsity in Unified Multimodal Models

πŸ“¬ Contact

For questions or collaborations, please contact: