sparseLLMs's profile picture. Workshop on Sparsity in LLMs: Deep Dive into Mixture of Experts, Quantization, Hardware, and Inference @iclr_conf 2025.

Sparsity in LLMs Workshop at ICLR 2025

@sparseLLMs

Workshop on Sparsity in LLMs: Deep Dive into Mixture of Experts, Quantization, Hardware, and Inference @iclr_conf 2025.

Pinned

We are happy to announce that the Workshop on Sparsity in LLMs will take place @iclr_conf in Singapore! For details: sparsellm.org Organizers: @TianlongChen4, @utkuevci, @yanii, @BerivanISIK, @Shiwei_Liu66, @adnan_ahmad1306, @alexinowak

sparseLLMs's tweet image. We are happy to announce that the Workshop on Sparsity in LLMs will take place
@iclr_conf in Singapore!   For details: sparsellm.org 

Organizers:
@TianlongChen4, @utkuevci, @yanii, @BerivanISIK, @Shiwei_Liu66, @adnan_ahmad1306, @alexinowak

Sparsity in LLMs Workshop at ICLR 2025 reposted

@AggieInCA giving the first Oral talk at #ICLR2025 @sparseLLMs workshop

yanii's tweet image. @AggieInCA giving the first Oral talk at #ICLR2025 @sparseLLMs workshop

Sparsity in LLMs Workshop at ICLR 2025 reposted

Presenting in short! 👉🏼 Mol-MoE: leveraging model merging and RLHF for test-time steering of molecular properties. 📆 today, 11:15am to 12:15pm 📍 Poster session #1, GEM Bio Workshop @gembioworkshop @sparseLLMs #ICLR #ICLR2025

In Mol-MoE, we propose a framework to train router networks to reuse property-specific molecule generators. This allows to personalize drug generation at test time by following property preferences! We discuss some challenges. @proceduralia @pierrelux arxiv.org/abs/2502.05633



Sparsity in LLMs Workshop at ICLR 2025 reposted

@black_samorez giving his oral talk at the #ICLR2025 @sparseLLMs workshop

yanii's tweet image. @black_samorez giving his oral talk at the #ICLR2025 @sparseLLMs workshop
yanii's tweet image. @black_samorez giving his oral talk at the #ICLR2025 @sparseLLMs workshop

Sparsity in LLMs Workshop at ICLR 2025 reposted

We are presenting “Prefix and output length-aware scheduling for efficient online LLM inference” at the ICLR 2025 (@iclr_conf) Sparsity in LLMs workshop (@sparseLLMs). 🪫 Challenge: LLM inference in data centers benefits from data parallelism. How can we exploit patterns in…

ayushnoori's tweet image. We are presenting “Prefix and output length-aware scheduling for efficient online LLM inference” at the ICLR 2025 (@iclr_conf) Sparsity in LLMs workshop (@sparseLLMs).

🪫 Challenge: LLM inference in data centers benefits from data parallelism. How can we exploit patterns in…

Sparsity in LLMs Workshop at ICLR 2025 reposted

@tydsh giving his invited talk at the #ICLR2025 @sparseLLMs workshop!

yanii's tweet image. @tydsh giving his invited talk at the #ICLR2025 @sparseLLMs workshop!
yanii's tweet image. @tydsh giving his invited talk at the #ICLR2025 @sparseLLMs workshop!
yanii's tweet image. @tydsh giving his invited talk at the #ICLR2025 @sparseLLMs workshop!

Sparsity in LLMs Workshop at ICLR 2025 reposted

our workshop on sparsity in LLMs is starting soon in Hall 4.7! we’re starting strong with an invited talk from @DAlistarh and an exciting oral on scaling laws for MoEs!

PandaAshwinee's tweet image. our workshop on sparsity in LLMs is starting soon in Hall 4.7! we’re starting strong with an invited talk from @DAlistarh and an exciting oral on scaling laws for MoEs!

Sparsity in LLMs Workshop at ICLR 2025 reposted

@DAlistarh giving his invited talk at the #ICLR2025 @sparseLLMs workshop

yanii's tweet image. @DAlistarh giving his invited talk at the #ICLR2025 @sparseLLMs workshop

Sparsity in LLMs Workshop at ICLR 2025 reposted

Our ICLR 2025 Workshop on Sparsity in LLMs (@sparseLLMs) kicks off with a talk by @DAlistarh on lossless (~1% perf drop) LLM compression using quantization across various benchmarks.

Shiwei_Liu66's tweet image. Our ICLR 2025 Workshop on Sparsity in LLMs (@sparseLLMs) kicks off with a talk by @DAlistarh on lossless (~1% perf drop) LLM compression using quantization across various benchmarks.

Sparsity in LLMs Workshop at ICLR 2025 reposted

First poster session at the #ICLR2025 @sparseLLMs workshop

yanii's tweet image. First poster session at the #ICLR2025 @sparseLLMs workshop
yanii's tweet image. First poster session at the #ICLR2025 @sparseLLMs workshop
yanii's tweet image. First poster session at the #ICLR2025 @sparseLLMs workshop
yanii's tweet image. First poster session at the #ICLR2025 @sparseLLMs workshop

Sparsity in LLMs Workshop at ICLR 2025 reposted

a PACKED hall for @tydsh‘s talk at our sparsity in LLMs workshop -not surprising! we have another oral right after this, and then we’ll have the first of 2 poster sessions before lunch! @iclr_conf

PandaAshwinee's tweet image. a PACKED hall for @tydsh‘s talk at our sparsity in LLMs workshop -not surprising! we have another oral right after this, and then we’ll have the first of 2 poster sessions before lunch! @iclr_conf

Sparsity in LLMs Workshop at ICLR 2025 reposted

@DAlistarh giving his invited talk at the #ICLR2025 @sparseLLMs workshop now!

yanii's tweet image. @DAlistarh giving his invited talk at the #ICLR2025 @sparseLLMs workshop now!

Sparsity in LLMs Workshop at ICLR 2025 reposted
yanii's tweet image. #ICLR2025 @sparseLLMs Workshop Panel with @PavloMolchanov @ayazdanb @DAlistarh @realDanFu @YangYou1991 and Olivia Hsu, moderated by @PandaAshwinee
yanii's tweet image. #ICLR2025 @sparseLLMs Workshop Panel with @PavloMolchanov @ayazdanb @DAlistarh @realDanFu @YangYou1991 and Olivia Hsu, moderated by @PandaAshwinee
yanii's tweet image. #ICLR2025 @sparseLLMs Workshop Panel with @PavloMolchanov @ayazdanb @DAlistarh @realDanFu @YangYou1991 and Olivia Hsu, moderated by @PandaAshwinee

Sparsity in LLMs Workshop at ICLR 2025 reposted

If you’re at #ICLR2025, go watch @AggieInCA give an oral presentation at the @SparseLLMs workshop on scaling laws for pretraining MoE LMs! Had a great time co-leading this project with @samira_abnar & @AggieInCA at Apple MLR last summer. When: Sun Apr 27, 9:30a Where: Hall 4-07

🚨 One question that has always intrigued me is the role of different ways to increase a model's capacity: parameters, parallelizable compute, or sequential compute? We explored this through the lens of MoEs:

samira_abnar's tweet image. 🚨 One question that has always intrigued me is the role of different ways to increase a model's capacity: parameters, parallelizable compute, or sequential compute? 

We explored this through the lens of MoEs:


Sparse LLM workshop will run on Sunday with two poster sessions, a mentoring session, 4 spotlight talks, 4 invited talks and a panel session. We'll host an amazing lineup of researchers: @DAlistarh @vithursant19 @tydsh @ayazdanb @gkdziugaite Olivia Hsu @PavloMolchanov Yang Yu

sparseLLMs's tweet image. Sparse LLM workshop will run on Sunday with two poster sessions, a mentoring session, 4 spotlight talks, 4 invited talks and a panel session. 

We'll host an amazing lineup of researchers: @DAlistarh @vithursant19 @tydsh @ayazdanb @gkdziugaite Olivia Hsu @PavloMolchanov Yang Yu

Sparsity in LLMs Workshop at ICLR 2025 reposted

I will travelling to Singapore 🇸🇬 this week for the ICLR 2025 Workshop on Sparsity in LLMs (SLLM) that I'm co-organizing! We have an exciting lineup of invited speakers and panelists including @DAlistarh, @gkdziugaite, @PavloMolchanov, @vithursant19, @tydsh and @ayazdanb.


Sparsity in LLMs Workshop at ICLR 2025 reposted

Check out this post that has information about research from Apple that will be presented at ICLR 2025 in 🇸🇬 this week. I will be at ICLR and will be presenting some of our work (led by @samira_abnar) at SLLM @sparseLLMs workshop. Happy to chat about JEPAs as well!

New post: "Apple Machine Learning Research at @iclr_conf 2025" - highlighting a selection of the many Apple #ML research papers to be presented at the conference this week: machinelearning.apple.com/research/iclr-…



Sparsity in LLMs Workshop at ICLR 2025 reposted

Our QuEST paper was selected for Oral Presentation at ICLR @sparseLLMs workshop! QuEST is the first algorithm with Pareto-optimal LLM training for 4bit weights/activations, and can even train accurate 1-bit LLMs. Paper: arxiv.org/abs/2502.05003 Code: github.com/IST-DASLab/QuE…

github.com

GitHub - IST-DASLab/QuEST: Work in progress.

Work in progress. Contribute to IST-DASLab/QuEST development by creating an account on GitHub.


United States Trends

Loading...

Something went wrong.


Something went wrong.