
Abhinav Menon
@anscombes_razor
always ready to learn something! professional: pursuing my PhD, working in interpretability in NLP. personal: movies, languages, books, and history
Вам может понравиться
New paper alert! 🧵👇 We show representations of concepts seen by a model during pretraining can be morphed to reflect novel semantics! We do this by building a task based on the conceptual role semantics "theory of meaning"--an idea I'd been wanting to pursue for SO long! 1/n
Check out our recent work on identifying the limitations and properties of SAEs! We use formal languages as a synthetic testbed to evaluate the methodology and suggest further steps.
Paper alert––*Awarded best paper* at NeurIPS workshop on Foundation Model Interventions! 🧵👇 We analyze the (in)abilities of SAEs by relating them to the field of disentangled rep. learning, where limitations of AE based interpretability protocols have been well established!🤯
Can RL fine-tuning endow MLLMs with fine-grained visual understanding? Using our training recipe, we outperform SOTA open-source MLLMs on fine-grained visual discrimination with ClipCap, a mere 200M param simplification of modern MLLMs!!! 🚨Introducing No Detail Left Behind:…

🚨 Introducing Detect, Describe, Discriminate: Moving Beyond VQA for MLLM Evaluation. Given an image pair, it is easier for an MLLM to identify fine-grained visual differences during VQA evaluation than to independently detect and describe such differences 🧵(1/n):

United States Тренды
- 1. #GalxeID 7,784 posts
- 2. Good Monday 22.9K posts
- 3. Branch 39.3K posts
- 4. Knesset 28.6K posts
- 5. Red Cross 62.9K posts
- 6. #njkopw 16.8K posts
- 7. All 20 51.3K posts
- 8. Chiefs 114K posts
- 9. #MondayMotivation 7,795 posts
- 10. Use GiveRep N/A
- 11. Lions 91.3K posts
- 12. Rod Wave 1,848 posts
- 13. #hostages 3,235 posts
- 14. Eitan Mor 21.4K posts
- 15. Air Force One 63.5K posts
- 16. Mahomes 35.6K posts
- 17. Omri Miran 19.9K posts
- 18. Columbus 41K posts
- 19. #LaGranjaVIP 85.6K posts
- 20. Tom Homan 84.6K posts
Something went wrong.
Something went wrong.