Yongchang Hao
@yongchanghao
PhD student @UAlbertaCS w/ @AmiiThinks. Interned @NetflixResearch, @RBCBorealis, @TencentGlobal AI Lab, @Google
Started trying Claude Code this weekend, have to disagree with @karpathy's idea that vibe coding may be the gateway drug to the real coding: it's the gateway drug to more powerful vibe coding.
A lot of people still not paying adequate attention to this vision makes the lesson bitter
> be me, first year PhD student > get obsessed with polyphonic music transcription during pandemic > for some reason, advisor lets me work on music > brilliant idea: explicitly encode chord structure into the model > spend months implementing complicated chord-aware audio…
What this is not a black mirror episode??
Very well deserved! Rich has been a great professor as well. Go @UAlberta @AmiiThinks!!
Meet the recipients of the 2024 ACM A.M. Turing Award, Andrew G. Barto and Richard S. Sutton! They are recognized for developing the conceptual and algorithmic foundations of reinforcement learning. Please join us in congratulating the two recipients! bit.ly/4hpdsbD
Meet the recipients of the 2024 ACM A.M. Turing Award, Andrew G. Barto and Richard S. Sutton! They are recognized for developing the conceptual and algorithmic foundations of reinforcement learning. Please join us in congratulating the two recipients! bit.ly/4hpdsbD
Mitigating racial bias from LLMs is a lot easier than removing it from humans! Can’t believe this happened at the best AI conference @NeurIPSConf We have ethical reviews for authors, but missed it for invited speakers? 😡
Focused and precise! @yongchanghao (PhD at @UAlbertaCS) took the challenge and delivered his #ICML paper in 115.18 seconds. Title: FLORA: Low-Rank Adapters Are Secretly Gradient Compressors
🇹🇭 I'm in Bangkok this week for the ACL 2024 and will present this paper. Poster: Mon 11:00-12:30@Convention Center A1 Oral: Tue 10:30@World Ballroom B, NLP Application I Looking forward to connecting with you all and making new friends! #ACL2024
💦 Text watermarking has emerged as an effective method to tag and identify content generated by LLMs. However, what if the watermarked text is translated into different languages? Let me introduce our work on the cross-lingual consistency of text watermarks:
arXiv -> alphaXiv Students at Stanford have built alphaXiv, an open discussion forum for arXiv papers. @askalphaxiv You can post questions and comments directly on top of any arXiv paper by changing arXiv to alphaXiv in any URL!
Somehow got an additional stop in Prague, which wasn't on the original itinerary (the coffee here is good though)
- me: travel tomorrow, gonna check some lovely news and have a good sleep - the news: bloomberg.com/news/articles/…
bloomberg.com
Airline Fallout Mounts as Carriers Work Through IT Meltdown
Thousands of delayed and canceled flights piled up Friday as airlines slowly resumed flying after a widespread global software meltdown, with disruptions set to cascade through the weekend.
Really cool work from my former colleagues at Borealis. Reduces GPU memory in training / fine tuning by a factor of three. Faster models! Larger batches! And only one or two easy code changes to make this happen.
👋 Introducing Flora: a breakthrough in reducing #GPU memory demands for training large #neuralnetworks. Learn how it works, and how you can incorporate #Flora with a one-liner, in our latest #research blog. 📚 Read it here: borealisai.com/research-blogs…. Work to appear in #icml2024.
United States Тренды
- 1. FINALLY DID IT 381K posts
- 2. FIFA 115K posts
- 3. The WET 107K posts
- 4. $MAYHEM 2,663 posts
- 5. Warner Bros 174K posts
- 6. Infantino 15.5K posts
- 7. The BONK 241K posts
- 8. The Jupiter 52K posts
- 9. #NXXT_AI_Energy N/A
- 10. #FanCashDropPromotion 2,467 posts
- 11. HBO Max 71.6K posts
- 12. #NXXT_CleanEra N/A
- 13. Kevin Hart 3,129 posts
- 14. NextNRG Inc 2,731 posts
- 15. Chris Henry Jr 8,175 posts
- 16. Andrea Bocelli 3,909 posts
- 17. Paramount 31.1K posts
- 18. Hep B 1,319 posts
- 19. Cyclist 3,294 posts
- 20. #GenerationsShift_NXXT N/A
Something went wrong.
Something went wrong.