Program Counter
@program_counter
all things toward agi
Beautiful technical debugging detective longread that starts with a suspicious loss curve and ends all the way in the Objective-C++ depths of PyTorch MPS backend of addcmul_ that silently fails on non-contiguous output tensors. I wonder how long before an LLM can do all of this.
New blog post: The bug that taught me more about PyTorch than years of using it started with a simple training loss plateau... ended up digging through optimizer states, memory layouts, kernel dispatch, and finally understanding how PyTorch works!
24 years old, still holds up
RL is pain sometimes and torchforge could handle a lot of the messiness! Thanks for the @PyTorch and @CoreWeave team for letting us test it out!
Today Meta announced torchforge, a brand-new PyTorch-native library that makes it easy to use reinforcement learning (RL) to train AI agents. Forge provides high-performance building blocks and ready-to-use examples, so you can focus on what’s novel about your use case rather…
One of the best SIMD programmers I’ve had the pleasure of interacting with is becoming available. Real work, with real code, that you almost certainly interact with every single day.
I am looking for a job starting May 2026. I am an expert in SIMD programming, in particular for non-numeric applications such as text processing or database programming. Please have a look at my website for the sort of work I do. I am located in Berlin, Germany.
At PyTorch 2025, where NVIDIA decided to unveil more details about cuTile and TileIR.
The OG PyTorch blog, explaining the mechanics and concepts of the internals of the framework. This basically allows you to explore the complete codebase, enabling better contributions. Definitely worth a read, then another ! Blog by - @ezyang
One year and half after starting the first draft of the first chapter, look what arrived in the mail!
Good point. My first paper was on time travel in the Gödel universe. ML was easy to pick up after that :) journals.aps.org/prd/abstract/1…
We’ve found a ton of value hiring folks with strong theory backgrounds with little to no production ML experience. One of our members of technical staff got his phd in pure math/the geometry of black holes and had no prior ML experience. Within days of hiring him we released our…
it is indeed blog post catch-up day (i'm behind by 6 weeks)
I guess it's blog post catch-up day instead of paper catch-up day x.com/omouamoua/stat…
I've left NVIDIA Research and joined AIRoA Tokyo as Team Lead, VLA Dev. We're pushing VLA and building a Japan-wide real-world data ecosystem with major partners in retail/logistics/construction to deploy hundreds of humanoids. 🔥We're hiring researchers and DM me if interested!
bro @karpathy literally re-implemented the entire lm-eval-harness in 2 Python files It's been very useful for my own repo and easy to adapt for SuperBPE case
I'll be presenting Formalized Kernel Derivation to @GPU_MODE w/ @GioeleZardini; discord dot gg/gpumode at noon PST today! Will be uploaded to the GPU Mode YT afterward. Somewhere at the intersection of art and science. Come for the diagrams, stay for the math.
RIP. Markov processes and Yang-Mills: tinyurl.com/3huay75x
Prof. Chen Ning Yang, a world-renowned physicist, Nobel Laureate in Physics, Academician of the Chinese Academy of Sciences, Professor at Tsinghua University, and Honorary Director of the Institute for Advanced Study at Tsinghua University, passed away in Beijing due to illness…
Checkout our latest work on Gaussian Splatting for LiDAR with 3DGUT!
[1/N] Excited to introduce "SimULi: Real-Time LiDAR and Camera Simulation with Unscented Transforms." We extend 3DGUT with LiDAR support and render a wide range of sensors 10-20x faster than ray tracing and 1.5-10x faster than prior rasterization work. research.nvidia.com/labs/sil/proje…
At @Berkeley_EECS we always work to keep our curriculum fresh. Our intro ML course CS 189 just got a drastic makeover this semester (thanks @profjoeyg @NargesNorouzi!) and now includes ~12 lectures on e.g. Adam, PyTorch, various NN architectures, LLMs, and more (see…
Harvard and Stanford students tell me their professors don't understand AI and the courses are outdated. If elite schools can't keep up, the credential arms race is over. Self-learning is the only way now.
I am really considering on changing my research topic to Robot Learning instead of SLAM. Getting feelings that there aren’t much left to do for me at the academia with traditional approaches. Or maybe do both? 🤔
Industry SLAM systems are far ahead of academic open source systems.
I received many requests to share materials from our undergraduate course “Machine Learning in Chemistry” — here you go! A preprint summarizing insights and lessons learned: chemrxiv.org/engage/chemrxi… A Jupyter Notebook Tutorial Gallery: xuhuihuang.github.io/mlchem/html/ex…
My focus for Spring 2025: launching an undergraduate course @UWMadisonChem @TCI_UW_Madison developed from scratch - "Chem361: Machine Learning in Chemistry"! Here's a glimpse of what we'll explore:
*Depth of second order optimization*
1/8 Second Order Optimizers like SOAP and Muon have shown impressive performance on LLM optimization. But are we fully utilizing the potential of second order information? New work: we show that a full second order optimizer is much better than existing optimizers in terms of…
This tutorial is strongly biased and only (effectively) covers a very small fraction of robot learning frontiers. But understandably, HF doesn't need to have any idea on non-data-intensive ones.
A comprehensive, hands-on tutorial on the most recent advancements in robotics 🤟 ...with self-contained explanations of modern techniques for end-to-end robot learning & ready-to-use code examples using @LeRobotHF and @huggingface. Now available everywhere! 🤗
United States الاتجاهات
- 1. #WorldSeries 103K posts
- 2. Dodgers 128K posts
- 3. Kershaw 16.2K posts
- 4. Ohtani 90.3K posts
- 5. Freddie 15.4K posts
- 6. Mookie 11.2K posts
- 7. Tommy Edman 4,701 posts
- 8. Will Smith 13K posts
- 9. Draymond 6,112 posts
- 10. Lauer 1,798 posts
- 11. Vladdy 8,589 posts
- 12. Joe Davis 1,248 posts
- 13. Dave Roberts 4,054 posts
- 14. Inning 15 N/A
- 15. Lukes 2,776 posts
- 16. Chiefs 84.7K posts
- 17. Alex Call 1,363 posts
- 18. Muncy 3,250 posts
- 19. Wikipedia 46.6K posts
- 20. Grokipedia 48K posts
Something went wrong.
Something went wrong.