RoboPapers's profile picture. @chris_j_paxton & @micoolcho geeking out weekly with authors of robotics AI papers. On YouTube / X / Spotify

RoboPapers

@RoboPapers

@chris_j_paxton & @micoolcho geeking out weekly with authors of robotics AI papers. On YouTube / X / Spotify

Full episode dropping soon! Geeking out with @SihengZhao @ZeYanjie on ResMimic: From General Motion Tracking to Humanoid Whole-Body Loco-Manipulation via Residual Learning resmimic.github.io Co-hosted by @micoolcho @chris_j_paxton


Full episode dropping soon! Geeking out with @SihengZhao @ZeYanjie on ResMimic: From General Motion Tracking to Humanoid Whole-Body Loco-Manipulation via Residual Learning resmimic.github.io Co-hosted by @micoolcho @chris_j_paxton


Full episode dropping soon! Geeking out with @GeYan_21 on ManiFlow: A General Robot Manipulation Policy via Consistency Flow Training maniflow-policy.github.io Co-hosted by @micoolcho @chris_j_paxton


Full episode dropping soon! Geeking out with @GeYan_21 on ManiFlow: A General Robot Manipulation Policy via Consistency Flow Training maniflow-policy.github.io Co-hosted by @micoolcho @chris_j_paxton


Just collecting manipulation data isn’t enough for robots - they need to be able to move around in the world, which has a whole different set of challenges from pure manipulation. And bringing navigation and manipulation together in a single framework is even more challenging.…


Full episode dropping soon! Geeking out with @fancy_yzc @still_wtm on HERMES: Human-to-Robot Embodied Learning From Multi-Source Motion Data for Mobile Dexterous Manipulation gemcollector.github.io/HERMES/ Co-hosted by @micoolcho @chris_j_paxton


Reasoning over long horizons would allow robots to generalize better to unseen environments and settings zero-shot. One mechanism for this kind of reasoning would be world models, but traditional video world models still tend to struggle with long horizons, and are very data…


Full episode dropping soon! Geeking out with @fancy_yzc @still_wtm on HERMES: Human-to-Robot Embodied Learning From Multi-Source Motion Data for Mobile Dexterous Manipulation gemcollector.github.io/HERMES/ Co-hosted by @micoolcho @chris_j_paxton


Full episode dropping soon! Geeking out with @nishanthkumar23 on From Pixels to Predicates: Learning Symbolic World Models via Pretrained Vision-Language Models pix2pred.csail.mit.edu Co-hosted by @micoolcho @chris_j_paxton


Full episode dropping soon! Geeking out with @nishanthkumar23 on From Pixels to Predicates: Learning Symbolic World Models via Pretrained Vision-Language Models pix2pred.csail.mit.edu Co-hosted by @micoolcho @chris_j_paxton


Walking robots can do all kinds of exciting things like dancing, running, and martial arts — but for them to be useful, they must be able to use their legs to handle terrain, to move over obstacles not just around them. So, how can we best train walking policies for legged…


Full episode dropping soon! Geeking out with @ChongZitaZhang on Attention-based map encoding for learning generalized legged locomotion science.org/doi/10.1126/sc… Co-hosted by @micoolcho @chris_j_paxton


Full episode dropping soon! Geeking out with @ChongZitaZhang on Attention-based map encoding for learning generalized legged locomotion science.org/doi/10.1126/sc… Co-hosted by @micoolcho @chris_j_paxton


With enough data, robots and AI can learn “world models” that let them predict the results of their actions. These models are a way to learn how embodied AI agents can perform a wide variety of useful tasks — but they require a huge amount of data. The team at General Intuition…


Full episode dropping soon! Geeking out with @PimDeWitte @AdamJelley2 , cofounders of @gen_intuition leveraging large-scale game recordings to build agents with deep spatial/temporal reasonings. Built upon the work by Adam, @EloiAlonso1 @micheli_vincent (all cofounders at…


Full episode dropping soon! Geeking out with @PimDeWitte @AdamJelley2 , cofounders of @gen_intuition leveraging large-scale game recordings to build agents with deep spatial/temporal reasonings. Built upon the work by Adam, @EloiAlonso1 @micheli_vincent (all cofounders at…


How can we make a humanoid robot play table tennis? The robot must hit a moving ball and return it over and over again, requiring precise whole-body control over again. @ZhiSu22 tells us about how he developed a hierarchical approach for planning an whole body control that lets…


How can we build robotic hands with truly superhuman dexterity? @DaxoRobotics is developing a unique tendon-driven soft robot hand, which aims to be tougher and more capable than a traditional humanoid hand. Each finger consists of many different tendons, which act in concert to…


Full episode dropping soon! Geeking out with @ZhiSu22 on HITTER: A HumanoId Table TEnnis Robot via Hierarchical Planning and Learning humanoid-table-tennis.github.io Co-hosted by @micoolcho @chris_j_paxton


Full episode dropping soon! Geeking out with @ZhiSu22 on HITTER: A HumanoId Table TEnnis Robot via Hierarchical Planning and Learning humanoid-table-tennis.github.io Co-hosted by @micoolcho @chris_j_paxton


United States الاتجاهات

Loading...

Something went wrong.


Something went wrong.