linkevin0's profile picture. CS PhD @UTAustin. Research Intern @NVIDIA GEAR.
CS MS @Stanford.
EECS BS @Berkeley_EECS.
Opinions are my own.

Kevin Lin

@linkevin0

CS PhD @UTAustin. Research Intern @NVIDIA GEAR. CS MS @Stanford. EECS BS @Berkeley_EECS. Opinions are my own.

Excited to share GR00T N1, our open-source and open-weight 2B parameter foundation model for humanoid robots!

Excited to announce GR00T N1, the world’s first open foundation model for humanoid robots! We are on a mission to democratize Physical AI. The power of general robot brain, in the palm of your hand - with only 2B parameters, N1 learns from the most diverse physical action dataset…



Fun idea: get the robot to pick up and use another hand for cross embodiment transfer

Humans, regardless of arm size or handedness, can fasten screws with the same tool: a screwdriver. What if robots could share tools—and skills too? Introducing LEGATO, a cross-embodiment learning framework using a handheld gripper! 🌐 ut-hcrl.github.io/LEGATO 👇 See more🧵(1/5)



Kevin Lin 已轉發

With the recent progress in large-scale multi-task robot training, how can we advance the real-world deployment of multi-task robot fleets? Introducing Sirius-Fleet✨, a multi-task interactive robot fleet learning framework with 𝗩𝗶𝘀𝘂𝗮𝗹 𝗪𝗼𝗿𝗹𝗱 𝗠𝗼𝗱𝗲𝗹𝘀! 🌍 #CoRL2024


Teleoperating humanoid robots is expensive and time-consuming. We introduce DexMimicGen: a pipeline for automatically generating data for humanoid robots from a few human demonstrations.

How can we scale up humanoid data acquisition with minimal human effort? Introducing DexMimicGen, a large-scale automated data generation system that synthesizes trajectories from a few human demonstrations for humanoid robots with dexterous hands. (1/n)



Robosuite v1.5: supports more robots (e.g., humanoids), robot controllers (e.g., whole body controllers) and teleoperation devices. Check it out!

此推文已無法使用。

Loading...

Something went wrong.


Something went wrong.