Walk a mile in my shoes: robot tries on a pair of white Nike sneakers


HumanPlus, a humanoid robot developed by researchers at Stanford University, is learning life skills by imitating humans.

In a video clip uploaded by the scientists, HumanPlus is shown sitting on a chair, putting on a white Nike skateboard trainer while already wearing another one. The robot then ties the shoelaces, stands up, and takes several awkward steps forward.

Several other short clips shared by the researchers show the robot folding clothes, jumping, typing “AI” on a keyboard, picking and placing objects in a warehouse, and shaking hands with another robot, among other tasks.

According to the researchers, HumanPlus learned all these tasks by first “shadowing” human motion. This means a human operator initially controlled the robot remotely to perform various tasks, collecting whole-body data for the machine to learn from.

“Using the data collected, we then perform supervised behavior cloning to train skill policies using egocentric vision, allowing humanoids to complete different tasks autonomously by imitating human skills,” they said.

The ability to train robots on extensive human data is one argument for building humanoid robots – those that are similar in form to human beings.

“Yet, doing so has remained challenging in practice due to the complexities in humanoid perception and control, lingering physical gaps between humanoids and humans in morphologies and actuation, and lack of a data pipeline for humanoids to learn autonomous skills from egocentric vision,” the scientists explained in the paper detailing the study.

The team encountered some of these limitations, such as the hardware they used having “fewer degrees of freedom compared to human anatomy,” during their research. Despite that, a customized version of their own humanoid robot used for the tests, achieved 60-100% success rates in up to 40 demonstrations.

“We hope to address these limitations in future, and to enable more autonomous and robust humanoid skills that can be applied in various real-world tasks,” the researchers said.