Elon Musk's artificial intelligence platform OpenAI introduced a new program to train robots entirely in simulation. Now they've added a new algorithm, named one-shot imitation learning, which will only require humans to demonstrate a task once in VR for a robot to learn it.
The system is powered by two neural networks. The first takes a camera image and determines objects' spatial position in relation to the robot -- but it was trained only with a host of simulated images, meaning it was taught how to interact with the real world before it ever actually saw the real world. The second imitates tasks shown by the demonstrator by scanning through recorded action and paying attention to frames that tell it what to do next.
This training model is only a prototype, but teaching robots entirely in simulation could allow researchers to train them for complex tasks without needing physical elements at all. That would let humans safely and easily approximate extreme environments like arctic waters or areas soaked in nuclear radiation -- or even other planets.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.