Boston Dynamics’ Atlas robot can now pick car parts on its own
In a new video captured by TechCrunch, Boston Dynamics’ humanoid robot Atlas can be seen working autonomously in a demo space, sorting engine parts between numbered bins. The company claims that Atlas does not need to be controlled by humans to work.
According to the video description, Atlas uses machine learning to detect changes in the environment and work around them. It also has a “specialized grasping policy” that helps it maintain a firm grip on objects, constantly guessing what position it is in. After getting some bin space to move parts apart from each other, Atlas will begin working without any prescribed movements, and choose to act independently.
In contrast, Tesla’s Optimus robot was said to have human remote assistance despite claims of autonomy. The Optimus robots were at a live event and when asked, they responded that they were being assisted.
The video shows the Atlas sorting engine parts and moving them to designated locations. It does a great job, moving them without any hassle. After placing one part in the bins, the robot will repeat the process for the other parts.
The Atlas has undergone a lot of changes since its unveiling in 2013, including a shift from hydraulic to electric after nearly 11 years of testing. The old Atlas and the current Atlas are quite different from one another. Check out the video below to see for yourself.
While the use of generative AI in games seems almost inevitable, as this medium has always toyed with new ways to make enemies and NPCs smarter and more realistic, watching several NVIDIA ACE demos back to back really gave me stomach pain.
It wasn’t just slightly smarter enemy AI — ACE can create entire conversations out of thin air, simulate voices, and try to give NPCs a sense of personality. It’s also doing this work locally on your PC, powered by NVIDIA’s RTX GPUs. But while all of this might sound good on paper, I hated almost every moment I saw the AI NPC in action.
TiGames’ ZooPunk is a great example of this: it relies on NVIDIA ACE to generate dialogue, a virtual voice, and lip syncing for an NPC named Buck. But as you can see in the video above, Buck sounds like a robot with a slightly rustic accent. If he’s supposed to have some kind of relationship with the main character, you can’t tell from his performance.
I think my deep dislike of NVIDIA’s ACE-powered AI comes down to this: there’s just nothing charming about it. No joy, no warmth, no humanity.
Every ACE AI character sounds like a developer cutting corners in the worst way possible, as if their contempt for you the audience can be seen in the form of a boring NPC. I’d much prefer scrolling some on-screen text, at least I wouldn’t have to interact with weird robotic voices.
During NVIDIA’s Editor’s Day at CES, a gathering for media to learn more about RTX 5000-series GPUs and their related technology, I was also disappointed by a demo of PUBG’s AI Ally. Its answers sounded like something you’d hear from a pre-recorded phone tree.
Ally also failed to find the gun when the player asked, which could have been a fatal mistake in a crowded map. At one point, the PUBG companion spent nearly 15 seconds attacking enemies while the demo player was yelling at it to get in the car. What’s the point of an AI assistant if it plays like a novice?
Visit NVIDIA’s YouTube channel and you’ll find other disappointing ACE examples, like the basic speaking animations in MMO World of Jade Dynasty (above) and Alien: Rogue Invasion. I’m sure many developers would prefer to skip the task of developing decent lip syncing technology or adopt someone else’s technology, but relying on AI for these games seems terrible.