How The Tesla Bot Is Made

Tesla is building an army - the biggest the world has ever seen. Millions strong.

Only this army isn’t going to war - they’re going to work. The robots will save manufacturing someday - but first, we’ve got to manufacture the robots - that’s the trick.

The Muscle

If our army is going to be successful at taking over the world, then it’s going to need some serious muscle. The Tesla Bot isn’t here to do tricks and play with cups, it’s going to work - so Optimus needs a specific combination of dexterity and raw power - this is where Tesla’s robotic actuators come into play.

Since an actuator is basically just another word for electric motor - this is a field that Tesla has a lot of experience in - not only do they have electric motors that drive the car forwards and backwards, they also have smaller actuators that control the steering when autopilot is engaged - on the Cybertruck, the 4 wheel steering is handled entirely by actuators and there is no mechanical linkage at all.

Still, this is fairly basic stuff as far as robotics are concerned - going forwards and backwards, left and right, is only really two ‘degrees of freedom’ - which is a robotics term that essentially describes the number of possible directions that a component can move.

The actuators inside a Tesla Bot have to cover a wide range of different movements with a very high degree of precision - over 200 degrees of freedom over the entire body of the robot - with 11 degrees of freedom in the hand alone powered by 6 tiny actuators.

There are two main categories of actuator used in the Tesla Bot - linear and rotary - and within each category there are three specific designs based on the action required - basically a small, medium and large size for each style of motor.

Let’s start with linear actuators - also sometimes referred to as screw type motors - Elon Musk likes these because they are a very simple and reliable design - essentially just a motor driven piston that extends out and rectracts.

We’ve recently seen SpaceX swap the thrust vectoring control of their Raptor engines over from a hydraulic drive system, to electric linear actuators - and following that switch we saw an instant improvement in Starship’s ability to reach outer space - while the hydraulic system failed during liftoff on 4/20, 2023 and lead to the entire rocket eventually spinning out of control and exploding - the electric thrust vectoring has twice now made it all the way to a successful stage separation for the Starship booster.

Optimus is doing something similar for humanoid robots - for most of the past decade, our go-to example of a humanoid robot has been the Boston Dynamics Atlas - these are hydraulic powered robots - and there’s nothing inherently wrong with that - we frequently use hydraulics for all kinds of important things all around the world - like steering jet airplanes, or digging large holes - hydraulic actuators can be incredibly powerful. But the biggest problem here is that a hydraulic system is just not practical for a consumer product that w e expect to be used around the house and in the workplace by millions of people - electric motors are quieter, cleaner, less prone to leakage and don’t require maintenance.

Linear actuators are relatively uncommon on humanoid robots - which is a bit weird, because they operate in a very similar mechanical function to human muscles - but if we look at the original, ‘Bumble C’ prototype of the Tesla Bot, it was mostly using rotary actuators to drive the arms and legs. Same goes for the new Figure One humanoid robot, which appears to use exclusively rotary actuators.

Looking over at our friend Optimus, we can visually spot three primary applications of the linear actuator - the forearms, thighs and calves. 

In each section of the body, we can see that the linear actuators doubled up to work in pairs for additional torque - one of each side of the ‘muscle group’ - and this way one actuator can push, while the other pulls to create an additional degree of freedom.

On the thigh segment, the two large sized linear motors operate the Tesla’s Bot’s human inspired knee and hip joints - they’re mechanically referred to as a ‘4 bar link’ - and it helps to maintain the most linear torque curve across all leg activities. Imagine standing up from a squat position, the more bent the knee is, the more force is going to be required to straighten it back out. 

With just one big rotary actuator on the knee, like Figure One, the actuator is going to require a tremendous amount of torque when the knee is bent and a very minimal amount as the knee straightens out - so there’s a very steep curve in the amount of energy being required.

While using linear actuators and the 4 bar link, you can apply a more consistent amount of force across the entire movement without going to the extremes, and that forms a more linear curve, which is a more efficient use of energy.

The calf actuators wor k in pretty much the same way as the wrists - they can primarily drive the heel up and down when working together - and when working in opposition, they create a second degree of freedom to slightly angle the foot from side to side.

Looking at the foot of the Tesla Bot, while we’re down here - it does have a degree of freedom, there’s a pivot point at the ‘toes’, but it’s not actuated - because you don’t really need to exert force from your toes unless you’re jumping or sprinting - so in the case of the Tesla Bot foot, there's just a spring loaded mechanism that allows the toe box to pivot and then return to the resting position automatically.

Let’s talk about the rotational actuators - these are pretty close to the same electric motors that drive a car or spin a washing machine - the Tesla Bot actuators use permanent magnets to generate the rotation, while it’s more common for electric motors to use AC induction - the difference really is that permanent magnets provide more torque in a smaller package, and generate a more consistent torque curve over the whole operating range of the actuator, which is better for precision control. Permanent magnet motors are generally more expensive because of the need for rare-earth elements like neodymium - but we know that Tesla is developing a new motor for the Redwood EV platform that eliminates the need for rare earths - so in theory the same cost saving can transfer over to the Tesla Bot as well.

The most prominent use of the rotational actuator on the Tesla Bot is the pelvis area - there are 6 fundamental degrees of freedom here - we’ve got one motor pointing upwards that controls the rotation of the torso, coupled to another motor point forward that controls the left-right tilt of the torso - it’s important to note here that the Tesla Bot can’t bend forward at the torso the way that a person can - all of the forward and backward tilt is controlled by the 4 bar links in the hip joints and actuated by a linear motor in the thighs.

That same dual motor combo is implemented above each hip joint - with a rotational actuator that controls the sideways angle of the leg - so these can widen or narrow the stance of the Bot, and sway the hips for balance while walking - and a second actuator that rotates the leg from side to side.

We’re also getting a lot of rotational action up in the shoulder joints - there are three degrees of freedom on each shoulder - one actuator for the lateral raising and lowering of the arm, one for the front to back rotation, and one actuator just below the shoulder joint that can rotate the upper arm left and right.

It’s easier to see the shoulders on older versions of the Bot.

The Hands

The hand of the Tesla Bot is designed to be as human-like as possible - and the reason for that comes down to ease of use - the world is already designed to be ergonomic to the human hand, so there’s no point in trying to fight against that. Also the human hand is one of the most amazing products to come out of 3 billion years of natural evolution, so you’d be pretty stupid to think that you could do any better.

Just like in nature, the Tesla Bot fingers are driven by metallic tendons that are pulled on by the smallest series of actuators. Each actuator has a built in clutch to prevent backdrive - so when the finger is contracted, a mechanism will lock it into place so that it can’t spring back - this way the actuator doesn’t have to be constantly pulling against the weight that the Bot is carrying - then when it wants to release, the clutch mechanism disengages.

The only human feature that the Tesla Bot still lacks is the middle knuckle - Optimus fingers joint where they meet the hand and at the fingertip, but they don’t have that pivot point right in the middle that allows us to make a tight fist - probably better that the robot can’t do that.




You’ll notice a couple of interesting things here - for one, unlike a human, the Tesla Bot has 4 fingers of equal length - this just makes sense from a manufacturing point of view. And for two, when the robot hand is at rest, the fingers actually curve inwards a little bit, just like a person.

If we flip over to our new friend, Figure One, we’ll notice two more things about the hands. Thing one: Figure actually does have the middle knuckle - though I’m not convinced it makes the hand any more useful. And thing two: when Figure is at rest, it just sticks all of the fingers straight out, and that makes it appear much more robotic and even a bit unsettling.

Anyway, the movement of the hand is driven by 6 actuators and has 11 degrees of freedom - there’s so much precision required here that Tesla has a small computer controller embedded into each hand to drive the fingers and receive feedback from the pressure sensors in each fingertip. The internal controllers also work to help Optimus recognize where his hands are in the physical space.

The Head

The head of the Tesla Bot is mostly hollow - there’s not much up there right now except for digital cameras - the head is actuated with two degrees of freedom, it can look up and down and side to side for increased field of view - but this is mostly just done to make it appear more human friendly - the Bot can easily have cameras pointing out of the head in every direction that see everything all at once.

As far as we know, Tesla hasn’t done anything yet with the front screen, or ‘face’, of the Bot - it’s just a black abyss - which is definitely better than other robots that try and have a human or animal-like face - those are not cool - but the deep black is a bit disconcerting. I like what Figure did with their front screen, it’s just a simple light display that lets you know what the robot is up to.

The Heart

Inside the chest cavity of the Tesla Bot you’ll find mostly battery cells - the same 2170 sized Panasonic batteries that power the Model 3 and Model Y - they’re very energy dense and allow you to pack the most power into the smallest space - because power consumption with a humanoid robot is a very important factor - if these things are supposed to be more productive than human beings, then they can’t be spending too much time sitting in a charging station.

Tesla has gotten the Bot’s power consumption down to just 100 watts when sitting - and it ramps up to 500 watts of power when in a brisk walk - and all of that stuff that we just talked about with the actuators also plays a big part in power consumption - more efficient linear actuators reduce the energy demand, but they also reduce the weight of the arms and legs, which in turn demands less energy to move - it’s all interconnected.

The battery pack is 2.3 kilowatt hours of storage, and Tesla says that this is good for a full day of work, which we assume means 8 or 9 hours of continuous operation. Just like on a Tesla vehicle the battery pack has an integrated computer brain that manages charging, power distribution and cooling.

The chest of the Tesla Bot is also where you’ll find the brain - that’s Tesla’s own custom designed ‘system-on-a-chip’, also known as the Autopilot Computer - it has wi-fi connectivity in addition to LTE. This computer processes all of the vision data from the head cameras and sensory input from the fingers to make decisions in real-time based on its neural network training.

The computer vision and occupancy networks used by Optimus are ported directly from the vehicle based Autopilot software - the only change really is the training data - Optimus doesn’t need to know how to drive - not yet - but he does need to know how to interact with a three dimensional world that has depth and height - this is a totally new perspective for Tesla’s AI team - the number one goal with a driving based AI was to stay inside the lines and not hit anything - while Optimus needs to interact with his world, he needs to touch and grasp objects, he needs to understand shelves and tables and deep bins full of objects…

There is so much more to see - and the way forward is also very similar to the Tesla self-driving car - they just need to make a whole lot of these robots and then get them out into the world to start learning from experience - just like Autopilot, Optimus will make mistakes, but every error is just one step closer to success.

Previous
Previous

Elon Musk Just Changed Tesla Forever!

Next
Next

Elon Musk Isn't Telling You Something About The 25K Tesla...