Robotics is emerging as a testing ground of edge computing and when Qualcomm chose to demonstrate a humanoid with a bendy back Qualcomm is not doing a show, but it is imagining where AI workloads are going: off the server rack and into power-constrained machines that need to sense, plan, and move in the real time.

In CES, Qualcomm positioned its robotics push as a “full-stack architecture” under the name Dragonwing IQ10 Series, which is scaled to small consumer devices up to human-scale applications. The prototype car was the Motion 2, a general-purpose humanoid by Vinmotion which was demonstrated in the act of crouching to pick up a teddy bear, striking a board, and most famously, flexing its torso with some spinal action uncharacteristic of modern rigid-trunk robots. Another indicator that this is intended not only as a demonstration but also as a platform strategy is that Qualcomm has partnered with industrial and humanoid developers such as Figure, Kuka Robotics and others.
“Whether from enterprise to consumer, I think the type of silicon that we develop for phones and for the edge is the perfect silicon for robots,” said Cristiano Amon.
That assertion reflects the engineering limitation that robotics continues to retrain the rest of the AI community: the embodied systems are not constructible like data centers. Sensors, inference, safety logic, and control loops must be able to fit into a chassis that moves with tight thermal constraints and have a limited battery size. The pitch of Dragonwing IQ10 by Qualcomm is based on experience with the automotive compute where architecture is defined by packaging, power efficiency, functional safety, and transfers these assumptions to robots that need to execute perception and interaction in a continuous fashion and handle motion planning and manipulation. Practically, the performance of the “brain of the robot” can only be important when it can be maintained on the edge without falling in the runtime or overheating during the bursty operations such as vision processing and grasp planning.
The deep-level AI transformation also alters the meaning of the concept of “robot compute”. The classic autonomy hierarchy used isolated perception, mapping, planning and control into software modules with tightly tuned interfaces. More and more robot constructors are seeking models which will combine language, vision and action into one policy layer capable of comprehending a goal and generating practical motions. The message of Qualcomm reflected that drift in the industry towards vision-language-action models that were an intermediate between the current large language models and completely realized autonomy with on-device inference being the distinction between a responsive machine and a slow and cloud reliant one.
Competition over that on-device role is getting more and more active since the prize is not an individual robot program it is the ecosystem that builds around toolchains, reference stacks and verification paths. In particular, NVIDIA has been clear that humanoids need a layered compute solution, with hardware abstraction to real-time control and perception to high-level thinking, and it sells Jetson Thor as a robotics computer with up to 2070 FP4 teraflops in 130 W. The differentiator of Qualcomm is not raw TOPS marketing; the idea of smartphone-and-auto DNA power-managed silicon, developed connectivity, and safety practices, which are claimed to cleanly transfer to robots that will be working close to people.
It is not a mere crowd-pleaser that is the “bendy back” detail in Motion 2. Humanoid 5 Humanoid flexible robots are also starting to be a mechanical tradeoff towards capability: compliant torso can increase reachable workspace, minimize peak actuator loads during lifting, and assist in maintaining a head-mounted vision stack stable during gait and perturbation. Another research focus that is advancing is the application of tensegrity-inspired spines, in which tension and compression balance to construct trunks that are not only load bearing, but also compliant passively. A recent tensegrity-spine spine system announced a 15 kg passive static load bearing in an unpowered trunk system under special test fixtures, and how compliance and load redistribution can help decrease momentary demand on motors and gearboxes. Such mechanical intelligence is best matched with edge AI since it reduces the control load: it requires less emergency corrections, actuation aggressiveness is reduced, and it may have reduced compute and power resources to maintain stability.
However, mechanical range of motion is not a control problem only. As robots evolve out of the fenced cells into mixed human settings, they carry on to them the lessons acquired in the industrial world of risk-assessment, safeguard, the end-effectors and the maintenance processes. ANSI/A3 R15.06-2025 has been introduced into the U.S. industrial robot safety framework, adding additional guidance, and functional safety expectations have been brought to be more explicit than before, and cybersecurity concerns have been drawn into the safety discussion. This is relevant to any full-stack robotics platform since there is only a thin line between software bugs, security vulnerabilities and physical safety when perception errors can turn into motion errors. In the case of silicon and SDK vendors, safety is not a paperwork of a downstream integrator anymore, but rather a design input, which affects the compute partitioning, watchdog strategies, sensor pipelines, and update mechanisms.
In that regard, the developer narrative of Qualcomm is an element of the bet. The robotics SDKs it offers focus on ROS packages, end-to-end samples, cross-compilation toolchains, and camera and neural inference baseline nodes, which are components designed to reduce the time to go between evaluation kit and prototype. The practical appeal lies in the fact that robotics projects do not need to begin with reference parts: they just need to build a stack of fragile components out of unrelated kernels, middleware projects, and driver forks. The documentation of Qualcomm robotics platforms holds these workflows around supported hardware including the Dragonwing RB3 Gen 2 and IQ-9075 platforms, where tooling support and sample applications are prioritized, instead of custom board-support engineering.
Plays based on robotics platforms fail when they consider the robot to be a generic edge box. The robot is an integrated electromechanical system: computing choices change wiring harness, thermal solutions, sensor placement, and even the structural geometry, which in turn change the control problem which must be solved by the compute. When Qualcomm chose to make their debut with a humanoid that could make dramatic motions with their torso it is important to remember that the future of embodied AI is not going to be won by simply having bigger models. It will be won by integrated stacks which defer to the latency, power, safety, and the unsanitary physics of positioning a machine to pick something up without toppling over.
