Futures
Hundreds of contracts settled in USDT or BTC
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Futures Kickoff
Get prepared for your futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to experience risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
A New Chapter in Humanoid Robots! Texas Instruments (TXN.US) Teams Up with NVIDIA (NVDA.US) to Integrate AI and Sensors, Igniting the "Physical AI" Revolution
The Tongtong Finance APP has learned that the chip giant focusing on analog chips and embedded processing solutions—long known as the “barometer of global chip demand”—Texas Instruments (TXN.US) will integrate its real-time control, sensing, and power products with NVIDIA (NVDA.US), the “world’s most valuable company,” whose advanced robotic computing components, Ethernet-based sensing, and proprietary simulation technology will provide significant technical support for developers. This will help them large-scale build, deploy, and mass produce humanoid robots and other so-called “Physical AI” terminal devices.
According to current media reports, the collaboration between analog chip leader Texas Instruments and NVIDIA is expected to push humanoid robot intelligent systems to a higher stage, rather than just a superficial “partnership to make robots.” Their latest cooperation is more about building a more complete, safer, and more scalable robotic intelligence infrastructure at the foundational technology layer, which will substantially promote the commercialization of humanoid robots in the industry.
As market expectations for combining massive AI inference workloads with physical execution continue to rise, the partnership between NVIDIA and Texas Instruments is not just an overlay of chips and sensing layers but a collaborative construction from AI inference and real-time perception to the underlying control system. This is a crucial foundation for advancing humanoid robots into real-world applications.
Giovanni Campanella, General Manager of Texas Instruments’ Industrial Automation and Robotics division, said: “TI’s comprehensive product portfolio bridges the gap between NVIDIA’s powerful AI computing capabilities and practical applications, enabling developers to validate complete humanoid operating systems earlier.” He also mentioned: “This integrated approach will accelerate the evolution from product prototypes to commercial humanoid robots, ensuring these robots can work safely alongside humans.”
Recently, NVIDIA has been committed to promoting cutting-edge AI technology into broader fields—such as robots and autonomous vehicles, which are considered “Physical AI” terminal devices—to continue expanding demand and seek new growth points beyond data center business. According to NVIDIA CEO Jensen Huang, “Physical AI” emphasizes enabling robots/autonomous systems to perceive, reason, and perform a full set of actions in the real world. An era where human civilization evolves with the help of “Physical AI” is imminent. “Physical AI” emphasizes enabling robots/autonomous systems to perceive, reason, and act in the real world, and these three capabilities are key tools to advance models from “just conversation” to “working in the physical environment.”
Texas Instruments joins NVIDIA to address the most challenging three-layer collaboration in robot perception + control + AI inference systems
As part of this collaboration, Texas Instruments has designed a sensor fusion solution that combines its millimeter-wave radar technology with NVIDIA’s Jetson Thor robotics platform, utilizing NVIDIA’s exclusive Holoscan Sensor Bridge to achieve low-latency 3D perception and safety awareness, supporting humanoid robot development. This latest joint development will be showcased at the highly anticipated NVIDIA GTC event in San Jose, California, from March 16 to 19.
Deepu Talla, Vice President of NVIDIA’s Robotics and Edge AI Business, said: “Safe operation of humanoid robots in unpredictable environments requires very powerful computing and processing capabilities to synchronize highly complex AI models, real-time sensor data, and motor control systems.”
By integrating high-definition cameras and radar data, the joint solution from Texas Instruments and NVIDIA improves object detection, localization, and tracking technology iterations, while reducing false positives/system false alarms, enhancing the real-time decision-making ability of humanoid robots.
Industry robot experts generally believe that it will still take several years before truly general-purpose autonomous humanoid robots become a reality. However, systematic progress in perception, reasoning, and motion coordination is a necessary prerequisite for commercialization. The collaboration between Texas Instruments and NVIDIA is a key step in moving the industry from “algorithm and simulation validation” to “real-world safe operation,” which will greatly help improve overall development efficiency, enhance system robustness, and shorten mass production timelines.
In robot development, the Sim-to-Real gap has always been one of the biggest challenges—AI algorithms perform well in simulated models but may fail in complex real environments. NVIDIA’s Jetson Thor, as a high-performance inference platform, has been used by many companies for robot applications, while Texas Instruments’ control and sensing modules add the ability to directly interact with the physical world. The combination of both will enable developers to verify system perception, motion, and safety earlier and more accurately, effectively shortening prototype validation cycles and reducing iteration costs.
Texas Instruments integrates its real-time controllers, sensing sensors (such as millimeter-wave radar mmWave), and power management technologies with NVIDIA’s high-performance robotics platform (Jetson Thor) and Holoscan Sensor Bridge, forming a complete chain from sensing and control to inference computing. Compared to traditional architectures relying solely on visual cameras + GPU inference systems, this sensor fusion solution can achieve low-latency 3D perception and safety awareness, improving the overall environment understanding of robots—an essential step toward deployable systems.
When humanoid robots perform tasks, they require not only complex AI inference but also real-time sensor fusion, multi-joint motion control, and edge safety decision-making—all to be completed within extremely short timeframes. Texas Instruments’ millimeter-wave radar and Ethernet bridging technology help robots reliably detect and track objects in complex environments (such as glass doors, strong/weak light, smoke, and dust), providing a solid hardware perception foundation for real-world operation.
The Great Wave of Humanoid Robots
Many US-based tech companies are dedicated to developing high-end embodied AI humanoid robots. For example, Elon Musk’s Tesla (TSLA.US), a leader in electric vehicles, AI, and robotics, is developing a humanoid robot called Optimus, planned for industrial and consumer applications.
Supported by Microsoft (MSFT.US) and OpenAI, Figure AI is trying to create a general-purpose humanoid robot capable of handling various tasks. Figure AI states: “These robots can eliminate unsafe and unpleasant work, ultimately allowing human society to live happier and more meaningful lives.” Boston Dynamics clearly hopes its Atlas robot will “revolutionize industrial work environments.”
Globally, from Tesla’s Optimus to Figure AI’s Helix super-system, and other tech companies’ R&D efforts, there is intense capital and industry deployment in this segment. Industry data shows significant progress in prototypes’ functions, perception, and motion control, such as bipedal balance, environmental sensing, and multimodal decision-making, gradually maturing. Coupled with decreasing costs along the supply chain and improving key component performance, multiple technological routes are competing, pushing the transition from conceptual research to real-world pilot applications. This active dynamic indicates the industry is moving from a “hotspot hype” phase toward genuine technological accumulation and large-scale deployment, although a large-scale popularization still has a time window. Market research predicts that the market size in this field will grow significantly over the next decade, with projects like Tesla’s Optimus aiming for high reliability and safety, and planning mass production in the coming years.
The current core driver of humanoid robot development is the deep integration of AI perception, decision-making, and motion control, including large model understanding of language and visual information, reinforcement learning for prioritization, and sensor fusion (such as vision, radar, force sensing). These systems can not only walk in controlled environments but also perform higher-level tasks like logistics handling, maintenance inspections, or service work involving human collaboration. Morgan Stanley and other institutions believe that this technological breakthrough is key to enabling commercial deployment. Morgan Stanley analysts estimate that the humanoid robot market will eventually surpass the traditional automotive industry, with global annual revenue exceeding $5 trillion by 2050, and the number of humanoid robots potentially surpassing 1 billion units.
However, Ken Goldberg, a professor at UC Berkeley and a robotics expert, recently stated in a journal article that there is still a long way to go before engineers can produce humanoid robots with real-world skills.
Goldberg said: “We are all very familiar with ChatGPT and the astonishing work it has done in vision and language, but most professional researchers are very nervous about such analogies: that now we have solved all these problems and are ready to tackle major issues related to humanoid robots, and it will happen next year. I’m not saying it won’t happen, but I say it won’t happen within two, five, or even ten years. We just want to reset expectations to avoid creating a bubble that ultimately leads to huge backlash.”