Hello and welcome to “Vision Vitals” - e-con Systems Podcast.
At e-con Systems, we've been a thought leader in embedded vision for over 20 years. And this podcast is another way for us to open up the conversation, bringing you insights from our experts and from across the field.
For this episode, we're starting with a big one: What is NVIDIA Jetson AGX Thor? This new platform is already being called a game-changer for robotics, AI, and advanced vision systems.
I'm excited to be here and to talk about Jetson Thor. Certainly, it's one of the most anticipated releases in embedded computing.
Let's start at the top. What exactly is Jetson AGX Thor, and how does it fit into the Jetson family?
Speaker:
Jetson AGX Thor is NVIDIA's newest system-on-module for robotics and physical AI. It sits above the Jetson AGX Orin, both in terms of raw performance and overall capability. What makes Jetson Thor stand out is that it's built to handle advanced robotics workloads where real-time reasoning is critical.
It symbolizes NVIDIA's vision of enabling what they call “Physical AI”, which refers to machines that can perceive, understand, and interact with the real world in a much richer way.
Host:
The specs are getting a lot of attention — up to 2070 FP4 TFLOPS and 128 gigabytes of memory. What do those numbers mean in practice for developers?
Speaker:
Those numbers translate to massive headroom for AI workloads. TFLOPS are a measure of how many trillions of operations per second the system can handle, so with 2070 FP4 TFLOPS, developers can run very large models directly at the edge.
Combine that with 128 GB of memory, and you can run multiple large-scale neural networks at the same time. It could mean multi-camera sensor fusion, large language model integration, or advanced 3D mapping. The best part is - all this happens without relying on constant cloud offloading.
Host:
NVIDIA highlights the flexible power envelope of 40 to 130 watts. Why is that such a key factor for robotics and edge devices?
Speaker:
Robotics platforms are always constrained by size, weight, and power. With Jetson Thor, 40 watts allows mobile robots, drones, or compact devices to operate efficiently. On the other end, 130 watts gives you maximum performance for larger systems with higher thermal and power budgets.
Such flexibility means you can design once and scale across multiple robot types with the same core platform.
Host:
We've heard the term “Physical AI” used a lot. Can you explain what that means, and how Thor enables it?
Speaker:
Physical AI is about AI operating in the real world, perceiving and acting in dynamic environments. Jetson Thor enables this with high compute density, huge memory capacity, and support for state-of-the-art neural networks.
Imagine robots that can collaborate safely with humans on factory floors, autonomous machines that navigate unpredictable outdoor settings, or assistive robots that adapt to people's behavior. That's physical AI in action, and Thor is purpose-built for it.
Host:
How does NVIDIA Thor compare with its predecessor, AGX Orin?
Speaker:
Thor delivers about 7.5 times the AI performance and 3.5 times the power efficiency of Orin. It integrates the latest GPU architecture, CPU upgrades, and faster I/O interfaces. Most importantly, it supports much larger models, including multi-modal and LLM-based AI for robotics.
For developers, that means a smoother path to scaling from existing Orin projects into more advanced workloads.
Host:
What industries or applications do you see benefiting most from Jetson Thor?
Speaker:
The impact is broad. In robotics, you'll see Thor driving autonomous mobile robots, humanoids, and warehouse automation systems. In industrial automation, it will power inspection platforms and predictive maintenance systems.
Mobility applications include autonomous vehicles, last-mile delivery robots and off-road vehicles. Even in healthcare and service robotics, Thor enables assistive devices that need both safety and real-time intelligence.
Host:
And finally, what does Thor unlock that wasn't possible before for developers in embedded vision?
Speaker:
Thor enables much more advanced vision workloads. You can handle higher-resolution multi-camera arrays and perform sensor fusion across vision, LiDAR, and radar in real time. That opens doors to true 3D perception, complex scene understanding, and predictive modeling.
For embedded vision leaders like e-con Systems, it means building camera solutions that are not only easy to integrate but also powerful enough to support next-generation AI robotics right out of the box.
Host:
That's a powerful way to close. thank you for walking us through Jetson AGX Thor.
Speaker:
My pleasure. Always glad to share insights on where robotics and vision are heading.
Host:
That's all for today. Thank you for joining us on day one of this journey.
Please make sure to follow, so you never miss an episode.
If you'd like to learn more about e-con Systems, please visit www.e-consystems.com.
We'll be back next week with more conversations that put the future of vision in focus. Until then, take care!
Close Full Transcript