Connect with us

Tech

Nvidia’s New $250 ‘Jetson Computer’ Lets Hobbyists Play Around With AI Locally

Published

on

Nvidia’s New 0 ‘Jetson Computer’ Lets Hobbyists Play Around With AI Locally

Nvidia has released a new $249 version of its Jetson computer meant for running artificial applications locally. The palm-sized Orin Nano is said to double the speed and efficiency of its predecessor at half the price, and it can process roughly 70% more computational tasks, according to Nvidia.

The Orin Nano is ideal for hobbyists looking to train their own artificial intelligence applications, or for developers of robots and other industrial tools to run sophisticated applications without connecting to the cloud.

In a brief video on YouTube announcing the product, Nvidia CEO Jensen Huang stands in his kitchen as he pulls a tray out of his oven to reveal the small, palm-sized computer. Huang goes on to say the computer can process almost “seventy trillion” operations per second and draws just 25 watts of power.

Nvidia CEO Jensen Huang on Tuesday debuted Nvidia’s latest Jetson computer for running AI programs locally. Credit: Nvidia

“A long time ago, we created a brand new type of processor, it was a robotics processor,” Huang says. “Nobody understood what we were building at the time, and we imagined that someday these deep learning models would evolve and we would have robots for everything.” The Jetson computers were intended to power robots, but today can also run large language models like Meta’s Llama.

Sam Altman’s startup World (formerly Worldcoin), which aims to authenticate human identity with iris scans, uses a Jetpack module inside its Orb scanning device. In a blog post back in October, World said, “In its newest iteration, the Orb is equipped with the most advanced NVIDIA Jetson module with nearly 5x the AI performance over the previous version to enable even faster, more seamless proof of human verifications.”

In essence, the Orin Nano is a portable brain that can be plugged into other hardware to power its AI functionality. Cloud hyperscalers—a term for companies building massive data centers for AI— like Amazon and Google charge for access to servers and AI models, and those costs can add up. Certain applications might need guaranteed uptime and minimal latency, such as warehouse robots. Connecting to a remote cloud hosting provider is not ideal. That being said, a computer like the Orin Nano will only be capable of running more lightweight AI applications—it will not replace Nvidia’s high-end GPUs that cost tens of thousands of dollars and are capable of training and inferring from large-scale AI models.

Still, in a world where running AI is becoming increasingly cheap and accessible, startups are quickly going to learn the value of having real intellectual property. Shoving an AI model into a smart home camera is no longer going to be good enough when anyone can easily do it.

Continue Reading