On-Prem

Systems

Nvidia upgrades tiny Jetson Orin Nano dev kits for the holidays

'Super' edition promises 67 TOPS and 102GB/s of memory bandwidth for your GenAI projects


Nvidia is bringing the AI hype home for the holidays with the launch of a tiny new dev board called the Jetson Orin Nano Super.

Nvidia cheekily bills the board as the "most affordable generative AI supercomputer," though that might be stretching the term quite a bit.

The dev kit consists of a system on module, similar to a Raspberry Pi Compute Module, that sits atop a reference carrier board for I/O and power. And, just like a Raspberry Pi, Nvidia's diminutive dev kit is aimed at developers and hobbyists looking to experiment with generative AI.

Similar to a Raspberry Pi Compute Module, Nvidia's Jetson Orin Nano Super development kit consists of a compute module containing the SoC that attaches to a carrier board for I/O and power. - Click to enlarge

Under the hood, the Orin Nano features six Arm Cortex-A78AE cores along with an Nvidia GPU based on its older Ampere architecture with 1024 CUDA cores and 32 tensor cores.

The hardware design appears to be identical to the original Jetson Orin Nano. However, Nvidia says the board now ships alongside upgraded software that unlocks additional performance and comes at a lower price of $249, compared to the original $499 price tag.

In terms of performance, the Jetson Orin Nano Super packs 67 TOPS at INT8, which is faster than the NPUs in any of Intel, AMD, or Qualcomm's latest AI PCs backed by 8GB of LPDDR5 memory capable of 102GB/s of memory bandwidth. According to Nvidia, these specs reflect a 70 percent uplift in performance and 50 percent more memory bandwidth than its predecessor.

The bandwidth boost is particularly important for those looking to play with the kind of large language models (LLMs) that power modern AI chatbots at home. At 102GB/s, we estimate the dev kit should be able to generate words at around 18-20 tokens a second when running a 4-bit quantized version of Meta's 8-billion-parameter Llama 3.1 model.

If you're curious about how TOPS, memory capacity, and bandwidth relate to model performance, you can check out our guide here.

For I/O, the dev kit's carrier board features the usual fare of connectivity for an SBC, including gigabit Ethernet, DisplayPort, four USB 3.2 Gen 2 type-A ports, USB-C, dual M.2 slots with M and E keys, along with a variety of expansion headers.

In terms of software support, you might think those Arm cores might be problematic; however, that really isn't the case. Nvidia has supported GPUs on Arm processors for years, with its most sophisticated designs - the GH200 and GB200 - utilizing its custom Arm-based Grace CPU. This means you can expect broad support for the GPU giant's software suite including Nvidia Isaac, Metropolis, and Holoscan, to name a few.

Along with a variety of open AI models available via Nvidia's Jetson AI lab, the dev kit also supports up to four cameras for robotics or vision processing workloads.

Alongside the new dev kit, Nvidia is also rolling out a software update to its older Jetson Orin NX and Nano system on modules, which it says should boost GenAI performance by 1.7x, so if you already picked up an Orin Nano, you shouldn't be missing out on too much. ®

Send us news
14 Comments

Where does Microsoft's NPU obsession leave Nvidia's AI PC ambitions?

While Microsoft pushes AI PC experiences, Nvidia is busy wooing developers

Nvidia shrinks Grace-Blackwell Superchip to power $3K mini PC

Tuned for running chunky models on the desktop with 128GB of RAM, custom Ubuntu

Nvidia snaps back at Biden's 'innovation-killing' AI chip export restrictions

'New rule threatens to squander America's hard-won technological advantage' says GPU supremo

Just as your LLM once again goes off the rails, Cisco, Nvidia are at the door smiling

Some of you have apparently already botched chatbots or allowed ‘shadow AI’ to creep in

Microsoft, PC makers cut prices of Copilot+ gear in Europe, analyst stats confirm

Double-digit reduction only served to 'stimulate some interest'

Additional Microprocessors Decoded: Quick guide to what AMD is flinging out next for AI PCs, gamers, business

Plus: A peek at Nvidia's latest hype

Biden said to weigh global limits on AI exports in 11th-hour trade war blitz

China faces outright ban while others vie for Uncle Sam's favor

Nvidia shovels $500M into Israeli boffinry supercomputer

System to feature hundreds of liquid-cooled Blackwell systems

CoreWeave drops £1bn in UK datacenters – but don't expect the latest Nvidia magic just yet

Rent-a-GPU outfit's latest datacenters are packed to the brim with H200s

Trump China tariffs to 'overshadow' the 'progress' of AI PCs

Already inflated costs of NPU-based boxes set to jump on import tax

AI frenzy continues as Macquarie commits up to $5B for Applied Digital datacenters

Bubble? What bubble?

Europe hopes Trump trumps Biden's plan for US to play AI gatekeeper

Export controls would limit shipments of GPUs to large swaths of EU