It all started with two software engineers and a tomato farmer on a road trip down the West Coast.

Visiting farms to research their needs, the three hatched a plan in the apple orchard: to build a highly adaptive 3D vision AI system to automate field tasks.

Verdant, based in the San Francisco Bay Area, is developing artificial intelligence that promises universal farm assistance in the form of tractor-powered equipment for weeding, fertilizing and spraying.

Founders Lawrence Ibarria, Gabe Sibley, and Curtis Garner—two Cruise Automation engineers and a tomato growing manager—use NVIDIA Jetson edge artificial intelligence platform and NVIDIA Metropolis SDKs like TAO Toolkit and DeepStream for this ambitious piece of farm automation.

The startup, founded in 2018, has commercial deployments on carrot farms and trials on apple, garlic, broccoli and lettuce farms in California’s Central Valley and Imperial Valley, as well as in Oregon.

Verdant plans to help organic farming by reducing production costs for farmers, increasing yields and providing labor. It employs a tractor driver who is trained to operate AI-controlled implements. The company’s robots-as-a-service (RaaS) model allows farmers to see results in increased yields and reduced chemical costs, and pay per acre for the results.

“We wanted to do something meaningful to help the environment,” said Ibarria, Verdant’s chief operating officer. “And this not only reduces costs for farmers, but also increases their yields.”

The company recently raised more than $46 million in Series A funding.

Another recent development at Verdant was the hiring of a Chief Technical Officer Frank Dellaert, which is known for using graphical models to solve large-scale mapping and 4D reconstruction tasks. A Georgia Institute of Technology faculty member, Dellaert led work at Skydio, Facebook Reality Labs, and Google AI while on sabbatical from the research university.

“One of the things that impressed me when I joined Verdant was how they measure performance in real time,” noted Dellaert. “It’s a promise to the manufacturer, but it’s also a promise to the environment. It shows whether we are really saving on all the chemicals used in the field.”

Verdant is a member The beginning of NVIDIAa free program that provides startups with technical training, go-to-market support, and AI platform guidance.

Companies around the world — Tractor Monarch, Bilberry, Green-eyed, FarmWise, John Deere and many others — are building a new generation of sustainable agriculture NVIDIA Jetson AI.

Working with Bolthouse Farms

Verdant has partnered with Bolthouse Farms, based in Bakersfield, California, to help its carrot growing business transition to regenerative farming practices. The goal is to use more sustainable farming methods, including reducing herbicides.

Verdant starts with weeding and then expands to precision fertilizer for Bolthouse.

Computing and automation from Verdant has enabled Bolthouse Farms to understand how to achieve sustainable farming goals, according to farm management.

Riding the Jetson AGX Orin

Verdant puts Jetson AGX Orin system-on-module inside the tractor cab. The company says Orin’s powerful computing power and availability with vendor-protected enclosures make it the only choice for agricultural applications. Verdant also collaborates with The Jetson Ecosystem partners including RidgeRun, Leopard Imaging and others.

The module allows Verdant to create 3D visualizations that demonstrate plant treatment for the tractor operator. The company uses two stereo cameras for field visualization, inference and data collection in the field to train models on NVIDIA DGX systems powered by NVIDIA A100 Tensor Core GPUs at its headquarters. The performance of DGX allows Verdant to use larger training data sets to obtain better model accuracy in inference.

“We’re showing a model of the tractor and a 3D rendering of every single carrot, every single weed, and the actions we’re doing, so it helps customers see what the robot is seeing and doing,” Ibarria said, noting that all of this can work on a single an AGX Orin module providing logic output at 29 frames per second in real time.

Apple Vision based on DeepStream

Verdant relies on NVIDIA DeepStream as the basis for running its core machine learning to help strengthen its detection and segmentation. He also uses custom CUDA cores perform a number of tracking and positioning elements of your work.

Founder and CEO of Verdant, Sibley, whose postdoctoral research was in simultaneous localization and mapping brought this experience to agriculture. This comes in handy to help present a logical view of the farm, Ibarria said. “We can see things and know when and where we saw them,” he said.

This is important for apples, he said. Treating them can be difficult because the apples and branches often overlap, making it difficult to find the best way to spray them. The 3D visualizations made possible by AGX Orin provide a better understanding of occlusion and the correct path for spraying.

“With apples, when you see blooms, you can’t just spray it when you see it, you have to wait 48 hours,” Ibarria said. “We do this by creating a map, moving ourselves to another place, saying, ‘The flower is blooming, I saw it two days ago, so it’s time to spray.’

NVIDIA TAO to produce 5x models

Verdant relies on NVIDIA TAO Toolkit to create a model. The transfer of learning A capability in the TAO Toolkit allows you to take ready-made models and quickly refine them using images taken in the field. For example, it made it possible to go from detecting carrots to detecting onions in just a day. Previously, it took about five days to create models from scratch that reached an acceptable level of accuracy.

“One of our goals here is to use technologies like TAO and transfer the knowledge to get up and running in new circumstances very quickly,” Dellaert said.

While reducing model build time by a factor of 5, the company was also able to achieve 95% accuracy in its vision systems using these techniques.

“Transfer learning is a big weapon in our arsenal,” he said.