At a U.S. military base in central California, a four-seater all-terrain vehicle roams a hillside path. This is training, but not for the people in the vehicle. This is an effort to train AI models to enter conflict zones.
The autonomous military ATVs are operated by Scout AI, a startup founded in 2024 by Coby Adcock and Colin Otis that calls itself “Frontier Lab for Defense.” The company announced Wednesday that it has raised $100 million in a Series A round led by Align Ventures and Draper Associates, following a $15 million seed round in January 2025.
Scout invited TechCrunch to take an exclusive tour of the training operation at the military base, which he asked not to be named.
The company is building an AI model called “Fury” to operate and command military assets. At first it was used for logistics support, but over time it could also be used for autonomous weapons. CTO Collin Otis likens the effort, which builds on existing LLMs, to training soldiers.
“They start at 18 years old, and in some cases they start after college, so you have to start with that base level of intelligence,” Ortiz told TechCrunch. “It’s helpful to start with those who are already invested and ask how can we teach this as an amazing military AGI, rather than just a broad intelligent AGI.”
Scout has won military technology development contracts totaling $11 million from organizations such as DARPA, the Army Applied Research Laboratory, and other Department of Defense customers. The company is one of 20 autonomous companies whose technology is used by the U.S. Army’s 1st Cavalry Division during routine training at Fort Martin. Hood, Texas, hopes the unit will bring a product to prove itself when it next deploys in 2027.
Scout’s internal testing saw the rubber come into contact with dirt on the base’s hilly terrain. There, the company’s operations team, led by former soldiers, tests the vehicle in simulated missions.
tech crunch event
San Francisco, California
|
October 13-15, 2026
Self-driving cars are starting to be seen in more cities around the world, where they operate in more structured environments with rules. Operating autonomously on unmarked trails and off-road is another challenge entirely. Ortiz, a former executive at the self-driving truck company Kodiak, said he wanted to start Scout because he realized the systems he helped develop were not intelligent enough to operate in unpredictable combat zones.

A new approach to autonomy
Scout focuses on a newer autonomous technology called Vision Language Action Model (VLA), which is used to control robots based on LLM. The technology, first released by Google DeepMind in 2023, has seeded robotics startups like Physical Intelligence and humanoid robot company Figure.AI, led by Adock’s brother Brett.
Adcock is on the Figure board. He says that experience convinced him of the opportunity to bring extensive intelligence to the military’s growing fleet of self-driving vehicles. His brother introduced him to Otis, who was advising Figure, and they set out to apply modern AI to military solutions.
“Right now, I hand you a drone controller, put on a headset, and you can learn to fly a drone in minutes,” Ortiz said. “It’s really just learning how to connect prior knowledge to some of these little joysticks. It’s not a huge leap. This is how we think about VLA and why VLA is unlocked.”
In fact, I had the opportunity to drive a Scout ATV on rutted trails, and the terrain was challenging, with steep hills, loose sand around turns, disappeared railroad tracks, and confusing intersections. I’m not an experienced ATV driver, but I did pretty well on my first try (if I do say so myself). That’s the kind of general intelligence the company is looking for in its models, and it’s been just six weeks since using civilian ATVs to start the process, and it’s been conducting training via these ATVs.
I also rode an autonomously controlled ATV and felt the difference in accelerating faster than a human with passenger comfort in mind. The management team, like the training drivers, noted how the vehicle moved to the right on wide roads, but stayed in the center on narrow roads. Also, when they get confused, they suddenly slow down to think about their next move, which happened several times during the 6.5km loop that took us back to base.
Although VLA is new enough that no company has deployed it in an operational environment yet, “the technology is good enough to be tested in the field with soldiers to find the most effective method for the U.S. military,” said Stuart Young, a former DARPA program manager who worked on ground vehicle autonomy. And like other autonomous companies, Scout’s complete autonomous stack also includes deterministic systems and other types of AI to enrich the agent’s capabilities.
Young left DARPA to join the field this month after managing a program called RACER. It urged companies to help seed the field and develop fast, autonomous off-road vehicles in the same way the organization’s Grand Challenge boosted self-driving cars. Two competitors in the space, Field AI and Overland AI, were separated from that program, and Scout joined as a later addition.
The first application of ground autonomy will be autonomous replenishment, Scout executives and military engineers say. Transporting water and ammunition to distant observation posts, or in convoys of six to 10 self-driving cars followed by a manned truck, can save valuable human labor for more important missions. Brian Maswich, an active-duty infantry officer serving as a Scout military fellow, recalled a recent exercise in Alaska where he led a supply convoy in pitch darkness, hoping self-driving cars would help him.

Adds intelligence to the Army’s motor pool
Scout considers itself primarily a software company that builds the intelligence layer of military machines. Instead of building self-driving cars themselves, we’re going to build on top of them.
Adcock expects the startup’s first product to be widely adopted will be something called Ox, the company’s command-and-control software bundled with enhanced computer hardware (GPU, communications, cameras). The idea is to allow individual soldiers to coordinate multiple drones and autonomous ground vehicles with commands like “Go to this waypoint and observe enemy forces.”
However, the software requires on-vehicle training to work. So the Foundry is what the company calls a training range located on a military base. There, drivers spend eight-hour shifts driving the ATV at their own pace, then use a reinforcement learning system to record where they had to take over and use that to improve the model. The base commander requested the company’s ATVs to conduct security patrols.
One of the hypotheses that Scout is testing is that VLA enables this relatively limited data set and simulation training data to provide a fully functional driving agent. For example, this vehicle looks comfortable on the trail, but is not ready to operate completely off-road.
Scouts are also practicing using drones for reconnaissance and as weapons, giving them intelligence through vision language models, a variant of multimodal LLM.
Scout is working on a system where groups of munitions drones would fly on a larger “quarterback” platform that would provide more computing resources to command them. One mission would involve a drone searching for and attacking hidden enemy tanks in a geographic area, perhaps without human intervention. Ortiz argues that an alternative approach in this scenario could be indirect artillery fire, but this would be inaccurate compared to drone strikes.
Autonomous weapons are a flashpoint in defense technology policy, but experts say the concept is old, with heat-seeking missiles and landmines having been in use for decades. The question for engineers is how the weapon will be controlled, Jay Adams, a retired U.S. Army captain who heads the Scout’s operations team, told TechCrunch.
He points out that the company’s military drones can be programmed to only attack threats in specific geographic areas, or only upon human confirmation. He also said autonomous weapons platforms are unlikely to open fire out of fear, as the 18-year-old soldier did.
VLA also has the potential to enable better targeting. Scout says its models are pre-trained on specific military data, in case it collides with an enemy tank during a resupply mission, for example. Lt. Col. Nick Rinaldi, who oversees scouting research at the Army Applied Research Laboratory, said automated targeting is difficult and unlikely to be used in the short term outside of a constrained environment, but VLA’s potential for inferring about threats makes it a promising technology for investigation.
Adams said the ability of drones to identify their own targets is key to future warfare. Russia’s invasion of Ukraine has sparked intense interest in drone warfare, but the United States believes that human control of individual UAVs is not large enough to counter the large number of low-cost unmanned systems that threaten the U.S. military.
Mission to counter anti-military atmosphere

Like many defense startups, Scout has a clear mission statement, and its executives will not hesitate to criticize companies that are reluctant to hand over technology to the government. For example, Google reportedly pulled out of a Pentagon competition to develop a control system for autonomous drone swarms, a feature Scout is also working on.
“AI people don’t want to work with the military,” Ortiz told TechCrunch, referring to Anthropic’s dispute with the Department of Defense over terms of use. “Neither of them accept running agents on one-way attack drones or running agents on missile systems.”
Nevertheless, Scout is actually using an existing LLM as the basis for building the agent, but did not reveal which LLM it is. Otis said it has contracted with “a very well-known hyperscaler” to provide pre-trained intelligence to Scout’s underlying model. Otis also declined to comment on whether it uses an open weight model like those offered by Chinese companies. Many companies that rely on AI inference build these models to operate at lower costs compared to models from Frontier Labs such as Anthropic and OpenAI.
Scout plans to address this problem by building its own models from scratch over the next few years, and the founders say much of the capital will go toward their training and computational costs. In fact, Ortiz wonders if Scout will be able to defeat existing leaders with AGI, since Scout’s models constantly interact with the real world.
“There is an argument in the AGI community that you can only increase your intelligence by reading the Internet, and that most intelligence comes from interacting with the world,” Ortiz said.
So does that mean Adcock is competing with his brother’s humanoid robot army in figures? No, says Ortiz, but “we can scale faster because our customers have the assets,” referring to the Department of Defense.
If you buy through links in our articles, we may earn a small commission. This does not affect editorial independence.
