Trained on classified battlefield data, AI multiplies effectiveness of Ukraine’s drones: Report
“These systems can often achieve objectives using just one or two drones per target rather than eight or nine,” Ukrainian-American scholar Kateryna Bondar, a former advisor to Kyiv, writes in a new report released today by CSIS.


Ukrainian soldiers, nicknamed ‘Doc’ and ‘Dean’, work together on piloting a FPV as 112th brigade of 244th battalion from Ukrainian army operates surveillance and FPV drone attacks against Russians defending their positions in the horizon in Chasiv Yar, Ukraine on January 17, 2025. (Photo by Andre Luis Alves/Anadolu via Getty Images)
WASHINGTON — Ukraine has taken publicly available AI models, retrained them on its own extensive real-world data from frontline combat, and deployed them on a variety of drones — increasing their odds of hitting Russian targets “three- or four-fold,” according to a new thinktank report.
“By removing the need for constant manual control and stable communications … drones enabled with autonomous navigation raise the target engagement success rate from 10 to 20 percent to around 70 to 80 percent,” writes Ukrainian-American scholar Kateryna Bondar, a former advisor to Kyiv, in a new report released today by the Center for Strategic and International Studies. “These systems can often achieve objectives using just one or two drones per target rather than eight or nine.”
To be clear, Ukraine has not built the Terminator. “We’re very far from killer robots,” Bondar told Breaking Defense in an exclusive interview. But in contrast to the more cautious bureaucracy of the West, she said, “the Ukrainians are more open to testing and trying anything and everything that can kill more Russians.”
The AI in question, Bondar explains, relies on the human to select a target; only then can the AI make the final approach on its own, autonomously flying the last 100 to 1,000 meters. While very limited, this autonomous final approach is still a huge improvement over most drones on both sides of the war, which require a human hand on the controls to guide them all the way to impact. If that human hand is too tired, shaking with fear, or just poorly trained, or if the control signal is disrupted by increasingly omnipresent frontline radio jamming, the remote-controlled drone will crash uselessly into the countryside.
For now, Bondar found the vast majority of Ukrainian drones still require human control all the way to the target. Of nearly 2 million put on contract by Kyiv in 2024 — 96 percent of which, incidentally, were built in Ukraine — the report says only 10,000 that definitely used AI guidance, less than half of one percent. (The actual total may be much higher, she cautioned). While 10,000 may seem a lot by the standards of anemic Western peacetime procurement, by some estimates it’s merely the number of drones Ukraine expends in the average month.
However, that relative handful of autonomous-final-approach drones has proved so effective in combat that Kyiv now plans to ramp up production drastically. While there’s no publicized target, Bondar estimates that the Ukrainian military wants at least half the drones it buys in 2025 to have AI guidance — an increase from 0.5 percent to 50.
If realized, that ambition means deploying about a million AI-assisted drones, each three or four times as likely to hit its target as current remote-controlled models — easily a twelve-fold increase in killing power. In a war of grinding attrition, where drones have replaced artillery, the traditional “king of battle,” as the main cause of casualties, those numbers could be decisive.
How Ukraine Did It: Keep It Simple, Suchka
Just 13 months ago, this kind of autonomous-final-approach AI was still unworkable on the small, cheap drones that have dominated fighting in Ukraine. Russia had rolled out and then apparently abandoned such a feature on its widely used Lancet, while Ukrainian counterparts were promised products rather than practical weapons.
But starting this fall, reports of AI-guided drones began popping up again, this time on the Ukrainian side. How did a country under constant bombardment, with a GDP comparable to Iraq, an arms industry inherited from the Soviet Union, and erratic support from Western allies, manage such a feat?
To start with, most of the algorithms guiding Ukraine’s autonomous-final-approach drones are derived from free, internationally available open-source models, Bondar found. That allowed Ukrainians to skip some of the most expensive and time-consuming parts of AI development.
From that open-source baseline, Bondar explained, the Ukrainians intensively retrained the models on Kyiv’s own classified, real-world battle data — using datasets tailored not just to current combat conditions but, often, to a specific sector of the front and specific types of drone.
“The frontline is very long … and the situation is very different on different parts of the front,” Bondar explained. For instance, she said, the infamous Bakhmut sector is now stable and static, with both sides dug in to avoid artillery and drones. In other areas, however, the Russians continue to advance, sending “wave after wave” of expendable “Storm Z” troops in small squads across dangerously open terrain.
The kind of drone the algorithm is meant for also matters, Bondar noted. High-altitude reconnaissance drones see the battlefield and targets from a different angle and greater distances than the low-flying First Person View (FPV) drones that do most of the actual strikes, for example, so imagery collected by one may not be suitable for training the other. Even the specific type of camera matters.
All these factors go to show that in Ukraine, as in machine learning development generally, the right training data is critical to making machine-learning actually learn something useful. In Ukraine, getting the data right required a combination of bottom-up innovation by private sector techies and top-down organization by government officials.
The initial outpouring of international aid and domestic innovation after February 2022 had left Ukraine with a bewildering variety of equipment, including information technology. “They call it a ‘zoo’ of technologies,” Bondar said. “One warfighter can have 10 different software systems on his tablet or phone.”
Some intelligence data comes through official channels from military-operated recon drones, but much of the information is still cobbled together by volunteers combing social media apps like Telegram for reports, photos, and videos of Russian vehicles.
“Now the government is trying to increase interoperability among these systems and integrate them,” Bondar explained. “They started to address this issue by creating, basically, one universal military dataset.”
While much of the data is still uploaded and even labeled by volunteers, they now have to conform to government standards for tags and categories. That better-organized data is then housed on a volunteer-developed but government-run military intelligence system known as Delta.
To protect the data, Bondar went on, “they have created a secure training environment.” Private companies can access the information and create custom datasets tailored to their specific training needs, but the data itself remains on government computers, and it’s on those computers that the algorithms are actually trained.
Only the final product, the guidance algorithms themselves, are exported and installed on drones, and even then only in encrypted form so the Russians can’t easily copy captured tech.
All of this only works because the Ukrainians limited their appetites for AI, Bondar emphasized, developing lots of small, specialized systems instead of a few mega-projects — a lesson the US defense sector is starting to learn.
“They also started with this idea of creating a huge mega-platform which would cover everything and do everything,” she said. “They slowly realized … the current level of AI development allows you to train models on very small datasets.”
“These small models are easier to train, easier to update,” Bondar said. “They’re way cheaper.”