Drones are are life-saving public safety tools. You’ve seen them be deployed over wildfires, hover during floods, or scout out collapsed buildings. But how do we know which ones are actually good enough for the job?

The Department of Homeland Security tested five U.S.-approved drones made for first responders through its Science and Technology Directorate. Not in a lab. Out in the field. In urban chaos.

Skydio X10D ranked first out of all five, scoring an impressive 4.0 out of 5.0, to beat heavyweights like Freefly Astro and Parrot’s ANAFI USA GOV. Evaluators praised its sharp EO camera, stable connection, and fast response time.

These drones were rated by firefighters and police officers from across the country—people who actually use them when lives are on the line. Let’s look at how the tech delivers in real urban emergencies.

Background: Compliance Factor in Urban Emergency Response

Not all drones make the cut. Only certain models meet federal safety, security, and performance standards. That’s where the “Blue UAS” label comes in.

The Department of Defense’s Defense Innovation Unit created the Blue UAS list to flag drone models that are cleared for U.S. government use. These drones have been tested for cybersecurity risks. They’re prohibited from using foreign components or software, in line with Section 848 of the NDAA 2020.

Basically, if it’s not Blue-listed, it’s off-limits for most federal operations.

The Department of Homeland Security took that list and ran its own hands-on assessment, focusing on how these drones perform in urban emergencies. Only five drones made it to the test.

You’ll want to know how they did. Especially if you’re in charge of buying one.

DHS Science and Technology (S&T) Assessment Conditions

In November 2024, the Science and Technology Directorate of the U.S. Department of Homeland Security (DHS) tapped into the National Urban Security Technology Laboratory (NUSTL) and its SAVER program to run the boots-on-the-ground evaluation. The goal was to find out how the Blue UAS drones actually perform in real-world settings.

The assessment went down in Miami, Florida—complete, with real urban obstacles. Concrete, glass, shadows, signal interference. The works. Local support came from the City of Miami Fire Rescue and the Miami Police Department.

But the test wasn’t run by a single agency. It was national.

Eight expert drone operators—firefighters, police officers, and public safety tech specialists—came from California, Colorado, Georgia, Massachusetts, Michigan, New York, Texas, and Virginia. Every one of them had experience flying UAS during emergencies.

They put five drones through 15 different tests. Evaluators prepped, launched, flew, landed, changed batteries, re-deployed, and even performed maintenance. All in both full daylight and low-light conditions. 

They tested latency. Control link quality. Camera performance. Time to deploy. Time to re-deploy. How easy the ground station was to read when the sun was high or the scene was dim.

The Contenders: Five Drones Evaluated

Five drones. One urban test. Real-world stakes.

Each drone in this assessment made it through cybersecurity screening and was listed as a Blue UAS. That means they met federal standards—but not all were built equally when pushed in the field.

Let’s break down who was tested.

You had the Skydio X10D priced at $28,382.
Then came Freefly Systems Astro—quoted at $47,018.
Next was the Ascent AeroSystems Spirit with the highest price tag of $56,195.
Parrot’s ANAFI USA GOV landed at $13,964, and Teal Drones Teal 2 was slightly higher at $15,073.

Cost was only the beginning. What mattered more was how each drone handled pressure. The assessment focused on four SAVER categories: Capability, Deployability, Usability, and Maintainability.

Each category had several criteria, totaling 15 evaluation points in all—from camera quality (EO and IR), latency, command and control link quality, flight time, portability, and how fast you could get it in the air, right down to in-house maintenance needs.

Evaluators launched, landed, adjusted payloads, tested thermal vision, lost signal behind buildings, and tried to reconnect. They flew during the day and after dark. They acted like it was a real emergency—because that’s what these machines are for. By the end, scores stretched from 4.0 (highest) to 2.7 (lowest). Those numbers showed what each drone could do—and what it couldn’t—under pressure that mirrors your field.

Skydio X10D: Performance at a Glance

Let’s start with where it stood out.

  • Electro-optical (EO) camera performance: Highest score in the group. Evaluators gave it top marks for clarity, zoom, and autofocus. During daylight flights, it delivered sharp visuals at a distance—key for standoff operations like search and rescue or threat monitoring.
  • Latency: Best again. It had the fastest response between ground control and drone movement. This matters when every second counts. While some evaluators noticed occasional video skipping and lag in tight spots, it still outperformed the rest.
  • Command and control link quality: Also top-rated. The drone held a stable signal in most urban scenarios. Though a few evaluators had trouble locking onto the ideal frequency, they still found the connection reliable enough for high-stakes use.

Breakdown by category:

  • Deployability: 4.2
  • Usability: 4.1
  • Maintainability: 4.3
  • Capability: 3.8

No other drone in the assessment scored above 4.0 in all four categories. Some were close. None were as consistent.

This drone didn’t come without flaws. A few responders mentioned minor input delays during complex tasks. And yes, the frequency issue could cost you a few seconds during deployment.

But if you’re weighing this system against others, here’s what you’re looking at: Balanced performance, strong visual output, and dependable signal control under pressure. It wasn’t perfect. But it was the most reliable across the board. And when the pressure’s on, that kind of consistency is what matters.

How the Other Drones Compared

Let’s start with Freefly Systems Astro. It landed second with a score of 3.5 out of 5.0. That’s a strong showing—but not without caveats. Evaluators liked its usability and interface. It also scored well in low-light conditions, thanks to its EO/IR and laser range finder payload. The IR camera didn’t take the top spot, but it delivered stable thermal imagery with decent resolution and contrast. However, Astro’s pricing was a concern. At $47,018, it’s significantly more expensive than the top scorer. It also lacks official GSA pricing, which complicates procurement for agencies that need predictable costs.

Then there’s the Ascent AeroSystems Spirit. Its overall score was 3.1. Spirit stood out in one big way—it received the highest score for infrared (IR) camera visual acuity. Evaluators noted its thermal resolution and contrast held up even when zoomed in. That’s not a minor detail when you’re searching for heat signatures in dense environments. But Spirit struggled with usability and deployability. It scored just 2.8 in both. That means responders found it harder to operate and slower to get in the air. For real-world operations, those seconds count.

Parrot’s ANAFI USA GOV also scored 3.1 overall. Its strength was maintainability. It earned a 4.0 in that category, which reflects easy upkeep and lower servicing time. But it fell short in one critical area: IR camera performance. The drone received the lowest IR score in the test. Evaluators cited weak resolution and poor thermal contrast, making it harder to rely on for night or smoke-obstructed missions.

And then there’s the Teal 2. At $15,073, it was one of the most affordable. But affordability came at a cost. It ranked last overall with a score of 2.7. It also finished at the bottom in latency and command and control link quality. Testers reported pixelated images, delayed response to input, and connection drops when objects blocked the signal. It did have one bright spot—maintainability. With a score of 4.8, it outperformed all others in that category. That’s valuable if your team does a lot of in-house repairs.

Still, the test made one thing clear: price doesn’t always match performance.

What This Means for First Responders

You don’t have time to wrestle with lag. Or guess what you’re seeing. Or lose signal mid-flight. Every second matters. Every decision counts. That’s why these results hit different when you’re the one responding.

Quick deployability isn’t a luxury. The drones that took longer to get in the air—like the Spirit—scored lower on speed. Skydio X10D, on the other hand, clocked the fastest time to deploy and redeploy, giving it an edge when seconds separate action from delay.

Low-light performance wasn’t equal, either. Not all drones handled darkness well. You work nights. You work in smoke. You work in chaos. You need visuals that hold up. Skydio stood out in day operations with EO clarity. Spirit led in IR at night. Others fell short, forcing operators to descend just to make sense of the image.

Then there’s usability. It’s not just about whether a drone flies—it’s how easily it works in your hands. The Freefly Astro scored well here. So did Skydio. But Teal 2, while the cheapest, lagged in flight response and image reliability.

And customization – it often what makes or breaks your workflow. Covertness. Battery swaps. Ground control interface. Those little things add up. Especially when you’re coordinating across teams under pressure.  So, no—this test wasn’t about picking a winner. It was about showing you what’s reliable, and when.

What the DHS Evaluation Tells Us

Buying drones isn’t just a tech decision. It’s a safety decision. That’s where independent testing matters. 

You don’t want to rely on marketing promises or press releases. You want field data from real responders flying real missions. The DHS assessment delivered just that—hands-on testing in full light, low light, urban clutter, and real deployment conditions. These weren’t ideal lab settings. They were messy. Complex. Exactly like the scenes you show up to.

And then there’s the issue of security. Every drone in the assessment had something in common: they were all on the Blue UAS list. That means they were cleared by the Department of Defense’s Defense Innovation Unit. No banned parts. No risky firmware. No ties to foreign vendors restricted under Section 848 of the NDAA.

Skydio’s Win and the Bigger Picture

Skydio X10D rank reflects strong visuals, low latency, and stable link quality in demanding, urban environments. Keep in mind that this assessment only included drones on the Blue UAS list. That list is limited to U.S.-made systems vetted for cybersecurity and supply chain security. It excludes any platform that uses parts, software, or services from restricted foreign entities. That includes well-known commercial names often praised for innovation and performance.

So, while this was a thorough and realistic field test, it wasn’t a complete market comparison. Some foreign-manufactured drones that perform well on technical merit couldn’t be included. They didn’t meet Blue UAS eligibility—not because of performance, but because of sourcing and compliance. That distinction matters. Especially if you’re comparing performance across international suppliers.

At the same time, the assessment offers value that goes beyond rankings. It gives you a clear, side-by-side look at how five compliant drones handled real-world stress. Whether you’re planning a new purchase, rethinking your fleet, or just curious about what’s working out there, this report gives you something solid to build on.

Full test results available here:

Blue UAS test