How Advanced Navigation's validation team went from constrained by their tools to delivering answers in real time.
Advanced Navigation builds the navigation hardware that guides autonomous systems through GPS-denied environments on land, air, sea, and space. Their devices, including the Certus EVO and the Boreas family (D90, D50), are trusted by companies and governments worldwide.
The company is in a period of significant growth, expanding their product line, shipping new devices like the Boreas D50, and serving a growing number of customers across multiple domains. With that growth, the demands on their Product Validation team have intensified.
Everything hinges on real-world testing. Jai Castle leads the Product Validation team responsible for collecting, analysing, and sharing performance data. Their ability to turn raw field data into usable answers directly determines how fast Advanced Navigation can respond to customers, validate new products for launch, and close deals.
We used to spend a full day after every field test just doing analysis. Now with Alloy, that's just ten minutes, and the conversation has completely shifted. Instead of 'can we get this done in time', it's 'what else can we go after.'Jai Castle — Product Validation Manager, Advanced Navigation
The validation team is highly capable, but the processing and analysis work required to process each product demonstration was significant. As the company scaled, the volume simply outpaced what any team could manage with legacy tools.
Using legacy tools, every validation carried an extra day of manual processing like transforming coordinates with custom Python scripts, converting data formats, calculating errors, and plotting charts before a report could be shared. This wasn't occasional — it was the process for every single validation.
As with many hardware teams managing field data across multiple offices and field sites, keeping a single source of truth felt impossible. Validation data ended up spread across network drives, cloud storage, local laptops, and FTP servers, depending on which office or test site it came from. As Jai put it: "Previously, we were reliant on keeping track of paths to files on the company's NAS and reconciling which one was which."
The result was a team that could manage roughly two validations per week. Meanwhile, Advanced Navigation's product lineup grew, and with it, validation requests. A product manager needs device comparisons. A customer wants field data for a specific use case. Engineering asks if a firmware regression has occurred. Each one required the team to dig through data, manually analyse, and prepare a response.
Before Alloy, even the fastest possible turnaround meant Jai's team could not locate, analyse, and report data until the end of the following day, at best.
Alloy replaced the entire manual stack. This includes the data wrangling, the Python scripts, the coordinate transforms, the charting, the Confluence publishing.
Now, the team uploads raw validation data from their inertial navigation system (INS) devices, and Alloy handles the rest. Automated mission reports, cross-test comparisons, error analysis, and natural language search across all historical validations.
A team member received a question about whether a specific firmware version had been validated on the Certus Evo. They pasted the question into Alloy's AI chat. In seconds, the agent searched across the full validation history, identified the relevant runs, and returned the answer "Validated ten times, clear improvement over previous versions" while citing the specific missions and firmware comparisons behind that conclusion. A query and analysis that would have taken more than 1 hour of manually searching and comparing old logs.
When a customer comes to us and asks if we have any real-world validation data for a particular scenario, now I say, let me type it into Alloy, and here it is.Jai Castle — Product Validation Manager, Advanced Navigation
As one engineer noted: "Some of the questions I've asked of Alloy's AI chat, have been quite complex, and it is able to think and run multiple scripts, and SQL queries. The layers of thinking it's able to go through is quite impressive."
The platform reached a level of capability Jai had not expected until the end of the pilot, while they were still mid-pilot: "Where we're up to now, is what I had anticipated for the time we get to the end of the pilot. We've all got confidence in the system now."
At the end of one check-in call, Jai offered the simplest summary of all: "The Alloy platform is making my job of managing multiple complex validation campaigns a lot easier."
| Before Alloy | With Alloy | |
|---|---|---|
| Analysis & reporting time per test | ~1 full day | ~10 minutes |
| Testing capacity (test per week) | 2 | 10 |
| Data-supported customer response | 1-2 days | Same day |
The numbers tell the speed story. But what matters more is what the team can now spend their time on. With the analysis automated, the validation team shifted from manual processing to strategic work like deciding what to test, expanding coverage across more devices and conditions, and designing tests that answer the questions that matter most.
Within the first three months, the team had already completed well over 60 validations. As the analysis constraint was removed, they began running more devices in parallel on every validation, extracting more insight from each field test.
The result compounds over time: more field trials, greater value from each one, and a growing library that makes every future analysis faster and more informed.
Advanced Navigation has since transitioned to an ongoing annual partnership with Alloy.
Given the consistent ROI we've seen, we have full support for continuing with Alloy and want to extend our engagement to further leverage the platform's capabilities.Jai Castle — Product Validation Manager, Advanced Navigation
Book a call to see how Alloy works with your data.