Computer vision, generative AI, resilient communications, and more…
Transcript
SUMMARY KEYWORDS
Software Defined Warfare, Navy Technology, Integration Challenges, Resilient Communications, Computer Vision, Generative AI, Military-Civil Fusion, Unmanned Systems, Data Integration, Operational Outcomes, Commercial Technology, Defense Innovation, Acquisition Strategy, Geopolitical Challenges.
SPEAKERS
Akhil, Artem, Justin, Maggie
Maggie 00:30
Welcome to the Mission Matters podcast, a podcast from Shield Capital where we explore the technical opportunities and challenges developing and deploying commercial technology to national security customers. In this episode, we're joined by Justin Fanelli, the CTO of the Navy and technical lead for PEO Digital, as well as Artem Sherbinin, the CTO of the Surface Navy. Justin and Artem are two of the key technical leaders bringing the Navy into the 21st Century. Now, Akhil, I figure this conversation probably hit home especially hard for you based on your first hand experience with the current state of technology in the Department of the Navy, from your time as a Marine back in the day.
Akhil 01:31
Yeah, thanks, Maggie. You know, preparing for this discussion did, in fact, remind me of my brief but very memorable time aboard naval shipping and really experiencing firsthand the technical challenges facing our maritime forces, whether that's planning and executing as a disaggregated force under limited communications and, oh, while having to use systems that have to survive the salt water the extreme of the maritime space. I think also what makes this conversation particularly timely is just everything happening in the world right now. Justin and Artem are at the forefront of making the Navy more lethal and capable in confined waters like the Persian Gulf and the Red Sea, as well as being able to do things like maintain presence and deter our adversaries in the vast expanses of the Pacific Ocean. And so this conversation is going to cover all of that, and I'm excited to get into the nitty gritty details of deploying computer vision for maritime to what technology looks like in potential conflicts with peer adversaries, as well as what software defined warfare actually means.
Yeah. Artem Justin, I think this will be a little bit of a unique podcast for us. We've spent the last couple episodes really diving deep technologically into some of it, but I think we'll have a chance to kind of hit that intersection. You guys both come from deeply technical worlds, but are also involved in managing technology adoption. Before we get to the latter, though, we'd love to ask you both a softball question, what's the one tech area, application sort of challenge that keeps you up at night, that you can't read enough about and that you're really trying to get your hands around.
Justin 03:04
For me right now, resilient comms is actually finally starting to get resilient comms. And so resilient comms used to mean like two and now we can be way more diversified in the different inputs, but I'd say that there are a lot of folks who are kind of nascent, or these communities didn't used to talk to each other. And so when I was at Davos this year, we said, oh, it was unthinkable for the space folks and the 5G folks to have conversations just maybe 18 months ago. And so how do all of these different communities of interest blend to solve harder problems together? It's a great time to be a versatilist, and I think there are more of them coming together, and that's going to result in to me, new protocols, new interfaces, and then ultimately new opportunities for deploying capability faster and solving effects problems in more creative ways.
Artem 04:10
If you ask two Navy people, what the hardest technical problem is, the answer always comes back to integration. Integrating with like large legacy capital assets is always hard, especially if they're made by like a prime contractor using whatever it is, resilient, comms, AI, software, hardware, anything, integration will always end up being like the hard problem to solve.
Justin 04:36
Now that said, this could be there are reasons to believe whispers that this could be the golden age of integration, because of what we're seeing with generative AI, because of model context protocol and a few other things that are coming out. And so if I don't know, 70 cents on the dollar goes to integration. And, and folks who are using specifically on the software development side, the vibe coding tools and things like windsurf or other products to go much faster, if we level that up in a significant way for interfaces and we can really go after the bottlenecks across the board. I would love to have someone who's listening to this podcast say, “Hey, we've done this for a defense application, and here is the time we've saved.” We've seen one offs, but benchmarking that and then making that the new baseline would be a real win.
Maggie 05:46
So Justin, you're telling me your vision for the future the Navy is just vibe coating our way into the future of conflict.
Justin 05:55
That is another interesting topic. I think what we said was, if we can 1/10 or 1/20 integration complexity based on applying solutions that we've seen other places, then perhaps the sky's the limit.
Akhil 06:15
That's great. Justin. In order to do that, we actually have to get some of these applications and protocols on network, on the environments that we need to but I concur with you. I mean the future in which we can collapse the data fusion, ETL, network to network collaboration is pretty awesome, but I know both of you work a lot of issues related to actually deploying software and adopting technology at scale to your point. Artem, you didn't really answer the question on integration, on tech area, but is the, is your focus still around the integration piece, or is there a specific tech area that has gotten you excited?
Artem 06:57
Yes, integration is always a focus specific tech area. You kind of hinted at it at the at the end, there is data integration. It's not particularly sexy, like ETL is a somewhat of a solved problem, right? Lots of options, ranging from Qlik to Etlworks to etc. But, you know, Foundry from Palantir, but it's a skipped step, right? The entire department is very hyped up on AI. The entire US government is very hyped up on AI. None of that matters if your data isn't right and if you have not solved the data integration problem. So I would say that's my kind of favorite problem to work on, because it combines sort of all three aspects of my job, technical, policy and, actual implementation. All three of those have a part to play there.
Maggie 07:45
I want to take a step back and really set the stage here. You know, why does all of this matter in the first place? You know, what is the state of geopolitics with our adversaries? What is the Navy's role in a potential conflict and like, like, why does it matter for us to rethink the way we design our force right now? And is there a sense of urgency?
Artem 08:10
Defense tech is really hot right now. I think that's why you guys started a podcast. And there's the reasons for defense tech being attractive to venture capital are different sometimes than I think the reasons that we need new technology, but those two are intersecting, right? If I look at the last, the last five years of venture investing, you had this big peak around 2020, in in SaaS, right, Software as a Service things. And then it just took, like a nosedive right around COVID. That money had to go somewhere, and it conveniently found its way in defense and around 2021, and 2020 to 2023, you see this just huge uptick in VCs investing nearly across that those three years, nearly $150 billion, so nearly $150 billion in private capital invested in defense.
At the same time as that's happening over in the public sector, you see this like big push of, hey, we really, you know, across political parties, across government agencies, whether you're in defense, commerce, the intelligence community, everyone's sort of coalescing around one singular strategic challenge, and that's the at the moment, peaceful rise of the People's Republic of China. And so I think that the reasons that defense tech is hot and the reason that the government said, Hey, China's a problem, were probably different. But now we're all on the same team. Now we've hit the intersection point. We're all driving towards the same problem.
So I think we should really understand that problem well, and I'll break it down like very simply, the leadership of the People's Republic of China, right the Chinese Communist Party, and specifically their president and general secretary of the party, Xi Jinping, has publicly stated that they have objectives to reshape the international world order that's currently led by the United States and our Western allies, and that they publicly have stated that they would like to reunify with Taiwan. Those are two very big strategic challenges of the United States, and so from a national security standpoint, it is our job to field the very best military that we can to deter those potential objectives of the Chinese Communist Party. Why so I think that's why that matters and why those two spaces have come together.
What does all that mean next for these spaces? It means that right now, we're in a really critical time period to do to bring in that technology and execute deterrence. There are a number of factors ranging from more conflict around the world, economic stagnation and deflation inside of China that's potentially driving leaders to make maybe even rash decisions, and a military on the United States side that is more expensive and by uh, by extension, smaller than it ever has been. At the height of the Cold War, we had 800 ships under Reagan in the 1980s today we have under 300 we said we want 350 we only build one and a half destroyers a year, and roughly the same submarines. The Chinese build three. So in this present moment, all of those factors have collided. And so not only is it critical that we get these technologies in, there's also just an immense amount of urgency to do.
Maggie 11:33
So what is the current state, as far as we know, in the open source domain of our adversaries’ adoption of next generation maritime technologies that would be relevant to the Navy, whether that's autonomous systems or AI or others? And how does the US stack up?
Artem 11:52
So the our number one adversary, as we've talked about a lot, is the People's Republic of China, and they have a concept that is actually, you know, riffs off of the US system of winning the Cold War called Military Civil Fusion. Military Civil Fusion is the Chinese looking at how the US won the Cold War, which is through by, you know, world class companies in Silicon Valley bringing technologies like the microprocessor to life, and then the government adopting the microprocessor and sticking it into weapon systems like, you know, cruise missiles and dumb bombs, and making them smart cruise missiles and very smart bombs, right?
So the Chinese looked at this and said, hey, the United States is, like, really good at the diffusion of technology. We should do that too. And so they developed military subtle fusion, which is this idea that Chinese companies work directly for. You know that their work directly supports the technologies being developed for the People's Liberation Army, which is the Chinese military. And they're doing this across the board in in areas ranging from energetics to, you know, rocket propulsion, to AI to autonomy, and you know, some examples include, in the open source you can find Chinese specifications for a software defined warship. In fact, they launched a medium, unmanned surface vessel for experimental purposes that meet some of those specifications. So that's a real example of that partnership coming to bear. The good news is the United States is just far better at this, right? We have a long history of taking, you know, innovation and fielding it for military advantage, and I feel fairly confident that we'll be able to continue doing that.
Maggie 14:00
What do you feel like the US most misunderstands about our adversaries approach to military technology?
Artem 14:10
Man, that's a tough question. I don't think that there's something we don't understand. I think we're just in the middle. It's hard to be number one for a long period of time, right? The United States has not had a competitor peer adversary since the fall of the Soviet Union in 1991 and today we do and when I say peer, I mean, this is, this is the first time I could, you could really say peer competitor. When you look at the Soviet Union by scale of military they were a peer competitor, but in every other aspect of intrastate competition, they were not. They were a near peer. China is most definitely a peer the GDP of China in. Uh, is just a smidge below ours. The GDP per capita is actually, you know, far lower, but the actual overall GDP is is just below ours. So China is the world's second largest economy. They are the world's largest Navy. They are the world's second largest army, right By every metric they are, they are close to us. And we have no institutional memory of facing a competitor like this, since probably the time of the American Revolution, where, you know, Britannia ruled the waves and was the global superpower of the day. So since that time, the US has never faced a competitor that could match us and actually exceed us in areas like industrial output. That has just never been the case. And so it's not necessarily that we misunderstand something about our adversaries. It's that we're facing a completely new kind of adversary.
Maggie 15:54
So one of the big topics of conversation that I know you and I have discussed, and I think is a hot topic right now in the broader defense tech community is, what lessons should we be learning from the war in Ukraine as they apply to a potential future conflict in the Indo Pacific, and what lessons should we not be over learning?
Artem 16:17
So I'll start with the easy one is, what should we be learning, and it aligns well with what I said before: Software Defined platforms are critical and cycle times of change are faster, largely driven by the software defined platforms those changes occur on. We're seeing that across the board on legacy things like tanks which have to be modified, like the physical tank has to be modified because of the advent of first person view drones and top down, you know, top attack munitions, all the way to electronic warfare systems, again, more of a Software Defined Platform than an analog tank. But the chain, the rate of change is just so much faster. And I suspect that any fight in the Indo Pacific would encounter something very similar in this very rapid rate of change.
What lessons should we not be taking away? There's a statistic thrown around all the time that 70 to 80% of Russian casualties today and Russian Ukrainian casualties on the front line are the result of first person view drones. I won't comment on the factual basis for that statistic itself, but the idea that all fires are now based on drones is probably an overreaction and an end and a bridge too far. One of the reasons that Russia just yesterday, from the filming of this podcast, launched over 300 drones at Kiev, and just under 70, I think it was 64 cruise missiles, as reported in the in the economist this morning was that Russia can produce more Shahad drones than they are roughly at a rate of 300 every three days. Then they can produce cruise missiles. So drone warfare is not necessarily just a function of the fact that this is the new way of war. Drone warfare in Ukraine is a function of what those countries can produce. And I suspect a fight in the Indo Pacific would look different. Moreover, the fight in the Indo Pacific will be between large capital assets, primarily armies, are still going to clash on islands, whether that's on Taiwan or other parts of the Western Pacific, but at the end of the day, it will be ships and aircraft that have the lead, and that is just going to be a fundamentally different conflict.
Akhil 18:49
As we maybe switch a little bit to the actual adoption, the actual technology, and then managing technology adoption, I'm reminded by a story from the turn of the 21st Century, a young Lieutenant William Sims, which is a classic case study of the Navy basically rejecting new innovation. Right for the listeners out there who don't know the story, basically, this young lieutenant out in the Philippines, stationed in the western Pacific in 1900 learned about this new technique around continuous aim fire for basically gunnery. The Navy spent a year saying this is probably not going to work. Did their own flawed tests, and it basically took this young lieutenant writing a letter to then Theodore Roosevelt, who was one of the assistant secretaries, to get him to elevate this to a level that needed, needed to be elevated and ultimately adopted. I think there was some wild statistic of a 500% increase in gunnery accuracy in the Navy by this one sort of technique that was initially outcasted. And so I think there's an argument to be said that the Navy, in particular, over its history, the Navy has had a real challenge when it came to adoption of new technology or just new practices, whether that's because of some of the legacy institutional culture or something else.
I'd be curious one, the Navy has changed a lot since 1900 it is actually, in a lot of ways, at the forefront of what geopolitically is happening from an experimentation standpoint, and where the potential points of friction, points of conflict are, and so I'd be curious to get, maybe to start off, a high level take on, how would you would assess the way in which the Navy has tried to adopt new technology or new novel techniques? Where do you think we can still go if there's one or two areas to really spend our time, spend our focus, if you were going to spend only had one hour or one marginal dollar, where you'd spend that time to enhance and increase that way in which we adopt technology.
Artem 20:56
I'll start with the last question. First, if I had to pick one place that I would change how the Navy adopts technology. I would say that every platform should be software defined, right? We historically, you know, we've, we've heavily vertically integrated our platforms, whether that's a ship or an aircraft. It's usually made by some more, you know, Prime Vendor, that prime vendor, will deliver the hardware, and that hardware, in order to put software on top of it, will have some sort of proprietary interface. And occasionally they need to have data rights. Occasionally we won't, even if we have data rights, we'll have to pay for like the decoder ring to be able to like interface. Those two things together, and they're very tightly coupled. And unlike an iPhone, or, you know, modern consumer electronics, benefits significantly from vertical integration. We found that defense things don't so effectively. Building software defined platforms on open source, like protocols with like open source code is, I think, the future, and we already, we already seen this in Ukraine, where software defined platforms ranging from drones to electronic warfare systems are updated overnight with new code, which in turn leads to some sort of operational outcome that should be our gold standard. You know, every night a ship is in contact with an, you know, an adversary potentially in a high end fight, and that ship just gets better because we deploy new code to it. So that's, that's the one kind of vision of the future I have. Going back into the past, the Navy has struggled with technology adoption. I think the sim story is just one of dozens the and I would point to kind of three high level things of what, what makes working with the Navy potentially challenging from a technology integration standpoint. The first is that large capital asset bit that I mentioned earlier. 70% of the Navy's budget goes to, you know, ships, aircraft, carriers, submarines, airplanes, etc. That's different than the other services. Akil, you're a Marine. 65% of your services budget goes to humans, right? So we're used to buying things. We're not used to buying software. And the second is, the Navy has a unique culture of, you know, kind of bottom up innovation, but that means that, you know, top down is hard, and that dates back to the notion of independent command at sea, and the idea that, you know, ship captain goes over the horizon and loses communications with everybody and just has to make decisions on their own. That's created kind of a centralized command and control system and a very hierarchical approach to, like, how our service operates. That's hard. We're working on getting better. Thanks,
Akhil 23:42
Artem, I wanted to pick up quickly on your software defined portion one, I think spending more time in the space. I think folks sometimes get what software defined means to be actually wrong. So I want to really, really distill what to you is software defined in the military context. And maybe as part of that question, you talk a lot about open source and being able to iterate constantly. Are we always going to be at concentration tension with the DevSecOps process of how we ensure whatever that means on the security side, with our ability to continuously iterate and ship and commit code on a nightly basis at scale.
Artem 24:22
So software in the military context, Software Defined warfare refers to the idea that your hardware platforms are getting better on a on a rapid rate, so in hours and days vice months and years because of the deployment of software to those hardware platforms. So just like a Tesla or your refrigerator potentially gets a software update overnight, and that piece of hardware now has new functionality, we want to see that in the military domain and adaptation is, you know, the. Uh, the side that adapts the fastest in conflict wins. Historically, adaptations came in the form of doctrine, tactics, techniques, procedures, or potentially rolling out a new technology. You know, the tank replaces the horse today, adaptation means deploying code, and in turn, the deployment of code drives those other changes. It drives tactics, techniques and procedures. It drives doctrine, it drives training, which is kind of a it's a new paradigm, and it has really changed the character of war.
Justin 25:33
So building off of that just a little bit, Artem is invoking both Darwin and then Krepinevich, right? And so this goes from the species to the Origins of Victory all the way to getting very wonky like DOTMLPF, here's the point: Maneuver warfare became the primary mode of marine philosophy, essentially like execution 1989 that scaling out and applying to technology as we've watched the wave of software grow is something that needs to be more tightly approached. And so what we've seen there is that the this OODA loop that so many people talk about is very measurable in terms of how long it takes to adapt. And so when you mentioned Sims, looked at the 500% there's a lot of talk, there's a lot of activities, but ultimately, the time it takes to turn a ship, or the time it takes to change the way that we are defending or countering is a very measurable thing. And so we talked about software defined warfare. Obviously, that's the fastest moving piece. I'd say just extending that a little bit modular across the board is particularly important.
So depending on how we do interfaces, including hardware, the more of these components that are swappable, the better. So we have a new line of ship builders the ability to navigate for autonomy or with or without GPS, are open questions. The idea of doing this vertically is an open question. And so we've seen some success with, for instance, In Q Tel, taking Anello Photonics and saying, Hey, instead of your Alt-PNT solution, swap that out and then swap in this new piece of hardware like that. Ability to mix and match all the way through hardware is ultimately something that I think is going to be extremely important to adaptation.
And so we have a few adaptation cells. We have one that we work with within the seal community. We have a few back office technical teams, and those are days and weeks mix and match that's extremely important, especially if we're not getting live production contact, and so I would say that watching this and being able to make those trade offs in real time is something that is allowing this bottom up to be pulled all the way through something we already knew there are often really good solutions closer to the problem, right? You're more likely to solve a problem if you're in it than if you're in some building Far, far away. But how long does it take to make through and how many letters do you have to write to Congress or somebody else? Well, we want that to be more data driven.
There's an obvious difference between private sector and public sector. Public sector is not measured on return on investment. Historically, there's no good sense or way of doing that. And so when you say something is five times better, that's really impressive. But like, the denominator is not standard at all. And so we've standardized the denominator to say if you are changing outcomes in a significant way, it's less important where the solution comes from in terms of where in the stack, and more important that you're moving the needle for the war fighter and so like a tactical example of this is occasionally, we'll use KPIs that specify how something works in the requirements document. And so we'll say there needs to be this upload speed and this download speed, but also we're specifying that we need coax cable at this installation. And the reality is we just need connectivity. And so what we've recently done. One is allowed for proliferated Low Earth Orbit and to compete with traditional, wired and so all this across the board, hardware and software, where outcomes are the basis for acquisition and scaling decisions, everyone is better off.
Maggie 30:20
Could you give a story of a time when fielding a software defined system really made a difference in some environment?
Artem 30:31
I think the best real world example for today is Ukraine. We brought that up, but I'll use a relevant example from the US Surface Navy. Our ships since October of 2023 have been an active engaged in active combat operations in the Red Sea against the Houthis, which are an Iranian backed militia movement in Yemen. The that means that our ships, for the first time since World War Two, have been under, you know, inside of the weapons Engagement Zone of an adversary for a consistent period of time. And so we've at, you know, at the filming of this podcast, we have engaged roughly, you know, 300 plus air breathing threats. So that's, that's a lot of data points. And at the start of that conflict, it took us weeks to get data off our ships. We have since gotten that down to days. We want to get that down into hours. And the data from those ships has improved. You know, when it gets back to CONUS to the United States, it has improved everything on the spectrum, from the tactics that those ships are using to shoot down those threats, all the way to the actual radar systems on board the ships where we have analyzed the data, sent a software update back on a CD. You know, ideally it would go over the air, but baby steps, sent it back to the ship, and then and the ship's radar systems have improved. So that's just one example of a software defined, you know platform yielding a measurable, you know impact and in combat.
Justin 32:10
And so there are there EW cases. There are also just infrastructure and connectivity cases that kind of allow everything above those to improve. And so we weren't doing a ton of telemetry a few years ago. And so at this point, the way this works CONUS and OCONUS is we're watching those flows proactively. We can do a cyber example after this, but now, instead of using phone calls or emails as the trigger for where connectivity problems exist, we're extremely proactive, and we can Do hot swap redirects or load balancing proactively, and so in this particular case, this is a re envisioning on how we modernly manage the highway for which all of the cars and bits and flows operate. I would say that prior to a couple of years ago, that wasn't pro grade, and now it's first in class.
Maggie 33:30
As we talk about fielding more of these software defined systems. What does that mean for the current capabilities that we have? You know, does this mean we need to throw away all of our aircraft carriers and F 35 and field just armies and armies of USBs and UAVs. Or does it mean something different? And if it means something different, you know, how do we envision kind of the future of this software defined force?
Artem 33:56
I really love this question, because we hear this, the Navy gets this, I think, more than the other services, because we are so platform centric. We get told all the time that, hey, the character of war has shifted towards autonomous systems that are software defined. You can just throw out all the expensive manned things, the F 35 there was a prominent tweet that suggests that those could be replaced. You know, the $135 million F 35 could be replaced by the $10,000 hobbyist drone, and that just isn't the case.
What we have found in our war games is that generally, when we face off against a peer competitor, that is usually the People's Republic of China, our program of record force, meaning the things we have today, our aircraft carriers, ships, submarines, et cetera. You know, occasionally win, occasionally lose. We want to win every single time. And when we war game that same scenario and add these new capabilities, these software defined capabilities in a hybrid fashion. So the traditional manned platforms operating with an unmanned system. So think in F 35 operating with a collaborative combat aircraft, a large Navy destroyer operating with a medium unmanned surface vessel, which itself has a fleet, its own little flotilla of small unmanned service vessels. When we war game that hybrid force against the, you know, the People's Republic of China, we come out on top more often than not. And so that's the Surface Navy strategy: how do we feel the best hybrid force? And if you and we've had that consistent message for a few years now, because we recognize that you're not going to replace these orange capital assets, but you need kind of a high low mix, which is a Cold War term of a small number of high end exquisite platforms and a large number of low end tradable systems.
Justin 35:47
And so that barbell strategy, I think, applies a number of different places, right? Ultimately, it's as much an economic strategy as it is a technology strategy. But where that plays out, or where that bears, I mean, we can double click on kind of any domain. And so, let's say, from a logistics perspective, how many different government off-the-shelf applications do you need? Ultimately, the way that I'm viewing this, and we have a number of now memos and invitations to say, let's not just let commercial in, but let's do one-to-many replacements. And so we're piloting commercial capabilities, but they go so much better when someone says, “Here are the seven aspects of logistics IT that we can wipe out from a GOTS perspective with one COTS application.” If we don't have time for BASF to do, to harken back to the early 90s, that says we make everything a little bit better but we don't replace anything, that's not good enough. And so we are looking for divestments across the board, even if they're not apples to apples.
Another example of this is software-defined, but perhaps with a little twist. I recently was with the strategic submarine PEO, and they were working with Gecko Robotics. Within that Gecko Robotics example, I asked, “What are you replacing?” They're replacing manual action. They're replacing services. They're replacing, and with that, getting both productivity gains and safety gains. So this is a place where, between maintenance, cleaning, and manufacturing, we had one way of doing things, and we would like to open the door to say: we will give you our problem and you figure it out. Or we will give you how we're doing this right now, and instead of just incremental improvement, what is your horizon three solution for doing this differently?
The hardest part here is that it is not intuitive that there are incentives for people to change the way they work. We know there are mavericks, 10 to 15% of any organization, willing to work against their own self-interest for the greater good. We have an incredible mission, one of the best imaginable. We've catalyzed that into 15 to 20% of groups that are willing to do more work and adopt commercial solutions or think creatively about something. It's almost always more work at first, both for the company bringing it in and for the PM and support staff reimagining all these different pieces. Most people would prefer to do things the way they’ve always been done. But if there are more groups who want to be pilot leads and accelerate commercial adoption, then, as Steve Speer, professor at MIT, says, great organizations always look for the suck. There's suck in every organization, and you can always find the weakest links.
If we're doing this quantitatively, and we're saying either look at the numbers or talk to a sailor or Marine, where are the pain points? Then, how do I extrapolate that to a solvable technical problem and scale it across the board? We can have fewer systems that perform at a higher level, combining hardware and software. This is a time for reinvention, and we can do that quantitatively, even from the outside in. So whether it’s more private than public sector, we are pulling them through, or, like the earlier point, it’s someone at the ground floor who sees something, tries it, and then we scale it. In both cases, we now have the ability to stamp something as an enterprise service. Across the board, those can be apples to oranges, as long as they have calories, as long as they nourish. We're ready to make those trades.
Maggie 40:36
So I wanted to ask this is kind of on a on an earlier point that we were talking about in terms of integrating these new technologies with our existing program of record force. I want to see if one of you guys could maybe tell a story of a startup or company that did a particularly good job doing that, and, you know, maybe what were some of the steps that they had to get through, like, you know, step one, what's required to even integrate with a data source coming off of a ship or an aircraft carrier. What's kind of you know, hardware edge, compute constraints do we have or access do we have? You know, what kinds of AI models do you have? What kinds of data do you have? I know Artem. One of the things we've talked about is computer vision and some of the challenges of doing that at sea. So I wanted to see if maybe you guys could just give us, like a life story, life cycle, of what it looks like to actually integrate a system with the program of records force that we have today.
Artem 41:36
So I'll start with saying it's not a zero or one, like, you're not totally air gapped and you're not totally integrated, right? There is a spectrum in between. And I think the success stories we've seen in the last two years are sit in that middle area where we've delivered some kind of in the case of computer vision, some CV models to our ships and those models you know, live and operate in inference on commoditized commercial hardware, and that hardware, in most cases, is air gapped from the rest of the combat system. Is the best thing possible, like for us to connect those absolutely is, could we deliver the commoditized hardware and the model to the you know, to run at the edge in a month, as opposed to going through a three year long integration process? Yes, and so that's why we chose the like in between solution where an operator is kind of looking at a laptop, as opposed to looking at like their display in the combat information center as part of the Aegis combat system.
So in that particular example, that's a like in between integration story. But we've also seen examples where we've done the full integration work from start to finish, to bring some AI models to our combat system, and that involves working with the prime contractor that delivers that combat system. That means the working with a third party that does the integration work, because that prime contractors code is in, like Ada or in Fortran, which, like, for those that don't know that Fortran is the was the DoD is, like, proprietary code base the so that's why you need, like, all of those moving pieces, and that took a couple years. I'm being non specific on the industry partners, just due to the classification of the work involved. But the we've done both, we're doing more of the first and the first is what I urge most like startups to do right? Startups want to get a product out and make sure it's usable, or get an MVP out. You're not going to integrate with an F 35 or an armor cost destroyer in a month. So get some standalone hardware that maybe has like, one integration point with, like, if you're moving data, maybe you use the ship's routers and switches, but you're not tied into the combat system, or you're delivering a model, but you're running on your own hardware, as opposed to the ship's hardware. Yes, that's a limitation of working with the Navy, but I think it's one that we're working towards overcoming,
Justin 44:17
And then on another ship, example, but not weapon systems, CANES. Within CANES the Consolidated Afloat Network Enterprise Services, this is the largest chipboard network, and it's where between some NIPR and some SIPR. So some some unclass and some classified functions that we have a lot of different capabilities running on one backbone and one set of computing. And in this particular case, we have taken some of the capabilities that are virtualized, and tried to, in a lab, find more efficient ways to do them again, one for the price of multiple and swap those in. And so we've now unleashed some COTS vendors to look at a native capability within canes and replace that in a virtualized fashion, and just kind of hot swap those. And so that's something that we're getting a little bit better at.
We're also doing that with flank speed edge that more modern hardware. And so between those two, here are ways that we can take external software capabilities and pull them in and lower the time to integration on that that is, to me, repeatable. And then this, I'm going to call back to the initial point where we said, like, integration can go way differently. The when I started doing that type of work 18 years ago, when I was actually doing ship installs, we had a binder, and you had to follow all of these steps, and you're in the operating system and you're, you know, swapping from Solaris to like other like lower down components. How all of that goes forward is, is something that we've toyed with streamlining. I think that looks very different in the next year and to this point when we look at what can be replaced, the more that we sandbox that up, and the more that we can invite people through structured challenges to figure out what they can divest, instead of doing a one for one request to divest. Then this became, this becomes like downhill running.
Maggie 47:20
What do people outside of the Navy or the Department of Defense most commonly misunderstand about Navy or DoD tech adoption as a whole?
Artem 47:34
I think the I know we've talked so much on integration, and I hate belabor in this point, but I think people really misunderstand how hard it is to put something on a warship. You are not just like you're not bolting a computer to the side, right? I mean, even to upgrade. Justin, at some point, talked about the afloat OT and IT infrastructure on ships to replace that. That means cutting a hole in the side of the ship, taking out a bunch of server racks and then putting them back in. That's not like a week long process or a month long process. That's 180 day long process. And so technology integration with the DoD is hard. It's not just hard for the for for the technical reasons I just described. It's also hard because of our risk tolerance.
Remember that when you're delivering, you know, if you're a startup and you're delivering a capability to a war fighter, that means that there's a human being, you know, an American sailor, soldier, Airman or marine or guardian from Space Force, who is potentially in close contact with an adversary trying to kill them. And so our risk tolerance and our threshold for something working is just so much higher than deploying an app, and you know it being buggy and potentially not working, and you know you losing some stock points that day on the on the Dow because of that higher risk threshold, you know, we're really, we are going to test, you know, do Test and Evaluation in a way that maybe mimics industry in terms of process, but it has a longer timeline and has some extra steps, and you really have to think carefully about how that technology will be used, especially when it fails, and what failure looks like, because failure can be catastrophic.
Maggie 49:26
So you're telling me all DoD software completely bug free, running perfectly smoothly.
Artem 49:30
Yeah, it's excellent in every regard. No, in fact, what I'll say is the DoD sometimes mistakes process, like, the process through which we follow like, let's say you're getting an authority to operate an ATO, which means your application has been designated as, like, cyber safe by the DoD, because we like, read all of your source code, and then we've like, now, like a human being read all of your source code and then we deploy it to a secure. Environment that doesn't magically like, mitigate cyber risk. Similarly, writing a requirement by committee and having like 10 people on the committee like, say, thumbs up to the requirement in the Navy. We call them requirements, resources boards that doesn't like, magically de risk the technology. And so I would say we mistake our processes as being risk mitigators, and true risk mitigation is like a very sound technical implementation.
Maggie 50:30
Are there any special considerations that startups need to consider when building for the Navy, as opposed to some of the other DoD services?
Artem 50:40
The Navy is very again, because of our, you know, large capital assets occupying most of our budget. And large capital assets are not built by startups. They're built by traditional prime vendors. And the defense primes refer to companies like Lockheed, Martin, Raytheon, General Dynamics, Boeing, etc. When you build for the Navy, you really have to your technology is going to interface with the technologies made by those vendors. You're not just building to interface something with a common API that you can just find the API key online and you're like, good to go. You're building on top of, you know, decades of legacy hardware and software. So you're going to have to, you know, we talked about interfaces are hard. You're going to have to really work with those prime vendors and build partnerships at like a human level, with those engineers who, you know, painstakingly, over the course of decades, have built the world's strongest military, and today are also striving to bring in new technology.
Maggie 51:48
Artem, you know, one of the major themes that we have been discussing this conversation is, how do we get commercial technology that that we already sort of know works, or more or less works, into the Navy and the Department of Defense. Can you maybe share an example of a technology that we think of as mature, but that maybe has some unique considerations or challenges when applied to the maritime domain? Yeah,
Artem 52:17
A problem that's really near and dear to my heart because of my time working on, you know, supporting Maven is computer vision, and I think that this is a largely believed to be a solved problem in Silicon Valley, right? AlexNet, 2010 you know, since then you have, I mean, almost everything, whether it's your iPhone or your vehicle has, like, some sort of CV model, like running very effectively with, like, just great F1 scores. You're detecting everything, and we think of that as a solved problem.
Well, when you pivot that same technology to the maritime domain, it is absolutely not a solved problem. It is on the cutting edge of DoD AI problems, because, for one, you don't just have 1000s of ships going around with cameras producing model ready data that you just feed into an open source model, like YOLO v10 and say, like, oh, look, I now have computer vision at sea, right? So you have an absence of data. You're also not you're also dealing with, like, a really unique environment with significant background noise, atmospheric changes that you just don't encounter in the, you know, the regular just if you're a car driving down a street.
So really great example, computer vision in the maritime domain of a commercial technology that is a solved problem, but requires some additional steps when you're looking to work with the DoD. And this is why partnership is just so important, right? The DoD is the world's largest I'm not 100% certain on this, but I feel confident enough to make this statement. We own the most data of you know, ships floating in the ocean. I feel fairly confident in us saying that, and so you want to do a CV for horizontal motion imagery at sea. Come work with the DoD, right? There's a partnership. Figure out which prime vendors you need to partner with, because they're the ones deploying those electro optical or infrared sensors at sea today, and then figure out which government labs have dedicated, you know, decades of research to this problem that potentially, you know, it doesn't have a commercial application today, and so a normal startup just wouldn't go after it, but the government labs, which have those, like, long, 10, 20, year horizons, did choose to go after and have done a lot of like, great research to really help us feel these capabilities.
Maggie 54:44
So now this is a question I have to ask, because I'm sitting in San Francisco, and all anybody ever talks about is generative. AI, where are you seeing the most interesting and exciting use cases for generative AI, in the Navy today?
Artem 54:59
Man, there's. Much I can say about generative AI. All right, so the first is, the first is that the government is not immune to there are no special like special use cases that are exclusive to the government on generative AI, most of the things that are adding value in the commercial sector, I would say, like RAG as a as an example, right, basically, search your own documents, or, in some cases, right, search your own knowledge base. So instead of doing a SQL query, you're now having a, you know, a conversation with a model. Those are like the areas where the government is seeing the most use, which is a one for one, analogous to what we're seeing the commercial space.
How do I see generative AI more generally, in the competition between the US and China in the military space? I think is actually a more interesting question. The United States labs such as Open AI and Anthropic have really been focused on the model and really being on the cutting edge of model development in China, with the exception of the innovations we saw with deep seek r1 you're seeing a lot of emphasis on the actual commercialization of the technology, and that's a problem, because historically, if you look at how a technology, especially a generalizable technology, like electricity, and today, like AI, has won a great power competition, it was not the country that developed the technology first, or even the country that refined the technology first that won that great power competition, it was the country that was able to diffuse the technology and yield outsize economic benefits as a result. And I'm not so sure whether you're a shareholder of a Fortune 500 company, or you're the leader of a government agency, such as the Department of Defense, you're really seeing that, you know, significant return on investment today, or the diffusion of that technology today. Conversely, on the in China, you're seeing significant diffusion of AI into all sorts of technology verticals. And you know that that that doesn't bode well for long term Great Power Competition,
Justin 57:26
there are still folks who are clinging on to building things in government that probably shouldn't be anymore, and so we've explicitly stated that we're going with a fast follower first strategy as acquisition, as the 18 program executive offices within Department of Navy, five of them have shifted over to portfolios that we can make better buying decisions based on capabilities and effects. As that shifts, we want to both reward the people who are contributing from every aspect of that as well as the ones who are actually doing, making the difference. So doing what we have requested and what the Secretary of Defense has requested and what the President has requested. And so the PM of Lionfish, small and prime small USV and Dragonfish, large UUV. These are cases where PEO, USC and UWS are trying to pull in different things, and that may even compete with something that's happening in a lab or internal development. But those shifts right now are either going into overdrive or we're not moving fast enough.
Maggie 58:45
Yeah. Justin, thank you for naming a few folks that you're seeing who are who are really doing a good job right now when it comes to fielding the systems that we're going to need in this increasingly geopolitically complicated world I wanted to see you know, could you maybe tell us, what about those organizations has been so successful, and what can the rest of the Department of Defense or the Navy learn from some of these success stories?
Justin 59:19
Yeah, so number one, trying to do something new alone, the first time that we endeavor, it rarely works well unless we have a really strong partner. In all three of those cases, Captain Alex Campbell at DIU was involved. This is a person who gets it. This is a person who is bringing the weight of that organization, who has been interacting with industry in a mature way. It doesn’t take a whole program office to agree. What I have seen is that it just takes an O-4, O-5, O-6, or a courageous civilian to say, “Okay, we will figure out how these compare, and we will show that even if it is not a perfect fit, there is likely a requirement or capability needs statement that will allow us to do things differently.” Translating the technology into outcomes where it is affecting results in a more significant way allows us to provide top cover.
I will use a specific example. We had an O-6 who was a CSO on a carrier, and he said, “Send me what you are doing for the ashore systems. Send me a hyper-converged infrastructure stack to my carrier and we will deploy that.” There was a lot of paperwork, like Akil and I were talking about, because this is not how it is supposed to work. Sometimes people get bogged down with emphasizing every aspect of process as opposed to the outcome. This was a combination of heads down, nudging through—it is always more work than you expect—and then wavering and justifying. Part of the way we justified it was by showing the difference it was going to make. This was not just about being the loudest voice. These were two senior people saying, “Here is the baseline for connectivity on ships right now. Here is what that is doing from a quality of life and quality of service perspective. And here is the difference we can make.”
That leap still required influence, but we had the numbers. Captain Kevin White made it happen on one ship, and I believe it was the documentation from that case showing that it was a thousand times better for connectivity and user experience—one of our outcome-driven metrics—that led to deployments on five more carriers, and then to Flank Speed wireless. The capability we are talking about, known as “Sea to Sailor, edge afloat and ashore,” is now turning into a fully scaled capability at a high level, and it happened very quickly.
Artem 1:02:39
The best partnerships in government like to get tech in government are obviously partnering with commercial that's a given. That's partnership number one. And then inside of government, you need kind of three groups, right? And then the so the end user, meaning the war fighter, the sailor on a ship, the Marine in the field, the airman on a maintenance line. And then you need the acquisitions community to help you bring the, you know, the commercial thing, in. And then you need a resource sponsor. So that's somebody sitting in the Pentagon that's willing to do the mountain of paperwork that Justin described in order to make sure that this capability that is clearly measurably better than what we had before actually gets into the budget and stays funded for the long term, you have to have all four of those. And I think we've become really good at a, you know, the flywheel effect of building those teams, you know, be and those, those private, public partnerships,
Akhil 1:03:39
If you had a single slide to articulate one metric for your respective office. What would that metric be?
Artem 1:03:48
We’re in the government. We love slides. If I had to pick one thing–
Akhil 1:03:52
I know, I asked you for a single one slide, not 40 slides here.
Artem 1:03:57
Well, you know, we if I had to pick a metric, it would be how few slides we use to get something across the finish line. Bit of a joke there. But the I would say today, we measure things by what Justin hinted at, cost, schedule, performance and cost is how expensive a thing is. Did we go above or below budget? Schedule is, was it delivered on time or usually, the answer is no. And performance is, did you meet a requirement? And what that means is, did you meet the mark, according to a piece of paper that was likely developed five to 10 years ago, that isn't how we want to judge success. We want to judge success by operational outcomes. And so if it's a piece of software, if it's a user facing piece of software operator input would be like number one, meaning, like, here's how I used to do my job. Here's how I do my new my job now, because right, like, you're delivering a software application, the process is the product you're changing how that that operator is doing something and. Then number two, closely behind it would be like the real world operational impact if I'm delivering a new, let's say, command and control user interface for unmanned systems. Was I able to a group of sailors? Were the sailors able to control 10 robots before this, and now 20 robots, and now the robots can swarm, because it used to take 10 clicks, now it takes two. Those are the kinds of metrics that I'd like to display on that slide, user facing and operational outcomes.
Maggie 1:05:33
Can you tell us about a startup, or maybe a VC backed startup that has been particularly successful building and deploying technology for the Navy, and what can other startups learn from what they did in order to achieve success?
Justin 1:05:49
When you say building and deploying technology for the Navy, let's talk about cases where, like they're building and deploying capability and the Navy is making use of that. So maybe Project Ammo for MLOps, where we had a decent amount of in house software, and then, kind of like a coagulation of COTS, we had companies like Domino Data Labs and Latent AI that were doing commercial work that said, we do this for pharmaceutical and healthcare. We can do ML, we can do machine learning operations for defense functions better than you can build something up yourself. And so there was a Hey, meet this team, work on this problem, take a look at our algorithms. And so that relationship grew. They showed the difference that they were making. They brought that business mindset where they could show, here is the difference that we're making.
And then they did something else, that is an unlock, which is they found other people's money to apply to the problem, and so they applied to the APFIT program, and got a plus up there that was significant, that helped, certainly the Navy, hopefully also the Marine Corps. And as that group showed the difference they were making, understood who the players were, applied their knowledge and experience from other sectors, and then found funding for that valley of death until we can get them into the larger scale, like sustainment budget, like those are the big factors for how they can make that difference, and now they've also bought enough time to show here how outsized effects can come out of their next batch of work.
Artem 1:07:53
I'll use two really quick examples. One is a hardware example. The Navy brought in Saronic to build the prime. Which prime is a DIU Defense Innovation Unit contract for small unmanned service vessels, sUSVs, a year ago, we did not have, you know, 1000s of attractable sUSVs vs a year from now, we will when we actually ran the prime solicitation, Saronic had not built the Corsair, which is the boat that the Navy is buying in mass so, in fact, that company actually did not exist three years ago. And the idea of using a small, fast moving boat to blow up large capital warships didn't exist, you know, at in the same way that it does today. And so Saronic is a great example of a hardware company that's that the Navy has brought in, adopted the technology and is now developing the tactics and standing up, we've actually set up two unmanned surface vessel squadrons, so two new units that didn't exist. You know, even one of them didn't exist a month ago. So that's the rate of change here.
And then the second, a software example that's like, really a success story. Justin mentioned MLOps, I'm an alum of something called Project Maven and DoD, which was the application of computer vision to overhead intelligence gathering satellites, overhead imagery satellites. The success of Project Maven is that the government set up a data environment and gave a bunch of industry partners, ranging from large companies like Microsoft down to small startups like modern intelligence access the same data and said, compete and build us the best models. And so that's a success story for two reasons. One, we brought in a bunch of new entrants into the defense industrial base, companies that previously wouldn't have been able to work with the department defense were because of the technical implementation of this program. And the second is, we're bringing market forces into government, right? So instead of competing, usually, the government can have. One program of record to build a missile. Well, wouldn't it be great if there was one program of record, but like five missile vendors below them, all building and competing with each other, and that's the that's the environment we set up for Maven for the delivery of computer vision models into DoD, and that is now the joint program of record for automatic target recognition in the Department as a result.
Akhil 1:10:22
Justin, Artem, as we're kind of coming up on time here, I wanted to come back to maybe your thoughts and perspectives to give to the startup and sort of industry community. Artem, I thought you made some great points about aligning the right incentives between venture capital that's spending more time in the space the government that has a real need based upon geopolitics and industry who wants to get involved, either because of the mission or something else. Sometimes those alignments can be a little bit unaligned. And so curious to ask you a little bit maybe just generally, I'm a new startup in this space. You know, what are? What are some pieces of advice for startup founders in particular that might be thinking about getting into the government space or working on a really tough and important national security problem set?
Artem 1:11:10
I think it might surprise folks, but the best piece of advice you can give any startup founder, I think, is always know the problem you're solving, and really like, know the people whose problem you're solving, that isn't always super obvious in government, when you look at the government and you look at the DoD, the DoD is the single largest employer in the world, right? So where do you even begin? Like, who is your end user? Which problem? Let's say you're really passionate about solving an electronic warfare problem, you could go talk to a requirements officer in the Pentagon, somebody at the Washington Navy Yard that works in a program office that buys things, or like a Navy SEAL at a unit somewhere deployed forward, deployed in the western Pacific. All three of those could potentially be some semblance of an end user. So it can be challenging to figure out who you really need to talk to. I'd say my number one piece of advice is find, start with that person closest the edge, right, the operator in a camouflage uniform, you know, ideally on, you know, active duty orders somewhere forward deployed, is the is the end user. No matter what, you know, a program office will tell you that they're not responsible for buying things like the person at the edge isn't responsible for buying things. That's okay. You should still be talking to them, because that's whose problem you're solving.
And the second piece of advice is that it's going to be a long, long game, right? Defense is a slow defense is slow moving capital, right? We operate on, on what are called FYDPS, right? So we operate on these, like long capital cycle times. So recognize that you know, if you're solving a defense problem, not only are you choosing to solve one of the hardest problems out there, both from a technical standpoint and from a delivery of a capability standpoint, and also some of the most important problems, I might add, you're also dealing with slow money, so persistence is going to be key. And I would say those are my two big, big pieces of advice.
Justin 1:13:25
On that last piece, it's you have to be impatiently patient or patiently impatient. This to say, like even if it's slow, there is work to do every day, and I would put a bulk of my effort talking with people who get it. And so whether they're we're using unleashed Mavericks, or if they are thinking about problems, traditionally, you're unlikely to change their mind if you want to spend time talking to people who have done things the other way. That is good community service. But if you find people who are fellow travelers, who are wanting to be commercial adopters, who are disruptive, you will notice that, and that's where I would dedicate my time. It's a kind of like table stakes, to have something that is too good to be ignored. It is table stakes to, in my opinion, have something that you can turn off when you find what you're solving and you're close to that problem, and you have people with the light bulb over their head, and they get it with you, then work with them to denote what is severable and what you can carve out in fiscal year 26 fiscal year 27 fiscal year 28 until we get that refined budget, and so we should be turning off or replacing capability and solving problems at the same time. So you have to do the gain creation and the pain reduction at the same time in order to walk the walk. Otherwise, you have to tell your investors, hey, will they really like it, will see something in three years. If this is a contact sport, if you are not severing something in the meantime, then you're probably at the wrong part of the snack.
Maggie 1:15:31
If you were granted three wishes from a genie, and you could only use them to advance the state of DoD tech adoption, what would you wish for?
Artem 1:15:39
Do we both get three, or do we get three total?
Maggie 1:15:43
Okay, you guys can both get three.
Justin 1:15:46
All right, that's a nice Genie.
Artem 1:15:48
That is a nice Genie. I like, yeah, it's like, there's a queue for the genie, and each person gets three wishes. This is great.
Justin 1:15:54
Down. I think we should both get three and then decide a consensus, three.
Artem 1:16:00
I like that. My number one is that the entire that every program manager in the government is tied to the hip to an operator that is ideally, like, under the age of 30, just like brand new thing has not seen how we've always done it, and is just eager to, like, try new things, and so that each program manager, one is tied to the hip to an operator. So they're solving that operators.
Problem number two is that that program manager is doing it with commercial tech. Instead of looking at something in a government lab or looking to do like a five year or 10 year acquisition, they're looking to, like, bring in a thing that's ready to go off the shelf today.
And the third is that the government has more flexibility. And specifically, when I say the government, I mean the executive branch, has more flexibility to move money around inside of the execution year. And what that means is, Justin talked about, like fiscal year 28, 29 that's like, where our heads are right now, like when we're in the Pentagon, we're thinking about those, you know, what we're going to buy five years from now? That's kind of a ridiculous model. We should be thinking about what we want to buy right now, because we don't know what the tech is going to be five years from now. So I hope that genie lets me move different, you know, money around today to divest from Legacy things and recapitalize that money immediately to a new thing using that program manager that's that knows commercial tech and is tied to the hip to an operator.
Justin 1:17:30
I think those are really good. If there's anyone who's listening to the show who either has done Maverick things that we haven't documented or wants to please reach out to us on LinkedIn, if you're outside or Teams, if you're on the inside. We We want to expand the Unleashed base so that we can show just a compounding return on investment from hearts and minds to effects across the board. Now is the time.
Maggie 1:18:04
Thank you guys so much for coming on the show and sharing some of these great success stories.
Akhil 1:18:09
Hey everyone. Thanks for listening to the Mission Matters podcast from Shield Capital. Tune in again next month for another conversation with founders building for a mission that matters. And if you yourself are looking to build in the national security space, please reach out to us.