The "Boring" part of AI and Robots in War
Changing the way we think about maintenance and logistics for autonomous systems.
Today, the Warzone published a piece on how the US Marines need robots in “every formation” for the future of warfighting. Their Intelligent, Robotic, Autonomous Systems (IRAS) need to supplement, augment, and “obviate” human processes. The Marines are not alone in their autonomous fighting futures, as every Service has strategies related to this, and frankly, they have had them for over a decade.
While that all certainly sounds pretty cool, I do wonder how much time is being spent on the less “sexy” part of the equation. The boring part of armed conflict: logistics and maintenance.
I’m certainly a fan of conceptualizing future Concepts of Operations (CONOPs) and Concepts of Employment (CONEMPs), as those are incredibly important for one to have a vision of how forces will fight not just on their own, but also in joint operations and with allied partners. The trouble, for me though, is that there remains no call to “arms” when it comes to finding these magical unicorns who will be the maintainers of these systems.
Why are these guys or gals going to be unicorns? Well… here is a little insight from my time working on issues of testing, evaluation, verification and validation (TEVV) problems associated with autonomous systems: TEVV for autonomous systems is super hard, even for people’s job it is to *ONLY* do TEVV. One source who shall remained unnamed but resides rather high up in this TEVV ecosystem told me directly that “if he had even $100,000 to test an ai-enabled autonomous system,” he “wouldn’t know how to do it, what to buy to test it with, or who to assign to do the testing.” In fact, that particular individual was sending out one of his testers to do more coursework and certification on testing autonomous systems, so there would be someone with some certs and knowledge. (This is of course not industry or a big company who devotes a lot more $$ and time to do testing, and also not industry or big tech basically using the civilian population to beta test its autonomous cars on…)
I’m not saying you can’t test them. You can. Depending on the system, depending on the architectures of that system, depending on whether the system is still learning once deployed or if its learning was “frozen” before the testing, and depending upon whether you need additional software constraints in the architecture - say ground collision avoidance, geofencing, or other guardian type software systems. The trouble, though, is that there are all these complex factors that exist in the real world when deployed that don’t exist in the test environments. What’s more, testing becomes trickier if you are changing physical environments drastically for operational push and pull.
Sure, you can model and sim your way through a lot of training. But M&S is only as good as the assumptions made in the M&S. For some stuff that isn’t too much of a barrier… DoD does M&S all the time. But for some systems - ones that will be in way more complex and cluttered environments - the M&S won’t suffice. Ground based systems face notorious challenges in this respect. It is not just hills, trenches, roads, trees, people, steps, stairs, asphalt, ice, dirt, snow, mud, sand, occlusions to all those things… it is like everything… all the time. Training those systems to work reliably and safely (amongst their operators and crew, or their “teams” if you will) is no small feat. Just ask a roboticist who tries to build and train industrial robots to work alongside humans. And sure, we can digital twin our way around some of these problems. But even with a digital twin we need to have enough real-world testing to build a good digital twin, and testing potentially sensitive military technology out in the open isn’t an easy thing.
So, let’s say that we can build a ground based robot built off of some commercial technology that will be in it to win it with dismounted soldiers, marines, special ops, what-have-you. Even if we get that system acquired (whoa Nelly!), and we get that system deployed with trained operators (challenge 2), we still have to have those maintainers!! Those maintainers who will have to not only know how the thing works from a software perspective, but also how it works with its various networked connections, as well as a good understanding of hardware because it will likely take a beating in the field. I don’t know about you, but I’ve seen gear come back and it never looks pretty.
Then you have the other difficulty with the hardware-software issue: parts. Relying on these systems means that whoever maintains them has to have access to replacement parts. Unless we are saying that these systems are fully “attritable” - which, ok fine, but that has its own cost (Replicator?). That means there may need to be 2 maintainers… the software one and the hardware one (the mechanic). Yet, given the diversity of these hypothesized systems, that is still a bit of a tough find.
The DoD is not ignorant to this fact. It well understands the talent acquisition and management struggle it is facing. How it responds effectively though, and how the Services do, in particular, is still to be seen. I mean, essentially you are requiring either one genius or you cobble together a myriad of folks to travel along with the robot. Paying those folks is going to be difficult given current pay levels.
Or maybe the future force is made up of Uber-Einhorns, in that they are part of the deployed warfighters doing the fighting and they are technical whizzes and maintainers too. That would be something.
There are multiple objections I can see flying, crawling, slithering, swimming, or rolling my way. I’m stating the obvious, or these are already known problems, or we are fixing them, or we are going to create new MOS’ for the jobs, or even “hey! the Ukrainians are doing fine without experts.” Or no one is going to have access to unicorns or Uber-Einhorns, so don’t worry about it. I have retorts for all those, but won’t list them here.
The main point though is that if the “the army marches on its stomach” (back in the 17th and 18th century), it is going to be marching on a lot more than that… like batteries for one. There will have to be a cultural change in the attitudes towards maintainers and logisticians if this pivot to our robot future really happens. Think back to the times when folks scoffed at “cyber warriors” or protested when giving “drone” (aka remotely piloted aircraft) pilots medals. There are cultural connotations and systems of power and reward at work within DoD that go beyond how much you pay someone. So even mission driven can be dampened if these folks aren’t respected or valued the same. I don’t see that cultural movement in lockstep with the acknowledgement that these folks are going to be the thing DoD is marching on…
Great points! Yes the boring parts are often ignored but usually have a big impact in real world impacts.
And while simulation can be a useful tool to estimate risk etc in many areas seems much harder to do for maintenance and logistics?
However I wonder if this is currently being outsourced - to Ukraine?
Basically the Ukraine war is being used as a very cheap way to do real world testing of new innovative systems like drones and AI systems.
I'm assuming this war wasn't planned to do this test ! However, it is helpful from the point of view of valuable real life war testing scenarios of putting these complex systems to the ultimate test?
What do you think?