there is a difference between the requirements changing and the poor quality, quickly made implementation proves to be inadequate.
agile approaches are based on the quick implementations, redone as needed.
my favorite life cycle:
1> Start with requirements identification for the entire system.
2> Pick a subset of requirements to implement and demonstrate (or deliver) to the customer.
3> Refine the requirements as needed.
4> go to 2
The key is you have an idea of overall system requirements and what is needed, in the end, for the software you are writing. Thus the re-factoring, and re-design due to things not included in the sprint do not occur. (or occur less)
"eventually get to live in a world where all the baby boomers are dead"
man, that stings. as a member of the birth class of 1963.
each of us is a product of our times. i wish no ill will on those younger or older than myself. personally, i have lived my life in a way to be a good steward of the world. was it always successful; no.
not malice, perhaps ignorance. please enjoy what is left of the world. i did my best to leave it better than it was when I received it.
if you don't like the world, try to change it. you have agency.
at 10 miles, the data link cannot be jammed. and it won't be observed, either.
military is very good at this 'mesh networking' thing. L16 is 40 years old at this point, I expect they have something much better.
mistakes in A/A combat can have serious repercussions. not only loss of expensive air vehicles, but things like civilian airliners.
'loyal wingman' gives the kill / no kill decision to an Air Force officer. And having the decision maker geographically close eliminates jamming, delays, and the requirements to have a satellite infrastructure (like is required for Predator UAV's).
i hope we never assign a piece of code, AI or not, to be the decision maker.
Airbus has publicized that it is working on a Project Maven style project with France's DGA [0][1].
Thales also publicly launched and demonstrated SkyDefender a couple days ago [2].
Mistral AI also announced in January 2026 that it is working with the DGA to productionize it's models for military applications [3] - ironically similar in manner to how the DoD was using Claude but is now using Gemini and GPT.
No country is going to leave networked, autonomous offensive and defensive capabilities on the table.
That's not really what they meant. They meant that the weapon is guided by software that decides which targets to pick and autonomously makes that decision without a human in the loop. The device seeks you instead of you going to it.
A landmine has no friend-or-foe-or-noncombatant decision engine, it will kill you or maim you just like it will kill or maim the guy that laid it or any other passer by.
You missed the point. The Mk 60 Captor is not a "land" mine. It is guided by software and autonomously makes the decision to launch a homing torpedo without a human in the loop.
A homing torpedo/release mechanism is not AI by the normal definition of the word. You're welcome to redefine words as much as you want but it's a bit silly. We also don't use that term for heatseekers or for line followers.
The 'signature' bit is interesting, but I'd still not label that AI, and neither does anybody else. It is a loitering munition, I'll give you that, and I think that that brings it closer to the 'mine' definition of things than the 'AI killbot'.
No, it obviously does not only apply to LLMs. It just does not apply to loitering munitions from decades ago. And I'm pretty sure nobody ever labeled that thing AI before you did.
I understand the point you are trying to make here.
But to be very clear, the mine still has to be placed. What's happening now is that step is also automated, and may be automated by a system controlling many other weapons at the same time, across land/sea/air.
It's not remotely the same thing in terms of the scope and scale of what is being done with the networked weapon systems.
Maybe if you tied several hundred of them together with different weapon effects, sensors and capabilities operating across thousands of km at once you would start to get close.
I'm talking about weapon systems that know what color clothes the target is wearing and where his kids go to school.
You know the statement that the "network is the computer"?
Same applies here. We've connected sensors and weapon systems and LLMs together with data sources.
Drones doesn't automatically mean no pilot or no human in the loop. Motherships and relays to command centers solve nearly everything loyal wingman offers. The only question is data access (which is still a major issue in manned jets operating in EW conditions far from home).
Same with the idea that drones can't have high end radars and other stuff which requires a fancy human jet in the loop. Decidated single purpose drones with high end sensors can solve a similar purpose with a much lower risk and cost.
in the 1990's there was a woman prime minister of Turkey.
she ended up resigning in a scandal caused by her husband accepting a boat (or work on the boat..i don't remember). the scandal was caused by the amount of the bribe. it was too low. the Turkish people could understand some corruption, but to be able to bribe the top leader for $50k. Unacceptable. If it would have been $100 million, it would not have been a scandal.
This. Issue in the USA is that if you don't accept the money offered, you get primaried and they bribe money you would have gotten just goes to the campaign of your primary opponent.
Rinse and repeat. Unless, politicians band together and say "we need the full ROI of your project, and NONE of us will even talk to you unless we get half the profits, and you can't primary all of us at once"
Many lifetimes ago I worked on a project where all customer desired changes were 'scoped' by engineering. A little report was written about how the change could be implemented, to include any possible complications and a LOC estimate.
My manager, did not want us to do a customer desired change. So he wrote the 'impact report'.
The problem basically required taking functionality that applied to 1 thing, and made it apply to an (existing) array of things. His solution was as if arrays did not exist, and the resulting LOC was N times bigger than it actually was going to be.
Of course, this change was not authorized until the manager left, and someone that knew it was BS spoke up.
Same thing here.