373

The aircraft flew up to speeds of 1,200mph. DARPA did not reveal which aircraft won the dogfight.

you are viewing a single comment's thread
view the rest of the comments
[-] KeenFlame@feddit.nu 44 points 6 months ago

I am a FIRM believer in any automated kill without a human pulling the trigger is a war crime

Yes mines yes uavs yes yes yes

It is a crime against humanity

Stop

[-] antidote101@lemmy.world 10 points 6 months ago

What if the human is pulling the trigger to "paint the target" and tag it for hunt and destroy then the drone goes and kills it? Because that's how lots of missles already work. So where's the line?

[-] KeenFlame@feddit.nu 7 points 6 months ago

The line is where an automatic process target and execute a human being. When it is automated. The arming of a device is not sufficient to warrant a human interaction, and as such mines are also not allowed.

This should in my opinion always have been the case. Mines are indiscriminate and have proven to be wildly inhumane in several ways. Significantly, innocents are often killed.

But mines don't paint the picture of what automated slaughter can lead to.

The point has been laid that when the conscious mind has to kill, it makes war have an important way to end, in the mind.

The dangers extend well beyond killing innocent targets, another part is the coldness of allowing a machine to decide, that is beyond morally corrupt. There is something terrifying about the very idea that facing one of these weapons, there is nothing to negotiate, the cold calculations that want to kill you are not human. It is a place where no human ever wants to be. But war is horrible. It's the escalation of automated triggers that can lead to exponential death with no remorse which is just a terrible danger.

The murder weapons has nobody's intent behind them, except very far back, in the arming and the program. It open for scenarios where mass murder becomes easy and terrifyingly cold.

Kind of like the prisoner's dilemma shows us, that when war escalates, it can quickly devolve into revenge narratives, and when either side has access to cold impudent kills, they will use them. This removes even more humanity from the acts and the violence can reach new heights beyond our comprehension.

Weapons of mass destruction with automated triggers will eventually seal our existence if we don't abolish it with impunity. It has been seen over and over how the human factor is the only grace that ever end or contain war. Without this component I think we are just doomed to have the last intent humans ever had was revenge, and the last emotions fear and complete hopelessness.

[-] antidote101@lemmy.world 2 points 6 months ago* (last edited 6 months ago)

Well, that's all very idealistic, but it's likely not going to happen.

Israel already used AI to pick bombing sites, those bombs and missiles would have been programmed with altitudes and destinations (armed) then dropped. The pilots only job these days is to avoid interception, fly over the bombing locations, tag the target when acquired, and drop them. Most of this is already done in software.

Eventually humans will leave the loop because unlike self-driving cars, these technologies won't risk the lives of the aggressor's citizens.

If the technology is seen as unstoppable enough, there may be calls for warnings to be given, but I suspect that's all the mercy that will be shown...

... especially if it's a case of a country with automated technologies killing one without or with stochastically meaningless defenses (eg. Defenses that modelling and simulations show won't be able to prevent such attacks).

No, in all likelihood the US will tell the country the attack sites, the country either will or will not have the technical level to prevent an amount of damage, will evacuate all necessary personal, and whoever doesn't get the message or get out in time will be automatically killed.

Where defenses are partially successful, that information will go into the training data for the next model, or upgrade, and the war machine will roll on.

[-] KeenFlame@feddit.nu 0 points 6 months ago

Sorry I was stressed when replying. Yeah in those cases humans have pulled the trigger. At several stages.

When arming a murder bot ship and sending to erase an island of life, you then lose control. That person is not pulling loads and loads of triggers. The triggers are automatic by a machine making the decision to end these lives.

And that is a danger, same as with engineered bio warfare. It just cannot be let out of the box even, or we all may die extremely quick.

[-] KeenFlame@feddit.nu -2 points 6 months ago

You described a scenarios where a human was involved in several stages of the killing so it's no wonder those don't hold up

[-] postmateDumbass@lemmy.world 6 points 6 months ago

If it is a bad kill, is there a person who will go to jail or be executed for it?

[-] antidote101@lemmy.world 5 points 6 months ago* (last edited 6 months ago)

Only the losing side is subject to war crimes trials, and no doubt rules of engagement will be developed and followed to prevent people going to jail due to "bad kills".

There are really no "bad kills" in the armed services, there's just limited exposure of public scandals.

Especially for the US who doesn't subject its self to international courts like The Hague. So any atrocities, accidents, or war crimes will still just be internal scandals and temporary.

Same as it is today.

[-] KeenFlame@feddit.nu 0 points 6 months ago

If a country implements murder machines that efficiently slay a continent then does not stop at the sea.

Will nobody for real do nothing?

Is that your belief for bad kills? Same with gas and engineered disease?

[-] KeenFlame@feddit.nu 0 points 6 months ago

Biological and chemical warfare

[-] VirtualOdour@sh.itjust.works 1 points 6 months ago

Of course there isn't just like there isn't when a human makes a mistake on the battlefield, you think that every civilian killed by an American soldier in Afghanistan resulted in a trial and punishment? American hasn't executed amy soldiers since 1961 (for rape and attempted murder of a child in austria, not during war)

Honestly at least the military code will obey orders and only focus on the objective rather than rape and murder for fun.

[-] KeenFlame@feddit.nu 0 points 6 months ago

Like if someone made a biological weapon that wipes out a continent

Will someone go to prison?

It's no difference

[-] NeatNit@discuss.tchncs.de 8 points 6 months ago

I see this as a positive: when both sides have AI unmanned planes, we get cool dogfights without human risk! Ideally over ocean or desert and with Hollywood cameras capturing every second in exquisite detail.

[-] Emmie@lemm.ee 8 points 6 months ago* (last edited 6 months ago)

I am a firm believer that any war is a crime and there is no ethical way to wage wars lmao It’s some kind of naive idea from extremely out of touch politicans.

War never changes.

The idea that we don’t do war crimes and they do is only there to placate our fragile conscience. To assure us that yes we are indeed the good guys. That kills of infants by our soldiers are merely the collateral. A necessary price.

[-] KeenFlame@feddit.nu 1 points 6 months ago

Absolutely. But

There's a science and whole cultures built around war now

It is important to not infantilize the debate by being absolutist and just shutting any action out.

I am a hard core pacifist at heart.

But this law I want is just not related to that. It is something I feel is needed just to not spell doom on our species. Like with biological warfare

How often do robots fail? How can anyone be so naive as to not see the same danger as with bio warfare? You can't assure a robot to not become a mass murder cold ass genocidal perpetual machine. And that's a no no if we want to exist

[-] DreamlandLividity@lemmy.world 8 points 6 months ago* (last edited 6 months ago)

You mean it should be a war crime, right? Or is there some treaty I am unaware of?

Also, why? I don't necessarily disagree, I am just curious about your reasoning.

[-] Hacksaw@lemmy.ca 23 points 6 months ago

Not OP, but if you can't convince a person to kill another person then you shouldn't be able to kill them anyways.

There are points in historical conflicts, from revolutions to wars, when the very people you picked to fight for your side think "are we the baddies" and just stop fighting. This generally leads to less deaths and sometimes a more democratic outcome.

If you can just get a drone to keep killing when any reasonable person would surrender you're empowering authoritarianism and tyranny.

[-] ohwhatfollyisman@lemmy.world 8 points 6 months ago

see star trek TNG episode The Arsenal of Freedom for a more explicit visualisation of this ☝️ guy's point.

[-] n3m37h@sh.itjust.works 8 points 6 months ago

Take WWI Christmas when everyone got out of the trenches and played some football (no not American foot touches the ball 3x a game)

It almost ended the war

[-] KeenFlame@feddit.nu 4 points 6 months ago

Yes the humanity factor is vital

Imagine the horrid destructive cold force of automated genocide, it can not be met by anything other than the same or worse and at that point we are truly doomed

Because there will then be no one that can prevent it anymore

It must be met with worse opposition than biological warfare did after wwI, hopefully before tragedy

[-] i_love_FFT@lemmy.ml 6 points 6 months ago* (last edited 6 months ago)

Mines are designated war crimes by the ~~Geneva convention~~ Ottawa treaty because of the indiscriminate killing. Many years ago, good human right lawyers could have extended that to drones... (Source: i had close friends in international law)

But i feel like now the tides have changed and tech companies have influenced the general population to think that ai is good enough to prevent "indiscriminate" killing.

Edit: fixed the treaty name, thanks!

[-] tal@lemmy.today 8 points 6 months ago* (last edited 6 months ago)

Mines are designated war crimes by the Geneva convention

Use of mines is not designated a war crime by the Geneva Convention.

Some countries are members of a treaty that prohibits the use of some types of mines, but that is not the Geneva Convention.

https://en.wikipedia.org/wiki/Ottawa_Treaty

[-] DreamlandLividity@lemmy.world 3 points 6 months ago* (last edited 6 months ago)

Mines are not part of what people refer to as the Geneva conventions. There is a separate treaty specifically banning some landmines, that was signed by a lot of countries but not really any that mattered.

[-] KeenFlame@feddit.nu 2 points 6 months ago

Yes

Because it is a slippery slope and dangerous to our future existence as a species

[-] DreamlandLividity@lemmy.world 1 points 6 months ago
[-] KeenFlame@feddit.nu 0 points 6 months ago

First it is enemy tanks. Then enemy air. Then enemy boats and vehicles, then foot soldiers and when these weapons are used the same happens to their enemy. Then at last one day all humans are killed

[-] xor@lemmy.blahaj.zone 2 points 6 months ago

I broadly agree, but that's not what this is, right?

This is a demonstration of using AI to execute combat against an explicitly selected target.

So it still needs the human to pull the trigger, just the trigger does some sick plane stunts rather than just firing a bullet in a straight line.

[-] KeenFlame@feddit.nu 2 points 6 months ago

I would imagine it was more than evasive since they called it a dogfight, but ye

this post was submitted on 19 Apr 2024
373 points (97.7% liked)

Technology

59312 readers
4536 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS