Frank Sauer on Autonomous Weapon Systems

Sdílet
Vložit
  • čas přidán 28. 05. 2024
  • Frank Sauer joins the podcast to discuss autonomy in weapon systems, killer drones, low-tech defenses against drones, the flaws and unpredictability of autonomous weapon systems, and the political possibilities of regulating such systems. You can learn more about Frank's work here: metis.unibw.de/en/
    Timestamps:
    00:00 Autonomy in weapon systems
    12:19 Balance of offense and defense
    20:05 Killer drone systems
    28:53 Is autonomy like nuclear weapons?
    37:20 Low-tech defenses against drones
    48:29 Autonomy and power balance
    1:00:24 Tricking autonomous systems
    1:07:53 Unpredictability of autonomous systems
    1:13:16 Will we trust autonomous systems too much?
    1:27:28 Legal terminology
    1:32:12 Political possibilities
  • Věda a technologie

Komentáře • 5

  • @MrWingman2009
    @MrWingman2009 Před 5 měsíci

    This is so good. It's concerning to me that there are so few views.

  • @danaut3936
    @danaut3936 Před 5 měsíci

    Another great conversation. Thank you!

  • @D2jspOFFICIAL
    @D2jspOFFICIAL Před 3 měsíci

    Dr Sauer 👍

  • @user-fl1rr6vg1y
    @user-fl1rr6vg1y Před 5 měsíci

    Cant share would love to. Toss out weapon and replace by machine or system, and the mr. Sauer's point becomes even more relevant.

  • @blahblahsaurus2458
    @blahblahsaurus2458 Před 2 měsíci

    "We", "us", "our", "humans".
    I am constantly shocked at the lack of imagination by everyone in these discussions. When mostly-autonomous control of weapons exists, and mostly-autonomous control of industrial tasks exists, we will reach a point where a very small group of people can build an army and fight a war all on their own. This will be possible very soon - if it isn't already - for all sorts of groups: corporations, rulers of countries, or anyone else with the necessary resources. And every time AI becomes more capable, this will be possible for a smaller group of people.
    When we reach this point, most of the considerations discussed in this podcast won't matter. Laws won't matter unless they are enforced by someone with an army that is keeping up with the arms race. Economics as we know it won't matter, because only people in control of their own industries and resources will have anything worth trading. Public opinion barely matters today.
    Most of the challenges with AI weapons are how to avoid friendly fire and collateral damage. If the people with their own armies only care about themselves and a couple of thousand workers that they still need, collateral damage is no longer a concern, and friendly fire is only a matter of efficiency.
    There is no "we" anymore. There's only humans who have their own army, and humans who don't. Very soon, there's going to be factions who separate from the human based economy, those factions will engage in an arms race, that arms race will force them into war, and that war will probably kill most humans whether purposefully or as an unintended consequence.
    Nuclear weapons will suddenly become viable. If most of the population is not needed for the war effort, nuclear winter is not a deterrent. And if you're living in a well hidden bunker, mutually assured destruction is no longer a deterrent.