2020-05-25 智邦网
编译 致远
据flightglobal网2020年5月19日报道,德国启动一项独立程序,考虑将自主和人工智能道德和法律影响作为多国“未来空战系统”项目(FCAS)的一部分,于2040年左右投入使用。
空客防务与航天公司FCAS首席工程师托马斯·格罗斯指出,该法、德、西班牙项目最初着眼于技术和设计要求,但要注意“道德问题”。称解决跨国项目这类问题,在设计和实现神经网络时将需采用模块化方法,“鉴于用户的道德理解不同,行为各异”。
德空军准将、FCAS国家项目负责人杰拉尔德·芬克说:“复杂性增加,技术机会和风险也在增加,不仅仅在军事领域。”弗劳恩霍夫通信、信息处理和人机工程学研究所(FKIE)首席科学家沃尔夫冈·科赫博士表示,从一开始,欧洲FCAS项目就引发了“围绕道德和法律原则的技术实施的不同争议”,提出“设计合规”要求。
神学家埃伦·乌伯斯查尔呼吁人工智能军用必须明确“红线”。“对人类来说,最大的耻辱就是被机器杀死”,“核心问题是能否在所有阶段,甚至在战争迷雾中保证有意义的人类控制?” 德国慕尼黑军事大学高级研究员弗兰克·绍尔博士警告:“从战略角度看,以机器速度进行的战斗,确实存在以机器速度升级的风险。因此让人类参与应慎之又慎”,“把生杀大权交给机器是不可接受的,这违背了人类基本价值观。”
Berlin launched related activities at a state level in 2019 to assess the suitability of using future autonomous weapon systems, with the involvement of its defence and foreign affairs ministries. On 14 May, the effort was expanded with the support of participants including Airbus Defence & Space and the Fraunhofer Institute for Communication, Information Processing and Ergonomics (FKIE), plus think tanks and universities.
During a webcast event to launch the expanded initiative, Airbus Defence & Space FCAS chief engineer Thomas Grohs noted that initial activities on the French-German-Spanish programme have been focused on technical and design requirements, but that participants understand “there are also the ethical considerations”.
Grohs says addressing such aspects during a multinational project will require a modular approach when designing and implementing neural networks, since “this behavior may differ from the different users according to their ethical understanding”.
Describing the military’s consideration of employing such technology, German air force Brigadier General Gerald Funke, national programme leader for FCAS, comments: “Complexity is growing, and so are the chances and risks of using those technologies – not just in the military.”
Dr Wolfgang Koch, chief scientist at FKIE, says that from its outset, the European FCAS programme raises “an intellectual struggle surrounding the technical implementation of ethical and legal principles”, and carries with it a need for “compliance by design”.
This view is echoed by theologian Ellen Ueberschaer, who is calling for clear “red lines” to be defined around the use of AI in a military setting. “The deepest humiliation for a human being is to be killed by a machine,” she notes. “The core question is can we at all stages, and even in the fog of war, guarantee meaningful human control?”
Dr Frank Sauer, a senior researcher at the German military university in Munich, also cautions that “From a strategic perspective, fighting at machine speed is accompanied by a real risk of escalation at machine speed. So it is very prudent to keep humans involved as circuit breakers,” he argues. “Delegating kill decisions to a machine is unacceptable, and goes against fundamental human values,” Sauer adds.
The parties behind the newly established expert panel hope to continue providing information to the German public in a transparent manner, and expect the process to in time also be expanded to involve participation by other nations involved in the FCAS effort. “We are well in front, and can prompt discussion in France and Spain,” Funke believes.
France and Germany formally launched the FCAS programme at last year’s Paris air show, where Spain also signaled its intention to participate. Targeting a replacement capability for the nations’ current Dassault Rafale and Eurofighter combat aircraft, the effort is expected to deliver a New Generation Fighter, to be accompanied by so-called remote carrier vehicles and advanced air-launched weapons, all operating within a networked system.
相关新闻