Deepfake research. CIA’s cloud contract. 5G infrastructure in Germany.
November 25, 2020
US Senate authorizes deepfake research.
The Cyber Security News reports the US Senate unanimously passed a bill that would tell the NSF and NIST to sponsor research on generative adversarial networks (GANs), systems prepared with real inputs to produce inauthentic outputs. The Identifying Outputs of Generative Adversarial Networks (IOGAN) Act would cost the agencies $6 million over the next five years—“subject to the availability of appropriated funds” — according to a Congressional Budget Office document.
NSF would be tasked with awarding competitive grants on “manipulated digital content and information authenticity” (research it already sponsors) in coordination with DARPA and IARPA. NIST would likely need to hire ten people to set up research into “measurements and standards for the development of technological tools that examine” deepfake tech. Both agencies were also directed to look into public-private partnerships. The next stop for the bill is the House of Representatives, which passed a “nearly identical” bill last year, as noted by Biometric Update.
For the curious, NSF’s mandate specifically covers the following:
- Tools for detecting inauthenticity and confirming authenticity.
- Behavioral studies of interactions with manipulated media.
- Effective methods of public education.
We also heard from some industry experts on the prospects of the research. Yossi Zekri, President and CEO at Acuant, said, “This legislation is vital to defend against growing threats that engage deepfake technology. While several instances of these attacks have gone viral, there is real damage being done on a much broader scale as well when it comes to financial transactions and fraudsters using this tech to subvert identity verification. It is imperative that both governments and businesses take measures to defeat bad actors via legislation and by employing technology that uses advanced AI that can prevent this fraud.”
Read the full article here.