Listen "Someday My 'Nets Will Code"
Episode Synopsis
Information about the AI Event Series mentioned in this episode: https://twitter.com/CNA_org/status/1400808135544213505?s=20 To RSVP contact Larry Lewis at [email protected]. Andy and Dave discuss the latest in AI news, including a report on Libya from the UN Security Council's Panel of Experts, which notes the March 2020 use of the "fully autonomous" Kargu-2 to engage retreating forces; it's unclear whether any person died in the conflict, and many other important details are missing from the incident. The Biden Administration releases its FY22 DoD Budget, which increases the RDT&E request, including $874M in AI research. NIST proposes an evaluation model for user trust in AI and seeks feedback; the model includes definitions for terms such as reliability and explainability. EleutherAI has provided an open-source version of GPT-3, called GPT-Neo, which uses an 825GB data "Pile" to train, and comes in 1.3B and 2.7B parameter versions. CSET takes a hands-on look at how transformer models such as GPT-3 can aid disinformation, with their findings published in Truth, Lies, and Automation: How Language Models Could Change Disinformation. IBM introduces a project aimed to teach AI to code, with CodeNet, a large dataset containing 500 million lines of code across 55 legacy and active programming languages. In a separate effort, researchers at Berkeley, Chicago, and Cornell publish results on using transformer models as "code generators," creating a benchmark (the Automated Programming Progress Standard) to measure progress; they find that GPT-Neo could pass approximately 15% of introductory problems, with GPT-3's 175B parameter model performing much worse (presumably due to the inability to fine-tune the larger model). The CNA Russia Studies Program leases an extensive report on AI and Autonomy in Russia, capping off their biweekly newsletters on the topic. Arthur Holland Michel publishes Known Unknowns: Data Issues and Military Autonomous Systems, which clearly identifies the known issues in autonomous systems that cause problems. The short story of the week comes from Asimov in 1956, with "Someday." And the Naval Institute Press publishes a collection of essays in AI at War: How big data, AI, and machine learning are changing naval warfare. Finally, Diana Gehlhaus from Georgetown's Center for Security and Emerging Technology (CSET), joins Andy and Dave to preview an upcoming event, "Requirements for Leveraging AI." Interview with Diana Gehlhaus: 33:32 Click here to visit our website and explore the links mentioned in the episode.
More episodes of the podcast AI with AI: Artificial Intelligence with Andy Ilachinski
All Good Things
24/02/2023
Up, Up, and Autonomy!
10/02/2023
Dr. GPT
29/01/2023
EmerGPT
13/01/2023
The Kwicker Man
16/12/2022
Battledrone Galactica
02/12/2022
The AI Who Loved Me
25/11/2022
Drawing Outside the Box
04/11/2022
Keep Watching the AIs!
23/09/2022
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.