Monday, 14 May 2018

RAND looks at the uses of AI and then takes a long look at the chances of an accidental Nuclear War!

This is not a passing reference to Dr. Strangelove, in fact quite the opposite. In that film the humans were trying to de-escalate after an errant commander's psychosis unwittingly starts off a chain reaction of events that leads to nuclear Armageddon (sorry should have said "spoiler alert" before that). RAND's interpretation of risk is somewhat different and more frightening:
Especially as "Deep Learning" is much more Black Box than the GOFAI (Good Old Fashioned Artificial Intelligence) symbolic-based Expert Systems. As an old (and distinguished) AI Professor from Aberdeen University faculty once said of intelligent autonomous systems (in this case Air Traffic Control): "Yes very clever, but keep a (sensible) human in the loop please, that might be my flight you happen to be working with!" aka "Always watch the watchers!"

No comments: