Best of the Week
Most Popular
1. 2019 From A Fourth Turning Perspective - James_Quinn
2.Beware the Young Stocks Bear Market! - Zeal_LLC
3.Safe Havens are Surging. What this Means for Stocks 2019 - Troy_Bombardia
4.Most Popular Financial Markets Analysis of 2018 - Trump and BrExit Chaos Dominate - Nadeem_Walayat
5.January 2019 Financial Markets Analysis and Forecasts - Nadeem_Walayat
6.Silver Price Trend Analysis 2019 - Nadeem_Walayat
7.Why 90% of Traders Lose - Nadeem_Walayat
8.What to do With Your Money in a Stocks Bear Market - Stephen_McBride
9.Stock Market What to Expect in the First 3~5 Months of 2019 - Chris_Vermeulen
10.China, Global Economy has Tipped over: The Surging Dollar and the Rallying Yen - FXCOT
Last 7 days
Here’s Why The Left’s New Economic Policies Are Just Stupid - 19th Feb 19
Should We Declare Emergency for Gold? - 19th Feb 19
Why Stock Traders Must Stay Optimistically Cautious Going Forward - 19th Feb 19
The Corporate Debt Bubble Is Strikingly Similar to the Subprime Mortgage Bubble - 18th Feb 19
Stacking The Next QE On Top Of A $4 Trillion Fed Floor - 18th Feb 19
Get ready for the Stock Market Breakout Pattern Setup II - 18th Feb 19
It's Blue Skies For The Stock Market As Far As The Eye Can See - 18th Feb 19
Stock Market Correction is Due - 18th Feb 19
Iran's Death Spiral -- 40 Years And Counting - 17 Feb 19
Venezuela's Opposition Is Playing With Fire - 17 Feb 19
Fed Chairman Deceives; Precious Metals Mine Supply Threatened - 17 Feb 19
After 8 Terrific Weeks for Stocks, What’s Next? - 16th Feb 19
My Favorite Real Estate Strategies: Rent to Live, Buy to Rent - 16th Feb 19
Schumer & Sanders Want One Thing: Your Money - 16th Feb 19
What Could Happen When the Stock Markets Correct Next - 16th Feb 19
Bitcoin Your Best Opportunity Outside of Stocks - 16th Feb 19
Olympus TG-5 Tough Camera Under SEA Water Test - 16th Feb 19
"Mi Amigo" Sheffield Bomber Crash Memorial Site Fly-past on 22nd February 2019 VR360 - 16th Feb 19
Plunging Inventories have Zinc Bulls Ready to Run - 15th Feb 19
Gold Stocks Mega Mergers Are Bad for Shareholders - 15th Feb 19
Retail Sales Crash! It’s 2008 All Over Again for Stock Market and Economy! - 15th Feb 19
Is Gold Market 2019 Like 2016? - 15th Feb 19
Virgin Media's Increasingly Unreliable Broadband Service - 15th Feb 19
2019 Starting to Shine But is it a Long Con for Stock Investors? - 15th Feb 19
Gold is on the Verge of a Bull-run and Here's Why - 15th Feb 19
Will Stock Market 2019 be like 1999? - 14th Feb 19
3 Charts That Scream “Don’t Buy Stocks” - 14th Feb 19
Capitalism Isn’t Bad, It’s Just Broken - 14th Feb 19
How To Find High-Yield Dividend Stocks That Are Safe - 14th Feb 19
Strategy Session - How This Stocks Bear Market Fits in With Markets of the Past - 14th Feb 19
Marijuana Stocks Ready for Another Massive Rally? - 14th Feb 19
Wage Day Advance And Why There is No Shame About It - 14th Feb 19
Will 2019 be the Year of the Big Breakout for Gold? - 13th Feb 19
Earth Overshoot Day Illustrates We are the Lemmings - 13th Feb 19
A Stock Market Rally With No Pullbacks. What’s Next for Stocks - 13th Feb 19
Where Is Gold’s Rally in Response to USD Weakness? - 13th Feb 19

Market Oracle FREE Newsletter

The Real Secret for Successful Trading

DARPA's XAI Explainable Artificial Intelligence Future

Politics / AI May 15, 2018 - 02:00 PM GMT

By: BATR

Politics

The popular scenario has AI deploying autonomous killer military robots as storm troopers. The mission of DARPA is to create the cutting edge of weaponized technology. So when a report contends that the Pentagon now using Jade Helm exercises to teach Skynet how to kill humans, it is not simply a screenplay for a Hollywood blockbuster.

"Simply AI quantum computing technology that can produce the holographic battle simulations and, in addition, "has the ability to use vast amounts of data being collected on the human domain to generate human terrain systems in geographic population centric locations" as a means of identifying and eliminating targets - insurgents, rebels or "whatever labels that can be flagged as targets in a Global Information Grid for Network Centric Warfare environments."


While this assessment may alarm the most fearful, Steven Walker, director of the Defense Advanced Research Projects Agency in DARPA: Next-generation artificial intelligence in the works, presents a far more sedated viewpoint.

"Walker described the current generation of AI as its “second wave,” which has led to breakthroughs like autonomous vehicles. By comparison, “first wave” applications, like tax preparation software, follow simple logic rules and are widely used in consumer technology.

While second-wave AI technology has the potential to, for example, control the use of the electromagnetic spectrum on the battlefield, Walker said the tools aren’t flexible enough to adapt to new inputs.

The third wave of AI will rely on contextual adaptation — having a computer or machine understand the context of the environment it’s working in, and being able to learn and adapt based on changes in that environment."

Here is where the XAI model comes into play. The authoritative publication Janes states that  DARPA’s XAI seeks explanations from autonomous systems. "According to DARPA, XAI aims to “produce more explainable models, while maintaining a high level of learning performance (prediction accuracy); and enable human users to understand, appropriately, trust, and effectively manage the emerging generation of artificially intelligent partners”.

Mr. David Gunning provides an insight that Explainable Artificial Intelligence (XAI) is the next development.

"XAI is one of a handful of current DARPA programs expected to enable “third-wave AI systems”, where machines understand the context and environment in which they operate, and over time build underlying explanatory models that allow them to characterize real world phenomena.

The XAI program is focused on the development of multiple systems by addressing challenge problems in two areas: (1) machine learning problems to classify events of interest in heterogeneous, multimedia data; and (2) machine learning problems to construct decision policies for an autonomous system to perform a variety of simulated missions. These two challenge problem areas were chosen to represent the intersection of two important machine learning approaches (classification and reinforcement learning) and two important operational problem areas for the DoD (intelligence analysis and autonomous systems)."

The FedBizOpps government site provides this synopsis: "The goal of Explainable AI (XAI) is to create a suite of new or modified machine learning techniques that produce explainable models that, when combined with effective explanation techniques, enable end users to understand, appropriately trust, and effectively manage the emerging generation of AI systems."

The private sector is involved in these developments. The question that gets lost involves national security since Xerox is being bought by Fujifilm. But why worry over such mere details when the machines are on a path to become self-directed networks.

"PARC, a Xerox company, today announced it has been selected by the Defense Advanced Research Projects Agency (DARPA), under its Explainable Artificial Intelligence (XAI) program, to help advance the underlying science of AI. For this multi-million dollar contract, PARC will aim to develop a highly interactive sense-making system called COGLE (COmmon Ground Learning and Explanation), which may explain the learned performance capabilities of autonomous systems to human users."

With the news that the Xerox sale to Fuji is called off, could the PARC component of this deal be a breaker?

As for trusting the results of the technology, just ask the machine. It will tell the human user what to believe. Another firm that is involved with XAI is Charles River Analytics. The stated objective is to overcome the current limitation from the human interface. "The Department of Defense (DoD) is investigating the concept that XAI -- especially explainable machine learning -- will be essential if future warfighters are to understand, appropriately trust, and effectively manage an emerging generation of artificially intelligent machine partners."

The Defense Department is developing a New project wants AI to explain itself.

"Explainable Artificial Intelligence (XAI), which looks to create tools that allow a human on the receiving end of information or a decision from an AI machine to understand the reasoning that produced it. In essence, the machine needs to explain its thinking.

More recent efforts have employed new techniques such as complex algorithms, probabilistic graphical models, deep learning neural networks and other methods that have proved to be more effective but, because their models are based on the machines’ own internal representations, are less explainable.

The Air Force, for example, recently awarded SRA International a contract to focus specifically on the trust issues associated with autonomous systems."

It would be a mistake to equate an AI system to just an advanced auto pilot device to navigate an aircraft. While the outward description of an objective to create an AI communication with human interface sounds reassuring, the actual risk of generating an entirely independent computerized decision structure is being mostly ignored.

Just look at the dangerous use of AI at Facebook. AI Is Inventing Languages Humans Can’t Understand. Should We Stop It? Can DARPA be confident that they can control a self-generation and thinking Artificial Intelligence entity that may very well see a human component unnecessary? Imagine a future combat regiment that see their commanding officer as inferior to the barking of a drill sergeant computer terminal? In such an environment, where would a General Douglas MacArthur fit in?   

XAI is an overly optimistic belief that humans can always pull the plug on a rogue machine. Well, such a conviction needs to be approved by the AI cloud computer.

SARTRE

Source: http://batr.org/utopia/051518.html

Discuss or comment about this essay on the BATR Forum

http://www.batr.org

"Many seek to become a Syndicated Columnist, while the few strive to be a Vindicated Publisher"

© 2018 Copyright BATR - All Rights Reserved

Disclaimer: The above is a matter of opinion provided for general information purposes only and is not intended as investment advice. Information and analysis above are derived from sources and utilising methods believed to be reliable, but we cannot accept responsibility for any losses you may incur as a result of this analysis. Individuals should consult with their personal financial advisors

BATR Archive

© 2005-2019 http://www.MarketOracle.co.uk - The Market Oracle is a FREE Daily Financial Markets Analysis & Forecasting online publication.


Post Comment

Only logged in users are allowed to post comments. Register/ Log in

6 Critical Money Making Rules