Best of the Week
Most Popular
1. Investing in a Bubble Mania Stock Market Trending Towards Financial Crisis 2.0 CRASH! - 9th Sep 21
2.Tech Stocks Bubble Valuations 2000 vs 2021 - 25th Sep 21
3.Stock Market FOMO Going into Crash Season - 8th Oct 21
4.Stock Market FOMO Hits September Brick Wall - Evergrande China's Lehman's Moment - 22nd Sep 21
5.Crypto Bubble BURSTS! BTC, ETH, XRP CRASH! NiceHash Seizes Funds on Account Halting ALL Withdrawals! - 19th May 21
6.How to Protect Your Self From a Stock Market CRASH / Bear Market? - 14th Oct 21
7.AI Stocks Portfolio Buying and Selling Levels Going Into Market Correction - 11th Oct 21
8.Why Silver Price Could Crash by 20%! - 5th Oct 21
9.Powell: Inflation Might Not Be Transitory, After All - 3rd Oct 21
10.Global Stock Markets Topped 60 Days Before the US Stocks Peaked - 23rd Sep 21
Last 7 days
Bitcoin Price TRIGGER for Accumulating Into Alt Coins for 2022 Price Explosion - 30th Nov 21
Omicron Covid Wave 4 Impact on Financial Markets - 30th Nov 21
Can You Hear It? That’s the Crowd Booing Gold’s Downturn - 30th Nov 21
Economic and Market Impacts of Omicron Strain Covid 4th Wave - 30th Nov 21
Stock Market Historical Trends Suggest A Strengthening Bullish Trend In December - 30th Nov 21
Crypto Market Analysis: What Trading Will Look Like in 2022 for Novice and Veteran Traders? - 30th Nov 21
Best Stocks for Investing to Profit form the Metaverse and Get Rich - 29th Nov 21
Should You Invest In Real Estate In 2021? - 29th Nov 21
Silver Long-term Trend Analysis - 28th Nov 21
Silver Mining Stocks Fundamentals - 28th Nov 21
Crude Oil Didn’t Like Thanksgiving Turkey This Year - 28th Nov 21
Sheffield First Snow Winter 2021 - Snowballs and Snowmen Fun - 28th Nov 21
Stock Market Investing LESSON - Buying Value - 27th Nov 21
Corsair MP600 NVME M.2 SSD 66% Performance Loss After 6 Months of Use - Benchmark Tests - 27th Nov 21
Stock Maket Trading Lesson - How to REALLY Trade Markets - 26th Nov 21
SILVER Price Trend Analysis - 26th Nov 21
Federal Reserve Asks Americans to Eat Soy “Meat” for Thanksgiving - 26th Nov 21
Is the S&P 500 Topping or Just Consolidating? - 26th Nov 21
Is a Bigger Drop in Gold Price Just Around the Corner? - 26th Nov 21
Financial Stocks ETF Sector XLF Pullback Sets Up A New $43.60 Upside Target - 26th Nov 21
A Couple of Things to Think About Before Buying Shares - 25th Nov 21
UK Best Fixed Rate Tariff Deal is to NOT FIX Gas and Electric Energy Tariffs During Winter 2021-22 - 25th Nov 21
Stock Market Begins it's Year End Seasonal Santa Rally - 24th Nov 21
How Silver Can Conquer $50+ in 2022 - 24th Nov 21
Stock Market Betting on Hawkish Fed - 24th Nov 21
Stock Market Elliott Wave Trend Forecast - 24th Nov 21
Your once-a-year All-Access Financial Markets Analysis Pass - 24th Nov 21
Did Zillow’s $300 million flop prove me wrong? - 24th Nov 21
Now Malaysian Drivers Renew Their Kurnia Car Insurance Online With Fincrew.my - 24th Nov 21
Gold / Silver Ratio - 23rd Nov 21
Stock Market Sentiment Speaks: Can We Get To 5500SPX In 2022? But 4440SPX Comes First - 23rd Nov 21
A Month-to-month breakdown of how Much Money Individuals are Spending on Stocks - 23rd Nov 21
S&P 500: Rallying Tech Stocks vs. Plummeting Oil Stocks - 23rd Nov 21
Like the Latest Bond Flick, the US Dollar Has No Time to Die - 23rd Nov 21
Why BITCOIN NEW ALL TIME HIGH Changes EVERYTHING! - 22nd Nov 21
Cannabis ETF MJ Basing & Volatility Patterns - 22nd Nov 21
The Most Important Lesson Learned from this COVID Pandemic - 22nd Nov 21
Dow Stock Market Trend Analysis - 22nd Nov 21

Market Oracle FREE Newsletter

How to Protect your Wealth by Investing in AI Tech Stocks

Nvidia’s Chips Have Powered Nearly Every Major AI Breakthrough

Companies / AI Dec 24, 2020 - 06:02 PM GMT

By: Stephen_McBride

Companies

 “Within 20 years, machines will be capable of doing anything man can do.”

Take a stab at when this quote is from. It wasn’t this year,  2010, or even during the ‘90s tech boom. It’s from one of America’s top computer scientists: in 1960.

You’ve surely heard about Artificial Intelligence (AI) before. “AI” often conjures up images of intelligent robots taking over the world. You’ll often read that it’s only a matter of time before AI steals all our jobs.

But the idea of humanoid machines is nothing new. It began with the “heartless” Tin Man from The Wizard of Oz. By the 1950s, a generation of scientists and engineers were convinced we’d soon co-exist with clever robots.


The term artificial intelligence was coined in 1954 at Dartmouth during the world’s first AI conference. Attendee Marvin Minsky, who later founded MIT’s AI lab, said “In 3–8 years, we will have a machine with the intelligence of a human.”

A couple of years later, Stanford created its AI project “with the goal of building a fully intelligent machine in a decade.”

This idea gripped Hollywood, too. Ever watch the sci-fi classic 2001: A Space Odyssey? The 1968 movie is best remembered for the intelligent supercomputer, HAL 9000. HAL could think just like a human and had the ability to scheme against anyone who threatened its survival.

Soon novels like I, Robot packed our bookshelves. We got stories of robots gone mad, mind-reading robots, robots with a sense of humor, and robots that secretly run the world.

Even the US military was convinced, so it pumped billions of dollars into AI research. In the ‘50s, we imagined bionic men would soon be running factories. Within a decade, cyborgs would be doing our housework. We were promised a new breed of machines.

70 years later, what did we get? Dishwashers, air conditioners, and microwaves!

How Do Robots Learn?

Despite many lofty predictions and billions of dollars in funding, we never got machines with human-like intelligence. You have to dig into how machines learn to see why the idea was a flop from the get-go.

“AI” is a term that’s shrouded in a weird mix of hype and complexity. But the core idea of artificial intelligence is a machine that learns and thinks just like you or I. Most importantly, it learns all by itself, without human intervention.

Of course learning doesn’t come naturally to robots. To overcome this challenge, scientists created neural networks in the late 1950s. In short, neural networks are computer programs that mimic how the human brain works. They are made of thousands—sometimes millions—of artificial “brain cells” that learn through analyzing examples.

Say you’re creating a machine that can recognize cats. First, you’ll feed tons of cat pictures into the neural network. After analyzing, say, 1,000 examples, it starts to learn what a cat looks like. Then you can show it a real cat it’s never seen before, and it will know what it is.

Scientists who believed neural networks would breed intelligent computers were right on the money. Problem was… they lacked the raw materials needed to fuel their ambitions.

Remember, machines learn through analyzing examples, or data. And it turns out you need to feed them with truly enormous amounts of data to kindle any kind of intelligence. So machines need to see hundreds of thousands, if not millions, of cat pictures before they “learn” what a cat looks like. But in the ‘60s and ‘70s, we didn’t have that much data. The internet wasn’t invented, so we had almost no digital text or images. Books, photo libraries, and documents were still in the physical world, which meant converting them into digital files was inefficient and expensive.

And get this: the lack of data wasn’t even the greatest hurdle to building intelligent computers. Designing computer programs that mimic the human brain was genius. The drawback was neural networks needed hyper-fast computers to function.

And by 1995, even supercomputers were shockingly slow. For example, it took a giant “render farm” of 117 Sun Microsystems running 24/7 to produce the original Toy Story. The machines worked non-stop for seven weeks to produce the 78-minute film.

A Match Made in Heaven

After 40 years in the wilderness, two huge breakthroughs are fueling an AI renaissance.

The internet handed us a near unlimited amount of data. A recent IBM paper found 90% of the world’s data has been created in just the last two years. From the 290+ billion photos shared on Facebook, to millions of e-books, billions of online articles and images, we now have endless fodder for neural networks.

The breathtaking jump in computing power is the other half of the equation. RiskHedge readers know computer chips are the “brains” of electronics like your phone and laptop. Chips contain billions of “brain cells” called transistors. The more transistors on a chip, the faster it is.

Your phone is more powerful than the render farm that produced Toy Story. The 117 Sun Microsystems had 1 billion transistors, combined. There are 8.7 billion packed onto the chip inside the latest iPhone!

And in the past decade, a special type of computer chip emerged as the perfect fit for neural networks.

Do you remember the blocky graphics on video games like Mario and Sonic from the ‘90s? If you have kids who are gamers, you’ll know graphics have gotten far more realistic since then. Here’s each Lara Croft from the Tomb Raider series since 1996:


Source: Epic Games

This incredible jump is due to chips called graphics processing units (GPUs). GPUs can perform thousands of calculations all at once, which helps create these movie-like graphics. That’s different from how traditional chips work, which calculate one by one.

Around 2006, Stanford researchers discovered GPUs “parallel processing” abilities were perfect for AI training. For example, do you remember Google’s Brain project? The machine taught itself to recognize cats and people by watching YouTube videos. It was powered by one of Google’s giant data centers, running on 2,000 traditional computer chips. In fact, the project cost a hefty $5 billion.

Stanford researchers then built the same machine with GPUs instead. A dozen GPUs delivered the same data crunching performance of 2,000 traditional chips. And it slashed costs from $5 billion to $33,000! The huge leap in computing power and explosion of data means we finally have the “lifeblood” of AI.

America’s Most Important Company

Artificial intelligence is the ultimate buzzword in tech these days. Data from Bloomberg shows a record 840 US firms mentioned AI at least once in recent earnings reports. In short, it’s become a “mating call” for companies trying to attract investor dollars.

The reality is few of these companies are building intelligent systems. For example, venture capital firm MMC Ventures recently studied 2,830 AI start-ups. In 40% of cases, it found no evidence AI was an important part of their business.

You only need to ask one simple question to weed out the fakes: What percent of their sales come from AI? I’ve done the work: and I can tell you only a handful make any money from this budding disruption.

The one company with a booming AI business is NVIDIA (NVDA). NVIDIA invented graphics processing units back in the 1990s. It’s solely responsible for the realistic video game graphics we have today. And then we discovered these gaming chips were perfect for training neural networks.

NVIDIA stumbled into AI by accident, but early on, it realized it was a huge opportunity. Soon after, NVIDIA started building chips specifically optimized for machine learning. And in the first half of 2020, AI-related sales topped $2.8 billion. In fact, more than 90% of neural network training runs on NVIDIA GPUs today.

Its AI-chips are lightyears ahead of the competition. Its newest system, the A100, is described as an “AI supercomputer in a box.” With more than 54 billion transistors, it’s the most powerful chip system ever created.

In fact, just one A100 packs the same computing power as 300 data center servers. And it does it for one-tenth the cost, takes up one-sixtieth the space, and runs on one-twentieth the power consumption of a typical server room. A single A100 reduces a whole room of servers to one rack.

The Epicenter of Disruption

NVIDIA has a virtual monopoly on neural network training. And every breakthrough worth mentioning has been powered by its GPUs.

Computer vision is one of the world’s most important disruptions. And graphics chips are perfect for helping computers to “see.”

NVIDIA crafted its DRIVE chips specially for self-driving cars. These chips power several robocar startups including Zoox, which Amazon just snapped up for $1.2 billion. With NVIDIA’s backing, vision disruptor Trigo is transforming grocery stores into giant supercomputers.

Trigo fits stores out with a network of cameras and sensors, which feed its neural network with reams of data. In short, the network has learned to “see” what items customers throw in their baskets. So when you’re finished shopping, you simply walk out. Trigo then sends the store a tally, who bills you for that amount.

Trigo’s computer vision system is powered by NVIDIA chips and software. The UK’s largest grocer, Tesco, is trialing Trigo in several of its stores. and each system runs on 40–50 GPUs.

But hands-down the biggest breakthroughs are happening in America’s most broken industry—healthcare.

Cancer is the #2 killer in America, responsible for 600,000 deaths last year. Catching the disease early has proven to be an effective way of beating it. But today, spotting tumors is a manual, time-consuming process.

Medical imaging disruptor Paige.AI built an AI system that could revolutionize cancer diagnosis. Paige.AI fed millions of real-life medical images into its neural network. Using 10 NVIDIA GPUs, it trained the system to detect early signs of tumors.

The neural network recently tested itself by scanning 12,000 medical images for potential tumors. It had never seen these images before, yet was able to “achieve near perfect accuracy.” After announcing these results, Paige.AI was granted “Breakthrough Designation” by the FDA, the first ever for an AI in cancer diagnosis.

NVIDIA is also opening the door to early detection of Alzheimer’s. Stanford researchers built an AI system that detects Alzheimer’s disease from scanning MRIs with 94% accuracy. Powered by six GPUs, it “learned” what biomarkers were most commonly associated with early signs of the disease.

The powerful GPU/AI combo is also saving victims of strokes. During a stroke, patients lose roughly 1.9 million brain cells every minute. So interpreting their CT scans even one second faster matters.

Medical imaging startup Deep01 has created a neural network which almost instantly evaluates strokes. DeepCT has a 95% accuracy rate within 30 seconds per case, which is roughly 10x faster than traditional methods. The system was trained on 60,000 medical images, using NVIDIA chips. And get this… it’s the first Asian firm to be granted FDA clearance for an AI product.

I could pepper you with dozens more examples, but you see my point. NVIDIA’s chips have powered almost every major AI breakthrough. It’s a buy today.

The Great Disruptors: 3 Breakthrough Stocks Set to Double Your Money"
Get my latest report where I reveal my three favorite stocks that will hand you 100% gains as they disrupt whole industries. Get your free copy here.

By Stephen McBride

http://www.riskhedge.com

© 2020 Copyright Stephen McBride - All Rights Reserved Disclaimer: The above is a matter of opinion provided for general information purposes only and is not intended as investment advice. Information and analysis above are derived from sources and utilising methods believed to be reliable, but we cannot accept responsibility for any losses you may incur as a result of this analysis. Individuals should consult with their personal financial advisors.


© 2005-2019 http://www.MarketOracle.co.uk - The Market Oracle is a FREE Daily Financial Markets Analysis & Forecasting online publication.


Post Comment

Only logged in users are allowed to post comments. Register/ Log in