Best of the Week
Most Popular
1. Stock Markets and the History Chart of the End of the World (With Presidential Cycles) - 28th Aug 20
2.Google, Apple, Amazon, Facebook... AI Tech Stocks Buying Levels and Valuations Q3 2020 - 31st Aug 20
3.The Inflation Mega-trend is Going Hyper! - 11th Sep 20
4.Is this the End of Capitalism? - 13th Sep 20
5.What's Driving Gold, Silver and What's Next? - 3rd Sep 20
6.QE4EVER! - 9th Sep 20
7.Gold Price Trend Forecast Analysis - Part1 - 7th Sep 20
8.The Fed May “Cause” The Next Stock Market Crash - 3rd Sep 20
9.Bitcoin Price Crash - You Will be Suprised What Happens Next - 7th Sep 20
10.NVIDIA Stock Price Soars on RTX 3000 Cornering the GPU Market for next 2 years! - 3rd Sep 20
Last 7 days
Intel Empire Fights Back with Rocket and Alder Lake! - 24th Jan 21
4 Reasons for Coronavirus 2021 Hope - 24th Jan 21
Apple M1 Chip Another Nail in Intel's Coffin - Top AI Tech Stocks 2021 - 24th Jan 21
Stock Market: Why You Should Prepare for a Jump in Volatility - 24th Jan 21
What’s next for Bitcoin Price – $56k or $16k? - 24th Jan 21
How Does Credit Repair Work? - 24th Jan 21
Silver Price 2021 Roadmap - 22nd Jan 21
Why Biden Wants to Win the Fight for $15 Federal Minimum Wage - 22nd Jan 21
Here’s Why Gold Recently Moved Up - 22nd Jan 21
US Dollar Decline creates New Sector Opportunities to Trade - 22nd Jan 21
Sandisk Extreme Micro SDXC Memory Card Read Write Speed Test Actual vs Sales Pitch - 22nd Jan 21
NHS Recommends Oximeter Oxygen Sensor Monitors for Everyone 10 Months Late! - 22nd Jan 21
DoorDash Has All the Makings of the “Next Amazon” - 22nd Jan 21
How to Survive a Silver-Gold Sucker Punch - 22nd Jan 21
2021: The Year of the Gripping Hand - 22nd Jan 21
Technology Minerals appoints ex-BP Petrochemicals CEO as Advisor - 22nd Jan 21
Gold Price Drops Amid Stimulus and Poor Data - 21st Jan 21
Protecting the Vulnerable 2021 - 21st Jan 21
How To Play The Next Stage Of The Marijuana Boom - 21st Jan 21
UK Schools Lockdown 2021 Covid Education Crisis - Home Learning Routine - 21st Jan 21
General Artificial Intelligence Was BORN in 2020! GPT-3, Deep Mind - 20th Jan 21
Bitcoin Price Crash: FCA Warning Was a Slap in the Face. But Not the Cause - 20th Jan 21
US Coronavirus Pandemic 2021 - We’re Going to Need More Than a Vaccine - 20th Jan 21
The Biggest Biotech Story Of 2021? - 20th Jan 21
Biden Bailout, Democrat Takeover to Drive Americans into Gold - 20th Jan 21
Pandemic 2020 Is Gone! Will 2021 Be Better for Gold? - 20th Jan 21
Trump and Coronavirus Pandemic Final US Catastrophe 2021 - 19th Jan 21
How To Find Market Momentum Trades for Explosive Gains - 19th Jan 21
Cryptos: 5 Simple Strategies to Catch the Next Opportunity - 19th Jan 21
Who Will NEXT Be Removed from the Internet? - 19th Jan 21
This Small Company Could Revolutionize The Trillion-Dollar Drug Sector - 19th Jan 21
Gold/SPX Ratio and the Gold Stock Case - 18th Jan 21
More Stock Market Speculative Signs, Energy Rebound, Commodities Breakout - 18th Jan 21
Higher Yields Hit Gold Price, But for How Long? - 18th Jan 21
Some Basic Facts About Forex Trading - 18th Jan 21
Custom Build PC 2021 - Ryzen 5950x, RTX 3080, 64gb DDR4 Specs - Scan Computers 3SX Order Day 11 - 17th Jan 21
UK Car MOT Covid-19 Lockdown Extension 2021 - 17th Jan 21
Why Nvidia Is My “Slam Dunk” Stock Investment for the Decade - 16th Jan 21
Three Financial Markets Price Drivers in a Globalized World - 16th Jan 21
Sheffield Turns Coronavirus Tide, Covid-19 Infections Half Rest of England, implies Fast Pandemic Recovery - 16th Jan 21
Covid and Democrat Blue Wave Beats Gold - 15th Jan 21
On Regime Change, Reputations, the Markets, and Gold and Silver - 15th Jan 21
US Coronavirus Pandemic Final Catastrophe 2021 - 15th Jan 21
The World’s Next Great Onshore Oil Discovery Could Be Here - 15th Jan 21
UK Coronavirus Final Pandemic Catastrophe 2021 - 14th Jan 21
Here's Why Blind Contrarianism Investing Failed in 2020 - 14th Jan 21
US Yield Curve Relentlessly Steepens, Whilst Gold Price Builds a Handle - 14th Jan 21
NEW UK MOT Extensions or has my Car Plate Been Cloned? - 14th Jan 21
How to Save Money While Decorating Your First House - 14th Jan 21
Car Number Plate Cloned Detective Work - PY16 JXV - 14th Jan 21
Big Oil Missed This, Now It Could Be Worth Billions - 14th Jan 21
Are you a Forex trader who needs a bank account? We have the solution! - 14th Jan 21
Finetero Review – Accurate and Efficient Stock Trading Services? - 14th Jan 21

Market Oracle FREE Newsletter

FIRST ACCESS to Nadeem Walayat’s Analysis and Trend Forecasts

Nvidia’s Chips Have Powered Nearly Every Major AI Breakthrough

Companies / AI Dec 24, 2020 - 06:02 PM GMT

By: Stephen_McBride


 “Within 20 years, machines will be capable of doing anything man can do.”

Take a stab at when this quote is from. It wasn’t this year,  2010, or even during the ‘90s tech boom. It’s from one of America’s top computer scientists: in 1960.

You’ve surely heard about Artificial Intelligence (AI) before. “AI” often conjures up images of intelligent robots taking over the world. You’ll often read that it’s only a matter of time before AI steals all our jobs.

But the idea of humanoid machines is nothing new. It began with the “heartless” Tin Man from The Wizard of Oz. By the 1950s, a generation of scientists and engineers were convinced we’d soon co-exist with clever robots.

The term artificial intelligence was coined in 1954 at Dartmouth during the world’s first AI conference. Attendee Marvin Minsky, who later founded MIT’s AI lab, said “In 3–8 years, we will have a machine with the intelligence of a human.”

A couple of years later, Stanford created its AI project “with the goal of building a fully intelligent machine in a decade.”

This idea gripped Hollywood, too. Ever watch the sci-fi classic 2001: A Space Odyssey? The 1968 movie is best remembered for the intelligent supercomputer, HAL 9000. HAL could think just like a human and had the ability to scheme against anyone who threatened its survival.

Soon novels like I, Robot packed our bookshelves. We got stories of robots gone mad, mind-reading robots, robots with a sense of humor, and robots that secretly run the world.

Even the US military was convinced, so it pumped billions of dollars into AI research. In the ‘50s, we imagined bionic men would soon be running factories. Within a decade, cyborgs would be doing our housework. We were promised a new breed of machines.

70 years later, what did we get? Dishwashers, air conditioners, and microwaves!

How Do Robots Learn?

Despite many lofty predictions and billions of dollars in funding, we never got machines with human-like intelligence. You have to dig into how machines learn to see why the idea was a flop from the get-go.

“AI” is a term that’s shrouded in a weird mix of hype and complexity. But the core idea of artificial intelligence is a machine that learns and thinks just like you or I. Most importantly, it learns all by itself, without human intervention.

Of course learning doesn’t come naturally to robots. To overcome this challenge, scientists created neural networks in the late 1950s. In short, neural networks are computer programs that mimic how the human brain works. They are made of thousands—sometimes millions—of artificial “brain cells” that learn through analyzing examples.

Say you’re creating a machine that can recognize cats. First, you’ll feed tons of cat pictures into the neural network. After analyzing, say, 1,000 examples, it starts to learn what a cat looks like. Then you can show it a real cat it’s never seen before, and it will know what it is.

Scientists who believed neural networks would breed intelligent computers were right on the money. Problem was… they lacked the raw materials needed to fuel their ambitions.

Remember, machines learn through analyzing examples, or data. And it turns out you need to feed them with truly enormous amounts of data to kindle any kind of intelligence. So machines need to see hundreds of thousands, if not millions, of cat pictures before they “learn” what a cat looks like. But in the ‘60s and ‘70s, we didn’t have that much data. The internet wasn’t invented, so we had almost no digital text or images. Books, photo libraries, and documents were still in the physical world, which meant converting them into digital files was inefficient and expensive.

And get this: the lack of data wasn’t even the greatest hurdle to building intelligent computers. Designing computer programs that mimic the human brain was genius. The drawback was neural networks needed hyper-fast computers to function.

And by 1995, even supercomputers were shockingly slow. For example, it took a giant “render farm” of 117 Sun Microsystems running 24/7 to produce the original Toy Story. The machines worked non-stop for seven weeks to produce the 78-minute film.

A Match Made in Heaven

After 40 years in the wilderness, two huge breakthroughs are fueling an AI renaissance.

The internet handed us a near unlimited amount of data. A recent IBM paper found 90% of the world’s data has been created in just the last two years. From the 290+ billion photos shared on Facebook, to millions of e-books, billions of online articles and images, we now have endless fodder for neural networks.

The breathtaking jump in computing power is the other half of the equation. RiskHedge readers know computer chips are the “brains” of electronics like your phone and laptop. Chips contain billions of “brain cells” called transistors. The more transistors on a chip, the faster it is.

Your phone is more powerful than the render farm that produced Toy Story. The 117 Sun Microsystems had 1 billion transistors, combined. There are 8.7 billion packed onto the chip inside the latest iPhone!

And in the past decade, a special type of computer chip emerged as the perfect fit for neural networks.

Do you remember the blocky graphics on video games like Mario and Sonic from the ‘90s? If you have kids who are gamers, you’ll know graphics have gotten far more realistic since then. Here’s each Lara Croft from the Tomb Raider series since 1996:

Source: Epic Games

This incredible jump is due to chips called graphics processing units (GPUs). GPUs can perform thousands of calculations all at once, which helps create these movie-like graphics. That’s different from how traditional chips work, which calculate one by one.

Around 2006, Stanford researchers discovered GPUs “parallel processing” abilities were perfect for AI training. For example, do you remember Google’s Brain project? The machine taught itself to recognize cats and people by watching YouTube videos. It was powered by one of Google’s giant data centers, running on 2,000 traditional computer chips. In fact, the project cost a hefty $5 billion.

Stanford researchers then built the same machine with GPUs instead. A dozen GPUs delivered the same data crunching performance of 2,000 traditional chips. And it slashed costs from $5 billion to $33,000! The huge leap in computing power and explosion of data means we finally have the “lifeblood” of AI.

America’s Most Important Company

Artificial intelligence is the ultimate buzzword in tech these days. Data from Bloomberg shows a record 840 US firms mentioned AI at least once in recent earnings reports. In short, it’s become a “mating call” for companies trying to attract investor dollars.

The reality is few of these companies are building intelligent systems. For example, venture capital firm MMC Ventures recently studied 2,830 AI start-ups. In 40% of cases, it found no evidence AI was an important part of their business.

You only need to ask one simple question to weed out the fakes: What percent of their sales come from AI? I’ve done the work: and I can tell you only a handful make any money from this budding disruption.

The one company with a booming AI business is NVIDIA (NVDA). NVIDIA invented graphics processing units back in the 1990s. It’s solely responsible for the realistic video game graphics we have today. And then we discovered these gaming chips were perfect for training neural networks.

NVIDIA stumbled into AI by accident, but early on, it realized it was a huge opportunity. Soon after, NVIDIA started building chips specifically optimized for machine learning. And in the first half of 2020, AI-related sales topped $2.8 billion. In fact, more than 90% of neural network training runs on NVIDIA GPUs today.

Its AI-chips are lightyears ahead of the competition. Its newest system, the A100, is described as an “AI supercomputer in a box.” With more than 54 billion transistors, it’s the most powerful chip system ever created.

In fact, just one A100 packs the same computing power as 300 data center servers. And it does it for one-tenth the cost, takes up one-sixtieth the space, and runs on one-twentieth the power consumption of a typical server room. A single A100 reduces a whole room of servers to one rack.

The Epicenter of Disruption

NVIDIA has a virtual monopoly on neural network training. And every breakthrough worth mentioning has been powered by its GPUs.

Computer vision is one of the world’s most important disruptions. And graphics chips are perfect for helping computers to “see.”

NVIDIA crafted its DRIVE chips specially for self-driving cars. These chips power several robocar startups including Zoox, which Amazon just snapped up for $1.2 billion. With NVIDIA’s backing, vision disruptor Trigo is transforming grocery stores into giant supercomputers.

Trigo fits stores out with a network of cameras and sensors, which feed its neural network with reams of data. In short, the network has learned to “see” what items customers throw in their baskets. So when you’re finished shopping, you simply walk out. Trigo then sends the store a tally, who bills you for that amount.

Trigo’s computer vision system is powered by NVIDIA chips and software. The UK’s largest grocer, Tesco, is trialing Trigo in several of its stores. and each system runs on 40–50 GPUs.

But hands-down the biggest breakthroughs are happening in America’s most broken industry—healthcare.

Cancer is the #2 killer in America, responsible for 600,000 deaths last year. Catching the disease early has proven to be an effective way of beating it. But today, spotting tumors is a manual, time-consuming process.

Medical imaging disruptor Paige.AI built an AI system that could revolutionize cancer diagnosis. Paige.AI fed millions of real-life medical images into its neural network. Using 10 NVIDIA GPUs, it trained the system to detect early signs of tumors.

The neural network recently tested itself by scanning 12,000 medical images for potential tumors. It had never seen these images before, yet was able to “achieve near perfect accuracy.” After announcing these results, Paige.AI was granted “Breakthrough Designation” by the FDA, the first ever for an AI in cancer diagnosis.

NVIDIA is also opening the door to early detection of Alzheimer’s. Stanford researchers built an AI system that detects Alzheimer’s disease from scanning MRIs with 94% accuracy. Powered by six GPUs, it “learned” what biomarkers were most commonly associated with early signs of the disease.

The powerful GPU/AI combo is also saving victims of strokes. During a stroke, patients lose roughly 1.9 million brain cells every minute. So interpreting their CT scans even one second faster matters.

Medical imaging startup Deep01 has created a neural network which almost instantly evaluates strokes. DeepCT has a 95% accuracy rate within 30 seconds per case, which is roughly 10x faster than traditional methods. The system was trained on 60,000 medical images, using NVIDIA chips. And get this… it’s the first Asian firm to be granted FDA clearance for an AI product.

I could pepper you with dozens more examples, but you see my point. NVIDIA’s chips have powered almost every major AI breakthrough. It’s a buy today.

The Great Disruptors: 3 Breakthrough Stocks Set to Double Your Money"
Get my latest report where I reveal my three favorite stocks that will hand you 100% gains as they disrupt whole industries. Get your free copy here.

By Stephen McBride

© 2020 Copyright Stephen McBride - All Rights Reserved Disclaimer: The above is a matter of opinion provided for general information purposes only and is not intended as investment advice. Information and analysis above are derived from sources and utilising methods believed to be reliable, but we cannot accept responsibility for any losses you may incur as a result of this analysis. Individuals should consult with their personal financial advisors.

© 2005-2019 - The Market Oracle is a FREE Daily Financial Markets Analysis & Forecasting online publication.

Post Comment

Only logged in users are allowed to post comments. Register/ Log in

6 Critical Money Making Rules