Best of the Week
Most Popular
1. Investing in a Bubble Mania Stock Market Trending Towards Financial Crisis 2.0 CRASH! - 9th Sep 21
2.Tech Stocks Bubble Valuations 2000 vs 2021 - 25th Sep 21
3.Stock Market FOMO Going into Crash Season - 8th Oct 21
4.Stock Market FOMO Hits September Brick Wall - Evergrande China's Lehman's Moment - 22nd Sep 21
5.Crypto Bubble BURSTS! BTC, ETH, XRP CRASH! NiceHash Seizes Funds on Account Halting ALL Withdrawals! - 19th May 21
6.How to Protect Your Self From a Stock Market CRASH / Bear Market? - 14th Oct 21
7.AI Stocks Portfolio Buying and Selling Levels Going Into Market Correction - 11th Oct 21
8.Why Silver Price Could Crash by 20%! - 5th Oct 21
9.Powell: Inflation Might Not Be Transitory, After All - 3rd Oct 21
10.Global Stock Markets Topped 60 Days Before the US Stocks Peaked - 23rd Sep 21
Last 7 days
Chinese Tech Stocks CCP Paranoia and Best AI Tech Stocks ETF - 26th Oct 21
Food Prices & Farm Inputs Getting Hard to Stomach - 26th Oct 21
Has Zillow’s Collapse Signaled A Warning For The Capital Markets? - 26th Oct 21
Dave Antrobus Welcomes Caribou to Award-Winning Group Inc & Co - 26th Oct 21
Stock Market New Intermediate uptrend - 26th Oct 21
Investing in Crypto Currencies With Both Eyes WIDE OPEN! - 25th Oct 21
Is Bitcoin a Better Inflation Hedge Than Gold? - 25th Oct 21
S&P 500 Stirs the Gold Pot - 25th Oct 21
Stock Market Against Bond Market Odds - 25th Oct 21
Inflation Consequences for the Stock Market, FED Balance Sheet - 24th Oct 21
To Be or Not to Be: How the Evergrande Crisis Can Affect Gold Price - 24th Oct 21
During a Market Mania, "no prudent professional is perceived to add value" - 24th Oct 21
Stock Market S&P500 Rallies Above $4400 – May Attempt To Advance To $4750~$4800 - 24th Oct 21
Inflation and the Crazy Crypto Markets - 23rd Oct 21
Easy PC Upgrades with Motherboard Combos - Overclockers UK Unboxing - MB, Memory and Ryzen 5600x CPU - 23rd Oct 21
Gold Mining Stocks Q3 2021 - 23rd Oct 21
Gold calmly continues cobbling its Handle, Miners lay in wait - 23rd Oct 21
US Economy Has Been in an Economic Depression Since 2008 - 22nd Oct 21
Extreme Ratios Point to Gold and Silver Price Readjustments - 22nd Oct 21
Bitcoin $100K or Ethereum $10K—which happens first? - 22nd Oct 21
This Isn’t Sci-Fi: How AI Is About To Disrupt This $11 Trillion Industry - 22nd Oct 21
Ravencoin RVN About to EXPLODE to NEW HIGHS! Last Chance to Buy Before it goes to the MOON! - 21st Oct 21
Stock Market Animal Spirits Returning - 21st Oct 21
Inflation Advances, and So Does Gold — Except That It Doesn’t - 21st Oct 21
Why A.I. Is About To Trigger The Next Great Medical Breakthrough - 21st Oct 21
Gold Price Slowly Going Nowhere - 20th Oct 21
Shocking Numbers Show Government Crowding Out Real Economy - 20th Oct 21
Crude Oil Is in the Fast Lane, But Where Is It Going? - 20th Oct 21
3 Tech Stocks That Could Change The World - 20th Oct 21
Best AI Tech Stocks ETF and Investment Trusts - 19th Oct 21
Gold Mining Stocks: Will Investors Dump the Laggards? - 19th Oct 21
The Most Exciting Medical Breakthrough Of The Decade? - 19th Oct 21
Prices Rising as New Dangers Point to Hard Assets - 19th Oct 21
It’s not just Copper; GYX indicated cyclical the whole time - 19th Oct 21
Chinese Tech Stocks CCP Paranoia, VIES - Variable Interest Entities - 19th Oct 21
Inflation Peaked Again, Right? - 19th Oct 21
Gold Stocks Bouncing Hard - 19th Oct 21
Stock Market New Intermediate Bottom Forming? - 19th Oct 21
Beware, Gold Bulls — That’s the Beginning of the End - 18th Oct 21
Gold Price Flag Suggests A Big Rally May Start Soon - 18th Oct 21
Inflation Or Deflation – End Result Is Still Depression - 18th Oct 21
A.I. Breakthrough Could Disrupt the $11 Trillion Medical Sector - 18th Oct 21
US Economy and Stock Market Addicted to Deficit Spending - 17th Oct 21
The Gold Price And Inflation - 17th Oct 21
Went Long the Crude Oil? Beware of the Headwinds Ahead… - 17th Oct 21
Watch These Next-gen Cloud Computing Stocks - 17th Oct 21
Overclockers UK Custom Built PC 1 YEAR Use Review Verdict - Does it Still Work? - 16th Oct 21
Altonville Mine Tours Maze at Alton Towers Scarefest 2021 - 16th Oct 21

Market Oracle FREE Newsletter

How to Protect your Wealth by Investing in AI Tech Stocks

Nvidia’s Chips Have Powered Nearly Every Major AI Breakthrough

Companies / AI Dec 24, 2020 - 06:02 PM GMT

By: Stephen_McBride


 “Within 20 years, machines will be capable of doing anything man can do.”

Take a stab at when this quote is from. It wasn’t this year,  2010, or even during the ‘90s tech boom. It’s from one of America’s top computer scientists: in 1960.

You’ve surely heard about Artificial Intelligence (AI) before. “AI” often conjures up images of intelligent robots taking over the world. You’ll often read that it’s only a matter of time before AI steals all our jobs.

But the idea of humanoid machines is nothing new. It began with the “heartless” Tin Man from The Wizard of Oz. By the 1950s, a generation of scientists and engineers were convinced we’d soon co-exist with clever robots.

The term artificial intelligence was coined in 1954 at Dartmouth during the world’s first AI conference. Attendee Marvin Minsky, who later founded MIT’s AI lab, said “In 3–8 years, we will have a machine with the intelligence of a human.”

A couple of years later, Stanford created its AI project “with the goal of building a fully intelligent machine in a decade.”

This idea gripped Hollywood, too. Ever watch the sci-fi classic 2001: A Space Odyssey? The 1968 movie is best remembered for the intelligent supercomputer, HAL 9000. HAL could think just like a human and had the ability to scheme against anyone who threatened its survival.

Soon novels like I, Robot packed our bookshelves. We got stories of robots gone mad, mind-reading robots, robots with a sense of humor, and robots that secretly run the world.

Even the US military was convinced, so it pumped billions of dollars into AI research. In the ‘50s, we imagined bionic men would soon be running factories. Within a decade, cyborgs would be doing our housework. We were promised a new breed of machines.

70 years later, what did we get? Dishwashers, air conditioners, and microwaves!

How Do Robots Learn?

Despite many lofty predictions and billions of dollars in funding, we never got machines with human-like intelligence. You have to dig into how machines learn to see why the idea was a flop from the get-go.

“AI” is a term that’s shrouded in a weird mix of hype and complexity. But the core idea of artificial intelligence is a machine that learns and thinks just like you or I. Most importantly, it learns all by itself, without human intervention.

Of course learning doesn’t come naturally to robots. To overcome this challenge, scientists created neural networks in the late 1950s. In short, neural networks are computer programs that mimic how the human brain works. They are made of thousands—sometimes millions—of artificial “brain cells” that learn through analyzing examples.

Say you’re creating a machine that can recognize cats. First, you’ll feed tons of cat pictures into the neural network. After analyzing, say, 1,000 examples, it starts to learn what a cat looks like. Then you can show it a real cat it’s never seen before, and it will know what it is.

Scientists who believed neural networks would breed intelligent computers were right on the money. Problem was… they lacked the raw materials needed to fuel their ambitions.

Remember, machines learn through analyzing examples, or data. And it turns out you need to feed them with truly enormous amounts of data to kindle any kind of intelligence. So machines need to see hundreds of thousands, if not millions, of cat pictures before they “learn” what a cat looks like. But in the ‘60s and ‘70s, we didn’t have that much data. The internet wasn’t invented, so we had almost no digital text or images. Books, photo libraries, and documents were still in the physical world, which meant converting them into digital files was inefficient and expensive.

And get this: the lack of data wasn’t even the greatest hurdle to building intelligent computers. Designing computer programs that mimic the human brain was genius. The drawback was neural networks needed hyper-fast computers to function.

And by 1995, even supercomputers were shockingly slow. For example, it took a giant “render farm” of 117 Sun Microsystems running 24/7 to produce the original Toy Story. The machines worked non-stop for seven weeks to produce the 78-minute film.

A Match Made in Heaven

After 40 years in the wilderness, two huge breakthroughs are fueling an AI renaissance.

The internet handed us a near unlimited amount of data. A recent IBM paper found 90% of the world’s data has been created in just the last two years. From the 290+ billion photos shared on Facebook, to millions of e-books, billions of online articles and images, we now have endless fodder for neural networks.

The breathtaking jump in computing power is the other half of the equation. RiskHedge readers know computer chips are the “brains” of electronics like your phone and laptop. Chips contain billions of “brain cells” called transistors. The more transistors on a chip, the faster it is.

Your phone is more powerful than the render farm that produced Toy Story. The 117 Sun Microsystems had 1 billion transistors, combined. There are 8.7 billion packed onto the chip inside the latest iPhone!

And in the past decade, a special type of computer chip emerged as the perfect fit for neural networks.

Do you remember the blocky graphics on video games like Mario and Sonic from the ‘90s? If you have kids who are gamers, you’ll know graphics have gotten far more realistic since then. Here’s each Lara Croft from the Tomb Raider series since 1996:

Source: Epic Games

This incredible jump is due to chips called graphics processing units (GPUs). GPUs can perform thousands of calculations all at once, which helps create these movie-like graphics. That’s different from how traditional chips work, which calculate one by one.

Around 2006, Stanford researchers discovered GPUs “parallel processing” abilities were perfect for AI training. For example, do you remember Google’s Brain project? The machine taught itself to recognize cats and people by watching YouTube videos. It was powered by one of Google’s giant data centers, running on 2,000 traditional computer chips. In fact, the project cost a hefty $5 billion.

Stanford researchers then built the same machine with GPUs instead. A dozen GPUs delivered the same data crunching performance of 2,000 traditional chips. And it slashed costs from $5 billion to $33,000! The huge leap in computing power and explosion of data means we finally have the “lifeblood” of AI.

America’s Most Important Company

Artificial intelligence is the ultimate buzzword in tech these days. Data from Bloomberg shows a record 840 US firms mentioned AI at least once in recent earnings reports. In short, it’s become a “mating call” for companies trying to attract investor dollars.

The reality is few of these companies are building intelligent systems. For example, venture capital firm MMC Ventures recently studied 2,830 AI start-ups. In 40% of cases, it found no evidence AI was an important part of their business.

You only need to ask one simple question to weed out the fakes: What percent of their sales come from AI? I’ve done the work: and I can tell you only a handful make any money from this budding disruption.

The one company with a booming AI business is NVIDIA (NVDA). NVIDIA invented graphics processing units back in the 1990s. It’s solely responsible for the realistic video game graphics we have today. And then we discovered these gaming chips were perfect for training neural networks.

NVIDIA stumbled into AI by accident, but early on, it realized it was a huge opportunity. Soon after, NVIDIA started building chips specifically optimized for machine learning. And in the first half of 2020, AI-related sales topped $2.8 billion. In fact, more than 90% of neural network training runs on NVIDIA GPUs today.

Its AI-chips are lightyears ahead of the competition. Its newest system, the A100, is described as an “AI supercomputer in a box.” With more than 54 billion transistors, it’s the most powerful chip system ever created.

In fact, just one A100 packs the same computing power as 300 data center servers. And it does it for one-tenth the cost, takes up one-sixtieth the space, and runs on one-twentieth the power consumption of a typical server room. A single A100 reduces a whole room of servers to one rack.

The Epicenter of Disruption

NVIDIA has a virtual monopoly on neural network training. And every breakthrough worth mentioning has been powered by its GPUs.

Computer vision is one of the world’s most important disruptions. And graphics chips are perfect for helping computers to “see.”

NVIDIA crafted its DRIVE chips specially for self-driving cars. These chips power several robocar startups including Zoox, which Amazon just snapped up for $1.2 billion. With NVIDIA’s backing, vision disruptor Trigo is transforming grocery stores into giant supercomputers.

Trigo fits stores out with a network of cameras and sensors, which feed its neural network with reams of data. In short, the network has learned to “see” what items customers throw in their baskets. So when you’re finished shopping, you simply walk out. Trigo then sends the store a tally, who bills you for that amount.

Trigo’s computer vision system is powered by NVIDIA chips and software. The UK’s largest grocer, Tesco, is trialing Trigo in several of its stores. and each system runs on 40–50 GPUs.

But hands-down the biggest breakthroughs are happening in America’s most broken industry—healthcare.

Cancer is the #2 killer in America, responsible for 600,000 deaths last year. Catching the disease early has proven to be an effective way of beating it. But today, spotting tumors is a manual, time-consuming process.

Medical imaging disruptor Paige.AI built an AI system that could revolutionize cancer diagnosis. Paige.AI fed millions of real-life medical images into its neural network. Using 10 NVIDIA GPUs, it trained the system to detect early signs of tumors.

The neural network recently tested itself by scanning 12,000 medical images for potential tumors. It had never seen these images before, yet was able to “achieve near perfect accuracy.” After announcing these results, Paige.AI was granted “Breakthrough Designation” by the FDA, the first ever for an AI in cancer diagnosis.

NVIDIA is also opening the door to early detection of Alzheimer’s. Stanford researchers built an AI system that detects Alzheimer’s disease from scanning MRIs with 94% accuracy. Powered by six GPUs, it “learned” what biomarkers were most commonly associated with early signs of the disease.

The powerful GPU/AI combo is also saving victims of strokes. During a stroke, patients lose roughly 1.9 million brain cells every minute. So interpreting their CT scans even one second faster matters.

Medical imaging startup Deep01 has created a neural network which almost instantly evaluates strokes. DeepCT has a 95% accuracy rate within 30 seconds per case, which is roughly 10x faster than traditional methods. The system was trained on 60,000 medical images, using NVIDIA chips. And get this… it’s the first Asian firm to be granted FDA clearance for an AI product.

I could pepper you with dozens more examples, but you see my point. NVIDIA’s chips have powered almost every major AI breakthrough. It’s a buy today.

The Great Disruptors: 3 Breakthrough Stocks Set to Double Your Money"
Get my latest report where I reveal my three favorite stocks that will hand you 100% gains as they disrupt whole industries. Get your free copy here.

By Stephen McBride

© 2020 Copyright Stephen McBride - All Rights Reserved Disclaimer: The above is a matter of opinion provided for general information purposes only and is not intended as investment advice. Information and analysis above are derived from sources and utilising methods believed to be reliable, but we cannot accept responsibility for any losses you may incur as a result of this analysis. Individuals should consult with their personal financial advisors.

© 2005-2019 - The Market Oracle is a FREE Daily Financial Markets Analysis & Forecasting online publication.

Post Comment

Only logged in users are allowed to post comments. Register/ Log in