Tech

Several Google engineers have left one of its most secretive AI projects to form a stealth start-up

Key Points
  • Groq founders previously helped create Google's Tensor Processing Unit.
  • Venture capitalist Chamath Palihapitiya led a $10 million investment in the start-up.
  • The project is in secretive stealth mode.
Several Google engineers have left its secretive AI program to form a secretive start-up
VIDEO0:4600:46
Several Google engineers have left its secretive AI program to form a secretive start-up

Google has slowly been pulling back the curtain on homegrown silicon that could define the future of machine learning and artificial intelligence.

Some key creators of that project — the Tensor Processing Unit, or TPU — recently left to team up with Chamath Palihapitiya, one of Silicon Valley's most prominent and outspoken young venture investors, on a stealth start-up.

Groq Inc. is the name of the company, at least for the time being.

There are no promotional materials or website. All that exists online are a couple SEC filings from October and December showing that the company raised $10.3 million, and an incorporation filing in the state of Delaware on Sept. 12.

"We're really excited about Groq," Palihapitiya wrote in an e-mail. "It's too early to talk specifics, but we think what they're building could become a fundamental building block for the next generation of computing."

Groq names three principals in the SEC documents: Jonathan Ross, who helped invent the TPU, Douglas Wightman, an entrepreneur and former engineer at the Google X "moonshot factory" and Palihapitiya, founder of investment firm Social Capital. The listed address is Social Capital's headquarters.

Palihapitiya told CNBC last month that he invested in a team of ex-Googlers who helped build the chip, which he first heard about on an earnings call 2 ½ years ago.

"They randomly mentioned that they built their own chip for AI and I thought, 'what is going on here, why is Google competing with Intel?'" Palihapitiya said in an interview on "Squawk Box."

Chamath Palihapitiya: I spent 1½ years trying to find the people who made Google's AI chip
VIDEO0:3800:38
Chamath Palihapitiya: I spent 1½ years trying to find the people who made Google's AI chip

The company (which we now know is Groq) now has eight of the first 10 people from the TPU team "building a next-generation chip," he said.

All start-ups are hard, but a new chip company is something most venture capitalists won't touch. The research and development costs required to get a working prototype can be exorbitant. Then there's manufacturing and the Herculean challenge of finding device makers to take a chance on unproven technology.

Also, the incumbents — Intel, Qualcomm and Nvidia — are massive, and Google, Apple and Amazon are developing their own silicon.

As crazy as it may be, Palihapitiya is taking the plunge. This next wave of chip innovation "can empower companies like Facebook and Amazon, Tesla, the government to do things with machine learning and computers that nobody could do before," he said in last month's interview.

Ross' LinkedIn page says he left Google in September and is currently "gainfully unemployed." According to Wightman's profile, he left the same month but doesn't say where he went. Wightman confirmed the funding in an e-mail and said, "We're still heads down right now."

Launching the TPU

Google made its first public pronouncement about the TPU in May with a blog post just ahead of the company's annual developers conference. Norm Jouppi, one of the project heads, said Google had been using the technology internally for over a year for things like improving the relevance of search results and the accuracy of its maps. It's also part of Google's Cloud Platform in a product called TensorFlow that lets other companies run machine learning workloads in Google's data centers.

The essence of TPUs is the ability to squeeze heavy and highly sophisticated computation into less silicon. Machine learning, or the training of computers to get smarter as data sets increase and without the need for human interference, is weaving its way into all types of consumer and business apps. At Google's scale, that work is too taxing for today's processors.

Palihapitiya said at a conference in January that the main reason he's bullish on Google as an investment is because "they're an order of magnitude ahead" of everyone else.

He's not the only investor paying attention. Denny Fish, a portfolio manager who focuses on technology at Janus, said Google is doing everything it can to ensure maximum performance and efficiency to handle the heft of machine learning workloads.

Google’s tensor processing unit or TPU.
Source: Google

"They've said, if we're going to do this and win, we need the most efficient way using the least amount of power," he said. "They feel like they've cracked the code."

Earlier this month, Google provided its first real update on TPUs in a 17-page report titled, "In-Datacenter performance analysis of a Tensor Processing Unit."

In a summary blog post, Jouppi wrote that AI workloads using TPUs are running 15 to 30 times faster than contemporary processors, while efficiency is 30 to 80 times better. The study compared TPUs to chips from Intel and Nvidia.

Ross was one of 75 authors of the report. He's also listed in the paper as an inventor on four patents, all tied to neural networks processors and computation, which you can think of as computing systems designed to mimic the brain.

Ross' LinkedIn page says he started the TPU with one other engineer as part of a 20 percent project, the perk that lets Google employees spend one-fifth of their time working on a side project they think will benefit the company.

From 2013 to 2015, he worked as a Google hardware engineer in Madison, Wisconsin, home to one of the principal hotbeds of TPU development.

Wightman co-founded four companies before his Google days. At Google X he worked on futuristic projects like Loon, a network of balloons that's setting out to provide a more extensive internet.