AI Chips Archives - Crunchbase News https://news.crunchbase.com/tag/ai-chips/ Data-driven reporting on private markets, startups, founders, and investors Thu, 20 Jun 2024 16:43:06 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 Semiconductor Startup Funding Looks To Bounce Back After Lackluster 2023 https://news.crunchbase.com/semiconductors-and-5g/chip-startup-funding-bounces-back-ai-nvda/ Thu, 20 Jun 2024 11:00:03 +0000 https://news.crunchbase.com/?p=89657 Black Semiconductor was the latest chip startup to make headlines when it raised nearly $275 million — mainly from the German government — last week for its next-gen chip tech.

It was just the latest sign of chip startups being able to raise big money as once again one of the most foundational technologies grab investors’ attention around the world, mainly thanks to AI.

Global venture funding to semiconductor chips appears well on its way to bouncing back this year after a forgettable 2023. Thus far this year, VC-backed chip startups have raised nearly $5.3 billion in just 175 deals, per Crunchbase data.

Those numbers are well ahead of the pace from last year, when such startups saw less than $8.8 billion in 490 deals. In 2022, chip startups locked up almost $10.9 billion in 447 deals.

More big rounds could be in the way, as it was reported last week that smartphone-maker Samsung is leading a round of at least $300 million for Toronto-based AI chip startup Tenstorrent.

US chip boom

U.S. startups are playing a key role in the surge of funding. Domestic startups have raised almost the same amount of money — about $1.2 billion — in nearly the same number of deals — 24 to 22 — compared to all of last year, per Crunchbase.

It is important to note that number is greatly helped out by PsiQuantum, which focuses on semiconductor process development and integrated photonic devices and systems. The company landed a financial package of $620 million from the Australian Commonwealth and Queensland governments this spring to build a quantum computer at a location in Brisbane, Australia. The round is actually a mix of equity, grants and loans.

Even without that round, U.S. startups would be ahead of last year’s pace. While many of the biggest rounds this year went to Chinese chipmakers like ChangXin Memory Technologies, Unisoc and AaltoSemi, some large financings also went to U.S.-based semiconductor startups, including:

“Semi used to be a four-letter word in the Valley, but now it’s sexy,” said Sriram Viswanathan, founding managing partner at San Francisco-based Celesta Capital. Along with its Recogni investment, the deep-tech firm’s portfolio includes Palo Alto, California-based SambaNova Systems.

AI effect

Of course what is leading to this renewed investor interest is the key driver of so many things in the tech world — AI.

Artificial intelligence is the driving reason chip giant Nvidia is now a $3 trillion-plus company. And while shares of Astera Labs — which provides data and memory connectivity solutions for some of the biggest chipmakers in the world, including Intel and Taiwan Semiconductor Manufacturing — are off their highs, they are still well above their IPO price from March. Astera’s IPO, in particular, was seen as a bellwether offering for both the semiconductor and AI industries.

Both those companies show there is significant public investor interest in the chip market — and that usually translates to VC interest in the private markets.

“While the full promise of AI commercialization has not been fully evidenced, the ‘FOMO’ of (the) AI race is pushing a lot of hot money into the value chain from AI applications to data infrastructure to semiconductors,” said Lorin Gu, founding partner at New York-based Recharge Capital, an investor in wireless-device chip manufacturer Airoha Technology.

“Given that at-scale AI application often requires retooling or new build of infrastructure, there is a strong cyclical demand for semis at the moment,” Gu added.

While the space has become more competitive to invest in, it also has become more creative in terms of financing, with more hybrid deals and investors analyzing the risks and capex in more granular details for the industry, Gu said.

Viswanathan added that the semi and hardware space as it relates to AI has been inundated with capital of late and is somewhat “over-inflated.”

Despite the influx of money and investors in the space, Viswanathan said there are opportunities at the silicon and hardware level. That includes startups looking to make AI inference — a model’s ability to use new data to make predictions and draw conclusions — more efficient.

However, it is important to remember chipmaking can be an expensive proposition and it is an industry dominated by a few big players like Nvidia.

While those in the AI space may be looking for an alternative to Nvidia, it can be a difficult market for any startups to make headway.

Nevertheless, it seems at least for now investors are willing to take that risk.

Related Crunchbase Pro lists:

Related reading:

Illustration: Dom Guzman

]]>
https://news.crunchbase.com/wp-content/uploads/Computer_Chips_thm.jpg
Hailo Raises $60M In Series B Funding To Scale Out Its AI-On-The-Edge Chip Business https://news.crunchbase.com/venture/hailo-raises-60m-in-series-b-funding-to-scale-out-its-ai-on-the-edge-chip-business/ Thu, 05 Mar 2020 16:19:24 +0000 http://news.crunchbase.com/?p=26184 Watching a machine learning model at work can sometimes feel like magic, but in reality it’s just math. A lot of math. Nothing terribly fancy, mind you: statistics, probability theory, multivariate calculus, linear algebra and algorithms. OK, it’s a little fancy, but the point is that there’s a lot of math to do. A reasonably accurate statistical model of the weights and biases of its training data does not get calculated on the back of an envelope. It’s calculated on computer chips, and as the volume of data and demand for insights from that data grow, so does the demand for ever-faster silicon capable of crunching those numbers.

Subscribe to the Crunchbase Daily

At a certain level of computational complexity, regular central processing units don’t cut it. They’ll do the math just fine, but they’ll take a long time to do it. Graphics cards were designed for massively parallel computing operations, like rendering and driving the multiplying number of pixels on our ever-denser visual displays; and it just so happens that that embarrassingly parallel architecture is well-suited to doing machine learning math quickly.

But there are plenty of applications where running a bunch of graphics processing units (GPU) in a data center is not practical. Take an autonomous vehicle for example: It doesn’t make sense to pipe all the data streaming off the onboard cameras and LIDAR sensors up to a cloud service, wait for it to process, and then get piped back into a car’s onboard computer. At 60 miles and hour, that kind of latency could be lethal.

As the world becomes more data-driven and our tech uses inference to be more responsive, a new generation of computer chips is required to make all the math-magic happen. At a certain scale of computational complexity, or in situations where electrical consumption has to be kept to a minimum, GPUs don’t cut it either.

Headquartered in Tel Aviv, Israel, Hailo is one of several companies vying for its spot in the competitive market for specialized artificial intelligence chips built for computing on the edge: automotive applications, mobile devices, AI-augmented home devices and industrial use cases.

Today the company announced that it’s raised $60 million in Series B funding. The round was led by existing backers but saw participation from new strategic investors including the venture arm of robotics and automation company ABB, Japanese IT conglomerate NEC Corporation and London-based VC Latitude Ventures.

The company says that its new funding will “bolster the ongoing global rollout of its breakthrough Hailo-8 Deep Learning chip and to reach new markets and industries worldwide.”

Hailo says its chip is capable of up to 26 trillion operations per second while drawing less than 5 watts at full utilization. Hailo-8 supports popular machine learning frameworks like TensorFlow and Open Neural Network Exchange and meets several compliance standards for automotive applications.

According to the company, the chip’s “Structure-Defined Dataflow Architecture translates into higher performance, lower power, and minimal latency, enabling more privacy and better performance for smart devices operating at the edge, including partially autonomous vehicles, smart cameras, smartphones, drones and AR/VR platforms.”

“This immense vote of confidence from our new strategic and financial investors, along with existing ones, is a testimony to our breakthrough innovation and market potential,” said Orr Danon, CEO and co-founder of Hailo. “The new funding will help us expedite the deployment of new levels of edge computing capabilities in smart devices and intelligent industries around the world, including areas such as mobility, smart cities, industrial automation, smart retail and beyond.”

Since its inception in February 2017, the company has raised $88 million in total funding, inclusive of the round announced today. In January 2019, the company closed $8.5 million in Series A funding led by Chinese venture firm Glory Ventures. No additional details about the company’s revenue or valuation was disclosed.

Illustration: Li-Anne Dias

]]>
https://news.crunchbase.com/wp-content/uploads/2017/08/us-china-ai_th-300x300.jpg
Specialized AI Chipmaker Graphcore Extends Series D Round With $150M, Valued At $1.95B https://news.crunchbase.com/venture/specialized-ai-chipmaker-graphcore-extends-series-d-round-with-150m-valued-at-1-95b/ Wed, 26 Feb 2020 16:37:03 +0000 http://news.crunchbase.com/?p=25861 Artificial intelligence and machine learning carry the promise of delivering optimization and personalization to all manner of systems. The challenge is that the math behind it is somewhat complicated, and that it has to be run, over and over, across vast quantities of data to suss out the statistical weights and biases of a particular system.

Subscribe to the Crunchbase Daily

At sufficient scale, the computational complexity of machine learning model training overwhelms general-purpose CPUs. The work will get done; it might just take a long time. Data scientists and machine learning researchers have long used graphics processing units (GPUs) because of their highly parallelized architecture and relatively abundant on-chip memory available.

But as industry and research groups alike seek more efficiency and need to accommodate ever-larger quantities of information, more specialized computing hardware is required for the task.

Headquartered in Bristol, U.K., Graphcore is in the business of producing silicon purpose-built for munching through machine-learning math at high rates of speed and using less electricity than GPUs. Benchmarks for Graphcore’s Intelligence Processing Unit (IPU) state that it offers notably less latency and higher computational throughput, and uses less power than GPUs.

The company announced that it raised an additional $150 million in fresh investor capital in an extension of its Series D round. The extension was led by Mayfair Equity Partners; new investors Baillie Gifford and M&G Investments joined in as well. The deal also saw participation from a number of prior investors. The first tranche of the company’s Series D was closed in December 2018, netting the company $200 million.

The Series D extension values Graphcore at $1.95 billion, according to the company. Taken together, the company has raised $460 million, according to Crunchbase data. The company’s shareholders include the likes of Draper Espirit, Dell Technologies Capital, C4 Ventures, various entities associated with Samsung, BMW i Ventures, Sequoia Capital and Microsoft, among others.

In a press release provided to Crunchbase News by the company, Graphcore highlighted a number of milestones from 2019. In partnership with strategic investor Dell Technologies, the companies co-developed and launched the DSS8440, a production-ready server built around Graphcore’s IPUs. Alongside Microsoft, another strategic investor, the company launched the Microsoft Azure IPU-Cloud service, as well as the IPU-Bare Metal Cloud service it launched in partnership with Cirrascale. The company’s publicly announced customers include Microsoft, Citadel Securities, Carmot Capital, and European search engine company Qwant.

The company says the new round brings its cash reserves up to $300 million. Graphcore has plotted for itself an ambitious growth plan. According to its press release, the company has devoted significant resources to research and development efforts. The company doubled headcount in its Bristol headquarters, as well as its engineering center in Oslo, Norway. It says its sales and support office in Palo Alto, California also saw similar up-scaling in 2019. The company also opened a new sales, support and engineering office in Beijing, alongside an engineering center in Cambridge, U.K., and an operations facility in Taiwan.

“Demand for our Intelligence Processor Unit products is increasing at existing and new customers and the outlook for our business in Fiscal 2020 is extremely positive. The major investments that we have made during 2018 and 2019 will help us to meet this strong demand by extending the capabilities of our technology and ecosystem, and will support long-term revenue growth and returns for our investors,” said Graphcore CEO Nigel Toon in a statement.

The company declined to answer questions from Crunchbase News about its revenue and profitability, whether it has its own fabrication facilities, what the company’s future exit prospects might look like and whether it may be affected by Brexit or the emerging SARS-CoV-2 virus situation in Asia.

Illustration: Li-Anne Dias

]]>
https://news.crunchbase.com/wp-content/uploads/2017/09/neurotech_th.gif