Close Menu
    Facebook X (Twitter) Instagram
    Articles Stock
    • Home
    • Technology
    • AI
    • Pages
      • About ArticlesStock — AI & Technology Journalist
      • Contact us
      • Disclaimer For Articles Stock
      • Privacy Policy
      • Terms and Conditions
    Facebook X (Twitter) Instagram
    Articles Stock
    AI

    $60B AI chip darling Cerebras nearly died early on, burning $8M a month

    Naveed AhmadBy Naveed Ahmad17/05/2026Updated:17/05/2026No Comments5 Mins Read


    At present, Cerebras Programs is a public firm that sells AI chips for inference to giants like OpenAI and AWS. It held a blockbuster IPO on Thursday, with each of its co-founders billionaires, and ended the week value about $60 billion.

    However in 2019, when it was three years previous, it got here dangerously near failure – incinerating a stunning amount of cash. It was making an attempt to unravel a technical downside nobody within the semiconductor business thought may very well be executed. 

    “We have been spending about $8 million a month,” founder CEO Andrew Feldman advised TechCrunch of that interval. “At this level, we had incinerated almost $200 million making an attempt to unravel one technical downside.” 

    Each few weeks, Feldman was compelled to make the painful stroll of disgrace to the board assembly to report one other failure and extra money burned. 

    However he had no alternative. With out a resolution, Cerebras was lifeless anyway.

    It was based with an concept that was easy on paper. The microprocessor business had spent its complete 50+ years making CPUs quicker and cheaper by cramming extra transistors onto a silicon wafer and dicing wafers into ever tinier items. However AI required a lot compute energy, many chips needed to be strung collectively after which compelled to speak with one another. Cerebras’ founders believed turning an entire, even greater wafer into one large, highly effective chip, would work quicker. 

    The issue was, nobody had ever efficiently executed this earlier than, for any motive, AI or not. Orchestrating that many microscopic digital parts onto a bigger, however nonetheless skinny, floor launched compounding engineering issues. 

    As soon as Cerebras crossed the primary threshold of designing the mega chip after which manufacturing it with TSMC, the workforce hit the actual roadblock. 

    They couldn’t remedy “packaging.” This includes the whole lot after manufacturing the silicon itself: adhering it to a motherboard, getting energy to it, coping with heating and cooling in addition to the pipes that will ship and return information, Feldman mentioned. 

    Cerebras’ chips “have been 58 occasions bigger. We have been utilizing 40 occasions as a lot energy as anyone had ever used,” he mentioned. There have been no premade warmth sinks. No distributors. No manufacturing companions. The brightest minds in microprocessor engineering had tried for many years to construct such huge, but extra dense chips, and failed. 

    The Cerebras workforce was left with trial and error by which “we destroyed an unlimited variety of chips” and an unlimited amount of money. However with out useful packaging, the chip was ineffective. 

    After exhaustive evaluation of every failure, the workforce lastly solved sufficient issues: the best way to cool it and transfer information round. In a single occasion, they needed to invent their very own machine that would bolt-in 40 screws concurrently to safe the wafer to a board with out cracking it. 

    Feldman nonetheless remembers the day in July 2019 when all of it, miraculously, labored.

    They put in the packaged chip into a pc, turned it on and all the founding workforce (pictured beneath) “simply stood within the lab and stared at it,” he mentioned. “Watching a pc run is about as thrilling as watching paint dry. However there we have been watching lights flashing on the pc, surprised that we would solved this.” 

    “That was one of many best moments of my life,” he mentioned. That is important, as a result of this identical founding workforce had beforehand constructed and bought a pioneering cloud server startup, SeaMicro, to AMD for $334 million in 2012.

    Cerebras Programs founding workforce in 2015: Andrew Feldman, Gary Lauterbach, Michael James, Sean Lie and Jean-Philippe FrickerPicture Credit:Cerebras Programs

    The day the chip lastly labored was additionally about two years after OpenAI had talked to Cerebras buying it, which Feldman confirmed to TechCrunch occurred just like the publicly revealed emails mentioned it did. 

    These talks fell by means of amidst rising squabbling among the many OpenAI founders, a number of of whom are angel traders in Cerebras. 

    At present OpenAI is a buyer and a accomplice, having loaned Cerebras $1 billion secured by warrants. These warrants conditionally grant OpenAI about 33 million shares of Cerebras’ inventory, the S-1 discloses. (33 million shares are value over $9 billion at Friday’s closing worth of $279.) 

    Curiously, Cerebras additionally agreed to not promote its wares to particular OpenAI rivals as a part of that mortgage deal. Feldman wouldn’t affirm that the plain firm this includes: Anthropic. He did, nevertheless say that restriction is momentary. 

    “It is restricted in time, and it was designed to guarantee that we may get OpenAI the capability,” he mentioned.

    The reality was, Cerebras hasn’t but grown sufficiently big to deal with a number of fast-growing mannequin makers anyway.  He likened promoting AI compute capability to an all-you-can eat buffet. As an alternative of making an attempt to stuff itself on all potential prospects, “We’ll work with a part of the buffet solely, and we will get snug with that, earlier than we assault the remainder,” he mentioned.

    If you buy by means of hyperlinks in our articles, we could earn a small fee. This doesn’t have an effect on our editorial independence.



    Source link

    Naveed Ahmad

    Naveed Ahmad is a technology journalist and AI writer at ArticlesStock, covering artificial intelligence, machine learning, and emerging tech policy. Read his latest articles.

    Related Posts

    Nous Analysis Proposes Lighthouse Consideration: A Coaching-Solely Choice-Primarily based Hierarchical Consideration That Delivers 1.4–1.7× Pretraining Speedup at Lengthy Context

    17/05/2026

    The haves and have nots of the AI gold rush

    17/05/2026

    Advertising working system Nectar Social raises $30M Collection A led by Menlo

    17/05/2026
    Leave A Reply Cancel Reply

    Categories
    • AI
    Recent Comments
      Facebook X (Twitter) Instagram Pinterest
      © 2026 ThemeSphere. Designed by ThemeSphere.

      Type above and press Enter to search. Press Esc to cancel.