Magic, an AI startup creating fashions to generate code and automate a spread of software program growth duties, has raised a big tranche of money from traders, together with ex-Google CEO Eric Schmidt.
In a weblog put up on Thursday, Magic mentioned that it closed a $320 million fundraising spherical with contributions from Schmidt, in addition to Alphabet’s CapitalG, Atlassian, Elad Gil, Jane Road, Nat Friedman and Daniel Gross, Sequoia and others. The funding brings the corporate’s complete raised to almost half a billion {dollars} ($465 million), catapulting it right into a cohort of better-funded AI coding startups whose members embody Codeium, Cognition, Poolside, Anysphere and Increase. (Apparently, Schmidt is backing Increase, too.)
In July, Reuters reported that Magic was looking for to lift over $200 million at a $1.5 billion valuation. Evidently, the spherical got here in above expectations, though the startup’s present valuation couldn’t be ascertained; Magic was valued at $500 million in February.
Magic additionally on Thursday introduced a partnership with Google Cloud to construct two “supercomputers” on Google Cloud Platform. The Magic-G4 will likely be made up of Nvidia H100 GPUs, and the Magic G5 will use Nvidia’s next-gen Blackwell chips scheduled to return on-line subsequent 12 months. (GPUs, because of their capacity to run many computations in parallel, are generally used to coach and serve generative AI fashions.)
Magic says it goals to scale the latter cluster to “tens of hundreds” of GPUs over time, and that collectively, the clusters will be capable of obtain 160 exaflops, the place one exaflop is the same as one quintillion pc operations per second.
“We’re excited to associate with Google and Nvidia to construct our next-gen AI supercomputer on Google Cloud,” Magic co-founder and CEO Eric Steinberger mentioned in a press release. “Nvidia’s [Blackwell] system will vastly enhance inference and coaching effectivity for our fashions, and Google Cloud affords us the quickest timeline to scale, and a wealthy ecosystem of cloud companies.”
Steinberger and Sebastian De Ro co-founded Magic in 2022. In a earlier interview, Steinberger instructed TechCrunch that he was impressed by the potential of AI at a younger age; in highschool, he and his mates wired up the varsity’s computer systems for machine-learning algorithm coaching.
That have planted the seeds for Steinberger’s pc science Bachelor’s program at Cambridge (he dropped out after a 12 months) and, later, his job at Meta as an AI researcher. De Ro hailed from German enterprise course of administration agency FireStart, the place he labored his means as much as the function of CTO. Steinberger and De Ro met on the environmental volunteer group Steinberger co-created, ClimateScience.org.
Magic develops AI-driven instruments (not but on the market) designed to assist software program engineers write, overview, debug and plan code modifications. The instruments function like an automatic pair programmer, trying to know and constantly be taught extra concerning the context of varied coding initiatives.
A lot of platforms do the identical, together with the elephant within the room GitHub Copilot. However certainly one of Magic’s improvements lies in its fashions’ ultra-long context home windows. It calls the fashions’ structure “Lengthy-term Reminiscence Community,” or “LTM” for brief.
A mannequin’s context, or context window, refers to enter information (e.g. code) that the mannequin considers earlier than producing output (e.g. extra code). A easy query — “Who gained the 2020 U.S. presidential election?” — can function context, as can a film script, present or audio clip.
As context home windows develop, so does the scale of the paperwork — or codebases, because the case could also be — being match into them. Lengthy context can forestall fashions from “forgetting” the content material of current docs and information, and from veering off matter and extrapolating wrongly.
Magic claims its newest mannequin, LTM-2-mini, has a 100 million-token context window. (Tokens are subdivided bits of uncooked information, just like the syllables “fan,” “tas” and “tic” within the phrase “implausible.”) 100 million tokens is equal to round 10 million strains of code, or 750 novels. And it’s by far the biggest context window of any industrial mannequin; the next-largest are Google’s Gemini flagship fashions at 2 million tokens.
Magic says that because of its lengthy context, LTM-2-mini was capable of implement a password energy meter for an open supply mission and create a calculator utilizing a customized UI framework just about autonomously.
The corporate’s now within the course of of coaching a bigger model of that mannequin.
Magic has a small crew — round two dozen individuals — and no income to talk of. Nevertheless it’s going after a market that might be value $27.17 billion by 2032, in accordance to an estimate by Polaris Analysis, and traders understand that to be a worthwhile (and presumably fairly profitable) endeavor.
Regardless of the safety, copyright and reliability issues round AI-powered assistive coding instruments, builders have proven enthusiasm for them, with the overwhelming majority of respondents in GitHub’s newest ballot saying that they’ve adopted AI instruments in some type. Microsoft reported in April that Copilot had over 1.3 million paying customers and greater than 50,000 enterprise clients.
And Magic’s ambitions are grander than automating routine software program growth duties. On the corporate’s web site, it speaks of a path to AGI — AI that may clear up issues extra reliably than people can alone.
Towards such AI, San Francisco-based Magic just lately employed Ben Chess, a former lead on OpenAI’s supercomputing crew, and plans to develop its cybersecurity, engineering, analysis and system engineering groups.