Fractile funding news – UK-based Fractile Secures $15 Million in Seed Funding
Jul 26, 2024 | By Team SR
Fractile, an AI chip firm, has raised $15 million in seed money. Currently available AI processors are the main obstacle to improved AI performance. Currently, every major AI company uses processors that are essentially the same, which work great for training LLMs but not for inference, which is the process of processing real data using a trained model.
SUMMARY
- Fractile, an AI chip firm, has raised $15 million in seed money. Currently available AI processors are the main obstacle to improved AI performance.
- Fractile is building chips to run large language models two orders of magnitude faster.
This implies that AI models have limited potential in the future, are prohibitively expensive to run, and have reduced performance. Additionally, it makes it challenging for AI model designers to provide significant distinction.
For a corporation looking to improve its hardware for AI inference, there are two options. The first is specialisation, which involves focussing on extremely particular workloads and developing chips that are specially made to meet those demands.
Read also - Enty funding news – Talinn-based Enty has Secures €700k in Seed Funding
RECOMMENDED FOR YOU
Founderful funding news – Zurich-based Founderful has Closed its Second Fund at €133 Million
Kailee Rainse
Nov 14, 2024
Companies using this strategy confront the challenge of aiming for a shifting target whose precise direction is unknown since model architectures in the field of artificial intelligence vary quickly while designing, verifying, producing, and testing semiconductors takes time.
Modification of computational processes
The other option is to radically rethink how computational operations are carried out, develop completely new processors using these new building blocks, and then construct enormously scalable systems on top of them. This is the strategy used by Fractile, which will enable ground-breaking performance in a variety of current and future AI models.
A Fractile system will use innovative circuits to carry out 99.99% of the operations required to run AI model inference, resulting in astounding performance - early targets are 100x quicker and 10x cheaper.
Reduced power use
Fractile not only offers significant cost and speed benefits, but also significantly lower power consumption. The largest fundamental barrier to scaling up AI computation performance is power, which is commonly expressed in Tera Operations Per Second per Watt (TOPS/W) (see notes below for additional details).
Lead investors in the round were Kindred Capital, NATO Innovation Fund, Oxford Science Enterprises; angel investors included Hermann Hauser (co-founder, Acorn, Amadeus Capital), Stan Boland (ex-Icera, NVIDIA, Element 14 and Five AI), and Amar Shah (co-founder, Wayve). Fractile has raised a total of $17.5 million in investment thus far.
Dr Walter Goodwin, CEO and Founder of Fractile: “In today’s AI race, the limitations of existing hardware — nearly all of which is provided by a single company — represent the biggest barrier to better performance, reduced cost, and wider adoption. Fractile’s approach supercharges inference, delivering astonishing improvements in terms of speed and cost. This is more than just a speed-up – changing the performance point for inference allows us to explore completely new ways to use today’s leading AI models to solve the world’s most complex problems.”
John Cassidy, Partner at Kindred Capital said , “AI is evolving so rapidly that building hardware for it is akin to shooting at a moving target in the dark. Because Fractile’s team has a deep background in AI, the company has the depth of knowledge to understand how AI models are likely to evolve, and how to build hardware for the requirements of not just the next two years, but 5-10 years into the future. We’re excited to partner with Walter and the team on this journey.”
Stan Boland, angel investor said, “There’s no question that, in Fractile, Walter is building one of the world’s future superstar companies. He’s a brilliant AI practitioner but he’s also listening intently to the market so he can be certain of building truly compelling products that other experts will want to use at scale. To achieve this, he’s already starting to build one of the world’s best teams of semiconductor, software and tools experts with track records of flawless execution. I’ve no doubt Fractile will become the most trusted partner to major AI model providers in short order.”
With senior hires from NVIDIA, ARM, and Imagination, Fractile has already assembled a top-tier team. It has also filed patents to safeguard important circuits and its distinct method of in-memory computing.
Potential partners are already being discussed by the firm, and agreements are anticipated to be signed before the first commercial AI accelerator hardware is produced. With the money, Fractile will expand its workforce and move more quickly towards producing its first product.
About Fractile
Fractile is building chips to run large language models two orders of magnitude faster. Existing hardware is good for training LLMs, but very poorly suited to subsequent inference of the trained model, which is increasingly the dominant workload. A network’s weights need to be moved onto a chip once per word generated, and this movement takes a few hundred times longer than the subsequent computations themselves.