
Inephany secures $2.2M for more efficient AI Model training
London-based AI startup Inephany has raised $2.2M in a pre-seed funding round to develop a platform aimed at optimising the training of neural networks, including large language models (LLMs).
The round was led by Amadeus Capital Partners, with participation from Sure Valley Ventures and Professor Steve Young, who joins as both an angel investor and chair.
Founded in July 2024 by Dr John Torr, Hami Bahraynian, and Maurice von Sturm, Inephany seeks to address the escalating costs and inefficiencies associated with training advanced AI models. The company’s AI-driven optimisation system offers a novel approach to controlling the training process in real-time, enhancing sample efficiency, accelerating training, and reducing overall development time, while cutting compute costs.
The rapid advancement of generative AI has led to a surge in demand for more efficient training methods. Training models like GPT-4 is estimated to have cost between $60M and $100M, with next-generation models approaching $1B in expenses. Inephany’s platform aims to make AI development more accessible and sustainable by offering a solution that is at least ten times more cost-effective than traditional methods.
Amelia Armour, Partner at Amadeus Capital Partners, commented: “We very much look forward to backing John, Hami, and Maurice as they tackle key efficiency challenges in current AI training. Their innovative approach to automating and optimising neural network training has the potential to reduce costs by an order of magnitude and accelerate advancements across AI applications. If rolled out at scale, the impact of this on what models can deliver will be very substantial.”
John Torr, CEO at Inephany, said: “We are thrilled to be backed by such experienced investors, and having a seasoned entrepreneur and AI pioneer like Professor Steve Young as our chair is a true privilege. Current approaches to training LLMs and other neural networks are extremely wasteful across multiple dimensions.
“Our unique solution tackles this inefficiency head-on, with the potential to radically reduce both the cost and time required to train and optimise state-of-the-art models. As we prepare to deliver our first products later this year, we are incredibly excited to embark on the next chapter of our journey—and to help shape the ongoing AI revolution by transforming AI optimisation.”
Professor Steve Young, said: “As the use of AI spreads ever wider, moving beyond the traditional applications of speech, language and vision into new and diverse areas such as weather prediction, healthcare, drug discovery and materials design, the need for very efficient training of accurate neural models is becoming critical. The groundbreaking new approach being developed by Inephany marks a step change in neural model training technology and I am delighted to join the team as chair and investor.”
Inephany plans to utilise the new funding to expand its engineering team, advance its optimisation platform, and onboard its first enterprise customers.
Powered by WPeMatico
https://tech.eu/2025/04/16/inephany-secures-2-2m-for-more-efficient-ai-model-training/