Lamini, a Palo Alto-based startup spearheading a platform for enterprises to deploy generative AI technology, has secured $25 million in funding, with notable investment from Stanford computer science professor Andrew Ng.
Founded by Sharon Zhou and Greg Diamos, Lamini distinguishes itself by tailoring its platform specifically for corporate needs, aiming to address the shortcomings of existing generative AI solutions. Zhou and Diamos assert that many current platforms lack the necessary infrastructure and scalability required by enterprises, leading Lamini to prioritize accuracy and scalability in its development.
Zhou, Lamini’s CEO, highlights the prevalent desire among CEOs, CIOs, and CTOs to harness generative AI for maximal return on investment within their organizations. However, despite widespread experimentation, only a small fraction of organizations have achieved widespread adoption of generative AI. This gap is attributed to various challenges, including inadequate IT infrastructure, governance issues, skills shortages, and high implementation costs. Security concerns also loom large, impacting the ability of companies to fully leverage generative AI technology.
Lamini aims to address these challenges by providing a platform tailored to the needs of enterprises, facilitating the successful deployment of generative AI technology across business functions. With the support of investors like Andrew Ng, Lamini is poised to make significant strides in the corporate AI landscape, offering solutions that prioritize both efficacy and scalability.
So what’s Lamini’s answer?
Zhou emphasizes that Lamini’s entire technological infrastructure has been meticulously optimized to handle enterprise-scale generative AI workloads. This optimization spans every aspect, from hardware to software, encompassing engines for model orchestration, fine-tuning, running, and training. One innovative approach introduced by Lamini is what Zhou refers to as “memory tuning,” a technique designed to train models on data in a way that enables them to accurately recall specific details.
The concept of memory tuning aims to mitigate instances of model hallucinations, where the AI generates false information in response to queries. According to Nina Wei, an AI designer at Lamini, this approach involves training models on proprietary data containing crucial facts and figures, enabling them to memorize and accurately recall specific information rather than relying on generalizations or fabrications.
However, the validity of “memory tuning” as a distinct technique remains open to scrutiny. While Lamini asserts its effectiveness, there is a lack of academic research or published papers on this specific method. It remains for Lamini to provide empirical evidence demonstrating the superiority of its memory tuning approach compared to other techniques aimed at reducing hallucinations.
Nevertheless, Lamini offers additional features beyond memory tuning that set it apart. Zhou highlights the platform’s ability to operate in highly secure environments, including air-gapped setups. Lamini enables companies to run, fine-tune, and train models across a variety of configurations, from on-premises data centers to public and private clouds. Furthermore, the platform boasts elastic scalability, capable of expanding workloads to over 1,000 GPUs as needed.
Zhou emphasizes the importance of aligning incentives in the market, particularly in contrast to closed-source models. Lamini aims to empower more stakeholders by returning control over proprietary data to enterprises, recognizing their stake in maintaining ownership and confidentiality. This ethos aligns with Lamini’s broader mission to democratize access to AI technologies and data control, particularly for enterprises with significant proprietary interests.
Lamini’s co-founders boast impressive credentials in the field of AI, which likely contributed to the company’s success in fundraising. Sharon Zhou, formerly a faculty member at Stanford University, led a research group focusing on generative AI. Prior to her doctoral studies in computer science under Andrew Ng, she served as a machine learning product manager at Google Cloud.
Greg Diamos, on the other hand, co-founded MLCommons, an engineering consortium aimed at establishing standard benchmarks for AI models and hardware, as well as the MLPerf benchmarking suite. His experience includes leading AI research at Baidu and working closely with Ng during Ng’s tenure as chief scientist at the company. Diamos also held a role as a software architect on Nvidia’s CUDA team.
The co-founders’ extensive industry connections likely played a significant role in Lamini’s fundraising efforts. Notably, investors include prominent figures such as Andrew Ng, Dylan Field (CEO of Figma), Drew Houston (CEO of Dropbox), Andrej Karpathy (co-founder of OpenAI), and Bernard Arnault (CEO of LVMH). Additionally, AMD Ventures, despite Diamos’ background at Nvidia, joined as an investor early on, supplying Lamini with data center hardware. Today, Lamini utilizes AMD Instinct GPUs for many of its models, diverging from the industry norm. The support from such high-profile investors underscores the confidence in Lamini’s technology and leadership team.
Lamini’s claim about its model training and running performance being comparable to Nvidia equivalent GPUs is certainly ambitious, but without independent verification, it remains to be seen how it holds up. Third-party testing would provide more clarity on this front.
Having raised $25 million across seed and Series A rounds, with Amplify leading the Series A, Lamini is poised for growth. The company plans to triple its 10-person team, expand its compute infrastructure, and focus on deeper technical optimizations with the funding.
In the competitive landscape of enterprise-oriented generative AI vendors, Lamini faces formidable opponents such as Google, AWS, and Microsoft, all of which have been aggressively targeting the enterprise market. These tech giants offer features like streamlined fine-tuning and private fine-tuning on private data.
Regarding Lamini’s customers and revenue, CEO Sharon Zhou was tight-lipped, citing the company’s early stage. However, she mentioned that Lamini counts AMD, AngelList, and NordicTrack among its early paying users, along with undisclosed government agencies. Despite being inundated with inbound demand, Lamini is focused on serving its customers and ensuring growth.
Amplify general partner Mike Dauber expressed confidence in Lamini’s potential, highlighting the company’s dedication to addressing enterprise needs and unlocking the value of private data while meeting compliance and security requirements. This endorsement underscores Lamini’s position as a serious contender in the generative AI space for enterprises.