Together AI raises $102.5M to grow its cloud platform for training open-source generative AI models

Now that the OpenAI drama is in the rearview mirror, investors are making a comeback with billions in investment to support promising generative AI startups. The latest to receive this funding windfall is Together, an AI startup offering a cloud platform for building and running open-source generative AI and models.

Today, Together announced it has closed $102.5 million Series A funding led by Kleiner Perkins, with noteworthy participation from investors such as NVIDIA and Emergence Capital. As part of this development, Bucky Moore from Kleiner Perkins is set to join Together’s AI board.

This funding round, which is five times larger than the company’s previous one, will be used to significantly accelerate the company’s mission of building the fastest cloud platform for generative AI applications.  In a blog post, Together co-founder and CEO Vipul Ved Prakash shared the company’s vision:

 “Startups and enterprises alike are looking to build a generative AI strategy for their business that is free from lock-in to a single vendor. Open-source AI provides a strong foundation for these applications with increasingly powerful generative models being released almost weekly … We believe generative AI is a platform technology, a new operating system for applications, and will have a long-range impact on human society. The AI ecosystem will consist of proprietary models and open models, and it’s incredibly important that this future has choice and options.”

Founded in June 2022 by Vipul, along with Ce Zhang, Chris Re, and Percy Liang, Together has assembled a team of top-notch scientists and engineers. Vipul, the former founder of the social media search platform Topsy, which Apple acquired in 2013, emphasized the importance of generative AI as a platform technology with far-reaching implications for various applications.

The company’s Chief Scientist, Tri Dao, and collaborators released FlashAttention v2 earlier this year, a tool used by industry giants like OpenAI, Anthropic, Meta, and Mistral for developing leading LLMs. Together’s innovative work on inference, utilizing techniques such as Medusa and Flash-Decoding, has resulted in the fastest inference stack for transformer models. Accessible through the Together Inference API, this stack allows quick access to over 100 open models for fast inference.

In addition, Together said its research lab is at the forefront of sub-quadratic models, promising a more compute-efficient approach for longer-context AI models. Together currently operates cloud-spanning data centers in Europe and the U.S., delivering approximately 20 exaflops of compute in total. Notable customers include Pika Labs, NexusFlow, Voyage AI, and Cartesia, some of whom leverage Together’s model-serving APIs.

Vipul also shared the company’s unique approach, adding, “By creating custom infrastructure, we can offer significantly better economics on pre-training and inference workloads. The Together AI platform allows developers to quickly and easily integrate leading open-source models or create their own models through pre-training or fine-tuning. Our customers choose to bring their generative AI workloads to Together AI owing to our industry-leading performance and reliability; while still having comfort that they own the result of their investment in AI and are always free to run their model on any platform.”

Commenting on the funding, Bucky Moore, partner at Kleiner Perkins, said: “AI is a new layer of infrastructure that is changing how we develop software. To maximize its impact, it’s important that we make it available to developers everywhere. We expect the vast and growing landscape of open-source models to see widespread adoption as their performance approaches that of closed alternatives. Together AI enables any organization to build fast, reliable applications on top of them.”


TOP