Together AI is a research-driven artificial intelligence company. Its mission is to create the fastest cloud platform for building and running generative AI and to advance the frontier of AI through open-source contributions of research, models, and datasets. Together AI aims to democratize access to advanced AI tools and solutions, making them accessible to businesses and developers of all sizes. The company believes that open and transparent AI systems will drive innovation and create the best outcomes for society.
Together AI’s goal is to empower developers and researchers to train, fine-tune, and deploy generative AI models with ease and efficiency. The company focuses on providing a platform that enables users to manage the entire generative AI lifecycle with high performance, control, and cost-efficiency. By offering a decentralized cloud platform and a suite of open-source tools, Together AI seeks to foster innovation and creativity within the AI community. The company is recognized for its commitment to open-source principles and for providing scalable and customizable AI solutions.
Offerings, Capabilities, and Integrations
Together AI provides a cloud platform focused on generative artificial intelligence. Its core offerings revolve around providing access to a vast number of open-source AI models, tools for customizing these models (fine-tuning), and the infrastructure to run these models at scale (inference). This is delivered through a full-stack approach, encompassing compute resources, software for model optimization, and APIs for integration. Together AI’s ability to offer these services with a focus on speed, cost-efficiency, and support for open-source models gives it a competitive edge. This positions Together AI as a key enabler for developers and enterprises looking to build and deploy AI applications without being locked into proprietary systems, reflecting a reputation for fostering innovation and accessibility in the AI space.
The platform is designed for the entire generative AI lifecycle, from research and development to production deployment. Key capabilities include high-performance GPU clusters, a fast inference stack, and tools for fine-tuning models with proprietary datasets. Together AI supports integrations with various tools and platforms, including Weights & Biases for monitoring fine-tuning jobs, and offers API compatibility, such as with the OpenAI SDK, to facilitate easier adoption. It also integrates with platforms like Vercel, Appwrite, and SkyDeck AI, allowing developers to incorporate its models into their applications and workflows. The company also recently acquired Refuel.ai to enhance its platform with purpose-built models and orchestration capabilities for data workflows.
Products and Services
Together AI’s offerings are centered around its AI cloud platform, which provides the following key products and services:
- Together Inference: This service provides a high-speed inference stack for running generative AI models. It allows users to deploy and scale inference requests for a wide range of open-source models. Together AI emphasizes the speed and cost-effectiveness of its inference services. Users can access models via serverless endpoints or dedicated instances.
- Together Fine-Tuning: This service enables users to customize leading open-source models with their own datasets. Customers can use APIs to upload data, manage fine-tuning jobs, and deploy the resulting custom models on the Together AI platform or download them.
- Together GPU Clusters: The company offers access to high-performance GPU clusters optimized for AI training and inference workloads. These clusters are equipped with Together AI’s fast training stack.
- Access to Open-Source Models: A core part of Together AI’s offering is providing access to a broad library of pre-trained open-source models, including those for text generation, image generation, and code.
- Together API: This allows developers to integrate Together AI’s model capabilities into their own applications and services.
- Together Enterprise Platform: This allows for the deployment of the Together AI platform within a customer’s own Virtual Private Cloud (VPC) on services like Amazon EKS, offering enhanced security and control.
A recent addition to its platform capabilities is the integration of Refuel LLM-2, following the acquisition of Refuel.ai, which supports serverless inference and LoRA fine-tuning for high-accuracy data workflows.
Target Customers
Together AI targets a diverse range of users, from individual developers and AI researchers to startups and large enterprises. The platform is designed to serve organizations of all sizes that are looking to build, train, fine-tune, and deploy generative AI models.
Specific market segments include:
- Developers: Individual developers and small teams can leverage Together AI’s platform to experiment with and build AI-powered applications, benefiting from its pay-as-you-go pricing and access to open-source models.
- AI Startups: Startups in the AI space, such as Pika Labs, utilize Together AI for its cost-effective and scalable infrastructure to power their generative AI applications.
- Enterprises: Large companies across various industries, including those in healthcare, finance, retail, and entertainment, can use Together AI to integrate AI into their operations, develop custom AI solutions, and manage the entire generative AI lifecycle with enhanced performance, security, and cost-efficiency. Customers like Zoom and Quora are mentioned as users.
- AI Researchers: The academic and research community can use the platform for experimenting with AI models and advancing AI innovation.
These target customers benefit from Together AI’s offerings by gaining access to a powerful and flexible platform for generative AI development that emphasizes open-source models, cost savings, and high performance. This allows them to innovate faster, build custom AI solutions tailored to their specific needs, and scale their AI applications efficiently. For enterprises, the platform offers enterprise-grade security, privacy controls, and the ability to retain ownership of custom models.
Cloud Integrations and Marketplaces
Together AI has a presence on the Amazon Web Services (AWS) Marketplace and offers integrations with other platforms.
- AWS Marketplace: Together AI is available on the AWS Marketplace, allowing AWS customers to use their existing cloud credits and committed spend to access the Together AI platform. This offering aims to provide faster inference times, optimized performance for leading models, and reduced GPU costs. The platform enables users to run serverless or on-demand models, scale with monthly reserved instances and VPC, fine-tune models via API, and deploy the Together Enterprise Platform in their VPC on EKS. Together AI has also joined the AWS Partner Network (APN).
- Google Cloud Marketplace: A search for “Together AI” on the Google Cloud Marketplace (https://console.cloud.google.com/marketplace/browse?q=Together%20AI) did not yield a direct listing for Together AI’s core platform at the time of this research. However, various AI and machine learning tools and services from Google and other third-party vendors are available.
- Microsoft Azure Marketplace: There is no specific information in the search results indicating that Together AI’s core platform is directly available on the Microsoft Azure Marketplace. The Azure Marketplace does list various AI services and solutions from Microsoft and other partners.
- Other Integrations:
- LiveKit Agents: Together AI’s Llama 2 and Llama 3 models can be integrated with LiveKit Agents using an OpenAI plugin. This allows for the use of these models in voice agent applications.
- Vapi: TogetherAI is a provider for Vapi, a platform for building voice AI applications. This suggests an integration that allows Vapi users to leverage Together AI’s models and inference capabilities.
- Dell Technologies: Together AI is collaborating with Dell Technologies to scale its AI cloud platform, integrating Dell’s AI Factory with NVIDIA technology. This collaboration focuses on providing scalable and efficient AI infrastructure.
Together AI’s platform is designed as an AI Acceleration Cloud, enabling users to run, fine-tune, and manage open-source and custom AI models at production scale. It supports a wide range of open-source models and offers tools for fine-tuning and training with private data.
Key People
- Founder & CEO: Vipul Ved Prakash
- Founder & CTO: Ce Zhang
- Founder: Chris Ré
- Founder: Percy Liang
- Founding Chief Scientist: Tri Dao
- Chief Revenue Officer: Kai Mak
- Founding SVP Product: Jamie de Guerre
- Director People & Ops: Nicolette Lea
- CMO (interim): Rajan Sheth
- VP Sales And BD: Arielle Fidel
- SVP Of Engineering: Albert Meixner
- Founding VP Engineering: Charles Srisuwananukorn
Key Facts
- Headquarters Location: San Francisco, California, United States.
- Number of Employees: Approximately 170-197.
- Annual Revenue: Estimated $50 million to $130 million.
- Parent Company: None.
- Subsidiary Companies: Refuel.
- Publicly Listed: No.
Analyst Recognition
Based on the available information, Together AI has not been specifically recognized in reports by Gartner, Forrester, IDC, or Everest Group under its own name in the common technology categories these analyst groups cover, such as “Cloud AI Developer Services”, “AI Foundation Models”, “AI Service Providers”, or “Data and AI Services”. Many reports from these analyst firms focus on larger, more established players or a broader range of services than Together AI’s current primary focus on providing a cloud platform for building and fine-tuning generative AI models, with an emphasis on open-source LLMs. It is important to note that the AI and generative AI market is rapidly evolving, and analyst coverage is continually updated.
While direct mentions of Together AI in specific Gartner Magic Quadrants, Forrester Waves, IDC MarketScapes, or Everest Group PEAK Matrix assessments were not found in the search results, the general sentiment in the available information indicates that Together AI is considered an emerging and innovative company in the AI infrastructure space, particularly for its contributions to open-source AI development and its efforts to democratize access to generative AI models.
Other companies are frequently cited by these analyst groups in various AI-related categories. For example:
- Gartner has recognized companies like Google, Microsoft, AWS, H2O.ai, and ServiceNow in reports such as the “Magic Quadrant for Cloud AI Developer Services” and “Magic Quadrant for AI Applications in IT Service Management”.
- Forrester has recognized firms like Databricks, McKinsey, Kameleoon, and ZS in its “Wave” reports for categories including “AI Foundation Models for Language”, “AI Service Providers”, “Feature Management and Experimentation Solutions”, and “Customer Analytics Services”.
- IDC has positioned companies such as Capgemini, TCS, Smart Communications, Aisera, and EY as Leaders or Major Players in its “MarketScape” reports covering areas like “Worldwide AI Services”, “Worldwide Cloud Professional Services”, “Intelligent Customer Communications Management”, and “Worldwide Conversational AI Software Platforms”.
- Everest Group has featured companies like NICE, Capgemini, Xebia, Deloitte, and Blend in its “PEAK Matrix” assessments for categories such as “Conversational AI Products”, “Artificial Intelligence (AI) and Generative AI Services”, “Data and AI (D&AI) Services for Mid-market Enterprises”, and “Salesforce Services”.
The absence of Together AI in these specific analyst reports does not necessarily reflect its market position or potential, as analyst coverage often has specific inclusion criteria and timelines. Together AI is noted for its work with Dell Technologies and NVIDIA to scale its AI cloud platform.