Blog

Democratizing AI Compute Series
Go behind the scenes of the AI industry with Chris Lattner
Latest

Increasing development velocity of giant AI models
Machine learning models are getting larger and larger — some might even say, humongous. The world’s most advanced technology companies have been in an arms race to see who can train the largest model (MUM, OPT, GPT-3, Megatron), while other companies focused on production systems have scaled their existing models to great effect. Through all the excitement, what’s gone unsaid is the myriad of practical challenges larger models present for existing AI infrastructure and developer workflows.

The Case for a Next-Generation AI Developer Platform
AI promised to profoundly change the world, so why hasn’t it?From healthcare to manufacturing, finance, climate, communication, and travel, to how we live and work. AI can help solve any problem that can be represented by data, assuming the right algorithms and enough computational resources.

The future of AI depends on Modularity
Platforms like TensorFlow, PyTorch, and CUDA do not focus on modularity - there, we said it! They are sprawling technologies with thousands of evolving interdependent pieces that have grown organically into complicated structures over time. AI software developers must deal with this sprawl while deploying workloads to server, mobile devices, microcontrollers, and web browsers using multiple hardware platforms and accelerators.
No items found within this category
We couldn’t find anything. Try changing or resetting your filters.

Get started guide
Install MAX with a few commands and deploy a GenAI model locally.
Read Guide
Browse open models
500+ models, many optimized for lightning-fast performance
Browse models



