empowers you to own & control your AI

Download MAX
Latest changelog

A new framework for Gen AI, and the best way to deploy PyTorch

Unparalleled GenAI performance

  • Llama3 - Optimized pipelines

    Llama3.py
    llama 3 ../run-pipeline.🔥 llama3 --max-tokens 200 --prompt "Why 
    the sky is blue?" --quantization-encoding q4_0
    
    Loading tokenizer...
    Building model...
    Compiling...
    Executing...
    
    The sky appears blue due to a phenomenon called Rayleigh scattering...
    
    Prompt size: 10
    Output size: 190
  • Deploy LLMs with a single command

    Checkout MAX Builds

The best way to deploy PyTorch

  • SOTA performance in just 3 lines of code

    Drop in your PyTorch or ONNX models and get an instant boost in performance with our next generation inference runtime for CPUs and GPUs

    See for yourself

Compatible with what you use today

  • Supports all your use cases and existing tools

    Use the MAX APIs to build, optimize and deploy from one model to more complex GenAI pipelines on CPUs or GPUs.

    Supported model formats

Build locally. Deploy easily across hardware in the cloud

  • Compute Abstraction for AI

    Build your AI applications, package and deploy across CPUs and GPUs platforms including Apple, ARM, Intel, AMD and NVIDIA without code changes.

    Supported hardware
  • Accelerate your time to market with MAX on AWS

    Get help from the experts using a production grade managed service on AWS

    Learn more

Develop with Python,  Extend with Mojo🔥

  • Use what you know with Python APIs in MAX

    Use our Python integration to interop with your existing workloads and offloads onto MAX where it matters

    Using Python with MAX
  • Learn how to
    scale your AI with Mojo

    No need to learn C and CUDA, use Mojo the easiest way to program CPUs and GPUs.

    Take a tour of Mojo🔥

What developers are saying about MAX

“I'm excited, you're excited, everyone is excited to see what's new in Mojo and MAX and the amazing achievements of the team at Modular.”

Eprahim

“Max installation on Mac M2 and running llama3 in (q6_k and q4_k) was a breeze! Thank you Modular team!”

NL

“The Community is incredible and so supportive. It’s awesome to be part of.”

benny.n

“I'm excited, you're excited, everyone is excited to see what's new in Mojo and MAX and the amazing achievements of the team at Modular.”

Eprahim

“Max installation on Mac M2 and running llama3 in (q6_k and q4_k) was a breeze! Thank you Modular team!”

NL

“The Community is incredible and so supportive. It’s awesome to be part of.”

benny.n

“I'm excited, you're excited, everyone is excited to see what's new in Mojo and MAX and the amazing achievements of the team at Modular.”

Eprahim

“Max installation on Mac M2 and running llama3 in (q6_k and q4_k) was a breeze! Thank you Modular team!”

NL

“The Community is incredible and so supportive. It’s awesome to be part of.”

benny.n

“I'm excited, you're excited, everyone is excited to see what's new in Mojo and MAX and the amazing achievements of the team at Modular.”

Eprahim

“Max installation on Mac M2 and running llama3 in (q6_k and q4_k) was a breeze! Thank you Modular team!”

NL

“The Community is incredible and so supportive. It’s awesome to be part of.”

benny.n

“I am focusing my time to help advance @Modular. I may be starting from scratch but I feel it’s what I need to do to contribute to #AI for the next generation.”

mytechnotalent

“What @modular is doing with Mojo and the MaxPlatform is a completely different ballgame.”

scrumtuous

“Mojo and the MAX Graph API are the surest bet for longterm multi-arch future-substrate NN compilation”

pagilgukey

“I'm very excited to see this coming together and what it represents, not just for MAX, but my hope for what it could also mean for the broader ecosystem that mojo could interact with.”

strangemonad

“I am focusing my time to help advance @Modular. I may be starting from scratch but I feel it’s what I need to do to contribute to #AI for the next generation.”

mytechnotalent

“What @modular is doing with Mojo and the MaxPlatform is a completely different ballgame.”

scrumtuous

“Mojo and the MAX Graph API are the surest bet for longterm multi-arch future-substrate NN compilation”

pagilgukey

“I'm very excited to see this coming together and what it represents, not just for MAX, but my hope for what it could also mean for the broader ecosystem that mojo could interact with.”

strangemonad

“I am focusing my time to help advance @Modular. I may be starting from scratch but I feel it’s what I need to do to contribute to #AI for the next generation.”

mytechnotalent

“What @modular is doing with Mojo and the MaxPlatform is a completely different ballgame.”

scrumtuous

“Mojo and the MAX Graph API are the surest bet for longterm multi-arch future-substrate NN compilation”

pagilgukey

“I'm very excited to see this coming together and what it represents, not just for MAX, but my hope for what it could also mean for the broader ecosystem that mojo could interact with.”

strangemonad

“I am focusing my time to help advance @Modular. I may be starting from scratch but I feel it’s what I need to do to contribute to #AI for the next generation.”

mytechnotalent

“What @modular is doing with Mojo and the MaxPlatform is a completely different ballgame.”

scrumtuous

“Mojo and the MAX Graph API are the surest bet for longterm multi-arch future-substrate NN compilation”

pagilgukey

“I'm very excited to see this coming together and what it represents, not just for MAX, but my hope for what it could also mean for the broader ecosystem that mojo could interact with.”

strangemonad