Using Mojo🔥 with Python🐍

October 2, 2023

Jack Clayton

AI Developer Advocate

Mojo allows you to access the entire Python ecosystem, but environments can vary depending on how Python was installed. It's worth taking some time to understand exactly how modules and packages work in Python, as there are a few complications to be aware of. If you've had trouble calling into Python code before, this will help you get started.

Let's start with Python, if we have two files in the same directory:

Dir

.
├── main.py
└── mod.py

If mod.py has a single function and a variable:

Python ./mod.py

def foo(arg):
    print(f'arg = {arg}')

bar = [5, 10, 15, 20]

You can call it from any file in the same directory:

Python ./main.py

from mod import foo, bar

foo("test")
bar    
Output python main.py

arg = test
[5, 10, 15, 20]

mod.py is treated as a module named mod. You can also import any module that is on sys.path, let's take a look:

Python

import sys

sys.path
Output

['/usr/lib/python311.zip',
 '/usr/lib/python3.11',
 '/usr/lib/python3.11/lib-dynload',
 '/home/j/.local/lib/python3.11/site-packages',
 '/usr/lib/python3.11/site-packages',
 '/home/j/blog']

Because I'm running the python interpreter from /home/j/blog, that's the last thing on my python path, and the reason mod.py is accessible.

Just like your system $PATH environment variable, each path is checked in descending order until it finds your module.

If we look inside /usr/lib/python3.11, we'll find the standard library for Python 3.11:

Python

os.listdir(sys.path[1])[196:200]
Output

['os.py', 'base64.py', 'tempfile.py', 'pkgutil.py']

Everything there can be used as a module because it's on sys.path, for example we can import a function from tempfile.py:

Python

from tempfile import mkstemp

tmp = mkstemp(".py")
print("Generated temp file path:", tmp[1])
Output

Generated temp file path: /tmp/tmp2go6n4ir.py

We can also add to sys.path with an env var to give us access to modules elsewhere:

Bash

export PYTHONPATH="/home/j"

If you check the first entry on sys.path it'll now be there:

Python

sys.path[0]
Output

/home/j

You can also edit sys.path directly:

Python

sys.path.append("/tmp")

To convince ourselves this works create a file at /tmp/mod2.py:

Python /tmp/mod2.py

a = 42

You can now access mod2 as a module:

Python

import mod2

mod2
Output

module 'mod2' from '/tmp/mod2.py'

Anything inside that module is accessible:

Python

mod2.a
Output

42

It's considered bad practice to directly modify sys.path, you should use the module system as intended, but it's a good way to see how things work under the hood.

There are a lot more details on how Python finds and loads modules and packages, you can read the module search path docs and the site module docs. And if you're wondering about how sys.path is generated, it's described at a high level in the source code and the sys path init docs.

The most important part for our purposes is sys.prefix which is the base where the lib and bin folders are located for the python binary we're using:

Bash

which python
Output

/usr/bin/python

Now notice the prefix is two folders above the python executable, this is the root where sys.path will start searching from:

Python

sys.prefix
Output

/usr

Depending on the system you're running on this will look very different, but you'll always see lib and bin at sys.prefix, and this is where the first few paths like /usr/lib/python3.11 and /usr/lib/python3.11/site-pacakges are added. There are ways to hack this as per the docs above, but that's the general rule.

Calling Python from Mojo🔥

You can install the Mojo SDK here.

In Mojo we can use any Python module in sys.path in the same way. To work out exactly which environment is being used, let's start by checking the prefix:

Mojo

from python import Python

def main():
    let sys = Python.import_module("sys")
    print(sys.prefix)
Output

/usr

Calling Python could result in an error, so we either need to use def main(): as the entrypoint, fn main() raises: to mark it as a strict Mojo function that can throw an error, or handle the errors in a try except block:

Mojo

fn main():
    try:
        let x = Python.import_module("fake")
    except e:
        # mojo 0.3.1
        print("Failed to import:", e.value)
        # next release
        print("Failed to import:", e)

In the next release of Mojo, this will print the error from Python if something goes wrong, currently a generic error is thrown for Python failures:

Output

# 0.3.1
Failed to import: An error occurred in Python.
# Next Release
Failed to import: No module named 'fake'

You can also print everything that's on sys.path from Mojo:

Mojo

for p in sys.path:
    print(p)
Output

/usr/lib/python3.11/site-packages
/usr/lib/python311.zip
/usr/lib/python3.11
/usr/lib/python3.11/lib-dynload
/home/j/.local/lib/python3.11/site-packages

Notice anything different here? We don't have our current directory on path! You can add it with:

Mojo

# Relative Path
Python.add_to_path(".")
# Absolute Path
Python.add_to_path("/home/j/blog")

print(sys.path[-1])
sys.path[-2]
Output

/home/j/blog
.

Now any modules in the current directly are available, let's access the mod.py we created earlier from Mojo:

Mojo

let mod = Python.import_module("mod")
mod.bar
Output

[5, 10, 15, 20]

What is a Package?

Mojo behaves the same as Python, a package can be a subfolder containing an __init__.🔥 or __init__.mojo with initialization logic. It also allows you to access other modules in the same directory:

main.🔥 can access the drive.🔥 module through import vehicles.car.drive

A question that often comes up in both Python and Mojo is relative imports, this is possible with the above structure. car, plane and common can access each other because they are sub-packages of the package vehicles:

Mojo ./vehicles/common/liquids.🔥

var fuel = 0

fn refuel():
    print("refuelling!")
Mojo ./vehicles/car/__init__.🔥

from ..common.liquids import fuel, refuel

fuel and refuel are now accessible from ./vehicles/car/drive.🔥:

Mojo ./vehicles/car/drive.🔥

fn move_forward():
    if fuel == 0:
        refuel()

    print("moving forward!") 
Mojo ./main.🔥

from vehicles.car.drive import move_forward

fn main():
    move_forward()
Output python main.🔥

refuelling!
moving forward!

This is different to other languages where file structure doesn't matter as much, keep this in mind as you build out your code base.

Note that as of Python 3.3 the __init__.py file is not required if you don't have any initialization logic due to PEP 420, however it's still currently a requirement in Mojo to mark a folder as a package.

Creating a virtual environment with venv

venv comes with Python and can be used to generate a virtual environment from the python binary we have on path, for more details you can read a primer here.

First check which python binary you're using to make sure it's the one you want:

Bash

which python3
python3 --version
Output

/usr/bin/python3
Python 3.11.5

Check to make sure Mojo is linking to the same libpython as the executable we're going to use to create the venv:

Bash

python3 -c 'import sys; print(sys.prefix)'
Output

/usr

We can find the associated libpython:

Bash

ls /usr/lib/libpython*
Output

/usr/lib/libpython3.11.so
/usr/lib/libpython3.11.so.1.0
/usr/lib/libpython3.so

Make sure that's the same as ~/.modular/modular.cfg, you want libpython[major].[minor].so, if you're on macOS it'll end in .dylib:

~/.modular/modular.cfg cfg

...
python_lib = /usr/lib/libpython3.11.so;

Now that we're certain Mojo is linking to the correct Python on our system, we can create a virtual environment and install dependencies into it:

Bash

python3 -m venv ~/venv
source ~/venv/bin/activate

Activating a venv is simple, it just adds a few env vars and modifies your $PATH so that ~/venv/bin is at the top of your system $PATH variable:

Bash

echo $PATH | cut -d':' -f1
Output

/home/j/venv/bin

If we look inside that path, you can see these commands will now take precedence:

Bash

ls ~/venv/bin
Output

activate
activate.csh
activate.fish
Activate.ps1
pip
pip3
pip3.11
python
python3
python3.11

Install a library for pretty printing named rich into the venv and take a look at the path:

Bash

pip install rich
Output

Collecting rich
Installing collected packages: pygments, mdurl, markdown-it-py, rich
Successfully installed markdown-it-py-3.0.0 mdurl-0.1.2 pygments-2.16.1 rich-13.5.3
Bash

python -c 'import sys; import rich; rich.print(sys.path)'
Output

[
    '',
    '/usr/lib/python311.zip',
    '/usr/lib/python3.11',
    '/usr/lib/python3.11/lib-dynload',
    '/home/j/venv/lib/python3.11/site-packages'
]

The two site-packages folders that were controlled by my system package manager have been removed, and the venvs /home/j/venv/lib/python3.11/site-packages is now in our sys.path. Let's have a look inside:

Bash

ls ~/venv/lib/python3.11/site-packages
Output

Pygments-2.16.1.dist-info
_distutils_hack
distutils-precedence.pth
markdown_it
markdown_it_py-3.0.0.dist-info
mdurl
mdurl-0.1.2.dist-info
pip
pip-23.2.1.dist-info
pkg_resources
pygments
rich
rich-13.5.3.dist-info
setuptools
setuptools-68.1.2.dist-info

rich and it's dependencies were installed into the venv so we can now access the module.

Let's check if sys.prefix has changed:

Bash

python -c "import sys; print(sys.prefix)"
Output

/home/j/venv

It has, but how are we still getting all the standard library modules in our sys.path? The base_prefix is what adds the python modules from the base installation:

Bash

python -c "import sys; print(sys.base_prefix)"
Output

/usr

We can access modules from Mojo in the same way so long as the venv is activated, and print where rich is located from Mojo:

Mojo

let rich = Python.import_module("rich")

rich
Output

module 'rich' from '/home/j/venv/lib/python3.11/site-packages/rich/__init__.py'

This works, but it's finicky and breakable. python must use the same sys.prefix when we create the venv as the one we're linking to from Mojo. And the venv must be activated before running Mojo, so that ~/venv/bin is at the top of our system $PATH env var.

Installing libpython with Conda

To use a specific version of Python with you can install it with conda and link to the libpython that's installed. It not only isolates python dependencies, but also C/C++ system libraries like openssl and cuda that are notorious for causing cross-platform and linux distribution problems.

If you don't have conda, you can install miniconda here

Bash

conda create -yn py310 python=3.10
Output

## Package Plan ##

  environment location: /home/j/miniconda3/envs/py310

  added / updated specs:
    - python=3.10

The following NEW packages will be INSTALLED:

  _libgcc_mutex      pkgs/main/linux-64::_libgcc_mutex-0.1-main
  _openmp_mutex      pkgs/main/linux-64::_openmp_mutex-5.1-1_gnu
  bzip2              pkgs/main/linux-64::bzip2-1.0.8-h7b6447c_0
  ca-certificates    pkgs/main/linux-64::ca-certificates-2023.08.22-h06a4308_0
  ld_impl_linux-64   pkgs/main/linux-64::ld_impl_linux-64-2.38-h1181459_1
  libffi             pkgs/main/linux-64::libffi-3.4.4-h6a678d5_0
  libgcc-ng          pkgs/main/linux-64::libgcc-ng-11.2.0-h1234567_1
  libgomp            pkgs/main/linux-64::libgomp-11.2.0-h1234567_1
  libstdcxx-ng       pkgs/main/linux-64::libstdcxx-ng-11.2.0-h1234567_1
  libuuid            pkgs/main/linux-64::libuuid-1.41.5-h5eee18b_0
  ncurses            pkgs/main/linux-64::ncurses-6.4-h6a678d5_0
  openssl            pkgs/main/linux-64::openssl-3.0.11-h7f8727e_2
  pip                pkgs/main/linux-64::pip-23.2.1-py310h06a4308_0
  python             pkgs/main/linux-64::python-3.10.13-h955ad1f_0
  readline           pkgs/main/linux-64::readline-8.2-h5eee18b_0
  setuptools         pkgs/main/linux-64::setuptools-68.0.0-py310h06a4308_0
  sqlite             pkgs/main/linux-64::sqlite-3.41.2-h5eee18b_0
  tk                 pkgs/main/linux-64::tk-8.6.12-h1ccaba5_0
  tzdata             pkgs/main/noarch::tzdata-2023c-h04d1e81_0
  wheel              pkgs/main/linux-64::wheel-0.41.2-py310h06a4308_0
  xz                 pkgs/main/linux-64::xz-5.4.2-h5eee18b_0
  zlib               pkgs/main/linux-64::zlib-1.2.13-h5eee18b_0

Now we have an isolated environment with its own system libraries, python packages, and importantly a fresh libpython which Mojo uses directly for python interop.

For Mojo to use this libpython, and by extension the python environment at runtime, you can set the $MOJO_PYTHON_LIBRARY env var:

Bash

# The path of the freshly installed libpython3.10:
export MOJO_PYTHON_LIBRARY="$(conda info --base)/envs/py310/lib/libpython3.10.so"
echo $MOJO_PYTHON_LIBRARY

# Export to whichever shells you're using to set it permanently:
echo "export MOJO_PYTHON_LIBRARY=$MOJO_PYTHON_LIBRARY" >> ~/.zshrc
echo "export MOJO_PYTHON_LIBRARY=$MOJO_PYTHON_LIBRARY" >> ~/.bashrc
echo "set -gx MOJO_PYTHON_LIBRARY $MOJO_PYTHON_LIBRARY" >> ~/.config/fish/config.fish
Output

/home/j/miniconda3/envs/py310/lib/libpython3.10.so

Or you can edit it directly in ~/.modular/modular.cfg with an absolute path under the python_lib key like shown previously:

Cfg ~/.modular/modular.cfg

python_lib = /home/j/miniconda3/envs/py310/lib/libpython3.10.so;

Now when you run mojo again you'll see that it's getting modules from the directories in our isolated python3.10 environment:

Mojo

for p in sys.path:
    print(p)
Output

/home/j/miniconda3/envs/py310/lib/python310.zip
/home/j/miniconda3/envs/py310/lib/python3.10
/home/j/miniconda3/envs/py310/lib/python3.10/lib-dynload
/home/j/miniconda3/envs/py310/lib/python3.10/site-packages

Great, now it's not a mystery where our Python modules are coming from!

Let's install a python package into our conda instance now:

Bash

conda activate py310
conda install numpy -y
Output

## Package Plan ##

  environment location: /home/j/miniconda3

  added / updated specs:
    - numpy

The following NEW packages will be INSTALLED:

  blas               pkgs/main/linux-64::blas-1.0-mkl
  intel-openmp       pkgs/main/linux-64::intel-openmp-2023.1.0-hdb19cb5_46305
  mkl                pkgs/main/linux-64::mkl-2023.1.0-h213fc3f_46343
  mkl-service        pkgs/main/linux-64::mkl-service-2.4.0-py311h5eee18b_1
  mkl_fft            pkgs/main/linux-64::mkl_fft-1.3.8-py311h5eee18b_0
  mkl_random         pkgs/main/linux-64::mkl_random-1.2.4-py311hdb19cb5_0
  numpy              pkgs/main/linux-64::numpy-1.26.0-py311h08b1b3b_0
  numpy-base         pkgs/main/linux-64::numpy-base-1.26.0-py311hf175353_0
  tbb                pkgs/main/linux-64::tbb-2021.8.0-hdb19cb5_0

The following packages will be UPDATED:

  ca-certificates                     2023.05.30-h06a4308_0 --> 2023.08.22-h06a4308_0
  certifi                          2023.5.7-py311h06a4308_0 --> 2023.7.22-py311h06a4308_0
  conda                              23.5.2-py311h06a4308_0 --> 23.7.4-py311h06a4308_0
  openssl                                  3.0.9-h7f8727e_0 --> 3.0.11-h7f8727e_2

numpy is complicated underneath and requires many system libraries for fortran routines, linear algebra, and hardware acceleration. All the system libraries above are all installed to the isolated environment. System libs are checked for compatibility against the version of python and all the other packages you're installing. Having an older or newer distribution won't break it, and we don't have to mess around with our system package manager to install system dependencies. Installing numpy on Apple Silicon uses entirely different libraries to take advantage of different hardware:

Output

## Package Plan ##

  environment location: /opt/homebrew/Caskroom/miniconda/base/envs/py310

  added / updated specs:
    - numpy

The following NEW packages will be INSTALLED:

  blas               conda-forge/osx-arm64::blas-2.118-openblas
  blas-devel         conda-forge/osx-arm64::blas-devel-3.9.0-18_osxarm64_openblas
  libblas            conda-forge/osx-arm64::libblas-3.9.0-18_osxarm64_openblas
  libcblas           conda-forge/osx-arm64::libcblas-3.9.0-18_osxarm64_openblas
  libgfortran        conda-forge/osx-arm64::libgfortran-5.0.0-13_2_0_hd922786_1
  libgfortran5       conda-forge/osx-arm64::libgfortran5-13.2.0-hf226fd6_1
  liblapack          conda-forge/osx-arm64::liblapack-3.9.0-18_osxarm64_openblas
  liblapacke         conda-forge/osx-arm64::liblapacke-3.9.0-18_osxarm64_openblas
  libopenblas        conda-forge/osx-arm64::libopenblas-0.3.24-openmp_hd76b1f2_0
  llvm-openmp        conda-forge/osx-arm64::llvm-openmp-16.0.6-h1c12783_0
  numpy              anaconda/osx-arm64::numpy-1.22.3-py310hdb36b11_0
  numpy-base         anaconda/osx-arm64::numpy-base-1.22.3-py310h5e3e9f0_0
  openblas           conda-forge/osx-arm64::openblas-0.3.24-openmp_hce3e5ba_0

Now we can access numpy from inside Python and print the path to the module:

Mojo

let numpy = Python.import_module("numpy")
numpy
Output

module 'numpy' from '/home/j/miniconda3/base/envs/py310/lib/python3.10/site-packages/numpy/__init__.py'

If we can't find something in conda simply install it with `pip`:

Bash

pip install pillow
Output

Collecting pillow
  Using cached Pillow-10.0.1-cp311-cp311-manylinux_2_28_x86_64.whl (3.6 MB)
Installing collected packages: pillow
Successfully installed pillow-10.0.1

And it'll now be available:

Mojo

let pillow = Python.import_module("PIL")
pillow
Output

module 'PIL' from '/home/j/miniconda3/lib/python3.11/site-packages/PIL/__init__.py'

Now to create a reproducible environment you can run:

Bash

conda env export > environment.yml
cat environment.yml
Output

name: py310
channels:
  - defaults
dependencies:
  - _libgcc_mutex=0.1=main
  - _openmp_mutex=5.1=1_gnu
  - blas=1.0=mkl
  - bzip2=1.0.8=h7b6447c_0
  - ca-certificates=2023.08.22=h06a4308_0
  - intel-openmp=2023.1.0=hdb19cb5_46305
  - ld_impl_linux-64=2.38=h1181459_1
  - libffi=3.4.4=h6a678d5_0
  - libgcc-ng=11.2.0=h1234567_1
  - libgomp=11.2.0=h1234567_1
  - libstdcxx-ng=11.2.0=h1234567_1
  - libuuid=1.41.5=h5eee18b_0
  - mkl=2023.1.0=h213fc3f_46343
  - mkl-service=2.4.0=py310h5eee18b_1
  - mkl_fft=1.3.8=py310h5eee18b_0
  - mkl_random=1.2.4=py310hdb19cb5_0
  - ncurses=6.4=h6a678d5_0
  - numpy=1.26.0=py310h5f9d8c6_0
  - numpy-base=1.26.0=py310hb5e798b_0
  - openssl=3.0.11=h7f8727e_2
  - pip=23.2.1=py310h06a4308_0
  - python=3.10.13=h955ad1f_0
  - readline=8.2=h5eee18b_0
  - setuptools=68.0.0=py310h06a4308_0
  - sqlite=3.41.2=h5eee18b_0
  - tbb=2021.8.0=hdb19cb5_0
  - tk=8.6.12=h1ccaba5_0
  - tzdata=2023c=h04d1e81_0
  - wheel=0.41.2=py310h06a4308_0
  - xz=5.4.2=h5eee18b_0
  - zlib=1.2.13=h5eee18b_0
  - pip:
      - pillow==10.0.1
prefix: /home/j/miniconda3/envs/py310

This works as a lockfile for the specific arch and os which you're running on, we're including system libraries that are specific to Linux. If you're building something cross-platform, let conda resolve all the dependencies, and just specify what you need:

Bash

conda env export --from-history > environment.yml
cat environment.yml
Output

name: py310
channels:
  - defaults
dependencies:
  - python=3.10
  - numpy

prefix: /home/j/miniconda3/envs/py310

Just be careful, as pip dependencies have to be added back in manually when using this technique. It's best to edit it manually, remove the prefix, and set minimum versions as required so that it ends up looking like this:

Output

name: py310
channels:
  - defaults
dependencies:
  - python=3.10
  - numpy>=1.26
  - pip:
      - pillow>=10.0

As a user you can install this environment by running:

Bash

conda env create -yn my-new-env --file environment.yml

Make sure to always put -n <env-name>, so it installs to your <conda-install>/base/envs/ folder and ignores any hard coded prefix.

Conclusion

The aim of this post was to make it clear about how you can access python modules from Mojo, so that you can troubleshoot anything that goes wrong yourself. And to demonstrate the two most common methods of creating virtual environments. There are other solutions like poetry and pdm that have nice features, but conda is the most foolproof way as it installs any version of python with all the required system libraries to an isolated environment. This mitigates the huge amount of system configuration problems and library conflicts that can arise when distributing Python applications.

In my next post we'll be creating a GUI app using Python libraries that call performant Mojo functions, stay tuned!

Mojo Boitata image credit: David Ragazzi

Jack Clayton
,
AI Developer Advocate