newcohospitality.com

Mojo: The Open-Source AI Language Outpacing Python by 90,000 Times

Written on

Mojo has recently been open-sourced by Modular Inc., marking a significant milestone for the programming language designed for AI software development. Initially introduced in August 2023, Mojo has attracted over 175,000 developers and 50,000 organizations.

Traditionally, AI models are created using various programming languages, with Python being favored for its simplicity despite its slower performance. Developers often turn to C++ for its speed but face a steeper learning curve. Mojo aims to bridge this gap by offering a user-friendly syntax akin to Python while delivering execution speeds thousands of times faster. This allows developers to create efficient AI models without delving into more complex languages like C++.

When Mojo was first revealed, many developers were eager but cautious, particularly regarding its open-source status. Chris Lattner, co-founder of Modular, responded to inquiries about an open-source release with uncertainty, leading to skepticism in the developer community:

“The promotion is great, but if it’s not open source, I won’t spend any time trying it.”

“It’s clearly an overhyped programming language, and it’s not open source! Chris Lattner wants to deceive millions of Python developers!”

“I can’t spend time on a language that might or might not be open source, especially considering the current commercial environment of OSS…”

Now, with its open-source launch, Mojo has quickly garnered 17.6k stars and 2.1k forks!

01 The Beginning of Mojo’s Open Source Journey

Modular has unveiled the open-sourcing of Mojo’s core standard library components today. This library contains essential syntax and features that form the backbone of the language, including tools for optimizing AI hyperparameters that guide neural networks in processing data.

“The Mojo standard library is still undergoing vigorous development and rapid changes, so we are open sourcing its core modules first. This marks an important starting point for our open-source journey, not the end,” the company stated.

Modular believes that open-sourcing will help gather valuable feedback from the developer community, enhancing Mojo's development. They emphasize a comprehensive approach to open source, encouraging external contributions through GitHub pull requests and fostering community involvement.

Additionally, they will publish nightly builds of the Mojo compiler, allowing developers to test the latest features and engage in continuous integration testing.

Last year, Modular launched MAX, a commercial AI platform with tools for building high-performance AI applications deployable across various hardware environments, including Kubernetes. Future plans include open-sourcing some MAX components.

They have adopted the Apache 2 LLVM license, a customized version designed to work well with GPL2 licensed software, addressing concerns about compatibility and acknowledgment requirements.

02 The Future of AI Programming with Mojo

In May 2023, Modular claimed that Mojo was 35,000 times faster than raw Python for certain algorithms. By September, this figure was updated to 68,000 times, and by October, it was reported to be 90,000 times faster.

Chris Lattner described Mojo as a transformative addition to the Python family, enhancing its capabilities and allowing Python developers to explore new domains without needing to switch to C++.

Mojo leverages advanced compiler technology from MLIR, an evolution of LLVM, allowing programmers to optimize their code extensively. It is designed to meet the requirements of Python developers while introducing new optimization techniques.

The Mojo team also draws inspiration from Rust, which they acknowledge as a significant influence on their design.

While Modular has made many comparisons between Mojo and Python, they have also addressed how Mojo stands against Rust. Notably, a recent video showed Mojo outperforming Rust by 50% when parsing DNA sequences, sparking considerable interest given Rust's reputation in the AI domain.

A notable perspective from the Netflix engineer and Rust advocate @ThePrimeagen highlighted that Mojo's familiarity and performance could make it a strong contender in the AI programming landscape.

“If Mojo officially enters the fray, then I believe Mojo will undoubtedly emerge victorious. The reason Mojo will win is that it doesn’t require any changes to the paradigms developers are already familiar with.”

Luca Palmieri, a respected Rust contributor, pointed out two challenges Rust faces in AI applications: slow compilation speeds and the reluctance of Python developers to learn new languages.

Mojo seeks to provide an intuitive experience for Python users. For instance, one developer quickly learned Mojo and implemented SIMD optimization algorithms in just a few weeks.

03 Built on Cutting-Edge Compiler Technology

Mojo is set apart by its foundation in MLIR, a more modern compiler stack compared to LLVM, which Rust uses. Chris Lattner, who founded LLVM, later contributed to the development of MLIR at Google, focusing on AI accelerator projects.

Modular claims that Mojo is the first language to fully utilize MLIR's advanced features, enabling superior performance and support for GPU acceleration.

Two highlights of Mojo's design are its ergonomic SIMD capabilities and its eager destruction feature. Unlike Rust’s RAII model, which holds onto memory until the end of a scope, Mojo releases memory as soon as it is no longer needed—an advantage in AI scenarios that require managing GPU memory efficiently.

In conclusion, developers are faced with the choice between Mojo's user-friendly design and the high performance of languages like C, C++, or Rust. The Mojo team encourages developers to explore this promising language that could shape AI development over the next 50 years.

Stackademic?

Thank you for reading! Before you leave: - Please consider clapping and following the writer! ? - Follow us on X | LinkedIn | YouTube | Discord - Explore our other platforms: In Plain English | CoFeed | Venture | Cubed - More content available at Stackademic.com