Rust and Artificial Intelligence: the Rust Foundation’s Position

This document outlines the Rust Foundation’s latest perspective on AI’s potential and responsible use in the Rust ecosystem.

The views shared in this position statement are those of the Rust Foundation and not necessarily those of Rust Project maintainers/community members.

Published on Thursday, May 8, 2025.


Empowerment is central to the Rust programming language’s mission. As Artificial Intelligence and Machine Learning quickly transform the technological landscape, the Rust Foundation is dedicated to helping individuals and organizations adopt the Rust language to build innovative solutions. We envision Rust as both a tool and a catalyst for transformative breakthroughs that promote safety, speed, and collaboration, ultimately enriching our global community and fostering meaningful progress.

Regarding Artificial Intelligence, the Rust Foundation…

  • Believes the Rust programming language can become synonymous with ultra-reliable, production-grade AI systems by harnessing its memory safety and performance to power complex training pipelines, large-scale model deployments, and inference, especially on edge devices, without sacrificing speed.
  • Aims to support companies and individuals in creating a strong ecosystem of Rust-based frameworks, libraries, and tools.
  • Acknowledges Rust’s potential to effectively connect various AI tools and integrate them with different programming languages and frameworks. Our goal is to help create an environment where developers can thrive at all levels of the modern AI/ML stack, including training, inference, and hardware.
  • Maintains that comprehensive documentation, accessible educational resources, and supportive community channels can smooth the learning curve in AI/ML in a world that is increasingly invested in it.
  • Wants newcomers engaging with AI development to be able to up-skill and prototype quickly, and advanced, responsible users to be empowered to scale their projects with minimal friction and reduced overheads in their workflows.

Rust’s Real-World Potential in AI Inference & Training

Inference is what brings an AI model’s training to life in day-to-day operations. It determines how swiftly insights translate into impactful decisions for users. Successful deployment is not just about training powerful models; it depends on fast, reliable inference that seamlessly integrates into production environments.

Rust’s powerful type system, ownership model, and compile-time memory safety make it well-suited for performance-critical tasks such as real-time inference or parallel processing. Its concurrency model facilitates safe parallelism, which is essential for leveraging modern hardware during inference.

While Python has established itself as the incumbent in large-scale model training, Rust is poised to revolutionize real-time inference. Projects like Burn demonstrate that Rust is capable of training and inference. Moreover, innovative models such as Moshi prove that Rust can be used to efficiently train and deploy models, underscoring its expanding role in the AI pipeline. The Rust Foundation is confident that frameworks built in Rust will enable significantly faster and safer training and the deployment of cross-platform models.

Inference on the Edge: Practical Applications of Rust in Memory-Limited Devices

Edge devices, such as smartphones, smart glasses, wearables, and other resource-limited devices, represent one of the most critical emerging frontiers for AI. Rust’s efficiency and cross-platform capabilities enable the training and deployment of models across various hardware domains. Additionally, the safety guarantees of Rust position it to harness the full potential of local on-device inference. However, a significant challenge remains: convincing vendors to fully embrace Rust as a viable option in the AI-on-edge landscape. This challenge extends beyond just the technical advantages of Rust; it also involves addressing issues related to ecosystem maturity, toolchain support, and developer familiarity. The Rust Foundation, with the proper support, is ready to work towards making Rust a prominent choice among hardware vendors.

Harnessing Rust’s Energy Efficiency for Sustainable AI

Today’s AI infrastructure and the inference stack are extremely resource-intensive, presenting critical environmental and sustainability challenges that we should all be considering seriously. While we acknowledge the progress being made toward enhancing the energy efficiency of AI infrastructure, the Rust Foundation is committed to being a responsible participant in this endeavor. We aim to support companies and individuals in effectively leveraging Rust’s capabilities to help mitigate the environmental impact of various components within the inference stack.

Although the primary training and inference computations still rely on C++ libraries running on a GPU, strategic investment in using Rust in these areas can contribute to a more sustainable and efficient AI ecosystem. A key development in this direction is the Rust Foundation’s Interop Initiative, which aims to improve interoperability between Rust, C++, Python, and other languages.


Advancements in model architecture, the availability of vast training data, powerful computing resources, and real-world incentives are driving the rapid progression of artificial intelligence and its continued growth is undeniable. As stewards of a programming language that is admired for its security and scalability, the Rust Foundation believes that Rust can play a vital role in developing practical, secure, and sustainable AI solutions.

We invite organizations, initiatives, and individuals passionate about building innovative AI solutions with Rust to connect with us. If you have ideas for meaningful and responsible collaboration in this space, we would love to hear from you.

contact@rustfoundation.org