MZLA Technologies, a subsidiary of the Mozilla Foundation and the organization behind Thunderbird, has introduced Thunderbolt, an open-source AI client for organizations that prefer to run AI on their own infrastructure rather than depend on third-party hosted services.
Based on the project’s documents, it is an AI client that allows users to switch between different modes of use and connect to different model providers. The repository lists Chat Mode, Search Mode, Research Mode in preview, Tasks in preview, custom models/providers, Google integration, Microsoft integration, Ollama compatibility, MCP support in preview, and OIDC support.
Or to put it simply, Thunderbolt is meant to be a front end where an organization’s users interact with AI for things like chat, search, research workflows, and task-based automation, while the back end can be tied to the models and systems the organization chooses.
Moreover, Thunderbolt is designed to work with multiple model sources rather than one fixed AI vendor. It says LLM calls go through a backend inference proxy and lists support for providers such as Anthropic, OpenAI, Mistral, and OpenRouter, while the roadmap separately lists Ollama compatibility, which points to support for local model setups as well.
According to the project’s GitHub repository, Thunderbolt is available for web, Linux, Windows, macOS, iOS, and Android. It supports frontier, local, and on-premises models, emphasizing model choice, data ownership, and avoidance of vendor lock-in.
Keep in mind, however, that Thunderbolt is not yet a finished mass-market product. The GitHub repository notes that the project is under active development, undergoing a security audit, and preparing for enterprise production readiness. While the code is public and the product officially introduced, the platform is still progressing toward enterprise-ready status.
For more details, visit the official Thunderbolt website.
