6 Mobile Apps for Running Local LLMs (Offline)

7 Min Read
Best Apps to Run LLM on Your Smartphone Locally 1 - 6 Mobile Apps for Running Local LLMs (Offline)

Ever wished you could have a super-powered AI assistant chilling on your phone, ready to answer your questions and whip up creative text formats on the fly? Yeah, us too. The problem? Those fancy Large Language Models (LLMs) usually require a beefy computer and an internet connection. But what if we told you there might be a way to bring that AI magic to your smartphone, without needing to break the bank or become a tech whiz? Check out these 5 apps that could turn your phone into a local LLM powerhouse!

Apps to Run LLM on Your Smartphone Locally

Layla & Layla Lite

New Project - 6 Mobile Apps for Running Local LLMs (Offline)

Layla! This app lets you run powerful AI right on your phone, without needing internet. It’s super private and always there for you, anywhere.

Think of it like having a personal AI assistant, but it works offline! There are two versions: Layla Full for newer phones with lots of memory, and Layla Lite for older phones.

Even though it’s still new (they update it constantly!), Layla is already pretty impressive. In the future, they plan on making it work even better with your phone’s features and add more cool stuff.

Basically, if you want an AI sidekick for your phone, check out Layla or Layla Lite. Full power or lite version, it depends on your phone.


ChatterUI - 6 Mobile Apps for Running Local LLMs (Offline)

ChatterUI is a mobile frontend for managing chat files and character cards. It supports various backends including KoboldAI, AI Horde, text-generation-webui, Mancer, and Text Completion Local using llama.rn. It’s experimental, so users may lose their chat histories on updates. ChatterUI is linked to the ggml library and can run LLaMA models locally. It’s a great tool for those interested in experimenting with different models on their smartphone without the need for a constant internet connection.

However, it’s still in the experimental stage, and updates may result in the loss of chat histories. Its support for various backends also means that users have the flexibility to choose the model that best suits their needs. It’s a valuable tool for those interested in exploring the capabilities of LLMs. Its user-friendly interface and support for various backends make it a versatile tool for anyone interested in experimenting with LLMs.


Maid - 6 Mobile Apps for Running Local LLMs (Offline)
  • UI: Character-based, one of the most decent interfaces.
  • Pros:
    • Cross-platform compatibility (Windows, Linux, Android).
    • Supports both local and remote operations.
    • Detailed instructions for each mode of operation.
  • Cons:
    • No availability for MacOS and iOS​

Maid is a cross-platform Flutter app that interfaces with GGUF/llama.cpp models locally, and with Ollama and OpenAI models remotely. The app is designed for use on multiple devices, including Windows, Linux, and Android, though MacOS and iOS releases are not yet available. It offers both local and remote modes of operation, with detailed instructions for each. The app has been tested on various devices and operating systems. However, it’s important to note that Maid does not provide the LLaMA models and is solely an environment for their functionality​.


android demo - 6 Mobile Apps for Running Local LLMs (Offline)
  • UI: Extremely minimal.
  • Pros:
    • Universal solution for various hardware backends and native applications.
    • Supports a wide range of operating systems and web browsers.
  • Cons:
    • Model license considerations for pre-packaged demos​

MLC LLM is a universal solution that allows deployment of any language model natively on various hardware backends and native applications. It offers support for iOS, Android, Windows, Linux, Mac, and web browsers. The iOS app, MLCChat, is available for iPhone and iPad, while the Android demo APK is also available for download. The cpp interface of MLC LLM supports various GPUs. Additionally, WebLLM is a companion project that runs MLC LLM natively in browsers using WebGPU and WebAssembly. However, it’s important to be aware of the model license associated with the pre-packaged demos​​​​​​​​​​​​.


  • UI: Very minimal but with adequate features.
  • Pros:
    • Mobile implementation of llama.cpp model.
    • Supports multiple devices.
    • Functions as an offline chatbot​

Sherpa: Sherpa is a mobile implementation of the llama.cpp model, functioning as a demo app to create an offline chatbot similar to OpenAI’s ChatGPT. It works with Vicuna and other latest models, supporting multiple devices like Windows, Mac, and Android. The app is developed using Flutter and requires downloading the 7B llama model from Meta for research purposes. The app serves as a demo for the model’s capabilities and does not provide the LLaMA models themselves​​​​​​​​.

Jan (Coming Soon)

mobile app dark - 6 Mobile Apps for Running Local LLMs (Offline)
  • UI: The best among the four.
  • Pros:
    • Designed for 100% offline use.
    • Customizable AI assistants and features like global hotkeys.
    • Focus on privacy with secure data storage.
  • Cons:
    • Still in development, may contain bugs

Jan is an open-source ChatGPT alternative designed to run 100% offline on your computer. It is currently in the development phase and may have bugs. Jan aims to increase productivity by offering customizable AI assistants, global hotkeys, and in-line AI. It is built with a focus on being open-source and plans to offer a mobile app for seamless integration into mobile workflows. Additionally, Jan provides an OpenAI-equivalent API server for compatibility with other apps. It emphasizes privacy with local data storage, which is secure, exportable, and can be deleted at any time​.

Share This Article
SK is a versatile writer deeply passionate about anime, evolution, storytelling, art, AI, game development, and VFX. His writings transcend genres, exploring these interests and more. Dive into his captivating world of words and explore the depths of his creative universe.