5 Best Apps to Run LLM on Your Smartphone Locally

6 Min Read

Ready to break free from the cloud and unleash the AI powerhouse tucked inside your smartphone? We’re peeling back the curtain on the world of locally runnable LLMs – massive language models that live right on your phone, bringing the next era of AI into your pocket.

This ain’t your grandma’s chatbot! Forget laggy connections and data privacy worries. This guide unveils the coolest apps that let you chat with cutting-edge AI, generate custom text formats, and unlock a treasure trove of offline creative powers – all without ever needing a Wi-Fi signal.

Apps to Run LLM on Your Smartphone Locally

ChatterUI

ChatterUI - 5 Best Apps to Run LLM on Your Smartphone Locally

ChatterUI is a mobile frontend for managing chat files and character cards. It supports various backends including KoboldAI, AI Horde, text-generation-webui, Mancer, and Text Completion Local using llama.rn. It’s experimental, so users may lose their chat histories on updates. ChatterUI is linked to the ggml library and can run LLaMA models locally. It’s a great tool for those interested in experimenting with different models on their smartphone without the need for a constant internet connection.

However, it’s still in the experimental stage, and updates may result in the loss of chat histories. Its support for various backends also means that users have the flexibility to choose the model that best suits their needs. It’s a valuable tool for those interested in exploring the capabilities of LLMs. Its user-friendly interface and support for various backends make it a versatile tool for anyone interested in experimenting with LLMs.

Maid

Maid - 5 Best Apps to Run LLM on Your Smartphone Locally
  • UI: Character-based, one of the most decent interfaces.
  • Pros:
    • Cross-platform compatibility (Windows, Linux, Android).
    • Supports both local and remote operations.
    • Detailed instructions for each mode of operation.
  • Cons:
    • No availability for MacOS and iOS​

Maid is a cross-platform Flutter app that interfaces with GGUF/llama.cpp models locally, and with Ollama and OpenAI models remotely. The app is designed for use on multiple devices, including Windows, Linux, and Android, though MacOS and iOS releases are not yet available. It offers both local and remote modes of operation, with detailed instructions for each. The app has been tested on various devices and operating systems. However, it’s important to note that Maid does not provide the LLaMA models and is solely an environment for their functionality​.

MLC LLM

android demo - 5 Best Apps to Run LLM on Your Smartphone Locally
  • UI: Extremely minimal.
  • Pros:
    • Universal solution for various hardware backends and native applications.
    • Supports a wide range of operating systems and web browsers.
  • Cons:
    • Model license considerations for pre-packaged demos​

MLC LLM is a universal solution that allows deployment of any language model natively on various hardware backends and native applications. It offers support for iOS, Android, Windows, Linux, Mac, and web browsers. The iOS app, MLCChat, is available for iPhone and iPad, while the Android demo APK is also available for download. The cpp interface of MLC LLM supports various GPUs. Additionally, WebLLM is a companion project that runs MLC LLM natively in browsers using WebGPU and WebAssembly. However, it’s important to be aware of the model license associated with the pre-packaged demos​​​​​​​​​​​​.

Sherpa

  • UI: Very minimal but with adequate features.
  • Pros:
    • Mobile implementation of llama.cpp model.
    • Supports multiple devices.
    • Functions as an offline chatbot​

Sherpa: Sherpa is a mobile implementation of the llama.cpp model, functioning as a demo app to create an offline chatbot similar to OpenAI’s ChatGPT. It works with Vicuna and other latest models, supporting multiple devices like Windows, Mac, and Android. The app is developed using Flutter and requires downloading the 7B llama model from Meta for research purposes. The app serves as a demo for the model’s capabilities and does not provide the LLaMA models themselves​​​​​​​​.

Jan (Coming Soon)

mobile app dark - 5 Best Apps to Run LLM on Your Smartphone Locally
  • UI: The best among the four.
  • Pros:
    • Designed for 100% offline use.
    • Customizable AI assistants and features like global hotkeys.
    • Focus on privacy with secure data storage.
  • Cons:
    • Still in development, may contain bugs

Jan is an open-source ChatGPT alternative designed to run 100% offline on your computer. It is currently in the development phase and may have bugs. Jan aims to increase productivity by offering customizable AI assistants, global hotkeys, and in-line AI. It is built with a focus on being open-source and plans to offer a mobile app for seamless integration into mobile workflows. Additionally, Jan provides an OpenAI-equivalent API server for compatibility with other apps. It emphasizes privacy with local data storage, which is secure, exportable, and can be deleted at any time​.

TAGGED:
Share This Article
Follow:
SK is a versatile writer deeply passionate about anime, evolution, storytelling, art, AI, game development, and VFX. His writings transcend genres, exploring these interests and more. Dive into his captivating world of words and explore the depths of his creative universe.