Invoke - Local LLM Client

Invoke - Local LLM Client

Download on App Store
Logo of Invoke - Local LLM Client

Invoke - Local LLM Client: Private AI Chat Powered by Your Own Hardware

The seamless mobile interface for Ollama and LM Studio. Secure, fully offline, and privacy-focused AI interactions within your local network.

Publisher

kazuhiko sugimoto

Category

Developer Tools

Downloads

2K+

User Rating

3.2/5

Total Ratings

0

Locales

2

The Private Interface

Discover the interface used by 2K+ users.

Configuration screens for LM Studio and Ollama showing how to enable network access for local LLM connections.

Configuration screens for LM Studio and Ollama showing how to enable network access for local LLM connections.

Server settings screen in the Invoke app showing Ollama configuration and language model selection

Server settings screen in the Invoke app showing Ollama configuration and language model selection

A chat conversation between a user and the gemma3-1b local model within the Invoke app interface

A chat conversation between a user and the gemma3-1b local model within the Invoke app interface

Master Your Private AI Ecosystem

The tools that make this app stand out, trusted by 2K+ users.

🔒

Absolute Data Sovereignty

Keep your conversations entirely within your local network. No external APIs, no cloud tracking, and zero data leakage to third parties.

Seamless Server Sync

Instantly connect to Ollama or LM Studio. Switch between models like Llama 3 or Mistral on the fly with real-time message streaming.

📱

The Lean-Back Experience

Untether from your workstation. Access your home lab hardware from your couch with a fluid, dark-mode optimized chat interface.

About the app

Everything you need to know about Invoke - Local LLM Client.

Description

Note: This app is intended for users who are able to set up a local LLM server (Ollama or LM Studio) within their own LAN environment. Some technical setup is required. Chat with your local LLM! Seamlessly connect to Ollama or LM Studio for a fully offline, privacy-focused AI chat experience! This iOS app connects to a locally hosted Large Language Model (LLM) server and enables seamless, natural conversations. Compatible with Ollama and LM Studio via HTTP, it provides real-time message streaming and intuitive chat history management. The app operates entirely within a local network—no internet connection required—making it ideal for those who prioritize privacy and security. Key Features: - Easy connection to local LLM servers (Ollama / LM Studio) - Natural chat UI with bubble-style layout - Auto-saving and browsing chat history - Server and model selection via settings screen - Supports Dark Mode

Latest Version

1.1.1

Size

2.7 MB

First Released

Aug 3, 2025

Command Your Local Models From Anywhere

Bridge the gap between your workstation and your mobile device. Experience a polished, dark-mode chat UI for your Ollama or LM Studio setup—no data ever leaves your LAN.

Download on App Store

App information, icons, screenshots, and descriptions displayed on this page are sourced from the Apple App Store and are the property of their respective developers. Download estimates and rankings are based on MWM's proprietary models and may not reflect actual figures. This page is provided for informational and analytical purposes only.

Believe this page infringes your intellectual property? File a dispute