You are using an outdated browser. Please upgrade your browser to improve your experience.
Invoke - Local LLM Client

Note: This app is intended for users who are able to set up a local LLM server (Ollama or LM Studio) within their own LAN environment

Note: This app is intended for users who are able to set up a local LLM server (Ollama or LM Studio) within their own LAN environment

Invoke - Local LLM Client

by kazuhiko sugimoto

What is it about?

Note: This app is intended for users who are able to set up a local LLM server (Ollama or LM Studio) within their own LAN environment. Some technical setup is required.

App Details

Version
1.0.1
Rating
NA
Size
2Mb
Genre
Utilities Developer Tools
Last updated
August 7, 2025
Release date
August 3, 2025
More info

App Store Description

Note: This app is intended for users who are able to set up a local LLM server (Ollama or LM Studio) within their own LAN environment. Some technical setup is required.

Chat with your local LLM! Seamlessly connect to Ollama or LM Studio for a fully offline, privacy-focused AI chat experience!

This iOS app connects to a locally hosted Large Language Model (LLM) server and enables seamless, natural conversations.
Compatible with Ollama and LM Studio via HTTP, it provides real-time message streaming and intuitive chat history management.
The app operates entirely within a local network—no internet connection required—making it ideal for those who prioritize privacy and security.

Key Features:
- Easy connection to local LLM servers (Ollama / LM Studio)
- Natural chat UI with bubble-style layout
- Auto-saving and browsing chat history
- Server and model selection via settings screen
- Supports Dark Mode

Disclaimer:
AppAdvice does not own this application and only provides images and links contained in the iTunes Search API, to help our users find the best apps to download. If you are the developer of this app and would like your information removed, please send a request to takedown@appadvice.com and your information will be removed.