Note: This app is intended for users who are able to set up a local LLM server (Ollama or LM Studio) within their own LAN environment
Invoke - Local LLM Client
What is it about?
Note: This app is intended for users who are able to set up a local LLM server (Ollama or LM Studio) within their own LAN environment. Some technical setup is required.
App Store Description
Note: This app is intended for users who are able to set up a local LLM server (Ollama or LM Studio) within their own LAN environment. Some technical setup is required.
Chat with your local LLM! Seamlessly connect to Ollama or LM Studio for a fully offline, privacy-focused AI chat experience!
This iOS app connects to a locally hosted Large Language Model (LLM) server and enables seamless, natural conversations.
Compatible with Ollama and LM Studio via HTTP, it provides real-time message streaming and intuitive chat history management.
The app operates entirely within a local network—no internet connection required—making it ideal for those who prioritize privacy and security.
Key Features:
- Easy connection to local LLM servers (Ollama / LM Studio)
- Natural chat UI with bubble-style layout
- Auto-saving and browsing chat history
- Server and model selection via settings screen
- Supports Dark Mode
AppAdvice does not own this application and only provides images and links contained in the iTunes Search API, to help our users find the best apps to download. If you are the developer of this app and would like your information removed, please send a request to takedown@appadvice.com and your information will be removed.