AppFlowy Local AI - Ollama
AppFlowy leverages Ollama integration to provide local AI functionality, allowing you to run AI features directly on your device without relying on cloud services. This means you can enjoy the power of AI while keeping your data fully private and secure.
Why use AppFlowy Local AI?
Maximum privacy and security – Your data is not sent to AI service providers
Stay in the flow – Get powerful AI assistance without interruptions
All-in-one experience – Enjoy AppFlowy's rich features without switching tools
⚠️ For optimal performance, ensure your system has the following minimum RAM:
Setup Process Overview
The setup involves three main components:
Part I: Installing and configuring Ollama with the required models
Part II: Installing the AppFlowy Local AI plugin
Part III: Enabling the plugin in the AppFlowy desktop application (supports macOS, Windows, and Linux)
Part I: Install Ollama
Simply download and install Ollama on your desktop device.
You can verify the installation in the terminal by running
Download Required Models
By default, we use the following models:
llama3.1
nomic-embed-text
Run these commands in your terminal to download the models:
Check out this video tutorial to learn more about how to use Ollama.
Verify the Download:
Check installed models with:
You should see llama3.1
and nomic-embed-text
in the output.
Start the Ollama Server (Optional)
After installing Ollama, the server starts automatically. To view its logs, first stop the server and then start a new terminal window and run:
⚠️ Keep the Server Running:
The terminal window where you run
ollama serve
must remain open.For production setups, run Ollama as a background service (e.g., use
systemd
on Linux ornohup ollama serve &
).
Part II: Install AppFlowy Local AI (AppFlowy LAI)
LAI is pronounced “lay”
macOS
Step 1: Download the Application
Visit the AppFlowy-LocalAI releases page.
Download the latest version of AppFlowy LAI.
Step 2: Install
Unzip the downloaded file.
Drag AppFlowy LAI to your
Applications
folder.Launch AppFlowy LAI from your Applications directory. Important: Do not move or delete the AppFlowy LAI application if you plan to continue using local AI features in AppFlowy
Step 3: Install Application
Step 4: Verify Installation (Optional)
Open a new terminal window.
Run this command to confirm the AI plugin is accessible:
✅ Expected Output:
Windows
Step 1: Download the Application
Visit the AppFlowy-LocalAI releases page to find the latest Windows version.
Download the latest release suitable for Windows (typically a
.zip
file).
Step 2: Extract the Application
Locate the downloaded zip file (
AppFlowyLAI.zip
) in your Downloads folder.Right-click the file and select Extract All to unzip it to your preferred location.
Step 3: Launch AppFlowy LAI
Navigate to the extracted folder (
AppFlowyLAI
).Double-click the executable file (
AppFlowyLAI.exe
) to launch the application.
Important: Do not move or delete the AppFlowyLAI.exe
file or its containing folder if you intend to keep using local AI features.
Step 3: Verify Installation (Optional)
Open a new PowerShell tab or window.
Verify the installation path of
ollama_ai_plugin.exe
:
Expected output:
Ensure the output matches the location of your extracted AppFlowyLAI.exe
.
Part III: Enable AppFlowy LAI
By default, AppFlowy LAI is disabled. To enable it, open the settings page in AppFlowy Desktop and toggle the local AI option.
Once enabled, the application will begin initializing AppFlowy LAI. Make sure Ollama is running!
After a few seconds—or minutes, depending on your machine’s performance—you’ll see that AppFlowy LAI is running.
With Local AI toggled on, all available AI features use local AI models. To use an online model such as GPT-4o, you need to turn off Local AI.
Q & A
1. How to use other models?
If you want to use additional models, first download (pull) them and then update the configuration in the settings page of AppFlowy.
To use the deepseek-r1 model, first download it with the following command:
Once the download completes, update the configuration in the AppFlowy settings page by renaming the chat model from llama3.1 to deepseek-r1.
If you attempt to use a model that hasn't been downloaded, AppFlowy will display a "model not found" message. You can download the model and then reopen Settings.
2. What features does AppFlowy LAI support?
AppFlowy LAI supports the following features:
AI Chat (refer to this guide for more details)
Please note text-to-image is not yet available in AppFlowy LAI
AI Writers in Document
Ask AI anything
Summarize
Fix spelling & grammar
Continue writing
Improve writing
Explain
Make longer
Make shorter
Upcoming features:
Chat with PDF
Chat with image
Chat with local files
Last updated
Was this helpful?