AppFlowy Local AI - Ollama
Last updated
Was this helpful?
Last updated
Was this helpful?
AppFlowy leverages Ollama integration to provide local AI functionality, allowing you to run AI features directly on your device without relying on cloud services. This means you can enjoy the power of AI while keeping your data fully private and secure.
Why use AppFlowy Local AI?
Maximum privacy and security – Your data is not sent to AI service providers
Stay in the flow – Get powerful AI assistance without interruptions
All-in-one experience – Enjoy AppFlowy's rich features without switching tools
⚠️ For optimal performance, ensure your system has the following minimum RAM:
The setup involves three main components:
Part I: Installing and configuring Ollama with the required models
Part II: Installing the AppFlowy Local AI plugin
Part III: Enabling the plugin in the AppFlowy application (supports macOS, Windows, and Linux)
Simply download and install on your desktop device.
You can verify the installation in the terminal by running
By default, we use the following models:
llama3.1
nomic-embed-text
Run these commands in your terminal to download the models:
Verify the Download:
Check installed models with:
You should see llama3.1
and nomic-embed-text
in the output.
After installing Ollama, the server starts automatically. To view its logs, first stop the server and then start a new terminal window and run:
⚠️ Keep the Server Running:
The terminal window where you run ollama serve
must remain open.
For production setups, run Ollama as a background service (e.g., use systemd
on Linux or nohup ollama serve &
).
LAI is pronounced “lay”
Download the latest version of AppFlowy LAI.
Unzip the downloaded file.
Drag AppFlowy LAI to your Applications
folder.
Launch AppFlowy LAI from your Applications directory. Important: Do not move or delete the AppFlowy LAI application if you plan to continue using local AI features in AppFlowy
Open a new terminal window.
Run this command to confirm the AI plugin is accessible:
✅ Expected Output:
Download the latest release suitable for Windows (typically a .zip
file).
Locate the downloaded zip file (AppFlowyLAI.zip
) in your Downloads folder.
Right-click the file and select Extract All to unzip it to your preferred location.
Navigate to the extracted folder (AppFlowyLAI
).
Double-click the executable file (AppFlowyLAI.exe
) to launch the application.
Important: Do not move or delete the AppFlowyLAI.exe
file or its containing folder if you intend to keep using local AI features.
Open a new PowerShell tab or window.
Verify the installation path of ollama_ai_plugin.exe
:
Expected output:
Ensure the output matches the location of your extracted AppFlowyLAI.exe
.
By default, AppFlowy LAI is disabled. To enable it, open the settings page in AppFlowy Desktop and toggle the local AI option.
Once enabled, the application will begin initializing AppFlowy LAI. Make sure Ollama is running!
After a few seconds—or minutes, depending on your machine’s performance—you’ll see that AppFlowy LAI is running.
With Local AI toggled on, all available AI features use local AI models. To use an online model such as GPT-4o, you need to turn off Local AI.
If you want to use additional models, first download (pull) them and then update the configuration in the settings page of AppFlowy.
To use the deepseek-r1 model, first download it with the following command:
Once the download completes, update the configuration in the AppFlowy settings page by renaming the chat model from llama3.1 to deepseek-r1.
If you attempt to use a model that hasn't been downloaded, AppFlowy will display a "model not found" message. You can download the model and then reopen Settings.
AppFlowy LAI supports the following features:
Please note text-to-image is not yet available in AppFlowy LAI
AI Writers in Document
Ask AI anything
Summarize
Fix spelling & grammar
Continue writing
Improve writing
Explain
Make longer
Make shorter
Upcoming features:
Chat with PDF
Chat with image
Chat with local files
Check out to learn more about how to use Ollama.
Visit the .
Visit the to find the latest Windows version.
AI Chat ()