LogoLogo
  • AppFlowy
    • ⭐Start here
      • Welcome to AppFlowy Docs
      • How to get help
      • Install AppFlowy
    • 🛠️Installation
      • 🖥️System Requirements
      • 💿Installation methods
        • Mac / Windows / Linux Packages
          • Installing on Linux
            • Installing & Setting up Flutter on Linux from Source
        • Docker
    • 🌱Community
      • 🤙Get in contact
      • 📔AppFlowy Mentorship Program
        • Program Guidance
        • Proposal Template
        • Pull Request Template
        • Mentorship 2023
          • Mentee Projects
            • Calendar View for AppFlowy Database
            • Custom Themes
            • Shortcuts and Customized Hotkeys for AppFlowy
            • Table
            • ⭐Favorites
            • Code Block
            • Outlines
            • Importers
            • AI Writers
            • Templates
          • Project Ideas
      • ✍️Write for AppFlowy
        • 📃Drafts
          • [Draft] Use Case: Software Engineer
          • [Draft] Use Case: High School Students
          • [Draft] How to add a new property to appflowy database
      • 🍂Hacktoberfest
    • 🛣️Roadmap
    • 🌋Product
      • 💽Data Storage
      • 🎨Customize and Style Content
      • ⏮️Duplicate, Delete, and Restore
      • 💎Databases
        • 🔢Database Properties
        • 🗃️Manage Properties
      • Ⓜ️Markdown
      • ⌨️Shortcuts
      • 🪄AppFlowy AI
      • 🦙AppFlowy Local AI - Ollama
      • 🎨Themes
      • ☁️AppFlowy Cloud
      • 🧩AppFlowy Plugins
        • Table-view Databases
        • Kanban Board
        • Calendar
        • Auto Generator
        • Smart Edit
        • Code Blocks
        • Math Equations
        • Cover
        • Emoji
  • Documentation
    • 💎Software Contributions
      • 🟢Get started
      • 💀Architecture
        • Frontend
          • Tauri
            • 🗺️CodeMap
          • Web
            • 🌟Design Philosophy
          • Flutter
            • 🗺️Project Structure: CodeMap
            • 🧮Grid
            • ⚙️Setting
          • Inter-Process Communication
          • User
            • User Data
            • Events & Notifications
          • Folder
            • Events & Notifications
          • Document
          • Database View
            • Events & Notifications
            • Grid
            • Calendar
            • Kanban Board
        • Backend
          • Initialize
          • Events
          • Delta(WIP)
          • Profiling
          • Database
        • Domain Driven Design
        • Proposals
      • 🏗️Conventions
        • 🔤Naming Conventions
        • ⌨️Code Conventions
          • 🐦Flutter
        • 🐙Git Conventions
      • 💛Submitting Code
        • 🏦Setting Up Your Repositories
        • ⤴️Submitting your first Pull Request
      • 🤟Coding Standards and Practices
        • 👽Rust Backend
    • 🚀AppFlowy
      • 👾How to contribute to AppFlowy
      • 🏗️Building from Source
        • 🌳Flutter Setup
          • 🐧Building on Linux
          • 🍎Building on macOS
          • 🪟Building on Windows
        • 🌐Web Setup
        • 📡Tauri Setup
      • ☁️Debugging with AppFlowy Cloud
      • 🔁Debugging in VS Code
      • ☎️Translate AppFlowy
      • ❓Troubleshooting
      • 👮‍♀️Licenses
    • 🏍️AppFlowy Editor
      • ⌨️How to Implement Markdown Syntax To Style Text In AppFlowy Editor
      • 🧩How to Create a Plugin for AppFlowy Editor
      • 👮‍♀️Licenses
    • ☁️AppFlowy Cloud
      • 🌈Architecture
      • ☀️Deployment
  • Guides
    • Sync Desktop and Mobile
    • Self-Hosting AppFlowy
      • ☁️Self-hosting AppFlowy with AppFlowy Cloud
      • 🆓Self-hosting AppFlowy for free Using Supabase
    • Import From Notion
  • Blog Highlights
    • 🔮Demystifying AppFlowy Editor's Codebase
  • Handbook
    • Core values
Powered by GitBook
On this page
  • Setup Process Overview
  • Part I: Install Ollama
  • Download Required Models
  • Start the Ollama Server (Optional)
  • Part II: Install AppFlowy Local AI (AppFlowy LAI)
  • macOS
  • Windows
  • Part III: Enable AppFlowy LAI
  • Q & A
  • 1. How to use other models?
  • 2. What features does AppFlowy LAI support?

Was this helpful?

Edit on GitHub
  1. AppFlowy
  2. Product

AppFlowy Local AI - Ollama

PreviousAppFlowy AINextThemes

Last updated 2 months ago

Was this helpful?

AppFlowy leverages Ollama integration to provide local AI functionality, allowing you to run AI features directly on your device without relying on cloud services. This means you can enjoy the power of AI while keeping your data fully private and secure.

Why use AppFlowy Local AI?

  • Maximum privacy and security – Your data is not sent to AI service providers

  • Stay in the flow – Get powerful AI assistance without interruptions

  • All-in-one experience – Enjoy AppFlowy's rich features without switching tools

⚠️ For optimal performance, ensure your system has the following minimum RAM:

•	7B models: 8 GB
•	13B models: 16 GB
•	33B models: 32 GB

Setup Process Overview

The setup involves three main components:

  • Part I: Installing and configuring Ollama with the required models

  • Part II: Installing the AppFlowy Local AI plugin

  • Part III: Enabling the plugin in the AppFlowy application (supports macOS, Windows, and Linux)

Part I: Install Ollama

Simply download and install on your desktop device.

You can verify the installation in the terminal by running

ollama --version

Download Required Models

By default, we use the following models:

  • llama3.1

  • nomic-embed-text

Run these commands in your terminal to download the models:

ollama pull llama3.1  
ollama pull nomic-embed-text

Verify the Download:

Check installed models with:

ollama list

You should see llama3.1 and nomic-embed-text in the output.


Start the Ollama Server (Optional)

After installing Ollama, the server starts automatically. To view its logs, first stop the server and then start a new terminal window and run:

ollama serve

⚠️ Keep the Server Running:

  • The terminal window where you run ollama serve must remain open.

  • For production setups, run Ollama as a background service (e.g., use systemd on Linux or nohup ollama serve &).

Part II: Install AppFlowy Local AI (AppFlowy LAI)

LAI is pronounced “lay”

macOS

Step 1: Download the Application

  1. Download the latest version of AppFlowy LAI.

Step 2: Install

  1. Unzip the downloaded file.

  2. Drag AppFlowy LAI to your Applications folder.

  3. Launch AppFlowy LAI from your Applications directory. Important: Do not move or delete the AppFlowy LAI application if you plan to continue using local AI features in AppFlowy

Step 3: Install Application

Step 4: Verify Installation (Optional)

  1. Open a new terminal window.

  2. Run this command to confirm the AI plugin is accessible:

command -v ollama_ai_plugin  

✅ Expected Output:

/usr/local/bin/ollama_ai_plugin  

Windows

Step 1: Download the Application

  1. Download the latest release suitable for Windows (typically a .zip file).


Step 2: Extract the Application

  1. Locate the downloaded zip file (AppFlowyLAI.zip) in your Downloads folder.

  2. Right-click the file and select Extract All to unzip it to your preferred location.


Step 3: Launch AppFlowy LAI

  1. Navigate to the extracted folder (AppFlowyLAI).

  2. Double-click the executable file (AppFlowyLAI.exe) to launch the application.

Important: Do not move or delete the AppFlowyLAI.exe file or its containing folder if you intend to keep using local AI features.


Step 3: Verify Installation (Optional)

  1. Open a new PowerShell tab or window.

  2. Verify the installation path of ollama_ai_plugin.exe:

Get-Command ollama_ai_plugin | Select-Object -ExpandProperty Definition

Expected output:

C:\path\to\AppFlowyLAI\ollama_ai_plugin.exe

Ensure the output matches the location of your extracted AppFlowyLAI.exe.

Part III: Enable AppFlowy LAI

By default, AppFlowy LAI is disabled. To enable it, open the settings page in AppFlowy Desktop and toggle the local AI option.

Once enabled, the application will begin initializing AppFlowy LAI. Make sure Ollama is running!

After a few seconds—or minutes, depending on your machine’s performance—you’ll see that AppFlowy LAI is running.

With Local AI toggled on, all available AI features use local AI models. To use an online model such as GPT-4o, you need to turn off Local AI.

Q & A

1. How to use other models?

If you want to use additional models, first download (pull) them and then update the configuration in the settings page of AppFlowy.

To use the deepseek-r1 model, first download it with the following command:

ollama pull deepseek-r1

Once the download completes, update the configuration in the AppFlowy settings page by renaming the chat model from llama3.1 to deepseek-r1.

If you attempt to use a model that hasn't been downloaded, AppFlowy will display a "model not found" message. You can download the model and then reopen Settings.

2. What features does AppFlowy LAI support?

AppFlowy LAI supports the following features:

    • Please note text-to-image is not yet available in AppFlowy LAI

  • AI Writers in Document

    • Ask AI anything

    • Summarize

    • Fix spelling & grammar

    • Continue writing

    • Improve writing

    • Explain

    • Make longer

    • Make shorter

Upcoming features:

  • Chat with PDF

  • Chat with image

  • Chat with local files

Check out to learn more about how to use Ollama.

Visit the .

Visit the to find the latest Windows version.

AI Chat ()

🌋
🦙
desktop
Ollama
this video tutorial
AppFlowy-LocalAI releases page
AppFlowy-LocalAI releases page
refer to this guide for more details