I’ve been looking at self-hosted workflow automation tooling with an eye on LLM and AI Agent capabilities.
One of the most popular tools for building LLM-powered agents and workflows right now is LangChain (GitHub repo). It comes in multiple flavors - Python and JavaScript frameworks, plus managed SaaS offerings and additional tooling directly from the developers of the project.
Several new tools have sprung up around this ecosystem, leveraging the open source Python and JS implementations. These aim to provide graphical workflow builders in contrast to what is normally a code-only experience - I feel this brings down the barrier to entry somewhat and lets people validate and iterate on their ideas a bit faster.
To that end, I’ve selected 3 tools I want to explore in more detail:
This is not an in-depth evaluation of these tools. That will come later. Probably.
Initial Thoughts
All three tools are GUI workflow / agent builders leveraging LangChain to varying degree.
LangFlow and Flowise are the most focused on LLM agents - these serve primarily to build LLM-powered apps and that is their primary focus. They’re the most specialized for LLMs and give you options to integrate with or embed within other products.
On the other hand, you’ve got n8n - a business workflow automation platform which recently started placing an emphasis on ML / AI / LLM workflows. It’s got plenty of integrations for other software and services and appears quite mature.
I’m exploring all three right now, trying to figure out which one will let me build something useful. Will post an update on how that goes.
For anyone in a similar spot, I’m publishing my docker compose template below. It’s meant for you to spin these up and have a quick look, dont use this in production - it’s basic, not secured, and your data is at risk!
If you want to deploy one of these tools in production, please review their respective documentation and follow their recommendations.
Setup
- Download and install Docker Desktop or compatible alternative.
- Create a folder for your project.
- Inside this folder create
docker-compose.yaml
file. - Populate the .yaml file with the docker compose config found below
- Open a terminal of your choice
- Navigate to your project directory
- Run command
docker compose up -d
. This will take a while to download, unpack, and deploy the images. About 6GB. - Go an explore, play with the tools.
- When you’re done,
docker compose stop
to stop ordocker compose down
to delete.
The three tools will be available to you locally at these local addresses:
- LangFlow - http://127.0.0.1:7701/
- Flowise - http://127.0.0.1:7702/
- n8n - http://127.0.0.1:7703/
Notes on docker usage, if you’re new:
- If you run docker compose command with this config multiple times, docker will refuse to spin up more containers - this is because containers, volumes, and ports need to be unique. You must spin down the old containers first.
docker compose down
will spin down your containers, but will preserve volumes and the data they contain.docker compose down -v
if you also want to delete the volumes.- By default docker compose will use folder name as your project name. You can give your project a different name like so:
docker compose -p YourProjectName up -d
- If you run
docker compose up
without the-d
flag, containers will dump all their output / logs to console. I think they’ll be attached to your console process too. Dont do it unless you’re debugging the container itself.
Docker Compose
### Template for deploying LangFlow, Flowise, and n8n side by side.
### Dont use in production
services:
### LangFlow service
langflow-example-postgres:
image: postgres:16
environment:
POSTGRES_USER: langflowuser
POSTGRES_PASSWORD: lfPW$90!kfgf135toab!
POSTGRES_DB: langflowdb
ports:
- "7710:5432"
volumes:
- langflow-example-postgres_data:/var/lib/postgresql/data
langflow-example:
image: langflowai/langflow:latest
pull_policy: always
ports:
- "7701:7860"
depends_on:
- langflow-example-postgres
environment:
- LANGFLOW_DATABASE_URL=postgresql://langflowuser:lfPW$90!kfgf135toab!@langflow-example-postgres:5432/langflowdb
volumes:
- langflow-example-data:/app/langflow
### Flowise service
flowise-example:
image: flowiseai/flowise
restart: always
environment:
- PORT=7702
- DATABASE_PATH=/root/.flowise
- APIKEY_PATH=/root/.flowise
- SECRETKEY_PATH=/root/.flowise
- LOG_PATH=/root/.flowise/logs
- BLOB_STORAGE_PATH=/root/.flowise/storage
ports:
- '7702:7702'
volumes:
- flowise-example_data:/root/.flowise
entrypoint: /bin/sh -c "sleep 3; flowise start"
### n8n service
n8n-example:
image: docker.n8n.io/n8nio/n8n
restart: always
ports:
- "127.0.0.1:7703:5678"
environment:
- N8N_PORT=5678
- NODE_ENV=production
- GENERIC_TIMEZONE="Europe/London"
volumes:
- n8n-example_data:/home/node/.n8n
### Data volumes
volumes:
### LangFlow volumes
langflow-example-postgres_data:
langflow-example-data:
### Flowise volumes
flowise-example_data:
### n8n volumes
n8n-example_data: