Getting Started
This guide will help you install and start using Devlog in minutes.
Prerequisites
Before installing Devlog, ensure you have:
- Rust 1.70 or later - Install Rust
- Git - Any recent version
- A Git repository - To analyze commit history
- (Optional) Ollama or llama.cpp - For AI-powered features
Installation
Option 1: Homebrew (macOS)
brew tap ananno/devlog https://gitlab.com/ananno/homebrew-devlog.git
brew install devlog
Option 2: From Source
# Clone the repository
git clone https://gitlab.com/aice/devlog.git
cd devlog
# Build release binary
cargo build --release
# The binary will be available at target/release/devlog
./target/release/devlog --help
# Optional: Install to system path
cargo install --path .
Option 3: Direct Cargo Install
cargo install --git https://gitlab.com/aice/devlog.git
Building Without Cloud LLM Support
If you only need local LLM support:
cargo build --release --no-default-features
This removes OpenAI and Anthropic dependencies, reducing binary size.
Setting Up Local LLM (Recommended)
For privacy-first AI features, set up a local LLM:
Ollama (Easiest)
-
Install Ollama
# macOS/Linux curl https://ollama.ai/install.sh | sh # Or download from: https://ollama.ai -
Pull a model
# Download llama3.2 (recommended, ~2GB) ollama pull llama3.2 # Or use other models ollama pull mistral ollama pull codellama -
Start Ollama server
ollama serveThe server runs on
http://localhost:11434by default. -
Verify installation
devlog --llm ollama:llama3.2 --help
llama.cpp (Advanced)
-
Clone and build
git clone https://github.com/ggerganov/llama.cpp cd llama.cpp make -
Download a model
- Get GGUF format models from HuggingFace
- Place in
llama.cpp/models/directory
-
Start server
./server -m models/your-model.gguf -c 2048Runs on
http://localhost:8080by default. -
Verify installation
devlog --llm llamacpp:MODEL --help
First Changelog
Let's generate your first changelog!
Step 1: Navigate to Your Repository
cd /path/to/your/git/repository
Step 2: Check Available Tags
git tag -l
Output example:
v1.0.0
v1.1.0
v2.0.0
Step 3: Generate Basic Changelog
# Between two versions
devlog -f v1.0.0 -t v2.0.0
# From version to current HEAD
devlog -f v2.0.0 -t HEAD
# Last 20 commits
devlog -n 20
Output (plain mode — no LLM, clean markdown):
# Changelog for my-repo
Changes from `v1.0.0` to `v2.0.0`
## Commits
- `auth` add user authentication system (`abc1234`)
- Merge pull request #125 from feature/dashboard (`def5678`)
- `auth` improve error handling (`ghi9012`)
Step 4: Add AI Enhancement (Optional)
When --llm is set, diff analysis is automatic — devlog analyzes actual code changes, not just commit messages.
devlog -f v1.0.0 -t v2.0.0 --llm ollama:llama3.2
Output (LLM diff-analysis mode — outputs to stdout with progress stages):
======================================================================
Development Log: v1.0.0 → v2.0.0
======================================================================
- (2024-01-05) [PR] Add user authentication system (#123)
• What changed: Implemented JWT-based authentication
• Summary:
- Added JWT token generation and validation
- Created login and logout endpoints
- Added middleware for protected routes
- Integrated bcrypt for password hashing
- (2024-01-08) [Merge] Dashboard feature integration (#125, #126)
• What changed: Merged dashboard with analytics integration
• Summary:
- Created dashboard layout components
- Integrated analytics API
======================================================================
Summary: 2 units displayed (1 filtered as minor)
======================================================================
To disable diff analysis while still using LLM summaries (uses -o / --format):
devlog -f v1.0.0 -t v2.0.0 --llm ollama:llama3.2 --no-diff -o CHANGELOG.md
Common Use Cases
Release Notes Generation
# Generate release notes for v2.0.0 (LLM mode outputs to stdout — redirect to file)
devlog -f v1.9.0 -t v2.0.0 \
--llm ollama:llama3.2 \
> RELEASE_NOTES.md
Changelog File Creation
# Create/update CHANGELOG.md (plain mode — -o flag writes to file)
devlog -f v1.0.0 -t HEAD -o CHANGELOG.md
# With LLM diff analysis — redirect stdout to file
devlog -f v1.0.0 -t HEAD --llm ollama:llama3.2 > CHANGELOG.md
Feature Summary for Stakeholders
# High-level feature overview
devlog -f v1.0.0 -t v2.0.0 \
--llm ollama:llama3.2 \
--group-features
Quick Recent Changes
# Last 10 commits
devlog -n 10 --llm ollama:llama3.2
CI/CD Integration
# Generate changelog in CI pipeline (LLM mode — redirect stdout to file)
devlog -f $LAST_TAG -t $CI_COMMIT_SHA \
--llm ollama:llama3.2 \
-y \
> artifacts/CHANGELOG.md
Output Options
Save to File
devlog -f v1.0.0 -t v2.0.0 -o CHANGELOG.md
JSON Output
devlog -f v1.0.0 -t v2.0.0 --format json -o changelog.json
Grouped by Commit Type
devlog -f v1.0.0 -t v2.0.0 --group-by-type
Output:
# Changelog for my-repo
Changes from `v1.0.0` to `v2.0.0`
## Features
- `auth` add user authentication (`abc1234`)
- `api` implement REST endpoints (`def5678`)
## Bug Fixes
- `auth` correct token expiration (`ghi9012`)
- `db` resolve connection pool issue (`jkl3456`)
With Author Information
devlog -f v1.0.0 -t v2.0.0 --authors
Output:
- `auth` add user authentication (`abc1234`) (_John Doe_)
Configuration
Environment Variables
Create a .env file or export in your shell:
# For cloud LLMs (optional)
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
# Logging level
export RUST_LOG="devlog=debug" # trace, debug, info, warn, error
Privacy Levels
When using cloud providers:
# Maximum privacy (default)
devlog --llm openai --privacy strict
# Keep file paths
devlog --llm openai --privacy moderate
# No sanitization (local LLMs only)
devlog --llm ollama:llama3.2 --privacy relaxed
Verifying Installation
Check Version
devlog --version
Run Tests
cd devlog
cargo test
Check Help
devlog --help
Test with Sample Repository
# Clone a sample repository
git clone https://github.com/rust-lang/rust-analyzer.git
cd rust-analyzer
# Generate changelog
devlog -n 10
Troubleshooting
"Command not found: devlog"
Solution 1: Use full path
./target/release/devlog --help
Solution 2: Install to system
cargo install --path .
# Ensure ~/.cargo/bin is in PATH
"Failed to connect to Ollama"
Check Ollama is running:
curl http://localhost:11434/api/tags
Start Ollama:
ollama serve
"No commits found in range"
Check your Git tags:
git tag -l
git log --oneline
Use valid references:
# Use commit hashes instead
devlog -f abc1234 -t def5678
"API key not found"
For OpenAI:
export OPENAI_API_KEY="sk-..."
For Anthropic:
export ANTHROPIC_API_KEY="sk-ant-..."
"Permission denied"
Make binary executable:
chmod +x target/release/devlog
Next Steps
Now that you have Devlog installed:
- Explore Features - Learn about all available features
- Read Usage Guide - Detailed usage examples
- Best Practices - Tips for optimal results
- Troubleshooting - Solutions to common issues
Getting Help
- Documentation: GitLab Pages
- Issues: GitLab
- Discussions: GitLab Discussions