Devlog - AI-Powered Development Log Generator
Privacy-First Changelog Generation with Local LLM Integration
What is Devlog?
Devlog is an intelligent changelog generator that transforms your Git commit history into human-readable development logs. Unlike traditional changelog generators, Devlog analyzes actual code changes using AI while keeping your data secure through local-first processing.
Key Highlights
- Privacy-First Architecture - Works entirely offline with local LLMs (Ollama, llama.cpp)
- Intelligent Analysis - Analyzes code diffs, not just commit messages
- Unit-Based Processing - Groups commits into logical units (PRs, merges, direct commits, tags)
- Multiple Output Modes - Plain text, AI-enhanced, diff-analysis, or feature-grouped
- Zero Configuration - Works out of the box, customize when needed
- Multi-Platform - Linux, macOS, and Windows support
Quick Links
- Getting Started - Installation and first steps
- Features - Comprehensive feature overview
- Usage Guide - Detailed usage examples
- Technical Details - Architecture and implementation
- Best Practices - Tips and recommendations
- Troubleshooting - Common issues and solutions
Repository Links
- GitLab: https://gitlab.com/aice/devlog
- Documentation: GitLab Pages
Quick Start
Installation
git clone https://gitlab.com/aice/devlog.git
cd devlog && cargo build --release
Basic Usage
# Generate changelog for current repository
devlog --from v1.0.0 --to v2.0.0
# With AI-enhanced summaries (local LLM)
devlog --from v1.0.0 --to v2.0.0 --llm ollama
# Analyze code changes with diff analysis
devlog --diff-analysis --llm ollama --from HEAD~20 --to HEAD
Four Operating Modes
Mode 1: Plain Mode (No LLM)
Generate changelogs from commit messages without AI processing.
devlog --from v1.0.0 --to v1.1.0
Output: Chronological list with unit badges [PR], [Merge], [Direct], [Tag]
Mode 2: AI-Enhanced (No Diff Analysis)
Use AI to summarize commit messages for better readability.
devlog --from v1.0.0 --to v1.1.0 --llm ollama
Output: Chronological list with AI-generated summaries
Mode 3: Diff Analysis (No Feature Grouping)
Analyze actual code changes with AI for detailed insights.
devlog --from v1.0.0 --to v1.1.0 --llm ollama --diff-analysis
Output: Detailed code change analysis with "What changed" and impact summaries
Mode 4: Feature-Grouped Analysis
Group related changes by features for high-level overview.
devlog --from v1.0.0 --to v1.1.0 --llm ollama --diff-analysis --group-features
Output: Feature-based changelog with PRs and commits organized by purpose
Why Devlog?
Privacy-First Design
- Local-Only Processing - No API tokens or network access required for diff analysis
- Data Sanitization - Automatic PII removal for optional cloud providers
- Transparent Operations - See exactly what data is processed
- Zero Telemetry - No tracking or analytics
Intelligent Analysis
- Code-Aware - Analyzes actual code diffs, not just messages
- Smart Grouping - Consolidates related commits into logical units
- Noise Filtering - Automatically filters whitespace-only changes and minor edits
- Merge Consolidation - Properly handles merge commits with M^1..M^2 range analysis
Developer-Friendly
- Multiple Output Formats - Markdown, JSON
- Flexible Filtering - By date ranges, commit ranges, or limits
- Progress Indicators - Clear feedback with spinners and progress bars
- Conventional Commits - Full support with heuristic fallback
Example Output
Diff Analysis Mode
======================================================================
Development Log: v1.0.0 → v1.1.0
======================================================================
- [Tag] v1.0.0 Release (2024-01-01)
- [PR] Add user authentication system (2024-01-05)
PR: #123
• What changed: Implemented JWT-based authentication
• Summary:
- Added JWT token generation and validation
- Created login and logout endpoints
- Added middleware for protected routes
- Integrated bcrypt for password hashing
- [Merge] Feature/dashboard integration (2024-01-08)
PR: #125, #126
• What changed: Merged dashboard feature with 15 commits
• Summary:
- Created dashboard layout components
- Integrated analytics API
- Added user preferences storage
- [Tag] v1.1.0 Release (2024-01-15)
Released with: authentication system, dashboard
======================================================================
Summary: 4 units displayed (2 minor filtered)
======================================================================
Community & Support
- Issues: GitLab Issues
- Discussions: GitLab Discussions
- License: MIT License - see LICENSE
Contributing
Contributions are welcome! Please see WARP.md for development guidelines.
Made for privacy-conscious developers
Getting Started
This guide will help you install and start using Devlog in minutes.
Prerequisites
Before installing Devlog, ensure you have:
- Rust 1.70 or later - Install Rust
- Git - Any recent version
- A Git repository - To analyze commit history
- (Optional) Ollama or llama.cpp - For AI-powered features
Installation
From Source
# Clone the repository
git clone https://gitlab.com/aice/devlog.git
cd devlog
# Build release binary
cargo build --release
# The binary will be available at target/release/devlog
./target/release/devlog --help
# Optional: Install to system path
cargo install --path .
Option 3: Direct Cargo Install
# Coming soon: Install directly from crates.io
cargo install devlog
Building Without Cloud LLM Support
If you only need local LLM support:
cargo build --release --no-default-features
This removes OpenAI and Anthropic dependencies, reducing binary size.
Setting Up Local LLM (Recommended)
For privacy-first AI features, set up a local LLM:
Ollama (Easiest)
-
Install Ollama
# macOS/Linux curl https://ollama.ai/install.sh | sh # Or download from: https://ollama.ai -
Pull a model
# Download llama3.2 (recommended, ~2GB) ollama pull llama3.2 # Or use other models ollama pull mistral ollama pull codellama -
Start Ollama server
ollama serveThe server runs on
http://localhost:11434by default. -
Verify installation
devlog --llm ollama --llm-model llama3.2 --help
llama.cpp (Advanced)
-
Clone and build
git clone https://github.com/ggerganov/llama.cpp cd llama.cpp make -
Download a model
- Get GGUF format models from HuggingFace
- Place in
llama.cpp/models/directory
-
Start server
./server -m models/your-model.gguf -c 2048Runs on
http://localhost:8080by default. -
Verify installation
devlog --llm llamacpp --help
First Changelog
Let's generate your first changelog!
Step 1: Navigate to Your Repository
cd /path/to/your/git/repository
Step 2: Check Available Tags
git tag -l
Output example:
v1.0.0
v1.1.0
v2.0.0
Step 3: Generate Basic Changelog
# Between two versions
devlog --from v1.0.0 --to v2.0.0
# From version to current HEAD
devlog --from v2.0.0 --to HEAD
# Last 20 commits
devlog --limit 20
Output:
======================================================================
Development Log: v1.0.0 → v2.0.0
======================================================================
- [PR] feat(auth): add user authentication system (2024-01-05)
PR: #123
- [Merge] Merge pull request #125 from feature/dashboard (2024-01-08)
PR: #125, #126
- [Direct] refactor(auth): improve error handling (2024-01-10)
======================================================================
Summary: 3 units displayed
======================================================================
Step 4: Add AI Enhancement (Optional)
If you have Ollama installed:
devlog --from v1.0.0 --to v2.0.0 --llm ollama
Output with AI summaries:
======================================================================
Development Log: v1.0.0 → v2.0.0
======================================================================
- [PR] Add user authentication system (2024-01-05)
PR: #123
Summary: Implemented JWT-based authentication with secure token
handling and session management
- [Merge] Dashboard feature integration (2024-01-08)
PR: #125, #126
Summary: Integrated comprehensive dashboard with analytics, user
preferences, and responsive design
======================================================================
Summary: 2 units displayed
======================================================================
Step 5: Try Diff Analysis
For detailed code change analysis:
devlog --from v1.0.0 --to v2.0.0 --llm ollama --diff-analysis
Output with diff analysis:
======================================================================
Development Log: v1.0.0 → v2.0.0
======================================================================
- [PR] Add user authentication system (2024-01-05)
PR: #123
• What changed: Implemented JWT-based authentication
• Summary:
- Added JWT token generation and validation
- Created login and logout endpoints
- Added middleware for protected routes
- Integrated bcrypt for password hashing
• Impact: Moderate
• Files: 8 files, +423/-12 lines
======================================================================
Summary: 1 unit displayed
======================================================================
Common Use Cases
Release Notes Generation
# Generate release notes for v2.0.0
devlog --from v1.9.0 --to v2.0.0 \
--llm ollama \
--diff-analysis \
--output RELEASE_NOTES.md
Changelog File Creation
# Create/update CHANGELOG.md
devlog --from v1.0.0 --to HEAD \
--llm ollama \
--output CHANGELOG.md
Feature Summary for Stakeholders
# High-level feature overview
devlog --from v1.0.0 --to v2.0.0 \
--llm ollama \
--diff-analysis \
--group-features
Quick Recent Changes
# Last 10 commits
devlog --limit 10 --llm ollama
CI/CD Integration
# Generate changelog in CI pipeline
devlog --from $LAST_TAG --to $CI_COMMIT_SHA \
--llm ollama \
--output artifacts/CHANGELOG.md \
--no-consent-prompt
Output Options
Save to File
devlog --from v1.0.0 --to v2.0.0 --output CHANGELOG.md
JSON Output
devlog --from v1.0.0 --to v2.0.0 --format json --output changelog.json
Grouped by Commit Type
devlog --from v1.0.0 --to v2.0.0 --group
Output:
## Features
- feat(auth): add user authentication
- feat(api): implement REST endpoints
## Bug Fixes
- fix(auth): correct token expiration
- fix(db): resolve connection pool issue
With Author Information
devlog --from v1.0.0 --to v2.0.0 --authors
Output:
- feat(auth): add user authentication [abc1234] (John Doe)
Configuration
Environment Variables
Create a .env file or export in your shell:
# For cloud LLMs (optional)
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
# Logging level
export DEVLOG_LOG_LEVEL="info" # trace, debug, info, warn, error
export RUST_LOG="devlog=debug"
# Privacy mode (for cloud providers)
export DEVLOG_PRIVACY_MODE="strict" # strict, moderate, relaxed
Privacy Levels
When using cloud providers:
# Maximum privacy (default)
devlog --llm openai --privacy-level strict
# Keep file paths
devlog --llm openai --privacy-level moderate
# No sanitization (local LLMs only)
devlog --llm ollama --privacy-level relaxed
Verifying Installation
Check Version
devlog --version
Run Tests
cd devlog
cargo test
Check Help
devlog --help
Test with Sample Repository
# Clone a sample repository
git clone https://github.com/rust-lang/rust-analyzer.git
cd rust-analyzer
# Generate changelog
devlog --limit 10
Troubleshooting
"Command not found: devlog"
Solution 1: Use full path
./target/release/devlog --help
Solution 2: Install to system
cargo install --path .
# Ensure ~/.cargo/bin is in PATH
"Failed to connect to Ollama"
Check Ollama is running:
curl http://localhost:11434/api/tags
Start Ollama:
ollama serve
"No commits found in range"
Check your Git tags:
git tag -l
git log --oneline
Use valid references:
# Use commit hashes instead
devlog --from abc1234 --to def5678
"API key not found"
For OpenAI:
export OPENAI_API_KEY="sk-..."
For Anthropic:
export ANTHROPIC_API_KEY="sk-ant-..."
"Permission denied"
Make binary executable:
chmod +x target/release/devlog
Next Steps
Now that you have Devlog installed:
- Explore Features - Learn about all available features
- Read Usage Guide - Detailed usage examples
- Best Practices - Tips for optimal results
- Troubleshooting - Solutions to common issues
Getting Help
- Documentation: GitLab Pages
- Issues: GitLab
- Discussions: GitLab Discussions
← Back to Home | Next: Features →
Usage Guide
Practical examples and use cases for Devlog.
Basic Workflows
Release Notes Generation
Generate release notes between two versions:
# Standard release notes
devlog --from v1.0.0 --to v2.0.0 --output RELEASE_NOTES.md
# With AI summaries
devlog --from v1.0.0 --to v2.0.0 --llm ollama --output RELEASE_NOTES.md
# With detailed analysis
devlog --from v1.0.0 --to v2.0.0 \
--llm ollama \
--diff-analysis \
--output RELEASE_NOTES.md
Maintaining CHANGELOG.md
Keep your changelog updated:
# Update for new release
devlog --from v1.0.0 --to HEAD \
--llm ollama \
--output CHANGELOG.md
# Append to existing changelog
devlog --from v1.0.0 --to HEAD --llm ollama >> CHANGELOG.md
Feature Summary for Stakeholders
Generate high-level feature overview:
devlog --from v1.0.0 --to v2.0.0 \
--llm ollama \
--diff-analysis \
--group-features
Advanced Usage
Custom Ranges
# Last N commits
devlog --limit 20 --llm ollama
# Specific commit range
devlog --from abc1234 --to def5678 --llm ollama
# Since yesterday (using git syntax)
devlog --from HEAD@{1.day.ago} --to HEAD
Filtering and Grouping
# Group by commit type
devlog --from v1.0.0 --to v2.0.0 --group
# Include merge commits
devlog --from v1.0.0 --to v2.0.0 --include-merges
# With author attribution
devlog --from v1.0.0 --to v2.0.0 --authors
Output Formats
# Markdown (default)
devlog --from v1.0.0 --to v2.0.0 --output changelog.md
# JSON for programmatic access
devlog --from v1.0.0 --to v2.0.0 \
--format json \
--output changelog.json
# Pretty-print to console
devlog --from v1.0.0 --to v2.0.0
CI/CD Integration
GitHub Actions
name: Generate Changelog
on:
push:
tags:
- 'v*'
jobs:
changelog:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Install Devlog
run: |
cargo install --git https://gitlab.com/aice/devlog.git
- name: Setup Ollama
run: |
curl https://ollama.ai/install.sh | sh
ollama pull llama3.2
- name: Generate Changelog
run: |
PREV_TAG=$(git describe --tags --abbrev=0 HEAD^)
devlog --from $PREV_TAG --to $GITHUB_REF_NAME \
--llm ollama \
--no-consent-prompt \
--output CHANGELOG.md
- name: Upload Changelog
uses: actions/upload-artifact@v4
with:
name: changelog
path: CHANGELOG.md
GitLab CI
generate_changelog:
stage: deploy
image: rust:latest
script:
- cargo install --git https://gitlab.com/aice/devlog.git
- curl https://ollama.ai/install.sh | sh
- ollama serve &
- ollama pull llama3.2
- PREV_TAG=$(git describe --tags --abbrev=0 HEAD^)
- devlog --from $PREV_TAG --to $CI_COMMIT_TAG \
--llm ollama \
--no-consent-prompt \
--output CHANGELOG.md
artifacts:
paths:
- CHANGELOG.md
only:
- tags
Working with Different Repository Types
Monorepo
# Focus on specific path
cd packages/frontend
devlog --from v1.0.0 --to v2.0.0
# Or filter by scope
devlog --from v1.0.0 --to v2.0.0 | grep "feat(frontend)"
Multiple Branches
# Compare feature branch to main
devlog --from main --to feature/new-feature
# Changes in current branch since branching
devlog --from $(git merge-base main HEAD) --to HEAD
Privacy Modes
Local-Only (Maximum Privacy)
# Only use local LLMs
devlog --llm ollama --diff-analysis
# Never requires network access
devlog --llm llamacpp --diff-analysis
Cloud with Sanitization
# Strict mode (default) - removes all PII
devlog --llm openai --privacy-level strict
# Moderate mode - keeps file paths
devlog --llm openai --privacy-level moderate
# Check what would be sent
devlog --llm openai --dry-run
Tips & Tricks
Speed Up Analysis
# Use smaller model
devlog --llm ollama --llm-model llama3.2 # instead of larger models
# Limit scope
devlog --limit 50 --llm ollama
# Skip diff analysis for quick results
devlog --from v1.0.0 --to v2.0.0 --llm ollama # no --diff-analysis
Better Results
# Use diff analysis for detailed insights
devlog --diff-analysis --llm ollama
# Group by features for overview
devlog --diff-analysis --group-features --llm ollama
# Include merge commits
devlog --include-merges --llm ollama
Debugging
# Enable debug logging
export RUST_LOG="devlog=debug"
devlog --from v1.0.0 --to v2.0.0
# Dry run to see what would be sent
devlog --llm openai --dry-run
# Estimate costs before running
devlog --llm openai --estimate-cost
Common Patterns
Pre-Release Checklist
#!/bin/bash
# Generate changelog before release
devlog --from $(git describe --tags --abbrev=0) --to HEAD \
--llm ollama \
--diff-analysis \
--output RELEASE_NOTES.md
# Review generated changelog
cat RELEASE_NOTES.md
# Create release commit
git add RELEASE_NOTES.md
git commit -m "docs: update release notes for vX.Y.Z"
Weekly Team Updates
#!/bin/bash
# Generate weekly summary
LAST_WEEK=$(date -d "7 days ago" +%Y-%m-%d)
devlog --from HEAD@{$LAST_WEEK} --to HEAD \
--llm ollama \
--diff-analysis \
--group-features
Integration with Semantic Release
# In .releaserc.js or package.json
{
"plugins": [
"@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator",
[
"@semantic-release/exec",
{
"prepareCmd": "devlog --from ${lastRelease.gitTag} --to ${nextRelease.gitTag} --llm ollama --output CHANGELOG.md"
}
]
]
}
Best Practices
See Best Practices Guide for detailed recommendations.
Features
Devlog offers a comprehensive set of features designed for modern software development workflows while prioritizing privacy and security.
Core Features
Smart Commit Parsing
Conventional Commits Support
- Fully compliant with Conventional Commits specification
- Automatic detection of commit types: feat, fix, docs, style, refactor, perf, test, build, ci, chore, revert
- Breaking change detection via
!suffix orBREAKING CHANGE:footer - Scope extraction and categorization
Heuristic Fallback
- Intelligently parses non-standard commits
- Extracts meaningful information from freeform commit messages
- No strict format requirements - works with any Git repository
Unit-Based Processing
Devlog introduces a revolutionary approach by organizing commits into logical "units":
Unit Types:
-
[PR] Pull Request Commits
- Groups consecutive commits with same PR reference
- Consolidates squash commits
- Shows PR links and numbers
- Example: Multiple commits in PR#123 → Single unit with combined analysis
-
[Merge] Merge Commits
- Analyzes incoming branch using M^1..M^2 range
- Consolidates all commits from feature branch
- Rolls up statistics and summaries
- Treats merge as atomic unit of work
-
[Direct] Direct Commits
- Groups related direct commits to main branch
- Uses temporal proximity and file overlap heuristics
- Segments by author and timeframe
- Filters noise while preserving context
-
[Tag] Tag Milestones
- Detects both annotated and lightweight tags
- Distinguishes release tags (v1.2.3) from general tags
- Generates release summaries
- Interleaves chronologically with commits
Multiple Output Formats
Markdown (Default)
- Human-readable changelog format
- Grouped by commit type or chronological order
- Breaking changes prominently displayed
- Author attribution optional
JSON
- Structured data for programmatic access
- Complete commit metadata
- Integration-friendly format
- Schema-based validation
Flexible Filtering Options
By Version Range
# Between two tags
devlog --from v1.0.0 --to v2.0.0
# From tag to HEAD
devlog --from v1.0.0 --to HEAD
By Commit Range
# Last N commits
devlog --limit 50
# Specific commit range
devlog --from abc1234 --to def5678
By Date Range
# Coming soon: date-based filtering
Minor Commit Filtering
- Automatic detection of whitespace-only changes
- Filters documentation-only commits
- Identifies typo fixes and formatting changes
- Configurable thresholds
AI-Powered Features
Multiple LLM Providers
Local LLMs (Recommended)
-
Ollama
- Easy installation and setup
- Multiple model support (llama3.2, mistral, etc.)
- Automatic model download
- No API keys required
-
llama.cpp
- Maximum performance
- Custom model support
- GGUF format compatibility
- Complete offline operation
Cloud LLMs (Optional)
-
OpenAI (GPT-4)
- Highest quality summaries
- Fast processing
- Requires API key
- Cost per request
-
Anthropic (Claude)
- Excellent code understanding
- Long context support
- Requires API key
- Cost per request
Privacy-First Architecture
Three Privacy Levels:
-
Strict Mode (Default for cloud)
- Removes all PII (emails, URLs, paths)
- Sanitizes ticket IDs
- Redacts sensitive patterns
- Maximum privacy protection
-
Moderate Mode
- Keeps file paths intact
- Removes emails and URLs
- Balances utility and privacy
- Suitable for internal tools
-
Relaxed Mode (Local only)
- No sanitization
- Full context for AI
- Only available with local LLMs
- Best analysis quality
Privacy Guarantees:
- Local LLMs never send data over network
- Cloud providers require explicit opt-in
- Dry-run mode shows exactly what would be sent
- Cost estimation before any API calls
- Transparent data handling
- Zero telemetry or tracking
Diff Analysis Mode
What It Does:
- Analyzes actual code changes, not just commit messages
- Extracts file changes, insertions, deletions
- Identifies affected code areas
- Assesses change impact (Major, Moderate, Patch, Minor)
- Generates detailed summaries of modifications
How It Works:
- Extracts diff for each commit vs. parent
- Analyzes diff content with local LLM
- Categorizes changes by area and impact
- Consolidates unit-level summaries
- Filters noise and minor changes
Output Structure:
- [PR] Feature name (date)
PR: #123
• What changed: High-level description
• Summary:
- Detailed bullet point 1
- Detailed bullet point 2
- Detailed bullet point 3
• Impact: Moderate
• Files: 5 files, +234/-67 lines
Requirements:
- Only works with local LLMs (Ollama, llama.cpp)
- No network access required
- Analyzes full git history
- Works with any repository
Feature Grouping
Automatic Feature Detection:
- Groups related units by logical features
- Identifies common themes across commits
- Separates PRs from direct commits
- Generates feature-level summaries
Output Format:
======================================================================
Feature Change Log: v1.0.0 → v1.1.0
======================================================================
### User Authentication
**Summary:**
- Implemented JWT-based authentication
- Added login and logout endpoints
- Middleware for protected routes
**PRs:**
- [#123](url) Add user authentication system
**Commits:**
- [abc1234](url) Refactor authentication middleware
### Dashboard Feature
**Summary:**
- Created dashboard layout components
- Integrated analytics API
**PRs:**
- [#125](url), [#126](url) Feature/dashboard integration
======================================================================
Use Cases:
- Release notes generation
- Feature tracking
- High-level project overview
- Stakeholder communication
Developer Experience
Progress Indicators
8-Stage Processing Pipeline:
- Stage 0: Preflight - Privacy checks, LLM availability
- Stage 1: Collection - Gathering commits and tags
- Stage 2: Unit Formation - Classifying and grouping
- Stage 3: Diff Extraction - Extracting code changes
- Stage 4: Analysis - AI analyzing each unit
- Stage 5: Feature Grouping - Organizing by features
- Stage 6: Filtering - Applying noise filters
- Stage 7: Rendering - Generating output
Visual Feedback:
- Spinners for indeterminate operations
- Progress bars with counters for analysis
- Stage completion checkmarks
- Clear error messages with remediation steps
Conventional Commits Integration
Supported Types:
feat- New featuresfix- Bug fixesdocs- Documentation changesstyle- Code style changes (formatting, etc.)refactor- Code refactoringperf- Performance improvementstest- Test changesbuild- Build system changesci- CI/CD configuration changeschore- Maintenance tasksrevert- Reverted changes
Breaking Changes:
feat!: remove deprecated API endpoint
feat(api)!: change response format
fix: correct token validation
BREAKING CHANGE: API v1 endpoints removed
Grouping & Organization
By Commit Type:
devlog --group
Groups commits into sections: Features, Bug Fixes, Documentation, etc.
Chronological:
devlog
Lists commits in time order with unit badges.
By Feature:
devlog --diff-analysis --group-features --llm ollama
Groups related work into logical features.
Author Attribution
devlog --authors
Includes commit author information in output:
- feat(auth): add JWT support [abc1234] (John Doe)
Security Features
Secret Detection Prevention
- Never logs API keys or tokens
- Sanitizes sensitive environment variables
- Redacts credentials from output
- Secure token handling
Data Sanitization
Automatic Redaction:
- Email addresses →
[EMAIL] - URLs →
[URL] - File paths →
[PATH](strict mode) - Ticket IDs →
[TICKET](strict mode)
Pattern Detection:
- API keys and tokens
- Credit card numbers
- Social security numbers
- Private keys
Audit Trail
- Transparent operation logging
- No hidden network calls
- Dry-run mode for verification
- Cost estimation before spending
Integration Features
Git Hosting Support
GitHub Integration:
- Pull request link generation
- Issue reference resolution
- GitHub-flavored markdown
- Actions workflow compatible
GitLab Integration:
- Merge request links
- Issue tracking integration
- GitLab CI/CD compatible
- GitLab Pages deployment
CI/CD Compatibility
Automated Changelog Generation:
# In GitHub Actions or GitLab CI
- name: Generate changelog
run: |
devlog --from ${{ env.LAST_TAG }} --to HEAD \
--output CHANGELOG.md \
--no-consent-prompt
No Interactive Prompts:
devlog --no-consent-prompt
Skip user confirmation for CI/CD environments.
Exit Codes:
0- Success1- General error2- Configuration error3- Git repository error
Configuration
Environment Variables
# LLM API Keys
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
# Logging
export DEVLOG_LOG_LEVEL="info" # trace, debug, info, warn, error
export RUST_LOG="devlog=debug"
# Privacy enforcement
export DEVLOG_PRIVACY_MODE="strict"
Configuration File Support
Coming soon: .devlogrc or devlog.toml for project-specific settings.
Performance
Optimization Features
- Parallel commit processing
- Efficient git operations via libgit2
- Streaming diff analysis
- Memory-efficient large repository handling
Benchmarks
- 1000 commits: ~30 seconds (local LLM)
- 100 commits with diff analysis: ~2-3 minutes (Ollama)
- Feature grouping: +10-15 seconds overhead
Roadmap
Planned features (see GitLab Issues):
- Configuration file support (.devlogrc.toml)
- Date-based filtering
- Interactive mode for commit selection
- Plugin system for custom formatters
- Web UI for repository visualization
- Git hosting API integration (automatic PR/MR fetching)
- Multi-repository changelog aggregation
- Custom LLM provider support
- Template-based output generation
← Back to Home | Next: Getting Started →
Best Practices
Recommendations for getting the most out of Devlog.
Commit Message Best Practices
Use Conventional Commits
Follow the Conventional Commits specification for best results:
<type>(<scope>): <description>
[optional body]
[optional footer]
Good Examples:
feat(auth): add JWT token validation
fix(api): correct rate limiting logic
docs(readme): update installation instructions
Avoid:
updated stuff
misc changes
WIP
Be Descriptive
Write clear, descriptive commit messages that explain what and why:
feat(auth): add JWT token validation
Implements JWT token validation middleware to secure API endpoints.
Uses RS256 algorithm for signature verification.
Closes #123
Using Devlog Effectively
Choose the Right Mode
Plain Mode - Quick changelog without AI:
devlog --from v1.0.0 --to v2.0.0
Use when: Speed is priority, simple overview needed
AI-Enhanced - Improved summaries:
devlog --from v1.0.0 --to v2.0.0 --llm ollama
Use when: Better readability needed, commit messages are unclear
Diff Analysis - Detailed code analysis:
devlog --from v1.0.0 --to v2.0.0 --llm ollama --diff-analysis
Use when: Need technical details, understanding code changes is important
Feature-Grouped - High-level overview:
devlog --from v1.0.0 --to v2.0.0 --llm ollama --diff-analysis --group-features
Use when: Presenting to stakeholders, creating release notes
Optimize Performance
Limit Scope:
# Instead of full history
devlog --llm ollama --diff-analysis
# Use reasonable ranges
devlog --from v1.0.0 --to v2.0.0 --llm ollama --diff-analysis
Use Smaller Models:
devlog --llm ollama --llm-model llama3.2 # 7B model, faster
Cache Results:
# Save to file for reuse
devlog --from v1.0.0 --to v2.0.0 --llm ollama --output changelog.md
Privacy Recommendations
For Open Source Projects
Use local LLMs to protect contributor privacy:
devlog --llm ollama --diff-analysis
For Private/Enterprise Code
Never use cloud LLMs with diff analysis:
# Good - local only
devlog --llm ollama --diff-analysis
# Bad - sends code to cloud
devlog --llm openai --diff-analysis # This will fail by design
For commit message summaries (no code):
# Use strict mode (default)
devlog --llm openai --privacy-level strict
# Or moderate if file paths are not sensitive
devlog --llm openai --privacy-level moderate
Always Verify with Dry-Run
Before using cloud providers:
devlog --llm openai --dry-run
CI/CD Best Practices
Pin Versions
# Don't use latest
- cargo install devlog # Bad
# Pin to specific version or commit
- cargo install --git https://gitlab.com/aice/devlog.git --rev abc1234 # Good
Use --no-consent-prompt
Skip interactive prompts in CI:
devlog --llm ollama --no-consent-prompt --output CHANGELOG.md
Cache Dependencies
Speed up CI builds by caching Ollama models:
- name: Cache Ollama models
uses: actions/cache@v3
with:
path: ~/.ollama
key: ollama-llama3.2
Output Organization
Maintain CHANGELOG.md
Keep a versioned changelog file:
CHANGELOG.md
├── ## [Unreleased]
├── ## [2.0.0] - 2024-01-15
├── ## [1.0.0] - 2023-12-01
└── ...
Separate Release Notes
Create per-release files:
releases/
├── v2.0.0.md
├── v1.0.0.md
└── ...
Use Consistent Format
Stick to one format across your project:
# Always use the same flags
devlog --from $PREV_TAG --to $NEW_TAG \
--llm ollama \
--diff-analysis \
--output RELEASE_NOTES.md
Model Selection
For Speed
devlog --llm ollama --llm-model llama3.2 # Small, fast
For Quality
devlog --llm ollama --llm-model codellama # Better code understanding
For Privacy
devlog --llm llamacpp # Maximum control, fully local
Common Pitfalls
Don't Over-Analyze
Analyzing thousands of commits is slow and often unnecessary:
# Bad - entire history
devlog --llm ollama --diff-analysis
# Good - reasonable range
devlog --from v1.0.0 --to v2.0.0 --llm ollama --diff-analysis
Don't Skip Testing
Always test generated changelogs before publishing:
# Generate to file
devlog --from v1.0.0 --to v2.0.0 --llm ollama --output DRAFT.md
# Review
cat DRAFT.md
# Publish if good
mv DRAFT.md CHANGELOG.md
Don't Ignore Privacy
Never send sensitive code to cloud providers:
# If code is sensitive, use local LLM only
devlog --llm ollama --diff-analysis
Documentation
Keep README Updated
Document your changelog generation process:
## Generating Changelog
We use Devlog to generate changelogs:
\`\`\`bash
devlog --from v1.0.0 --to v2.0.0 --llm ollama --output CHANGELOG.md
\`\`\`
Add Pre-Release Scripts
Create scripts for common tasks:
#!/bin/bash
# scripts/generate-changelog.sh
PREV_TAG=$(git describe --tags --abbrev=0)
devlog --from $PREV_TAG --to HEAD \
--llm ollama \
--diff-analysis \
--output RELEASE_NOTES.md
Version Control
Commit Generated Changelogs
git add CHANGELOG.md
git commit -m "docs: update changelog for v2.0.0"
Tag Releases Properly
Use semantic versioning:
git tag -a v2.0.0 -m "Release v2.0.0"
git push origin v2.0.0
Troubleshooting
Common issues and their solutions.
Installation Issues
"Command not found: devlog"
Cause: Binary not in system PATH
Solutions:
- Use full path:
./target/release/devlog - Install to system:
cargo install --path . - Add to PATH:
export PATH="$HOME/.cargo/bin:$PATH"
Build Fails with "linker error"
Cause: Missing system dependencies
Solutions:
# Ubuntu/Debian
sudo apt-get install build-essential libssl-dev pkg-config
# macOS
xcode-select --install
# Fedora/RHEL
sudo dnf install gcc openssl-devel
LLM Connection Issues
"Failed to connect to Ollama"
Check if Ollama is running:
curl http://localhost:11434/api/tags
Start Ollama:
ollama serve
Check firewall: Ensure port 11434 is not blocked
"Model not found"
Pull the model:
ollama pull llama3.2
List available models:
ollama list
"Connection refused to llama.cpp"
Check server is running:
curl http://localhost:8080/health
Start llama.cpp server:
./server -m models/your-model.gguf -c 2048
Git Repository Issues
"No commits found in range"
Check tags exist:
git tag -l
Use commit hashes instead:
devlog --from abc1234 --to def5678
Check you're in a git repo:
git status
"Failed to open repository"
Ensure you're in git directory:
cd /path/to/git/repo
devlog --repo .
Check git is initialized:
git log --oneline
API Key Issues
"OpenAI API key not found"
Set environment variable:
export OPENAI_API_KEY="sk-..."
Or specify in command (not recommended):
devlog --llm openai --llm-model gpt-4
# Will prompt for key
"Anthropic authentication failed"
Set API key:
export ANTHROPIC_API_KEY="sk-ant-..."
Verify key is correct:
echo $ANTHROPIC_API_KEY
Performance Issues
"Analysis is very slow"
Use smaller model:
# Instead of llama3.2:70b, use:
ollama pull llama3.2 # Default 7B model
devlog --llm ollama --llm-model llama3.2
Limit commit range:
devlog --limit 50 # Instead of full history
Use plain mode for quick results:
devlog --from v1.0.0 --to v2.0.0 # No LLM
"Out of memory"
Increase system memory
Use llama.cpp with quantized model:
# Download smaller quantized model (Q4 instead of F16)
Process in smaller batches:
# Split into smaller ranges
devlog --from v1.0.0 --to v1.5.0
devlog --from v1.5.0 --to v2.0.0
Output Issues
"No output generated"
Check for errors:
devlog --from v1.0.0 --to v2.0.0 2>&1 | tee debug.log
Enable debug logging:
export RUST_LOG="devlog=debug"
devlog --from v1.0.0 --to v2.0.0
"Output is garbled"
Specify output file:
devlog --output CHANGELOG.md
Check terminal encoding: Ensure UTF-8 support
Privacy & Security
"Dry-run shows sensitive data"
Use stricter privacy mode:
devlog --llm openai --privacy-level strict --dry-run
Only use local LLMs for sensitive code:
devlog --llm ollama --diff-analysis
"Data sanitization too aggressive"
Use moderate mode:
devlog --llm openai --privacy-level moderate
Or use relaxed with local LLM:
devlog --llm ollama --privacy-level relaxed
Getting Help
Still having issues?
- Documentation: GitLab Pages
- GitLab Issues: https://gitlab.com/aice/devlog/-/issues
- Discussions: GitLab Discussions
When reporting issues, include:
- Operating system and version
- Rust version (
rustc --version) - Devlog version (
devlog --version) - Full error message
- Command you ran
Devlog Documentation
This directory contains comprehensive documentation for Devlog.
Documentation Structure
- index.md - Main landing page with overview and quick links
- getting-started.md - Installation and first steps
- features.md - Complete feature documentation
- usage-guide.md - Practical examples and use cases
- best-practices.md - Recommendations and tips
- troubleshooting.md - Common issues and solutions
Viewing Documentation
Online
The documentation is automatically deployed to:
- GitLab Pages: https://aice.gitlab.io/devlog/
Locally
You can view the documentation locally using mdBook:
# Install mdBook
cargo install mdbook
# Serve locally with hot reload
mdbook serve
# Or build static site
mdbook build
Contributing to Documentation
When adding or updating documentation:
- Follow the existing structure and format
- Use relative links between documents
- Keep examples practical and tested
- Ensure cross-links work correctly
Deployment
GitLab Pages
Documentation is automatically deployed via GitLab CI when changes are pushed to the main branch:
- Job:
pagesin.gitlab-ci.yml - Trigger: Push to
mainbranch with docs changes - Tool: mdBook (Rust-native documentation generator)
- URL: https://aice.gitlab.io/devlog/
License
Documentation is licensed under MIT License, same as the project.