Best Practices

Recommendations for getting the most out of Devlog.


Commit Message Best Practices

Use Conventional Commits

Follow the Conventional Commits specification for best results:

<type>(<scope>): <description>

[optional body]

[optional footer]

Good Examples:

feat(auth): add JWT token validation
fix(api): correct rate limiting logic
docs(readme): update installation instructions

Avoid:

updated stuff
misc changes
WIP

Be Descriptive

Write clear, descriptive commit messages that explain what and why:

feat(auth): add JWT token validation

Implements JWT token validation middleware to secure API endpoints.
Uses RS256 algorithm for signature verification.

Closes #123

Using Devlog Effectively

Choose the Right Mode

Plain Mode - Quick changelog without AI:

devlog --from v1.0.0 --to v2.0.0

Use when: Speed is priority, simple overview needed

AI-Enhanced - Improved summaries:

devlog --from v1.0.0 --to v2.0.0 --llm ollama

Use when: Better readability needed, commit messages are unclear

Diff Analysis - Detailed code analysis:

devlog --from v1.0.0 --to v2.0.0 --llm ollama --diff-analysis

Use when: Need technical details, understanding code changes is important

Feature-Grouped - High-level overview:

devlog --from v1.0.0 --to v2.0.0 --llm ollama --diff-analysis --group-features

Use when: Presenting to stakeholders, creating release notes

Optimize Performance

Limit Scope:

# Instead of full history
devlog --llm ollama --diff-analysis

# Use reasonable ranges
devlog --from v1.0.0 --to v2.0.0 --llm ollama --diff-analysis

Use Smaller Models:

devlog --llm ollama --llm-model llama3.2  # 7B model, faster

Cache Results:

# Save to file for reuse
devlog --from v1.0.0 --to v2.0.0 --llm ollama --output changelog.md

Privacy Recommendations

For Open Source Projects

Use local LLMs to protect contributor privacy:

devlog --llm ollama --diff-analysis

For Private/Enterprise Code

Never use cloud LLMs with diff analysis:

# Good - local only
devlog --llm ollama --diff-analysis

# Bad - sends code to cloud
devlog --llm openai --diff-analysis  # This will fail by design

For commit message summaries (no code):

# Use strict mode (default)
devlog --llm openai --privacy-level strict

# Or moderate if file paths are not sensitive
devlog --llm openai --privacy-level moderate

Always Verify with Dry-Run

Before using cloud providers:

devlog --llm openai --dry-run

CI/CD Best Practices

Pin Versions

# Don't use latest
- cargo install devlog  # Bad

# Pin to specific version or commit
- cargo install --git https://gitlab.com/aice/devlog.git --rev abc1234  # Good

Skip interactive prompts in CI:

devlog --llm ollama --no-consent-prompt --output CHANGELOG.md

Cache Dependencies

Speed up CI builds by caching Ollama models:

- name: Cache Ollama models
  uses: actions/cache@v3
  with:
    path: ~/.ollama
    key: ollama-llama3.2

Output Organization

Maintain CHANGELOG.md

Keep a versioned changelog file:

CHANGELOG.md
├── ## [Unreleased]
├── ## [2.0.0] - 2024-01-15
├── ## [1.0.0] - 2023-12-01
└── ...

Separate Release Notes

Create per-release files:

releases/
├── v2.0.0.md
├── v1.0.0.md
└── ...

Use Consistent Format

Stick to one format across your project:

# Always use the same flags
devlog --from $PREV_TAG --to $NEW_TAG \
       --llm ollama \
       --diff-analysis \
       --output RELEASE_NOTES.md

Model Selection

For Speed

devlog --llm ollama --llm-model llama3.2  # Small, fast

For Quality

devlog --llm ollama --llm-model codellama  # Better code understanding

For Privacy

devlog --llm llamacpp  # Maximum control, fully local

Common Pitfalls

Don't Over-Analyze

Analyzing thousands of commits is slow and often unnecessary:

# Bad - entire history
devlog --llm ollama --diff-analysis

# Good - reasonable range
devlog --from v1.0.0 --to v2.0.0 --llm ollama --diff-analysis

Don't Skip Testing

Always test generated changelogs before publishing:

# Generate to file
devlog --from v1.0.0 --to v2.0.0 --llm ollama --output DRAFT.md

# Review
cat DRAFT.md

# Publish if good
mv DRAFT.md CHANGELOG.md

Don't Ignore Privacy

Never send sensitive code to cloud providers:

# If code is sensitive, use local LLM only
devlog --llm ollama --diff-analysis

Documentation

Keep README Updated

Document your changelog generation process:

## Generating Changelog

We use Devlog to generate changelogs:

\`\`\`bash
devlog --from v1.0.0 --to v2.0.0 --llm ollama --output CHANGELOG.md
\`\`\`

Add Pre-Release Scripts

Create scripts for common tasks:

#!/bin/bash
# scripts/generate-changelog.sh
PREV_TAG=$(git describe --tags --abbrev=0)
devlog --from $PREV_TAG --to HEAD \
       --llm ollama \
       --diff-analysis \
       --output RELEASE_NOTES.md

Version Control

Commit Generated Changelogs

git add CHANGELOG.md
git commit -m "docs: update changelog for v2.0.0"

Tag Releases Properly

Use semantic versioning:

git tag -a v2.0.0 -m "Release v2.0.0"
git push origin v2.0.0

← Back to Home