Version Compatibility Testing with GitHub Actions Matrix
Use GitHub Actions matrix strategy to automatically test compatibility across multiple software versions in parallel - catch breaking changes before they reach production.
Version Compatibility Testing with GitHub Actions Matrix
The Challenge: You release v3.0 of your software. Can users with v2.x still open files created with v3.0? Can your v3.0 API still talk to v2.x clients? Manual testing of every version combination takes days and is error-prone.
The Solution: Use GitHub Actions matrix to automatically test multiple version combinations in parallel. One workflow run can verify compatibility across 5+ versions in the same time it takes to test one.
Results:
- Test 5+ versions simultaneously (20 minutes vs 4+ hours manually)
- Catch breaking changes before release
- Zero production compatibility incidents
Understanding Matrix Strategy
GitHub Actions matrix lets you run the same job multiple times with different parameters—all in parallel.
Sequential testing (slow):
Test v2.0.0 → Test v2.1.0 → Test v2.2.0 → Test v2.3.0
↓ ↓ ↓ ↓
5 minutes 5 minutes 5 minutes 5 minutes
Total: 20 minutes
Matrix testing (fast):
Test v2.0.0 ↘
Test v2.1.0 → All run in parallel
Test v2.2.0 ↗ ↓
Test v2.3.0 ↗ 5 minutes total
Basic Matrix Example
jobs:
test:
strategy:
matrix:
version: [2.0.0, 2.1.0, 2.2.0, 2.3.0]
steps:
- name: Checkout version ${{ matrix.version }}
uses: actions/checkout@v4
with:
ref: v${{ matrix.version }}
- name: Run tests
run: ./test.sh
This creates 4 parallel jobs, each testing a different version.
Real-World Pattern: Forward Compatibility
Scenario: You want to ensure files created with your latest version (v3.0) can still be processed by older versions (v2.x). This is "forward compatibility."
Two-Job Workflow Structure
Job 1: Generate with Latest
- Checkout latest code
- Create test file with latest version
- Upload as artifact
Job 2: Test with Multiple Old Versions (Matrix)
- Download test file from Job 1
- Checkout code at each old version
- Attempt to process the test file
- Report success/failure
Complete Example
name: Forward Compatibility Test
on:
schedule:
- cron: "0 2 * * *" # Daily at 2 AM
workflow_dispatch:
jobs:
# Job 1: Create test file with latest version
generate-latest:
runs-on: ubuntu-latest
steps:
- name: Checkout latest code
uses: actions/checkout@v4
with:
ref: main
- name: Setup environment
run: |
npm install
# or: go mod download, pip install -r requirements.txt, etc.
- name: Generate test file with latest version
run: |
echo "Sample data" | ./cli encrypt \
--output test-file.enc \
--format latest
- name: Upload test file for compatibility testing
uses: actions/upload-artifact@v4
with:
name: test-artifacts
path: test-file.enc
retention-days: 1
# Job 2: Test file with multiple old versions
test-old-versions:
runs-on: ubuntu-latest
needs: generate-latest
strategy:
fail-fast: false # Test all versions even if one fails
matrix:
old-version:
- v2.0.0
- v2.1.0
- v2.2.0
- v2.3.0
steps:
- name: Checkout old version ${{ matrix.old-version }}
uses: actions/checkout@v4
with:
ref: ${{ matrix.old-version }}
- name: Download test file from latest version
uses: actions/download-artifact@v4
with:
name: test-artifacts
- name: Setup environment for ${{ matrix.old-version }}
run: |
npm install
- name: Test decryption with ${{ matrix.old-version }}
run: |
# Attempt to process file created with latest
result=$(./cli decrypt test-file.enc)
# Verify it worked
if [ "$result" != "Sample data" ]; then
echo "❌ Failed to decrypt with ${{ matrix.old-version }}"
echo "Expected: Sample data"
echo "Got: $result"
exit 1
fi
echo "✅ ${{ matrix.old-version }} is compatible"
What Happens
Job 1 runs once on latest code:
- Creates
test-file.encwith v3.0 - Uploads as artifact
- Creates
Job 2 runs 4 times in parallel:
- Each instance downloads
test-file.enc - Checks out different old version (v2.0.0, v2.1.0, v2.2.0, v2.3.0)
- Tries to decrypt the file
- Reports pass/fail
- Each instance downloads
Results:
- See immediately which versions are compatible
- All tests complete in ~5-10 minutes
- CI fails if any version is incompatible
Common Patterns
Pattern 1: Using Commit SHAs Instead of Tags
Tags are cleaner, but commit SHAs are more precise:
matrix:
old-version:
- abc123 # v2.0.0
- def456 # v2.1.0
- ghi789 # v2.2.0
Pattern 2: Testing Both Directions
jobs:
# Can old versions read new files?
forward-compatibility:
strategy:
matrix:
old-version: [v2.0.0, v2.1.0]
steps:
- name: Test new file with old version
run: # Generate with latest, test with old
# Can new version read old files?
backward-compatibility:
strategy:
matrix:
old-version: [v2.0.0, v2.1.0]
steps:
- name: Test old file with new version
run: # Generate with old, test with latest
Pattern 3: Conditional Version Handling
Some old versions may need special treatment:
- name: Run compatibility test
run: |
# v2.0.x needs legacy flag
if [[ "${{ matrix.old-version }}" == v2.0.* ]]; then
./cli decrypt test-file.enc --legacy-mode
else
./cli decrypt test-file.enc
fi
Pattern 4: Matrix from Configuration File
Keep version list in a separate file for easier maintenance:
// versions.json
{
"supported": ["v2.0.0", "v2.1.0", "v2.2.0", "v2.3.0"],
"deprecated": ["v1.5.0", "v1.9.0"]
}
jobs:
setup:
runs-on: ubuntu-latest
outputs:
versions: ${{ steps.set-versions.outputs.versions }}
steps:
- uses: actions/checkout@v4
- id: set-versions
run: |
versions=$(jq -c '.supported' versions.json)
echo "versions=$versions" >> $GITHUB_OUTPUT
test:
needs: setup
strategy:
matrix:
version: ${{ fromJson(needs.setup.outputs.versions) }}
Advanced: Multi-Dimension Matrix
Test combinations of client and server versions:
strategy:
matrix:
client-version: [v2.0.0, v2.1.0, v3.0.0]
server-version: [v2.0.0, v2.1.0, v3.0.0]
exclude:
- client-version: v3.0.0
server-version: v2.0.0 # Known incompatible
steps:
- name: Test client ${{ matrix.client-version }} with server ${{ matrix.server-version }}
run: ./test-api-compatibility.sh
This creates 8 combinations (3×3 minus 1 excluded), all running in parallel.
Key Configuration Options
Fail Fast
strategy:
fail-fast: true # Stop all jobs if one fails (default)
fail-fast: false # Continue testing other versions
Use false to see all compatibility issues at once.
Max Parallel
strategy:
max-parallel: 3 # Limit to 3 concurrent jobs
Useful if you're hitting runner limits or rate limits.
Include/Exclude
strategy:
matrix:
version: [v2.0.0, v2.1.0, v2.2.0]
os: [ubuntu-latest, windows-latest]
exclude:
- version: v2.0.0
os: windows-latest # v2.0.0 doesn't support Windows
Real-World Results
Case Study: API Platform
Before matrix testing:
- Manual testing: 4 hours per release
- Tested only 2 previous versions
- Missed breaking change in v2.3 → customer incident
After matrix testing:
- Automated testing: 15 minutes per release
- Tests 5 previous versions automatically
- Caught 3 potential breaking changes pre-release
- Zero compatibility incidents in 6 months
Engineering Team Impact
"We found a serialization bug that would have broken 30% of our users. Matrix testing caught it in PR review."
— Senior Backend Engineer
"Adding a new version to test takes 30 seconds. We now test every minor release back to v2.0."
— DevOps Lead
Best Practices
1. Start Small
Begin with 2-3 critical versions, then expand:
matrix:
old-version: [v2.0.0, v2.3.0] # First and latest stable
2. Use Descriptive Job Names
- name: Compatibility test - Latest → ${{ matrix.old-version }}
3. Clear Error Messages
- name: Verify compatibility
run: |
if ! ./test.sh; then
echo "::error::${{ matrix.old-version }} cannot process files from latest version"
echo "::error::This is a BREAKING CHANGE"
exit 1
fi
4. Cache Dependencies
- uses: actions/cache@v4
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ matrix.old-version }}
5. Document Expected Behavior
Add a comment explaining what should pass/fail:
# Expected: All v2.x versions should decrypt files from v3.0
# Breaking change is acceptable for v1.x (not in matrix)
Quick Start Checklist
- List versions you need to support
- Create Job 1: Generate test artifact with latest
- Create Job 2: Test artifact with matrix of old versions
- Set
fail-fast: falseto see all failures - Add clear success/failure messages
- Run workflow manually to verify
- Schedule for daily/weekly runs
Common Issues & Solutions
Issue: "Artifact not found"
Solution: Ensure Job 2 has needs: job-1-name and artifact names match exactly.
Issue: Old version doesn't have setup script
Solution: Check if file exists before running:
[ -f ./setup.sh ] && ./setup.sh || echo "No setup needed"
Issue: Different dependency versions
Solution: Let each version use its own dependencies from checkout:
npm install # Installs from package.json of checked-out version
Time & Cost Analysis
Manual Testing:
- 1 hour per version × 4 versions = 4 hours
- Cost: ~$480/month (weekly releases)
Automated Matrix Testing:
- Setup: 2-3 hours (one-time)
- Runtime: 15 minutes per release
- Cost: ~$0 (included in GitHub free tier for public repos)
ROI: Break-even after first month, then save 4 hours per release indefinitely.
Conclusion
GitHub Actions matrix strategy transforms version compatibility testing from a manual, time-consuming process into an automated, parallel workflow. With just two jobs—one to generate test artifacts and one matrix job to test multiple versions—you can:
✅ Test 5+ versions in parallel (not sequential)
✅ Catch breaking changes before release
✅ Increase test coverage without increasing test time
✅ Automate on every commit or schedule
The key insight: Use artifacts to pass data from "latest version" to multiple "old version" instances running in parallel via matrix strategy.
Start with 2 versions today, prove the value, then expand. Your future self (and your users) will thank you.
Next Steps
- Try it: Copy the example workflow and adapt for your project
- Expand: Add more versions to your matrix as you gain confidence
- Optimize: Add caching for faster runs
- Monitor: Track which versions fail most often to inform deprecation strategy
Related Articles:
AI Tester Team
Expert team with 20+ years of collective experience in test automation and AI-augmented testing