Embedding Drift in Your Test Framework
If you already have a test framework (Jest, Playwright, JUnit, pytest, etc.), you can embed Drift as a test case within it. This allows you to:
- Run Drift tests alongside unit/integration tests in your CI/CD pipeline
- Use your framework's reporting (JUnit, HTML, etc.)
- Access mocks and stubs from your existing test infrastructure
- Reuse test databases and fixtures
- Combine with state management from the previous tutorial
The Architectureā
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā Your Test Framework ā
ā (Jest, Playwright, etc) ā
āāāāāāāāāāāā¬āāāāāāāāāāāāāāāāāāāā
ā
ā spawns process
ā
ā¼
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā Drift Verifier ā
ā (CLI subprocess) ā
āāāāāāāāāāāā¬āāāāāāāāāāāāāāāāāāāā
ā
ā validates
ā
ā¼
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā Your API Server ā
ā (Express, FastAPI, etc) ā
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā
ā uses
ā
ā¼
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā Shared Resources ā
ā - Database ā
ā - Mocks/Stubs ā
ā - Test Fixtures ā
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
Why Embed Drift?ā
| Scenario | Benefit |
|---|---|
| You have Jest/pytest tests | Run Drift with same setup/teardown as other tests |
| Need mocks/stubs for APIs | Access your mock server from Drift tests |
| Share database fixtures | Use same test data as unit tests |
| Report in one place | Combine API contract tests with other tests |
| Run in CI/CD | Single test command runs everything |
Step 1: Create a Drift Wrapperā
Create a utility function that spawns Drift as a subprocess and captures the exit code.
Node.js/Jest Exampleā
Create automation/drift.js:
const { spawn } = require('child_process');
const path = require('path');
const runDrift = (options = {}) => {
const {
testFile = './drift.yaml',
serverUrl = 'http://localhost:8080',
outputDir = './output',
logLevel = 'info'
} = options;
return new Promise((resolve, reject) => {
console.log(`\nš Running Drift tests from: ${testFile}\n`);
const child = spawn('drift', [
'verify',
'--test-files', testFile,
'--server-url', serverUrl,
'--log-level', logLevel,
'--output-dir', outputDir
], {
stdio: 'inherit', // Shows Drift output in test output
shell: true
});
child.on('error', (err) => {
reject(new Error(`Failed to run Drift: ${err.message}`));
});
child.on('close', (code) => {
resolve(code ?? 1);
});
});
};
module.exports = { runDrift };
Key options:
stdio: 'inherit'- Displays Drift output directly in your test runner- Exit code
0= all tests passed, non-zero = failures
Python/pytest Exampleā
Create automation/drift.py:
import subprocess
import sys
def run_drift(test_file='./drift.yaml', server_url='http://localhost:8080',
output_dir='./output', log_level='info'):
"""Run Drift verifier and return exit code"""
print(f"\nš Running Drift tests from: {test_file}\n")
result = subprocess.run([
'drift',
'verify',
'--test-files', test_file,
'--server-url', server_url,
'--log-level', log_level,
'--output-dir', output_dir
])
return result.returncode
Step 2: Start Your API Server in the Testā
Create a Jest test that starts your API server, then runs Drift.
Jest Exampleā
Create tests/api.test.js:
const express = require('express');
const { runDrift } = require('../automation/drift');
// Use in-memory database for tests (no cleanup needed)
process.env.REPOSITORY_TYPE = 'inmemory';
describe('API Contract Tests', () => {
let server;
// Setup: Start the API server before tests
beforeAll(async () => {
const app = express();
// Mount your API routes
app.use(require('../src/product/routes'));
app.use(require('../automation/test-routes')); // State mgmt routes
// Start server on port 8080
server = app.listen(8080);
console.log('ā API server started on http://localhost:8080');
});
// Cleanup: Stop the server after tests
afterAll(async () => {
return new Promise((resolve) => {
server.close(resolve);
});
});
// The actual test: Run Drift
it('Validates API conforms to OpenAPI specification', async () => {
const exitCode = await runDrift({
testFile: './drift.yaml',
serverUrl: 'http://localhost:8080',
logLevel: 'debug'
});
// Drift exit code 0 = all tests passed
expect(exitCode).toBe(0);
});
});
pytest Exampleā
Create tests/test_api_contract.py:
import subprocess
import pytest
from your_app import create_app
@pytest.fixture(scope='module')
def api_server():
"""Start API server for the test module"""
app = create_app(config='testing')
app.config['REPOSITORY_TYPE'] = 'inmemory'
# Run in debug mode
server = app.run(port=8080, debug=False)
yield server
server.shutdown()
def test_api_conforms_to_openapi(api_server):
"""Drift validates the API against its OpenAPI spec"""
from automation.drift import run_drift
exit_code = run_drift(
test_file='./drift.yaml',
server_url='http://localhost:8080',
log_level='debug'
)
assert exit_code == 0, "Drift tests failed"
Step 3: Combine with State Managementā
The Drift tests can use the state management pattern from the previous tutorial.
Your drift.yaml stays the same:
# yaml-language-server: $schema=https://download.pactflow.io/drift/schemas/drift.testcases.v1.schema.json
drift-testcase-file: v1
title: "Product API Tests"
sources:
- name: source-oas
path: ./openapi.yaml
- name: state-mgmt
path: ./state-management.lua
plugins:
- name: oas
- name: json
global:
auth:
apply: true
parameters:
authentication:
scheme: bearer
token: ${state-mgmt:bearer_token}
operations:
getProductByID_Success:
target: source-oas:getProductByID
parameters:
path:
id: 10
expected:
response:
statusCode: 200
The state-management.lua script (from the previous tutorial) handles setup/teardown automatically.
Step 4: Run Your Testsā
# Run all tests (unit + Drift contract tests)
npm test
# Run only API contract tests
npm test -- api.test.js
# Run with specific log level
LOG_LEVEL=debug npm test
Output:
PASS tests/api.test.js
API Contract Tests
ā Validates API conforms to OpenAPI specification (2345ms)
š Running Drift tests from: ./drift.yaml
ā getProductByID_Success
ā getProductByID_NotFound
ā createProduct_Success
ā deleteProduct_Success
Test Suites: 1 passed, 1 total
Tests: 1 passed, 1 total
Advanced: Multiple Test Variantsā
You can run Drift against different configurations:
describe('API Contract Tests', () => {
// Test with in-memory database
it('Works with in-memory database', async () => {
process.env.REPOSITORY_TYPE = 'inmemory';
const exitCode = await runDrift();
expect(exitCode).toBe(0);
});
// Test with PostgreSQL
it('Works with PostgreSQL', async () => {
process.env.REPOSITORY_TYPE = 'postgres';
process.env.DATABASE_URL = 'postgres://test:test@localhost/testdb';
const exitCode = await runDrift({
testFile: './drift-postgres.yaml'
});
expect(exitCode).toBe(0);
});
// Test specific tags only
it('Smoke tests pass', async () => {
const exitCode = await runDrift({
// Use --tags flag in drift command
testFile: './drift.yaml'
// Would need to enhance runDrift() to support --tags
});
expect(exitCode).toBe(0);
});
});
Combining with Existing Test Setupā
If you have existing test setup (fixtures, database seeding, etc.), Drift reuses it:
describe('Product API', () => {
let database;
let server;
beforeAll(async () => {
// Your existing setup
database = await setupTestDatabase();
await seedTestData(database);
server = startServer(database);
});
// Your existing tests
it('Creates a product', async () => {
const response = await fetch('http://localhost:8080/products', {
method: 'POST',
body: JSON.stringify({ name: 'Test' })
});
expect(response.status).toBe(201);
});
// Drift test - uses same database and server
it('Validates API contract with Drift', async () => {
const exitCode = await runDrift();
expect(exitCode).toBe(0);
});
afterAll(async () => {
server.close();
await database.close();
});
});
Running in CI/CDā
In your CI pipeline, Drift runs like any other test:
GitHub Actionsā
name: Tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
services:
postgres:
image: postgres:15
env:
POSTGRES_PASSWORD: test
options: >-
--health-cmd pg_isready
--health-interval 10s
steps:
- uses: actions/checkout@v3
- name: Install Drift
run: |
wget -O - https://download.pactflow.io/drift/latest/linux-x86_64.tgz | tar xz -C /usr/local/bin
drift --version
- name: Install dependencies
run: npm install
- name: Run all tests (including Drift)
run: npm test
env:
NODE_ENV: test
DATABASE_URL: postgres://postgres:test@localhost/testdb
GitLab CIā
test:
image: node:20
services:
- postgres:15
script:
- wget -O - https://download.pactflow.io/drift/latest/linux-x86_64.tgz | tar xz -C /usr/local/bin
- drift --version
- npm install
- npm test
variables:
POSTGRES_DB: testdb
POSTGRES_PASSWORD: test
Benefits of This Approachā
ā
Single test runner - Run all tests with one command
ā
Shared infrastructure - Use same database, mocks, fixtures
ā
Better reporting - All tests in one report
ā
Faster feedback - Catch API contract breaks immediately
ā
Easy maintenance - One repository, one deployment flow
ā
Developer experience - npm test runs everything
Troubleshootingā
Drift can't find the APIā
Ensure the server is started before Drift tests run:
beforeAll(async () => {
server = app.listen(8080);
// Give server time to start
await new Promise(resolve => setTimeout(resolve, 100));
});
Port already in useā
Check if another process is using port 8080:
lsof -i :8080
# Kill if needed: kill -9 <PID>
Drift not on PATHā
Make sure Drift is installed and in your PATH:
which drift
# If not found, install or add to PATH
export PATH="/path/to/drift:$PATH"
Different behavior in CI vs localā
Ensure environment variables are the same:
console.log('NODE_ENV:', process.env.NODE_ENV);
console.log('REPOSITORY_TYPE:', process.env.REPOSITORY_TYPE);
Complete Working Exampleā
The full implementation is available in the example repository:
š¦ View the complete example project
This includes:
- Complete Jest test setup
- Drift wrapper utility
- State management integration
- CI/CD configuration
- Multiple database variants
Next Stepsā
- Lifecycle Hooks - Learn about event handlers in Drift
- CI/CD Integration - Deploy Drift tests in your pipeline
- Debugging - Troubleshoot failing tests
- Testing with State Dependencies - The previous tutorial on state management