Skip to main content

Embedding Drift in Your Test Framework

If you already have a test framework (Jest, Playwright, JUnit, pytest, etc.), you can embed Drift as a test case within it. This allows you to:

  • Run Drift tests alongside unit/integration tests in your CI/CD pipeline
  • Use your framework's reporting (JUnit, HTML, etc.)
  • Access mocks and stubs from your existing test infrastructure
  • Reuse test databases and fixtures
  • Combine with state management from the previous tutorial

The Architecture​

ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│ Your Test Framework │
│ (Jest, Playwright, etc) │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
│
│ spawns process
│
ā–¼
ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│ Drift Verifier │
│ (CLI subprocess) │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
│
│ validates
│
ā–¼
ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│ Your API Server │
│ (Express, FastAPI, etc) │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
│
│ uses
│
ā–¼
ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│ Shared Resources │
│ - Database │
│ - Mocks/Stubs │
│ - Test Fixtures │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜

Why Embed Drift?​

ScenarioBenefit
You have Jest/pytest testsRun Drift with same setup/teardown as other tests
Need mocks/stubs for APIsAccess your mock server from Drift tests
Share database fixturesUse same test data as unit tests
Report in one placeCombine API contract tests with other tests
Run in CI/CDSingle test command runs everything

Step 1: Create a Drift Wrapper​

Create a utility function that spawns Drift as a subprocess and captures the exit code.

Node.js/Jest Example​

Create automation/drift.js:

const { spawn } = require('child_process');
const path = require('path');

const runDrift = (options = {}) => {
const {
testFile = './drift.yaml',
serverUrl = 'http://localhost:8080',
outputDir = './output',
logLevel = 'info'
} = options;

return new Promise((resolve, reject) => {
console.log(`\nšŸ“‹ Running Drift tests from: ${testFile}\n`);

const child = spawn('drift', [
'verify',
'--test-files', testFile,
'--server-url', serverUrl,
'--log-level', logLevel,
'--output-dir', outputDir
], {
stdio: 'inherit', // Shows Drift output in test output
shell: true
});

child.on('error', (err) => {
reject(new Error(`Failed to run Drift: ${err.message}`));
});

child.on('close', (code) => {
resolve(code ?? 1);
});
});
};

module.exports = { runDrift };

Key options:

  • stdio: 'inherit' - Displays Drift output directly in your test runner
  • Exit code 0 = all tests passed, non-zero = failures

Python/pytest Example​

Create automation/drift.py:

import subprocess
import sys

def run_drift(test_file='./drift.yaml', server_url='http://localhost:8080',
output_dir='./output', log_level='info'):
"""Run Drift verifier and return exit code"""

print(f"\nšŸ“‹ Running Drift tests from: {test_file}\n")

result = subprocess.run([
'drift',
'verify',
'--test-files', test_file,
'--server-url', server_url,
'--log-level', log_level,
'--output-dir', output_dir
])

return result.returncode

Step 2: Start Your API Server in the Test​

Create a Jest test that starts your API server, then runs Drift.

Jest Example​

Create tests/api.test.js:

const express = require('express');
const { runDrift } = require('../automation/drift');

// Use in-memory database for tests (no cleanup needed)
process.env.REPOSITORY_TYPE = 'inmemory';

describe('API Contract Tests', () => {
let server;

// Setup: Start the API server before tests
beforeAll(async () => {
const app = express();

// Mount your API routes
app.use(require('../src/product/routes'));
app.use(require('../automation/test-routes')); // State mgmt routes

// Start server on port 8080
server = app.listen(8080);
console.log('āœ“ API server started on http://localhost:8080');
});

// Cleanup: Stop the server after tests
afterAll(async () => {
return new Promise((resolve) => {
server.close(resolve);
});
});

// The actual test: Run Drift
it('Validates API conforms to OpenAPI specification', async () => {
const exitCode = await runDrift({
testFile: './drift.yaml',
serverUrl: 'http://localhost:8080',
logLevel: 'debug'
});

// Drift exit code 0 = all tests passed
expect(exitCode).toBe(0);
});
});

pytest Example​

Create tests/test_api_contract.py:

import subprocess
import pytest
from your_app import create_app

@pytest.fixture(scope='module')
def api_server():
"""Start API server for the test module"""
app = create_app(config='testing')
app.config['REPOSITORY_TYPE'] = 'inmemory'

# Run in debug mode
server = app.run(port=8080, debug=False)
yield server
server.shutdown()

def test_api_conforms_to_openapi(api_server):
"""Drift validates the API against its OpenAPI spec"""
from automation.drift import run_drift

exit_code = run_drift(
test_file='./drift.yaml',
server_url='http://localhost:8080',
log_level='debug'
)

assert exit_code == 0, "Drift tests failed"

Step 3: Combine with State Management​

The Drift tests can use the state management pattern from the previous tutorial.

Your drift.yaml stays the same:

# yaml-language-server: $schema=https://download.pactflow.io/drift/schemas/drift.testcases.v1.schema.json
drift-testcase-file: v1
title: "Product API Tests"

sources:
- name: source-oas
path: ./openapi.yaml
- name: state-mgmt
path: ./state-management.lua

plugins:
- name: oas
- name: json

global:
auth:
apply: true
parameters:
authentication:
scheme: bearer
token: ${state-mgmt:bearer_token}

operations:
getProductByID_Success:
target: source-oas:getProductByID
parameters:
path:
id: 10
expected:
response:
statusCode: 200

The state-management.lua script (from the previous tutorial) handles setup/teardown automatically.

Step 4: Run Your Tests​

# Run all tests (unit + Drift contract tests)
npm test

# Run only API contract tests
npm test -- api.test.js

# Run with specific log level
LOG_LEVEL=debug npm test

Output:

PASS tests/api.test.js
API Contract Tests
āœ“ Validates API conforms to OpenAPI specification (2345ms)

šŸ“‹ Running Drift tests from: ./drift.yaml

āœ“ getProductByID_Success
āœ“ getProductByID_NotFound
āœ“ createProduct_Success
āœ“ deleteProduct_Success

Test Suites: 1 passed, 1 total
Tests: 1 passed, 1 total

Advanced: Multiple Test Variants​

You can run Drift against different configurations:

describe('API Contract Tests', () => {
// Test with in-memory database
it('Works with in-memory database', async () => {
process.env.REPOSITORY_TYPE = 'inmemory';
const exitCode = await runDrift();
expect(exitCode).toBe(0);
});

// Test with PostgreSQL
it('Works with PostgreSQL', async () => {
process.env.REPOSITORY_TYPE = 'postgres';
process.env.DATABASE_URL = 'postgres://test:test@localhost/testdb';

const exitCode = await runDrift({
testFile: './drift-postgres.yaml'
});
expect(exitCode).toBe(0);
});

// Test specific tags only
it('Smoke tests pass', async () => {
const exitCode = await runDrift({
// Use --tags flag in drift command
testFile: './drift.yaml'
// Would need to enhance runDrift() to support --tags
});
expect(exitCode).toBe(0);
});
});

Combining with Existing Test Setup​

If you have existing test setup (fixtures, database seeding, etc.), Drift reuses it:

describe('Product API', () => {
let database;
let server;

beforeAll(async () => {
// Your existing setup
database = await setupTestDatabase();
await seedTestData(database);

server = startServer(database);
});

// Your existing tests
it('Creates a product', async () => {
const response = await fetch('http://localhost:8080/products', {
method: 'POST',
body: JSON.stringify({ name: 'Test' })
});
expect(response.status).toBe(201);
});

// Drift test - uses same database and server
it('Validates API contract with Drift', async () => {
const exitCode = await runDrift();
expect(exitCode).toBe(0);
});

afterAll(async () => {
server.close();
await database.close();
});
});

Running in CI/CD​

In your CI pipeline, Drift runs like any other test:

GitHub Actions​

name: Tests
on: [push, pull_request]

jobs:
test:
runs-on: ubuntu-latest
services:
postgres:
image: postgres:15
env:
POSTGRES_PASSWORD: test
options: >-
--health-cmd pg_isready
--health-interval 10s

steps:
- uses: actions/checkout@v3

- name: Install Drift
run: |
wget -O - https://download.pactflow.io/drift/latest/linux-x86_64.tgz | tar xz -C /usr/local/bin
drift --version

- name: Install dependencies
run: npm install

- name: Run all tests (including Drift)
run: npm test
env:
NODE_ENV: test
DATABASE_URL: postgres://postgres:test@localhost/testdb

GitLab CI​

test:
image: node:20
services:
- postgres:15
script:
- wget -O - https://download.pactflow.io/drift/latest/linux-x86_64.tgz | tar xz -C /usr/local/bin
- drift --version
- npm install
- npm test
variables:
POSTGRES_DB: testdb
POSTGRES_PASSWORD: test

Benefits of This Approach​

āœ… Single test runner - Run all tests with one command
āœ… Shared infrastructure - Use same database, mocks, fixtures
āœ… Better reporting - All tests in one report
āœ… Faster feedback - Catch API contract breaks immediately
āœ… Easy maintenance - One repository, one deployment flow
āœ… Developer experience - npm test runs everything

Troubleshooting​

Drift can't find the API​

Ensure the server is started before Drift tests run:

beforeAll(async () => {
server = app.listen(8080);
// Give server time to start
await new Promise(resolve => setTimeout(resolve, 100));
});

Port already in use​

Check if another process is using port 8080:

lsof -i :8080
# Kill if needed: kill -9 <PID>

Drift not on PATH​

Make sure Drift is installed and in your PATH:

which drift
# If not found, install or add to PATH
export PATH="/path/to/drift:$PATH"

Different behavior in CI vs local​

Ensure environment variables are the same:

console.log('NODE_ENV:', process.env.NODE_ENV);
console.log('REPOSITORY_TYPE:', process.env.REPOSITORY_TYPE);

Complete Working Example​

The full implementation is available in the example repository:

šŸ“¦ View the complete example project

This includes:

  • Complete Jest test setup
  • Drift wrapper utility
  • State management integration
  • CI/CD configuration
  • Multiple database variants

Next Steps​