Skip to main content

Testing Schemas

The atlas schema test command allows you to write tests for your database schema using familiar software testing paradigms.

Schema testing works only for Atlas Pro users, free and paid. Use the following command to use this feature:

atlas login

Introduction

Atlas schema testing is inspired by the popular databases in the way they test their public (and private) APIs. The structure is defined in HCL files (suffixed with a .test.hcl), but the underlying testing logic is written in plain SQL. The following document describes the different structure options, flags, and capabilities supported by the testing framework.

Flags

  • --url (-u) - a list of URLs to the tested schema: can be a database URL, an HCL, SQL or ORM schema, or a migration directory.
  • --dev-url - a URL to the Dev-Database to run the tests on.
  • --run (optional) - run only tests matching the given regexp.

Examples

atlas schema test --dev-url "docker://postgres/15/dev" --url "file://schema.hcl" .

The test "schema" block

The test "schema" "<name>" block describes a test case. The second label defines the test case name, and the two arguments below are supported:

  • skip (bool) - indicates whether the test case should be executed or skipped (can be computed). Defaults to false.
  • parallel (bool) - indicates whether this test case can be executed in parallel with all other cases defined this way. Test cases that are stateless (e.g., without INSERT or CREATE commands) should be set as parallel tests to speed up test time.

Before running a test case, Atlas creates the desired schema on the dev-database, making the environment ready for testing, and cleans the schema after the test is done, regardless of its result.

Example

test "schema" "postal" {
parallel = true
# The "exec" command is explained below.
exec {
sql = "select '12345'::us_postal_code"
}
# The "catch" command is explained below.
catch {
sql = "select 'hello'::us_postal_code"
}
}

A test case is composed of zero or more commands that are executed in order, and it is aborted if any of the commands fail. The supported commands are:

exec command

The exec command expects an SQL statement to pass. If output or match is defined, the output of the SQL statement is compared to the expected value.

  • sql (string) - the SQL statement to execute.
  • format (optional) - the format of the output (default: csv). Can be table or csv.
  • output (optional) - the expected output of the SQL statement.
  • match (optional) - a regular expression to match the output of the SQL statement.
schema.test.hcl
test "schema" "postal" {
# Expected exec to pass.
exec {
sql = <<SQL
CREATE TABLE t (a INT, b TEXT);
INSERT INTO t VALUES (1, 'one');
INSERT INTO t VALUES (2, 'two');
SQL
}
# Expected exec to pass and output
# be equal to the expected table.
exec {
sql = "SELECT a, b FROM t;"
format = table
output = <<TAB
a | b
---+-----
1 | one
2 | two
TAB
}
}

catch command

The catch command expects an SQL statement to fail. If error is defined, the error message is compared to the expected value.

  • sql (string) - the SQL statement to execute.
  • error (optional) - matches the error message.
schema.test.hcl
test "schema" "postal" {
catch {
sql = "SELECT 1+"
error = "incomplete input"
}
}

assert command

The assert command expects an SQL statement to pass and the output to be a single row (+column) with a true value.

  • sql (string) - the SQL statement to execute.
  • error_message (optional) - the error message to display if the assertion fails.
schema.test.hcl
test "schema" "postal" {
assert {
sql = "SELECT json_valid('{}')"
}
assert {
sql = "SELECT json_valid('{')"
error_message = "expects a valid JSON"
}
}

log command

The log command logs a message to the test output. It can be useful to report the progress of the test case.

  • message (string) - the message to log.
schema.test.hcl
test "schema" "seed" {
exec {
sql = "SELECT seed()"
}
log {
message = "Seeded the database"
}
}

external command

The external command runs an external program and expects it to pass. If output or match is defined, the output (stdout) of the program is compared to the expected value.

  • program ([]string) - The first element of the string is the program to run. The remaining elements are optional command line arguments.
  • working_dir (optional) - The working directory to run the program from. Defaults to the current working directory.
  • output (optional) - The expected output of the program.
  • match (optional) - A regular expression to match the output of the program.

cleanup command

The cleanup command runs an SQL statement after the test case is done, regardless of its result. Note, cleanup commands are called in the reverse order they are defined.

  • sql (string) - the SQL statement to execute.
schema.test.hcl
test "schema" "rls" {
exec {
sql = "SELECT SELECT set_config('app.current_tenant', 'a8m', false)"
}
cleanup {
sql = "SELECT set_config('app.current_tenant', NULL, false)"
}
# Test logic goes here.
}

Input Variables

Test files can be parameterized using variables, and their values can be set through the atlas.hcl config file. For example, given this test file:

schema.test.hcl
variable "seed_file" {
type = string
}

test "schema" "seed" {
exec {
sql = "SELECT seed('${var.seed_file}')"
}
}

Test config can be defined on the env block (or globally) and executed using the --env flag:

atlas.hcl
env "dev" {
src = "<schema to test>"
dev = "<docker-based dev-url>"

# Test configuration for local development.
test {
schema {
src = ["schema.test.hcl"]
vars = {
seed_file = "filename.sql"
variable2 = var.name
variable3 = data.external.value
}
}
}
}
atlas schema test --env dev
Input Variables

Input variables can be defined statically per environment, injected from the CLI using the --var flag, or computed from a data source.

Table Driven Tests

Test blocks support the for_each meta-argument, which accepts a map or a set of values and is used to generate a test case for each item in the set or map. See the example below:

schema.test.hcl
test "schema" "seed" {
for_each = [
{input: "hello", expected: "HELLO"},
{input: "world", expected: "WORLD"},
]
log {
message = "Testing ${each.value.input} -> ${each.value.expected}"
}
exec {
sql = "SELECT upper('${each.value.input}')"
output = each.value.expected
}
}
atlas schema test --env dev
Test Output
-- PASS: seed/1 (901µs)
schema.test.hcl:6: Testing hello -> HELLO
-- PASS: seed/2 (89µs)
schema.test.hcl:6: Testing world -> WORLD
PASS

Testing Queries

With Atlas schema testing, users can validate the correctness of their SQL queries. To test a query:

  1. Seed the tables the query references with test data.
  2. Execute the query.
  3. Compare the result against the expected output attribute.

Suppose you have a schema file that defines three tables: products, users, and orders, along with a query that returns the top 3 users by total amount spent:

schema.sql
-- Create "products" table
CREATE TABLE "products" (
"id" serial NOT NULL,
"name" text NOT NULL,
"price" numeric(10,2) NOT NULL,
PRIMARY KEY ("id")
);

-- Create "users" table
CREATE TABLE "users" (
"id" serial NOT NULL,
"name" text NOT NULL,
"email" text NOT NULL,
PRIMARY KEY ("id"),
CONSTRAINT "users_email_key" UNIQUE ("email")
);

-- Create "orders" table
CREATE TABLE "orders" (
"id" serial NOT NULL,
"user_id" integer NOT NULL,
"product_id" integer NOT NULL,
"quantity" integer NOT NULL,
"created_at" timestamptz NULL DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY ("id"),
CONSTRAINT "orders_product_id_fkey" FOREIGN KEY ("product_id") REFERENCES "products" ("id"),
CONSTRAINT "orders_user_id_fkey" FOREIGN KEY ("user_id") REFERENCES "users" ("id"),
CONSTRAINT "orders_quantity_check" CHECK (quantity > 0)
);

Let's create a test file that seeds the tables with test data and verifies the query's output. The tested SQL query can be defined inline in the test file, injected via variables, read from an external file, or executed through an external program.

schema.test.hcl
test "schema" "query_top_users" {
exec {
sql = <<-SQL
-- Seed: users
INSERT INTO users (id, name, email) VALUES
(1, 'Alice', 'alice@example.com'),
(2, 'Bob', 'bob@example.com'),
(3, 'Charlie', 'charlie@example.com'),
(4, 'Dana', 'dana@example.com');

-- Seed: products
INSERT INTO products (id, name, price) VALUES
(1, 'Widget', 10.00),
(2, 'Gadget', 25.00),
(3, 'Doohickey', 15.00);

-- Seed: orders
INSERT INTO orders (user_id, product_id, quantity, created_at) VALUES
(1, 1, 2, '2024-01-01'), -- Alice buys 2 Widgets
(1, 2, 1, '2024-01-02'), -- Alice buys 1 Gadget
(2, 1, 3, '2024-01-03'), -- Bob buys 3 Widgets
(2, 3, 1, '2024-01-03'), -- Bob buys 1 Doohickey
(3, 1, 1, '2024-01-04'), -- Charlie buys 1 Widget
(3, 2, 2, '2024-01-04'), -- Charlie buys 2 Gadgets
(4, 2, 1, '2024-01-05'); -- Dana buys 1 Gadget
SQL
}
exec {
sql = <<-SQL
SELECT
u.name,
ROUND(SUM(o.quantity * p.price), 2) AS total_spent
FROM orders o
JOIN users u ON o.user_id = u.id
JOIN products p ON o.product_id = p.id
GROUP BY u.id
ORDER BY total_spent DESC, u.name
LIMIT 3;
SQL
output = <<-OUT
Charlie,60
Alice,45
Bob,45
OUT
}
}

Note that because the test seeds data into the database, it cannot run in parallel with other test cases. Therefore, you should not set the parallel attribute in your test case.

What to Test

While Atlas schema testing provides the flexibility to integrate with external programs for certain validation tasks, its core strength lies in directly testing and asserting the behavior of database schema elements and related logic.

Here are key areas you can effectively test using Atlas's built-in commands:

  • Constraints: Ensure primary keys, foreign keys, unique constraints, check constraints, and domain types are defined correctly.
  • View Queries: Ensure that views return the expected data.
  • Generated Columns: Validate the logic of generated columns to confirm they compute the correct values based on their dependencies.
  • Data Integrity: Test constraints and triggers that maintain data integrity, ensuring that invalid data cannot be inserted or updated.
  • Stored Procedures and Functions: Test the logic of stored procedures and functions, including their input parameters and return values.
  • Event Triggers: Validate the behavior of event triggers that listen to DDL events and ensure they perform the expected actions.
  • SQL Queries: Validate the correctness of your application's SQL queries by seeding data and comparing the query results against expected outputs. See the Testing Queries section for more details.