Automate Database CI/CD with Azure DevOps
Atlas provides an Azure DevOps extension with the AtlasAction
task to run actions on Azure Pipelines. We recommend setting the githubConnection
input to allow the task to report results to GitHub.
This guide will walk you through the setup for using AtlasAction with Azure DevOps. In each example, $(ATLAS_TOKEN)
is the secret that holds the Atlas Token to authenticate with Atlas Cloud.
To access the outputs generated by the task, you need to name your steps in the pipeline. See how to access task outputs.
migrate apply
Applies a migration directory on a target database
Usage
Add azure-pipelines.yml
to your repo with the following contents:
Deploy a directory from the git repository
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'migrate apply' # Required
url: $(DatabaseURL)
dir: 'file://migrations'
Deploy a directory from the cloud
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'migrate apply' # Required
url: $(DatabaseURL)
dir: 'atlas://my-project'
Inputs
action
- (Required) Alwaysmigrate apply
.githubConnection
- (Optional) The connection to GitHub.allow_dirty
- (Optional) Allow working on a non-clean database.amount
- (Optional) The maximum number of migration files to apply. Default is all.dir
- (Optional) The URL of the migration directory to apply. For example:atlas://dir-name
for cloud based directories orfile://migrations
for local ones.dry_run
- (Optional) Print SQL without executing it. Either "true" or "false".revisions_schema
- (Optional) The name of the schema containing the revisions table.tx_mode
- (Optional) Transaction mode to use. Either "file", "all", or "none". Default is "file".url
- (Optional) The URL of the target database. For example:mysql://root:pass@localhost:3306/dev
.working_directory
- (Optional) Atlas working directory. Default is project rootconfig
- (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.env
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.vars
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.
Outputs
The AtlasAction
task will generate the following outputs for this action:
applied_count
- The number of migrations that were applied.current
- The current version of the database. (before applying migrations)pending_count
- The number of migrations that will be applied.target
- The target version of the database.
migrate autorebase
Automatically resolves atlas.sum
conflicts and rebases the migration directory onto the target branch.
Inputs
action
- (Required) Alwaysmigrate autorebase
.githubConnection
- (Optional) The connection to GitHub.base_branch
- (Optional) The base branch to rebase the migration directory onto. Default to the default branch of the repository.dir
- (Optional) The URL of the migration directory to rebase on. By default:file://migrations
.remote
- (Optional) The remote to fetch from. Defaults toorigin
.working_directory
- (Optional) Atlas working directory. Default is project root
migrate diff
Automatically generate versioned migrations whenever the schema is changed, and commit them to the migration directory.
Inputs
action
- (Required) Alwaysmigrate diff
.githubConnection
- (Optional) The connection to GitHub.dir
- (Optional) The URL of the migration directory. For example:file://migrations
. Read more about Atlas URLs.remote
- (Optional) The remote to push changes to. Defaults toorigin
.to
- (Optional) The URL of the desired state.working_directory
- (Optional) Atlas working directory. Default is project rootconfig
- (Optional) The path to the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.env
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.vars
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.dev_url
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
Outputs
The AtlasAction
task will generate the following outputs for this action:
url
- The URL of the CI report in Atlas Cloud, containing an ERD visualization and analysis of the schema migrations.
migrate down
Reverts deployed migration files on a target database
Inputs
action
- (Required) Alwaysmigrate down
.githubConnection
- (Optional) The connection to GitHub.amount
- (Optional) The amount of applied migrations to revert. Mutually exclusive withto-tag
andto-version
.dir
- (Optional) The URL of the migration directory to apply. For example:atlas://dir-name
for cloud based directories orfile://migrations
for local ones.revisions_schema
- (Optional) The name of the schema containing the revisions table.to_tag
- (Optional) The tag to revert to. Mutually exclusive withamount
andto-version
.to_version
- (Optional) The version to revert to. Mutually exclusive withamount
andto-tag
.url
- (Optional) The URL of the target database. For example:mysql://root:pass@localhost:3306/dev
.wait_interval
- (Optional) Time in seconds between different migrate down attempts.wait_timeout
- (Optional) Time after which no other retry attempt is made and the action exits.working_directory
- (Optional) Atlas working directory. Default is project root.config
- (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.env
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.vars
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.dev_url
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
Outputs
The AtlasAction
task will generate the following outputs for this action:
current
- The current version of the database. (before applying migrations)planned_count
- The number of migrations that will be applied.reverted_count
- The number of migrations that were reverted.target
- The target version of the database.url
- If given, the URL for reviewing the revert plan.
migrate lint
CI for database schema changes with Atlas
Usage
Add azure-pipelines.yml
to your repo with the following contents:
- MySQL
- Postgres
- MariaDB
- SQL Server
- ClickHouse
- SQLite
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'migrate lint' # Required
dir_name: 'my-project'
env: 'ci'
dev_url: 'docker://mysql/8/dev'
githubConnection: '<Connection to your GitHub>'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'migrate lint' # Required
dir_name: 'my-project'
env: 'ci'
dev_url: 'docker://postgres/15/dev?search_path=public'
githubConnection: '<Connection to your GitHub>'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'migrate lint' # Required
dir_name: 'my-project'
env: 'ci'
dev_url: 'docker://maria/latest/schema'
githubConnection: '<Connection to your GitHub>'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'migrate lint' # Required
dir_name: 'my-project'
env: 'ci'
dev_url: 'docker://sqlserver/2022-latest?mode=schema'
githubConnection: '<Connection to your GitHub>'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'migrate lint' # Required
dir_name: 'my-project'
env: 'ci'
dev_url: 'docker://clickhouse/23.11/dev'
githubConnection: '<Connection to your GitHub>'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'migrate lint' # Required
dir_name: 'my-project'
env: 'ci'
dev_url: 'sqlite://db?mode=memory'
githubConnection: '<Connection to your GitHub>'
Inputs
action
- (Required) Alwaysmigrate lint
.githubConnection
- (Optional) The connection to GitHub.dir
- (Optional) The URL of the migration directory to lint. For example:file://migrations
. Read more about Atlas URLs.dir_name
- The name (slug) of the project in Atlas Cloud.tag
- (Optional) The tag of migrations to used as base for linting. By default, thelatest
tag is used.working_directory
- (Optional) Atlas working directory. Default is project rootconfig
- (Optional) The path to the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.env
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.vars
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.dev_url
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
Outputs
The AtlasAction
task will generate the following outputs for this action:
url
- The URL of the CI report in Atlas Cloud, containing an ERD visualization and analysis of the schema migrations.
migrate push
Push the current version of your migration directory to Atlas Cloud.
Usage
Add azure-pipelines.yml
to your repo with the following contents:
- MySQL
- Postgres
- MariaDB
- SQL Server
- ClickHouse
- SQLite
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'migrate push' # Required
dir_name: 'my-project'
env: 'ci'
dev_url: 'docker://mysql/8/dev'
githubConnection: '<Connection to your GitHub>'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'migrate push' # Required
dir_name: 'my-project'
env: 'ci'
dev_url: 'docker://postgres/15/dev?search_path=public'
githubConnection: '<Connection to your GitHub>'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'migrate push' # Required
dir_name: 'my-project'
env: 'ci'
dev_url: 'docker://maria/latest/schema'
githubConnection: '<Connection to your GitHub>'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'migrate push' # Required
dir_name: 'my-project'
env: 'ci'
dev_url: 'docker://sqlserver/2022-latest?mode=schema'
githubConnection: '<Connection to your GitHub>'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'migrate push' # Required
dir_name: 'my-project'
env: 'ci'
dev_url: 'docker://clickhouse/23.11/dev'
githubConnection: '<Connection to your GitHub>'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'migrate push' # Required
dir_name: 'my-project'
env: 'ci'
dev_url: 'sqlite://db?mode=memory'
githubConnection: '<Connection to your GitHub>'
Inputs
action
- (Required) Alwaysmigrate push
.githubConnection
- (Optional) The connection to GitHub.dir
- (Optional) The URL of the migration directory to push. For example:file://migrations
. Read more about Atlas URLs.dir_name
- (Optional) The name (slug) of the project in Atlas Cloud.latest
- (Optional) If true, push also to the "latest" tag.tag
- (Optional) The tag to apply to the pushed migration directory. By default the current git commit hash is used.working_directory
- (Optional) Atlas working directory. Default is project rootconfig
- (Optional) The path to the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.env
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.vars
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.dev_url
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
migrate test
CI for database schema changes with Atlas
Inputs
action
- (Required) Alwaysmigrate test
.githubConnection
- (Optional) The connection to GitHub.dir
- (Optional) The URL of the migration directory to apply. For example:atlas://dir-name
for cloud based directories orfile://migrations
for local ones.paths
- (Optional) List of directories containing test files.revisions_schema
- (Optional) The name of the schema containing the revisions table.run
- (Optional) Filter tests to run by regexp. For example,^test_.*
will only run tests that start withtest_
. Default is to run all tests.working_directory
- (Optional) Atlas working directory. Default is project rootconfig
- (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.env
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.vars
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.dev_url
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
monitor schema
Sync the database schema to Atlas Cloud.
Inputs
action
- (Required) Alwaysmonitor schema
.githubConnection
- (Optional) The connection to GitHub.cloud_token
- The token that is used to connect to Atlas Cloud (should be passed as a secret).exclude
- (Optional) List of exclude patterns from inspection. see: https://atlasgo.io/declarative/inspect#exclude-schemasschemas
- (Optional) List of database schemas to include (by default includes all schemas). see: https://atlasgo.io/declarative/inspect#inspect-multiple-schemasslug
- (Optional) Optional unique identifier for the database server.url
- (Optional) URL of the database to sync (mutually exclusive withconfig
andenv
).config
- (Optional) The URL of the Atlas configuration file (mutually exclusive withurl
). For example,file://config/atlas.hcl
, learn more about Atlas configuration files.env
- (Optional) The environment to use from the Atlas configuration file. For example,dev
(mutually exclusive withurl
).
Outputs
The AtlasAction
task will generate the following outputs for this action:
url
- URL of the schema of the database inside Atlas Cloud.
schema apply
Applies schema changes to a target database
Usage
Add azure-pipelines.yml
to your repo with the following contents:
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
name: ApplySchema
inputs:
action: 'schema apply' # Required
env: 'ci'
Inputs
action
- (Required) Alwaysschema apply
.githubConnection
- (Optional) The connection to GitHub.auto_approve
- (Optional) Automatically approve and apply changes. Either "true" or "false".dry_run
- (Optional) Print SQL without executing it. Either "true" or "false".exclude
- (Optional) List of glob patterns used to filter resources from applying see: https://atlasgo.io/declarative/inspect#exclude-schemasinclude
- (Optional) List of glob patterns used to select which resources to keep in inspection see: https://atlasgo.io/declarative/inspect#include-schemas-lint_review
- (Optional) Automatically generate an approval plan before applying changes. Options are "ALWAYS", "ERROR" or "WARNING". Use "ALWAYS" to generate a plan for every apply, or "WARNING" and "ERROR" to generate a plan only based on review policy.plan
- (Optional) The plan to apply. For example,atlas://<schema>/plans/<id>
.schema
- (Optional) List of database schema(s). For example:public
.to
- (Optional) URL(s) of the desired schema state.tx_mode
- (Optional) Transaction mode to use. Either "file", "all", or "none". Default is "file".url
- (Optional) The URL of the target database to apply changes to. For example:mysql://root:pass@localhost:3306/prod
.wait_interval
- (Optional) Time in seconds between different apply attempts.wait_timeout
- (Optional) Time after which no other retry attempt is made and the action exits.working_directory
- (Optional) Atlas working directory. Default is project rootconfig
- (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.env
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.vars
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.dev_url
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
Outputs
The AtlasAction
task will generate the following outputs for this action:
error
- The error message if the action fails.
schema lint
Lint database schema with Atlas
Inputs
action
- (Required) Alwaysschema lint
.githubConnection
- (Optional) The connection to GitHub.schema
- (Optional) The database schema(s) to include. For example:public
.url
- (Optional) Schema URL(s) to lint. For example:file://schema.hcl
. Read more about Atlas URLs.working_directory
- (Optional) Atlas working directory. Default is project rootconfig
- (Optional) The path to the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.env
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.vars
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.dev_url
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
schema plan
Plan a declarative migration to move from the current state to the desired state
Usage
Add azure-pipelines.yml
to your repo with the following contents:
- MySQL
- Postgres
- MariaDB
- SQL Server
- ClickHouse
- SQLite
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'schema plan' # Required
dev_url: 'docker://mysql/8/dev'
env: 'ci'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'schema plan' # Required
dev_url: 'docker://postgres/15/dev?search_path=public'
env: 'ci'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'schema plan' # Required
dev_url: 'docker://maria/latest/schema'
env: 'ci'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'schema plan' # Required
dev_url: 'docker://sqlserver/2022-latest?mode=schema'
env: 'ci'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'schema plan' # Required
dev_url: 'docker://clickhouse/23.11/dev'
env: 'ci'
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'schema plan' # Required
dev_url: 'sqlite://db?mode=memory'
env: 'ci'
Inputs
action
- (Required) Alwaysschema plan
.githubConnection
- (Optional) The connection to GitHub.exclude
- (Optional) List of glob patterns used to filter resources from applying see: https://atlasgo.io/declarative/inspect#exclude-schemasfrom
- (Optional) URL(s) of the current schema state.include
- (Optional) List of glob patterns used to select which resources to keep in inspection see: https://atlasgo.io/declarative/inspect#include-schemas-name
- (Optional) The name of the plan. By default, Atlas will generate a name based on the schema changes.schema
- (Optional) List of database schema(s). For example:public
.schema_name
- (Optional) The name (slug) of the project in Atlas Cloud.to
- (Optional) URL(s) of the desired schema state.working_directory
- (Optional) Atlas working directory. Default is project rootconfig
- (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.env
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.vars
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.dev_url
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
Outputs
The AtlasAction
task will generate the following outputs for this action:
link
- Link to the schema plan on Atlas.plan
- The plan to be applied or generated. (e.g.atlas://<schema>/plans/<id>
)status
- The status of the plan. For example,PENDING
orAPPROVED
.
schema plan approve
Approve a migration plan by its URL
Usage
Add azure-pipelines.yml
to your repo with the following contents:
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
inputs:
action: 'schema plan approve' # Required
env: 'ci'
Inputs
action
- (Required) Alwaysschema plan approve
.githubConnection
- (Optional) The connection to GitHub.exclude
- (Optional) List of glob patterns used to filter resources from applying see: https://atlasgo.io/declarative/inspect#exclude-schemasfrom
- (Optional) URL(s) of the current schema state.include
- (Optional) List of glob patterns used to select which resources to keep in inspection see: https://atlasgo.io/declarative/inspect#include-schemas-plan
- (Optional) The URL of the plan to be approved. For example,atlas://<schema>/plans/<id>
. If not provided, Atlas will search the registry for a plan corresponding to the given schema transition and approve it (typically, this plan is created during the PR stage). If multiple plans are found, an error will be thrown.schema
- (Optional) List of database schema(s). For example:public
.schema_name
- (Optional) The name (slug) of the project in Atlas Cloud.to
- (Optional) URL(s) of the desired schema state.working_directory
- (Optional) Atlas working directory. Default is project rootconfig
- (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.env
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.vars
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.dev_url
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
Outputs
The AtlasAction
task will generate the following outputs for this action:
link
- Link to the schema plan on Atlas.plan
- The plan to be applied or generated. (e.g.atlas://<schema>/plans/<id>
)status
- The status of the plan. (e.g,PENDING
,APPROVED
)
schema push
Push a schema version with an optional tag to Atlas
Usage
Add azure-pipelines.yml
to your repo with the following contents:
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- script: curl -sSf https://atlasgo.sh | sh
displayName: Install Atlas
- script: atlas login --token $(ATLAS_TOKEN)
displayName: Atlas Login
- task: AtlasAction@1
name: PushSchema
inputs:
action: 'schema push' # Required
env: 'ci'
latest: true
- displayName: Print the URL of schema
script: echo "Pushed schema to $(PushSchema.link)"
Inputs
action
- (Required) Alwaysschema push
.githubConnection
- (Optional) The connection to GitHub.description
- (Optional) The description of the schema.latest
- (Optional) If true, push also to the "latest" tag.schema
- (Optional) List of database schema(s). For example:public
.schema_name
- (Optional) The name (slug) of the schema repository in Atlas Registry. Read more in Atlas website: https://atlasgo.io/registry.tag
- (Optional) The tag to apply to the pushed schema. By default, the current git commit hash is used.url
- (Optional) Desired schema URL(s) to push. For example:file://schema.lt.hcl
.version
- (Optional) The version of the schema.working_directory
- (Optional) Atlas working directory. Default is project rootconfig
- (Optional) The path to the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.env
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.vars
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.dev_url
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
Outputs
The AtlasAction
task will generate the following outputs for this action:
link
- Link to the schema version on Atlas.slug
- The slug of the pushed schema version.url
- The URL of the pushed schema version.
schema test
Run schema tests against the desired schema
Inputs
action
- (Required) Alwaysschema test
.githubConnection
- (Optional) The connection to GitHub.paths
- (Optional) List of directories containing test files.run
- (Optional) Filter tests to run by regexp. For example,^test_.*
will only run tests that start withtest_
. Default is to run all tests.url
- (Optional) The desired schema URL(s) to testworking_directory
- (Optional) Atlas working directory. Default is project rootconfig
- (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.env
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.vars
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.dev_url
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.