Posts

How To Use Liquibase with Spring Boot

Sep 2, 2025
Catherine Edelveis
24.4

Liquibase is a database migration tool that allows you to track, apply, and roll back changes to the database schema using declarative changelog files. It integrates with version control systems, ensures automatic and reproducible deployment of migrations across different environments, makes the change history transparent, and works consistently with various databases.

This article will guide you through setting up and using Liquibase with a Spring Boot application and Maven.

You will learn how to:

  • Set up Liquibase for PostgreSQL,
  • Write migration scripts in YAML and SQL,
  • Use preconditions,
  • Perform rollbacks,
  • Use Liquibase diff to compare two versions of a database and create migration scripts automatically,
  • Run database migrations in CI using GitHub Actions.

The code from this article is available on GitHub.

Liquibase Structure and Commands

Liquibase enables you to evolve database schemas using data-agnostic formats: YAML, XML, and JSON. These formats are helpful if you need to manage one schema across several databases. However, you can also write scripts in SQL if you prefer.

To work with Liquibase, you need to familiarize yourself with the following key concepts.

Changelogs

Changelog files are the main building blocks of database migrations. They sequentially describe the history of the database changes. You can write a changelog in any supported format, and Liquibase will detect the format by the file extension.

It is a good practice to maintain a main changelog that includes links to other changelogs with the actual schema changes. 

Changesets

Changesets are the atomic units of work included in a changelog. For easier rollback, it is a good practice to specify one schema change per changeset. Changesets are applied once and tracked by Liquibase.

Change types

Change types are database-independent operations specified in a changeset. They correspond to SQL statements applied to the database. Some examples are createTable, createIndex, addColumn, addForeignKeyConstraint, etc.

Changelog tags

Changelog tags can be described as execution controls. They help to specify when and where a changeset should run. Three types of changelog tags are:

  • Preconditions that help to perform checks against the current database state. For instance, “table X must exist.” If the check fails, Liquibase can halt, skip, or mark the changeset as ran as instructed by you.
  • Context tags are boolean flags you attach to changesets, e.g., dev, staging. You pass a --context-filter at runtime, and only matching changesets execute.
  • Labels are similar to context tags but are added to changesets as labels and specified at runtime with a --label-filteroption.

Liquibase Commands

Liquibase supports ~40 commands that can be classified into six categories:

  • Init to bootstrap Liquibase to a project.
  • Update to apply any pending changes to the target database.
  • Rollback to undo changes by count, tag, or date.
  • Inspection to compare sources and generate diffs.
  • Change tracking to get the deployment status of changes.
  • Utility to manage schema documents and tasks.

Integrate Liquibase into a Spring Boot Application

Prerequisites:

  • Java 17 or higher. As we are working with Spring Boot, I use Liberica JDK 24 recommended by Spring.
  • Local PostgreSQL Server or Docker Compose if you want to spin up a database in a container. This demo uses the second option.
  • Your favorite IDE.

I’m going to use a cyberpunk-themed demo project called Neurowatch. You can clone the project from GitHub to follow along, use your own app, or create a basic Spring Boot app. 

If you use your own project, add the following dependency on Liquibase:

<dependency>
    <groupId>org.liquibase</groupId>
    <artifactId>liquibase-core</artifactId>
</dependency>

In the application.properties file, disable Hibernate DDL, enable Liquibase and specify the path to the main changelog file:

# Hibernate is disabled because Liquibase will manage schema
spring.jpa.hibernate.ddl-auto=none
spring.jpa.show-sql=true

# Liquibase
spring.liquibase.enabled=true
spring.liquibase.change-log=classpath:db/changelog/db.changelog-master.yaml

If you added Liquibase when creating the project, Spring created db/changelog directories for you under resources. Otherwise, you need to create them manually.

In the db/changelog directory, create a db.changelog-master.yaml file with the following content:

databaseChangeLog:
  - include:
      file: /src/main/resources/db/changelog/001-create-tables.yaml

The 001-create-tables.yaml will be our first changelog. Other changelogs will be added to the main changelog in a similar fashion.

We’re all set. It's time to write some migrations!

Write a Changelog File in YAML

The first changelog will describe the initial database schema.

The file starts with a top-level key pointing out that this file is a changelog:

databaseChangeLog:

After that, you add a changeSet entry and specify its id, author, any other required info, and preconditions if any:

databaseChangeLog:
  - changeSet:
      id: 001
      author: cyberjar
      preConditions:

Then, you begin the list of actual operations, a.k.a. the change types. In this case, we will have two change types, addColumn and addForeignKeyConstraint. Everything indented under the line with the change type configures that operation:

databaseChangeLog:
  - changeSet:
      id: 001
      author: cyberjar
      changes:
        - createTable:

The whole file:

databaseChangeLog:
  - changeSet:
      id: 001
      author: cyberjar
      changes:
        - createTable:
            tableName: civilian
            columns:
              - column:
                  name: id
                  type: BIGINT
                  autoIncrement: true
                  constraints:
                    primaryKey: true
              - column:
                  name: legal_name
                  type: VARCHAR(250)
              - column:
                  name: national_id
                  type: VARCHAR(50)
              - column:
                  name: birth_date
                  type: DATE
              - column:
                  name: criminal_record
                  type: BOOLEAN
              - column:
                  name: under_surveillance
                  type: BOOLEAN

        - createTable:
            tableName: cyberware
            columns:
              - column:
                  name: id
                  type: BIGINT
                  autoIncrement: true
                  constraints:
                    primaryKey: true
              - column:
                  name: name
                  type: VARCHAR(200)
              - column:
                  name: type
                  type: VARCHAR(50)
              - column:
                  name: version
                  type: VARCHAR(50)

        - createTable:
            tableName: implant_session
            columns:
              - column:
                  name: id
                  type: BIGINT
                  autoIncrement: true
                  constraints:
                    primaryKey: true
              - column:
                  name: civilian_id
                  type: BIGINT
              - column:
                  name: cyberware_id
                  type: BIGINT
              - column:
                  name: installed_at
                  type: DATE
              - column:
                  name: installed_by
                  type: VARCHAR(255)
        - addForeignKeyConstraint:
            baseTableName: implant_session
            baseColumnNames: civilian_id
            referencedTableName: civilian
            referencedColumnNames: id
            constraintName: fk_civilian
        - addForeignKeyConstraint:
            baseTableName: implant_session
            baseColumnNames: cyberware_id
            referencedTableName: cyberware
            referencedColumnNames: id
            constraintName: fk_cyberware

Let’s create another changelog 002-insert-sample-data.yaml to populate the DB with test data. Don’t forget to reference this file in the main changelog:

databaseChangeLog:
  - include:
      file: /src/main/resources/db/changelog/001-create-tables.yaml
  - include:
      file: /src/main/resources/db/changelog/002-insert-sample-data.yaml

In this case, we will use the change type “insert.” The whole file:

databaseChangeLog:
  - changeSet:
      id: 002
      author: cyberjar
      changes:
        - insert:
            tableName: civilian
            columns:
              - column: { name: legal_name, value: "Aelita Fang" }
              - column: { name: national_id, value: "IZ-0965437-BM" }
              - column: { name: birth_date, value: "1999-08-30" }
              - column: { name: criminal_record, value: "FALSE" }
              - column: { name: under_surveillance, value: "FALSE" }

        - insert:
            tableName: cyberware
            columns:
              - column: { name: name, value: "OptiSight X3" }
              - column: { name: type, value: "ocular" }
              - column: { name: version, value: "SZ 3000" }

        - insert:
            tableName: implant_session
            columns:
              - column: { name: civilian_id, valueNumeric: 1 }
              - column: { name: cyberware_id, valueNumeric: 1 }
              - column: { name: installed_at, valueDate: "2025-06-15" }
              - column: { name: installed_by, value: "MP-129854" }

Now, run the application. You should see in the console that Liquibase has successfully run two changesets. If you open your favorite database tool and connect to the database, you should see that the tables were created and populated.

DB schema

You will also see two additional tables that Liquibase creates automatically upon the first run: databasechangelog and databasechangeloglock.

Write a Changelog File in SQL

If you work with one database, you don’t have to use database-agnostic formats and can write changelogs in SQL. The file almost doesn’t differ from the schema docs you are used to writing, except for the formatted comments. The first non-empty line should be

--liquibase formatted sql

This header that tells Liquibase to parse this file as a “formatted SQL” changelog, not just raw SQL. This way, Liquibase reads and recognizes special comment directives like --changeset author:id, --rollback, --preconditions, --context, --labels, etc.

As for the changes, they are described in the Postgres-specific SQL, so nothing new here: 

--liquibase formatted sql
--changeset cyberjar:001

CREATE TABLE civilian (
  id BIGINT PRIMARY KEY,
  legal_name VARCHAR(250),
  national_id VARCHAR(50),
  birth_date DATE,
  criminal_record BOOLEAN,
  under_surveillance BOOLEAN
);

CREATE TABLE cyberware (
  id BIGINT PRIMARY KEY,
  name VARCHAR(200),
  type VARCHAR(50),
  version VARCHAR(50)
);

CREATE TABLE implant_session (
  id BIGINT PRIMARY KEY,
  civilian_id BIGINT,
  cyberware_id BIGINT,
  installed_at DATE,
  installed_by VARCHAR(255)
);

ALTER TABLE implant_session
  ADD CONSTRAINT fk_civilian  FOREIGN KEY (civilian_id)  REFERENCES civilian(id);

ALTER TABLE implant_session
  ADD CONSTRAINT fk_cyberware FOREIGN KEY (cyberware_id) REFERENCES cyberware(id);

Use Preconditions in Liquibase Changelogs

What are Preconditions?

Preconditions are special tags that enable you to specify requirements for updating the database. For instance, you can check that changes are applied to a specific database or by a particular user. Preconditions can be applied to the whole changelog or to individual changesets.

With XML, JSON, and YAML you can use all preconditions supported by Liquibase. The SQL-based changelogs support only some of them.

Preconditions Syntax

It is possible to use one preCondition tag per changelog/changeset, but you can specify several requirements using the AND, OR, NOT tags. If no tag is specified, the default AND tag is used under the hood.

For instance, in the following changelog, we verify that the database type is Postgres and the user applying changes is dbadmin:

databaseChangeLog:
  - changeSet:
      id: 001
      author: cyberjar
      preConditions:
        - dbms:
            type: postgresql
	 - runningAs:
            username: dbadmin
      changes:

In the following changelog, the database type should be PostgreSQL or MySQL:

databaseChangeLog:
  - changeSet:
      id: 001
      author: cyberjar
      preConditions:
   - or:
       - dbms:
          type: postgresql
       - dbms:
          type: mysql
      changes:

You can find the whole list of available preConditions in the documentation.

This is all very well, but how can we handle the unsatisfactory check results?

How to Handle Precondition Errors and Failures

There are two types of preconditions:

  • Failures (the onFail attribute) mean that the check has failed. For instance, a column exists when it mustn’t.
  • Errors (the onError attribute) are exceptions thrown during the execution of the check.

You can use the required attribute and specify what to do in case of failure or error using the following values:

  • CONTINUE: Liquibase will not execute this changeset, but it will continue with the changelog and try to execute this changeset during the next run.
  • HALT: Liquibase stops the execution of the whole changelog.
  • MARK_RAN: Liquibase will not execute this changeset, but will mark it as executed.
  • WARN: Liquibase will continue executing the changeset/changelog, but will issue a warning.

Note that the CONTINUE and MARK_RAN can only be applied to a changeset.

For instance, the changeset below includes a precondition that verifies the column doesn’t exist. If the check fails, this changeset will not be applied, but marked as run, meaning that Liquibase won’t attempt to execute it during the next update:

databaseChangeLog:
  - changeSet:
      id: 004
      author: cyberjar
      preConditions:
        - onFail: MARK_RAN
        - not:
            columnExists:
              tableName: cyberware
              columnName: type
      changes:
        - addColumn:
            tableName: cyberware
            columns:
              - column:
                  name: type
                  type: VARCHAR(50)

Perform Rollbacks with Liquibase

Suppose something went wrong and the changes you introduced to the database brought some issues. In this case, you can use the Liquibase rollback functionality supported in CLI and Maven.

The rollback script returns the database to a specified point. There are three types of rollback:

  • A simple rollback performed with the rollback command reverts changes to the database after a specified tag.
  • A rollback to some period with the rollback-to-date command reverts changes from the current date to a specified date and time.
  • A rollback by a number of changesets with the rollback-count command rolls back a specified number of changesets.

However, writing rollback scripts is difficult, and using rollbacks is associated with risks. Therefore, you should use this feature cautiously and always validate potential changes before they are applied. You can do that with the help of these commands:

  • update-testing-rollback tests the rollbacks by deploying pending changesets, rolling them back, and running the update again.
  • future-rollback-sql generates a SQL that will be used to perform the rollback.

Let’s see how we can perform a rollback using Maven. For that, we need to add a new plugin, liquibase-maven-plugin and configure the database connection and the path to the main changelog there: 

<plugin>
    <groupId>org.liquibase</groupId>
    <artifactId>liquibase-maven-plugin</artifactId>
    <version>4.27.0</version>
    <configuration>
     <changeLogFile>src/main/resources/db/changelog/db.changelog-master.yaml</changeLogFile>       
        <driver>org.postgresql.Driver</driver>
        <url>jdbc:postgresql://localhost:5432/neurowatch</url>
        <defaultSchemaName />
        <username>neurowatch_user</username>
        <password>mypassword</password>
    </configuration>
</plugin>

Now, let’s create a new rollback changelog where we will add a new column to the civilian table. The rollback will drop this column:

databaseChangeLog:
  - changeSet:
      id: 005
      author: cyberjar
      preConditions:
        - dbms:
            type: postgresql
        - not:
            columnExists:
              tableName: civilian
              columnName: temp_notes
      changes:
        - addColumn:
            tableName: civilian
            columns:
              - column:
                  name: temp_notes
                  type: VARCHAR(255)
                  defaultValue: 'N/A'
      rollback:
        - dropColumn:
            columnName: temp_notes
            tableName: civilian

Run the application. The new column should have appeared in the civilian table. Now, let’s perform the rollback.

Run the mvn:liquibase command specifying the number of changesets you want to roll back with rollbackCount:

mvn liquibase:rollback -Dliquibase.rollbackCount=1

Run the application again. In the console, you will see that the rollback was applied. Verify in your database tool that the column indeed disappeared.

Generate Changelogs with Liquibase diff

Comparing Two Database Versions with Liquibase diff

The Liquibase diff command allows you to compare two database schema versions, DB to DB or DB to JPA via Hibernate. When you run this command, Liquibase analyzes one or two database states and automatically generates a changelog with the differences.

The diff command can be very useful in some situations but harmful in others, so it should not be used for everything. Let’s briefly discuss when it is a sagacious advisor and when it is a little drunk elf that desperately wants to help but has no idea what it is doing.

Benefits and Drawbacks of Diffs

The Liquibase diff command can be helpful when

  • You want to integrate Liquibase into an existing project. In this case, the diff command will compare the existing and empty databases and generate changesets describing how to recreate the current state of the database.
  • The diff command can help you detect changes that weren’t documented. Although the best practice for developers is to introduce all changes through migrations rather than directly to the database, this command can still be used for a quick sanity check.

On the other hand, there are some caveats with diffs that you must take into consideration:

  • The diffs show syntactical differences but not semantical ones. For example, if a column was renamed, the generated changelog will tell you that one column should be dropped and another added, leading to data loss.
  • The diffs can’t track the data changes efficiently because they can’t handle the expected differences between environments.

Therefore, this command should be used with caution and a complete understanding of what and why one is doing. Always validate the generated changelogs before applying them and adjust them manually if needed. You can also use additional tools, such as JPA Buddy, to validate the potential changes.

Use the Liquibase Hibernate Plugin with Maven to Generate Diffs

Suppose you reworked your JPA entities a bit and want to generate a changelog describing all the changes you have made.

Let’s first add the dependency to Liquibase Hibernate that will enable you to generate the changelog file comparing the current database schema and JPA entities:

<dependency>
    <groupId>org.liquibase.ext</groupId>
    <artifactId>liquibase-hibernate6</artifactId>
    <version>4.27.0</version>
</dependency>

Next, we need to add some additional configurations to the Liquibase Maven plugin:

  • diffChangeLogFile points to the changelog file that will be generated as a result of the database comparison. The name of the file contains a timestamp for easier version tracking.
  • referenceURL pointing to the package in the application where our JPA entities reside. The dialect parameter is required because the referenceUrl performs the package scanning.
  • Hibernate Physical Naming Strategy and Hibernate Implicit Naming Strategy help avoid API incompatibilities.
<plugin>
    <groupId>org.liquibase</groupId>
    <artifactId>liquibase-maven-plugin</artifactId>
    <version>4.27.0</version>
    <configuration>
        <changeLogFile>src/main/resources/db/changelog/db.changelog-master.yaml</changeLogFile>
        <diffChangeLogFile>src/main/resources/db/changelog/${maven.build.timestamp}_changelog.yaml</diffChangeLogFile>
        <driver>org.postgresql.Driver</driver>
        <url>jdbc:postgresql://localhost:5432/neurowatch</url>
        <defaultSchemaName />
        <username>neurowatch_user</username>
        <password>mypassword</password>
        <referenceUrl>hibernate:spring:dev.cyberjar?dialect=org.hibernate.dialect.PostgreSQLDialect
            &amp;hibernate.physical_naming_strategy=org.hibernate.boot.model.naming.CamelCaseToUnderscoresNamingStrategy
            &amp;hibernate.implicit_naming_strategy=org.springframework.boot.orm.jpa.hibernate.SpringImplicitNamingStrategy
        </referenceUrl>
    </configuration>
</plugin>

Finally, we can run the liquibase:diff command to generate a changelog. Note that this command uses compiled classes from the target directory for comparison, so we need to rebuild the project:

mvn clean install liquibase:diff

As a result, the changelog will be generated in the db/changelog directory.

Run Database Migrations in CI/CD with Liquibase and GitHub Actions

Let’s discuss this pipeline step by step.

The workflow YAML is configured to run whenever changes are pushed to the main branch of the application. It defines one job, “Verify and Migrate DB with Liquibase”:

name: Liquibase CI Pipeline

on:
  workflow_dispatch:
  push:
    branches:
      - main

jobs:
  liquibase-verify:
    name: Verify and Migrate DB with Liquibase

Before conducting any steps within the job, the workflow declares the services it needs to spin up, in this case, a PostgreSQL service. The database user and password are specified explicitly in the example, but in a real setup, you should store credentials as GitHub Secrets and reference them in the password fields. The service also exposes the required ports and includes health-check options, so the job continues only after the database reports itself as ready.

   runs-on: ubuntu-latest
    services:
      postgres:
        image: postgres:16
        env:
          POSTGRES_DB: neurowatch
          POSTGRES_USER: neurowatch_user
          POSTGRES_PASSWORD: mypassword
        ports:
          - 5432:5432
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

The steps begin by checking out the repository and running a resource processing phase with Maven as a precaution.

   steps:
      - name: Checkout repository
        uses: actions/checkout@v4

      - name: Process Resources
        shell: bash
        run: |
          ./mvnw process-resources

The interesting part starts with validation. Here, the workflow uses the Liquibase GitHub Action to validate the change sets. It specifies the path to the main changelog file along with the URL, username, and password needed to reach the database. Note that the URL does not point to localhost. Instead, it targets the service name we brought up, postgres.

     - name: Validate changesets
        uses: liquibase-github-actions/[email protected]
        with:
          changelogFile: src/main/resources/db/changelog/db.changelog-master.yaml
          url: "jdbc:postgresql://postgres:5432/neurowatch"
          username: neurowatch_user
          password: mypassword
          headless: true
          logLevel: INFO

Next comes Preview Update SQL, a dry run of the SQL that would be executed. It uses the update Liquibase GitHub Action and repeats the same configuration details: the changelog file, the database URL, and the credentials.

     - name: Preview updateSQL (dry run)
        uses: liquibase-github-actions/[email protected]
        with:
          changelogFile: src/main/resources/db/changelog/db.changelog-master.yaml
          url: "jdbc:postgresql://postgres:5432/neurowatch"
          username: neurowatch_user
          password: mypassword
          headless: true
          logLevel: INFO

After that, the pipeline performs a Preview Rollback SQL step to verify that rollbacks behave correctly. Here, it uses a different Liquibase GitHub Action for rollbacks, the rollback count SQL in this example, but you can choose the rollback action that best fits your needs. The step includes the rollback count or how many changesets to roll back, and again, passes the username, password, and other required parameters.

     - name: Preview rollbackSQL (dry run of last changeSet)
        uses: liquibase-github-actions/rollback-count-[email protected]
        with:
          changelogFile: src/main/resources/db/changelog/db.changelog-master.yaml
          count: 1
          url: "jdbc:postgresql://postgres:5432/neurowatch"
          username: neurowatch_user
          password: mypassword
          headless: true
          logLevel: INFO

If all those jobs succeed, the pipeline applies the Liquibase migration for real using the update job, with the same set of parameters as before.

     - name: Apply Liquibase migration
        uses: liquibase-github-actions/[email protected]
        with:
          changelogFile: src/main/resources/db/changelog/db.changelog-master.yaml
          url: "jdbc:postgresql://postgres:5432/neurowatch"
          username: neurowatch_user
          password: mypassword
          headless: true
          logLevel: INFO

Once migrations are applied, the workflow runs the tests. If everything passes, the final stage deploys the application. In my demo, it is represented by a simple placeholder command, but in a real pipeline, you would replace it with your actual deployment process.

     - name:  Run tests
        run: ./mvnw test

      - name: Deploy app
        if: success()
        run: echo "Deploy your app here..."

That’s it! The whole file can be found on GitHub.

Conclusion

In this article, we examined using Liquibase with Spring Boot and Maven for reliable database migrations both locally and in the CI/CD pipeline.

Don’t forget to subscribe to our newsletter for a monthly digest of our articles and videos on Java development and news in the Java world!

 

Subcribe to our newsletter

figure

Read the industry news, receive solutions to your problems, and find the ways to save money.

Further reading