1. Overview

When creating a persistence layer, we need to match our SQL database schema with the object model that we created in our code. This can be a lot of work to do manually.

In this tutorial, we’ll learn how to generate and export our database schema based on the entity models from our code.

First, we’ll cover the JPA configuration properties for schema generation. Then we’ll explore how to use these properties in Spring Data JPA.

Finally, we’ll discuss an alternative for DDL generation using Hibernate’s native API.

2. JPA Schema Generation

JPA 2.1 introduced a standard for database schema generation. Thus, starting with this release, we can control how to generate and export our database schema through a set of predefined configuration properties.

2.1. The Script action

To control which DDL commands we’ll generate, JPA introduces the script action configuration option:

javax.persistence.schema-generation.scripts.action

We can choose from four different options:

  • none – doesn’t generate any DDL commands
  • create – generates only database create commands
  • drop – generates only database drop commands
  • drop-and-create – generates database drop commands followed by create commands

2.2. The Script target

For each specified script action, we’ll need to define the corresponding target configuration:

javax.persistence.schema-generation.scripts.create-target
javax.persistence.schema-generation.scripts.drop-target

In essence, the script target defines the location of the file that contains the schema create or drop commands. So, for instance, if we choose drop-and-create as the script action, we’ll need to specify both targets.

2.3. The Schema Source

Finally, to generate the schema DDL commands from our entity models, we should include the schema source configurations with the metadata option selected:

javax.persistence.schema-generation.create-source=metadata
javax.persistence.schema-generation.drop-source=metadata

In the next section, we’ll see how we can use Spring Data JPA to automatically generate our database schema with the standard JPA properties.

3. Schema Generation With Spring Data JPA

3.1. The Models

Let’s imagine we’re implementing a user-account system with an entity called Account:

@Entity
@Table(name = "accounts")
public class Account {
    
    @Id
    @GeneratedValue
    private Long id;

    @Column(nullable = false, length = 100)
    private String name;

    @Column(name = "email_address")
    private String emailAddress;

    @OneToMany(mappedBy = "account", cascade = CascadeType.ALL)
    private List<AccountSettings> accountSettings = new ArrayList<>();

    // getters and setters
}

Each account can have multiple account settings, so here we’ll have a one-to-many mapping:

@Entity
@Table(name = "account_settings")
public class AccountSetting {

    @Id
    @GeneratedValue
    private Long id;

    @Column(name = "name", nullable = false)
    private String settingName;

    @Column(name = "value", nullable = false)
    private String settingValue;

    @ManyToOne
    @JoinColumn(name ="account_id", nullable = false)
    private Account account;

    // getters and setters
}

3.2. Spring Data JPA Configuration

To generate the database schema, we’ll need to pass the schema generation properties to the persistence provider in use. To do this, we’ll set the native JPA properties in our configuration file under the spring.jpa.properties prefix:

spring.jpa.properties.javax.persistence.schema-generation.scripts.action=create
spring.jpa.properties.javax.persistence.schema-generation.scripts.create-target=create.sql
spring.jpa.properties.javax.persistence.schema-generation.scripts.create-source=metadata

Then Spring Data JPA passes these properties to the persistence provider when it creates the EntityManagerFactory bean.

3.3. The create.sql File

As a result, on the application startup, the above configuration will generate the database creation commands based on the entity mapping metadata. Furthermore, the DDL commands are exported into the create.sql file, which is created in our main project folder:

create table account_settings (
    id bigint not null,
    name varchar(255) not null,
    value varchar(255) not null,
    account_id bigint not null,
    primary key (id)
)

create table accounts (
    id bigint not null,
    email_address varchar(255),
    name varchar(100) not null,
    primary key (id)
)

alter table account_settings
   add constraint FK54uo82jnot7ye32pyc8dcj2eh
   foreign key (account_id)
   references accounts (id)

4. Schema Generation With the Hibernate API

If we’re using Hibernate, we can use its native API, SchemaExport, to generate our schema DDL commands. Likewise, the Hibernate API uses our application entity models to generate and export the database schema.

With Hibernate’s SchemaExport, we can use the drop, createOnly, and create methods explicitly:

MetadataSources metadataSources = new MetadataSources(serviceRegistry);
metadataSources.addAnnotatedClass(Account.class);
metadataSources.addAnnotatedClass(AccountSettings.class);
Metadata metadata = metadataSources.buildMetadata();

SchemaExport schemaExport = new SchemaExport();
schemaExport.setFormat(true);
schemaExport.setOutputFile("create.sql");
schemaExport.createOnly(EnumSet.of(TargetType.SCRIPT), metadata);

When we run this code, our database creation commands are exported into the create.sql file in our main project folder.

The SchemaExport is part of the Hibernate Bootstrapping API.

5. Schema Generation Options

Even though schema generation can save us time during development, we should only use it for basic scenarios.

For instance, we could use it to quickly spin up development or testing databases.

In contrast, for more complex scenarios like database migration, we should use more refined tooling, like Liquibase or Flyway.

6. Conclusion

In this article, we learned how to generate and export our database schema with the help of the JPA schema-generation properties. Then we discussed how to achieve the same result using Hibernate’s native API, SchemaExport.

As always, the example code for this article is available over on GitHub.