1. Introduction

In this tutorial, we’ll learn how to read JSON data from files and import them into MongoDB using Spring Boot. This can be useful for many reasons: restoring data, bulk inserting new data, or inserting default values. MongoDB uses JSON internally to structure its documents, so naturally, that’s what we’ll use to store importable files. Being plain text, this strategy also has the advantage of being easily compressible.

Moreover, we’ll learn how to validate our input files against our custom types when necessary. Finally, we’ll expose an API so we can use it during runtime in our web app.

2. Dependencies

Let’s add these Spring Boot dependencies to our pom.xml:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-data-mongodb</artifactId>
</dependency>

We’re also going to need a running instance of MongoDB, which requires a properly configured application.properties file.

3. Importing JSON Strings

The simplest way to import JSON into MongoDB is to convert it into an “org.bson.Document” object first. This class represents a generic MongoDB document of no specific type. Therefore we don’t have to worry about creating repositories for all the kinds of objects we might import.

Our strategy takes JSON (from a file, resource, or string), converts it into Documents, and saves them using MongoTemplate. Batch operations generally perform better since the amount of round trips is reduced compared to inserting each object individually.

Most importantly, we’ll consider our input to have only one JSON object per line break. That way, we can easily delimiter our objects. We’ll encapsulate these functionalities into two classes that we’ll create: ImportUtils and ImportJsonService. Let’s start with our service class:

@Service
public class ImportJsonService {

    @Autowired
    private MongoTemplate mongo;
}

Next, let’s add a method that parses lines of JSON into documents:

private List<Document> generateMongoDocs(List<String> lines) {
    List<Document> docs = new ArrayList<>();
    for (String json : lines) {
        docs.add(Document.parse(json));
    }
    return docs;
}

Then we add a method that inserts a list of Document objects into the desired collection. Also, it’s possible the batch operation partially fails. In that case, we can return the number of inserted documents by checking the cause of the exception:

private int insertInto(String collection, List<Document> mongoDocs) {
    try {
        Collection<Document> inserts = mongo.insert(mongoDocs, collection);
        return inserts.size();
    } catch (DataIntegrityViolationException e) {
        if (e.getCause() instanceof MongoBulkWriteException) {
            return ((MongoBulkWriteException) e.getCause())
              .getWriteResult()
              .getInsertedCount();
        }
        return 0;
    }
}

Finally, let’s combine those methods. This one takes the input and returns a string showing how many lines were read vs. successfully inserted:

public String importTo(String collection, List<String> jsonLines) {
    List<Document> mongoDocs = generateMongoDocs(jsonLines);
    int inserts = insertInto(collection, mongoDocs);
    return inserts + "/" + jsonLines.size();
}

4. Use Cases

Now that we’re ready to process input, we can build some use cases. Let’s create the ImportUtils class to help us with that. This class will be responsible for converting input into lines of JSON. It will only contain static methods. Let’s start with the one for reading a simple String:

public static List<String> lines(String json) {
    String[] split = json.split("[\\r\\n]+");
    return Arrays.asList(split);
}

Since we’re using line breaks as a delimiter, regex works great to break strings into multiple lines. This regex handles both Unix and Windows line endings. Next, a method to convert a File into a list of strings:

public static List<String> lines(File file) {
    return Files.readAllLines(file.toPath());
}

Similarly, we finish up with a method to convert a classpath resource into a list:

public static List<String> linesFromResource(String resource) {
    Resource input = new ClassPathResource(resource);
    Path path = input.getFile().toPath();
    return Files.readAllLines(path);
}

4.1. Import File During Startup With a CLI

In our first use case, we’ll implement functionality for importing a file via application arguments. We’ll take advantage of the Spring Boot ApplicationRunner interface to do this at boot time. For instance, we can read command line parameters to define the file to import:

@SpringBootApplication
public class SpringBootJsonConvertFileApplication implements ApplicationRunner {
    private static final String RESOURCE_PREFIX = "classpath:";

    @Autowired
    private ImportJsonService importService;

    public static void main(String ... args) {
        SpringApplication.run(SpringBootPersistenceApplication.class, args);
    }

    @Override
    public void run(ApplicationArguments args) {
        if (args.containsOption("import")) {
            String collection = args.getOptionValues("collection")
              .get(0);

            List<String> sources = args.getOptionValues("import");
            for (String source : sources) {
                List<String> jsonLines = new ArrayList<>();
                if (source.startsWith(RESOURCE_PREFIX)) {
                    String resource = source.substring(RESOURCE_PREFIX.length());
                    jsonLines = ImportUtils.linesFromResource(resource);
                } else {
                    jsonLines = ImportUtils.lines(new File(source));
                }
                
                String result = importService.importTo(collection, jsonLines);
                log.info(source + " - result: " + result);
            }
        }
    }
}

Using getOptionValues() we can process one or more files. These files can be either from our classpath or from our file system. We differentiate them using the RESOURCE_PREFIX. Every argument starting with “*classpath:*” will be read from our resources folder instead of from the file system. After that, they will all be imported into the desired collection.

Let’s start using our application by creating a file under src/main/resources/data.json.log:

{"name":"Book A", "genre": "Comedy"}
{"name":"Book B", "genre": "Thriller"}
{"name":"Book C", "genre": "Drama"}

After building, we can use the following example to run it (line breaks added for readability). In our example, two files will be imported, one from the classpath, and one from the file system:

java -cp target/spring-boot-persistence-mongodb/WEB-INF/lib/*:target/spring-boot-persistence-mongodb/WEB-INF/classes \
  -Djdk.tls.client.protocols=TLSv1.2 \
  com.baeldung.SpringBootPersistenceApplication \
  --import=classpath:data.json.log \
  --import=/tmp/data.json \
  --collection=books

4.2. JSON File From HTTP POST Upload

Additionally, if we create a REST Controller, we’ll have an endpoint to upload and import JSON files. For that, we’ll need a MultipartFile parameter:

@RestController
@RequestMapping("/import-json")
public class ImportJsonController {
    @Autowired
    private ImportJsonService service;

    @PostMapping("/file/{collection}")
    public String postJsonFile(@RequestPart("parts") MultipartFile jsonStringsFile, @PathVariable String collection)  {
        List<String> jsonLines = ImportUtils.lines(jsonStringsFile);
        return service.importTo(collection, jsonLines);
    }
}

Now we can import files with a POST like this, where “*/tmp/data.json*” refers to an existing file:

curl -X POST http://localhost:8082/import-json/file/books -F "parts=@/tmp/books.json"

4.3. Mapping JSON to a Specific Java Type

We’ve been using only JSON, not bound to any type, which is one of the advantages of working with MongoDB. Now we want to validate our input. In this case, let’s add an ObjectMapper by making this change to our service:

private <T> List<Document> generateMongoDocs(List<String> lines, Class<T> type) {
    ObjectMapper mapper = new ObjectMapper();

    List<Document> docs = new ArrayList<>();
    for (String json : lines) {
        if (type != null) {
            mapper.readValue(json, type);
        }
        docs.add(Document.parse(json));
    }
    return docs;
}

That way, if the type parameter is specified, our mapper will try to parse our JSON string as that type. And, with default configuration, will throw an exception if any unknown properties are present. Here’s our simple bean definition for working with a MongoDB repository:

@Document("books")
public class Book {
    @Id
    private String id;
    private String name;
    private String genre;
    // getters and setters
}

And now, to use the improved version of our Document generator, let’s change this method as well:

public String importTo(Class<?> type, List<String> jsonLines) {
    List<Document> mongoDocs = generateMongoDocs(jsonLines, type);
    String collection = type.getAnnotation(org.springframework.data.mongodb.core.mapping.Document.class)
      .value();
    int inserts = insertInto(collection, mongoDocs);
    return inserts + "/" + jsonLines.size();
}

Now, instead of passing the name of a collection, we pass a Class. We assume it has the Document annotation as we used in our Book, so it can retrieve the collection name. However, since both the annotation and the Document classes have the same name, we have to specify the whole package.

5. Conclusion

In this article, we went through breaking JSON input from files, resources, or simple strings and importing them into MongoDB. We centralized this functionality in a service class and a utility class so we could reuse it anywhere. Our use cases included a CLI and a REST option, along with example commands on how to use it.

And as always, the source code is available over on GitHub.