Spring Boot R2DBC INSERT batching (Reactive SQL)

Batching is the act of gathering multiple statements together and executing them over a single database connection. Batching has performance benefits since the database can better optimize the batched queries, however it also “saves” having to open individual database connections and reduces connection starvation. Connection starvation in Postgres ends up spitting out errors along the lines of “too many clients”.

Unfortunately, in the context of Spring Boot R2DBC, batching is not currently supported via high-level abstractions like R2DBC repositories and such:


Instead we have to use a lower level approach and use the Database Client directly


Whether or not you use higher-level abstractions like R2RBC data repositories, the first step is to define a configuration been which will act as your database connection factory. For example, using postgres would result in something similar to:

class Database : AbstractR2dbcConfiguration() {

	override fun connectionFactory(): ConnectionFactory {
        return PostgresqlConnectionFactory(


Once that is done, you are free to use Spring Boot R2DBC connection primitives like templates or reactive repositories as described in:


Actually batching

So far, so good but as noted previously this wont support batching. If you were to have some Flux of objects you’d like to insert into the database, these insertions would not be batched. The github issue linked above has a clue on how to do this:

However, the syntax is a bit old and doesn’t offer much context of how to use the above in a flow made up of flux and mono.

The final approach which worked well resulted in the following code:

somethingThatReturnsAFlux()     // Flux<SomeObject>
            // error handling
            .onErrorResume { e ->    
            // collect flux to list... unfortunately this is necessary due to the next step
            .flatMap {
                // databaseConnection is the connection factory built in the "pre-requisites"
                    // create a batch
                    .map{connection -> connection.createBatch()}
                    // add operations in SQL to the batch
                    .map{ batch -> {   
                        for ( item in it){
                            batch.add("INSERT INTO some table(col1, col2) values ('${item.col1}','${item.col2')")
                        // execute the batch
            //optional: flow can be executed on seperate thread pools