How to bulk insert with a custom AsyncEventListener
search cancel

How to bulk insert with a custom AsyncEventListener

book

Article ID: 294084

calendar_today

Updated On:

Products

VMware Tanzu Gemfire

Issue/Introduction

This article provides a couple of workarounds to avoid an issue where a bulk insert doesn't properly fire a custom AsyncEventListener to asynchronously write the data into external database (such as Postgresql).

Environment


Cause

A custom AsyncEventListener may fail to pass data that is loaded by bulk insert (i.e. batch insert by PreparedStatement) from a Java client application into the external database. This behavior does not occur with single (non-batch) inserts, nor with the default, built-in DBSynchronizer.


The Generic AsyncEventListner in SQLFire uses a slightly different invocation path, and only Primary Key based events are sent to the AsyncEventListner Queue. A bulk insert's event type is BULK_INSERT and a custom AsyncEventListener that implements AsyncEventListener will not receive the BULK_INSERT type event.

Resolution

There are two basic workarounds for this behavior (which will be addressed in a future release).

Workaround 1:

Use simple (non-batch) SQL inserts in conjunction with a Prepared Statement instead of a bulk insert in your Java client application.

Example:

Statement.addBatch();
Statement.executeBatch();
-->
tatement.executeUpdate(insertSQL);

Workaround 2:

Extend the built-in DBSynchronizer instead of implementing AsyncEventListener in the custom AsyncEventListener code.

Example:

public class customDBSynchronizer implements AsyncEventListener
-->
import com.vmware.sqlfire.callbacks.DBSynchronizer;
public class customDBSynchronizer extends DBSynchronizer