By Tom Marthenal

2010-12-04 18:20:16 8 Comments

I want to insert multiple rows into a MySQL table at once using Java. The number of rows is dynamic. In the past I was doing...

for (String element : array) {
    myStatement.setString(1, element[0]);
    myStatement.setString(2, element[1]);


I'd like to optimize this to use the MySQL-supported syntax:

INSERT INTO table (col1, col2) VALUES ('val1', 'val2'), ('val1', 'val2')[, ...]

but with a PreparedStatement I don't know of any way to do this since I don't know beforehand how many elements array will contain. If it's not possible with a PreparedStatement, how else can I do it (and still escape the values in the array)?


@Ali Shakiba 2010-12-04 18:52:37

If you can create your sql statement dynamically you can do following workaround:

String myArray[][] = { { "1-1", "1-2" }, { "2-1", "2-2" }, { "3-1", "3-2" } };

StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");

for (int i = 0; i < myArray.length - 1; i++) {
    mySql.append(", (?, ?)");

myStatement = myConnection.prepareStatement(mySql.toString());

for (int i = 0; i < myArray.length; i++) {
    myStatement.setString(i, myArray[i][1]);
    myStatement.setString(i, myArray[i][2]);

@Ali Shakiba 2010-12-04 23:20:27

I believe the accepted answer is far better!! I didn't know about batch updates and when I was started to writing this answer that answer was not submitted yet!!! :)

@julian0zzx 2012-11-21 10:27:13

This approach is much faster than accepted one. I test it, but don't find why. @JohnS do you know why?

@Ali Shakiba 2012-11-23 18:36:23

@julian0zzx no, but maybe beacause it is executed as single sql instead of multiple. but i'm not sure.

@MichalSv 2014-05-29 16:22:35

When MySQL driver is used you have to set connection param rewriteBatchedStatements to true ( jdbc:mysql://localhost:3306/TestDB?**rewriteBatchedStatements=true**).

With this param the statement is rewritten to bulk insert when table is locked only once and indexes are updated only once. So it is much faster.

Without this param only advantage is cleaner source code.

@MichalSv 2014-05-29 16:27:09

this is comment for performence for construction: statement.addBatch(); if ((i + 1) % 1000 == 0) { statement.executeBatch(); // Execute every 1000 items. }

@Shailendra 2014-05-31 19:42:47

Apparently MySQL driver has a bug This also causes issues for ORM frameworks like Hibernate

@v.ladynev 2018-02-27 22:37:57

Yes. This is correct for now too. At least for 5.1.45 mysql connector version.

@vincent mathew 2019-05-18 03:00:15

<artifactId>mysql-connector-java</artifactId> <version>8.0.14</version> Just checked it is correct of 8.0.14. Without adding rewriteBatchedStatements=true there is no performance gain.

@vinay 2017-10-18 14:54:42

@Ali Shakiba your code needs some modification. Error part:

for (int i = 0; i < myArray.length; i++) {
     myStatement.setString(i, myArray[i][1]);
     myStatement.setString(i, myArray[i][2]);

Updated code:

String myArray[][] = {
    {"1-1", "1-2"},
    {"2-1", "2-2"},
    {"3-1", "3-2"}

StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");

for (int i = 0; i < myArray.length - 1; i++) {
    mySql.append(", (?, ?)");

mysql.append(";"); //also add the terminator at the end of sql statement
myStatement = myConnection.prepareStatement(mySql.toString());

for (int i = 0; i < myArray.length; i++) {
    myStatement.setString((2 * i) + 1, myArray[i][1]);
    myStatement.setString((2 * i) + 2, myArray[i][2]);


@Arun Shankar 2017-11-09 08:25:46

This is a much faster and better approach in entire answer. This should be the accepted answer

@Danny Bullis 2018-05-03 19:44:06

As mentioned in the accepted answer, some JDBC drivers / databases have limits on the number of rows you can include in an INSERT statement. In the case of the above example, if myArray has a greater length than that limit, you'll hit an exception. In my case, I have a 1,000 row limit which elicits the need for a batch execution, because I could be potentially updating more than 1,000 rows on any given run. This type of statement should theoretically work fine if you know you're inserting less than the maximum allowed. Something to keep in mind.

@Danny Bullis 2018-05-03 19:52:49

To clarify, the above answer mentions JDBC driver / database limitations on batch length, but there may also be limits on number of rows included in an insert statement, as I've seen in my case.

@gladiator 2015-07-30 07:29:41

In case you have auto increment in the table and need to access it.. you can use the following approach... Do test before using because getGeneratedKeys() in Statement because it depends on driver used. The below code is tested on Maria DB 10.0.12 and Maria JDBC driver 1.2

Remember that increasing batch size improves performance only to a certain extent... for my setup increasing batch size above 500 was actually degrading the performance.

public Connection getConnection(boolean autoCommit) throws SQLException {
    Connection conn = dataSource.getConnection();
    return conn;

private void testBatchInsert(int count, int maxBatchSize) {
    String querySql = "insert into batch_test(keyword) values(?)";
    try {
        Connection connection = getConnection(false);
        PreparedStatement pstmt = null;
        ResultSet rs = null;
        boolean success = true;
        int[] executeResult = null;
        try {
            pstmt = connection.prepareStatement(querySql, Statement.RETURN_GENERATED_KEYS);
            for (int i = 0; i < count; i++) {
                pstmt.setString(1, UUID.randomUUID().toString());
                if ((i + 1) % maxBatchSize == 0 || (i + 1) == count) {
                    executeResult = pstmt.executeBatch();
            ResultSet ids = pstmt.getGeneratedKeys();
            for (int i = 0; i < executeResult.length; i++) {
                if (executeResult[i] == 1) {
                    System.out.println("Execute Result: " + i + ", Update Count: " + executeResult[i] + ", id: "
                            + ids.getLong(1));
        } catch (Exception e) {
            success = false;
        } finally {
            if (rs != null) {
            if (pstmt != null) {
            if (connection != null) {
                if (success) {
                } else {
    } catch (SQLException e) {

@BalusC 2010-12-04 18:30:44

You can create a batch by PreparedStatement#addBatch() and execute it by PreparedStatement#executeBatch().

Here's a kickoff example:

public void save(List<Entity> entities) throws SQLException {
    try (
        Connection connection = database.getConnection();
        PreparedStatement statement = connection.prepareStatement(SQL_INSERT);
    ) {
        int i = 0;

        for (Entity entity : entities) {
            statement.setString(1, entity.getSomeProperty());
            // ...


            if (i % 1000 == 0 || i == entities.size()) {
                statement.executeBatch(); // Execute every 1000 items.

It's executed every 1000 items because some JDBC drivers and/or DBs may have a limitation on batch length.

See also:

@Joshua Martell 2010-12-04 20:39:01

Your inserts will go faster if you put them in transactions... i.e. wrap with connection.setAutoCommit(false); and connection.commit();…

@djechlin 2013-03-07 21:31:10

Looks like you can execute an empty batch if there are 999 items.

@Yohanes AI 2017-03-31 06:43:01

@electricalbah it will execute normally because i == entities.size()

@Danny Bullis 2018-05-03 19:54:16

Here's another good resource on putting batch jobs together using prepared statements.

@André 2018-06-04 10:11:36

In this case what does SQL_INSERT look like? Something like this INSERT INTO MyTable (value1, value2) VALUES (?,?)?

@BalusC 2018-06-04 10:13:54

@AndréPaulo: Just any SQL INSERT suitable for a prepared statement. Refer to the JDBC tutorial links for basic examples. This is not related to the concrete question.

@kapil das 2013-08-15 14:18:40

we can be submit multiple updates together in JDBC to submit batch updates.

we can use Statement, PreparedStatement, and CallableStatement objects for bacth update with disable autocommit

addBatch() and executeBatch() functions are available with all statement objects to have BatchUpdate

here addBatch() method adds a set of statements or parameters to the current batch.

Related Questions

Sponsored Content

87 Answered Questions

[SOLVED] Is Java "pass-by-reference" or "pass-by-value"?

43 Answered Questions

[SOLVED] How do I efficiently iterate over each entry in a Java Map?

33 Answered Questions

[SOLVED] When to use LinkedList over ArrayList in Java?

58 Answered Questions

[SOLVED] How to create a memory leak in Java?

28 Answered Questions

66 Answered Questions

[SOLVED] How do I generate random integers within a specific range in Java?

  • 2008-12-12 18:20:57
  • user42155
  • 4092747 View
  • 3530 Score
  • 66 Answer
  • Tags:   java random integer

44 Answered Questions

[SOLVED] How do I convert a String to an int in Java?

11 Answered Questions

[SOLVED] Can I concatenate multiple MySQL rows into one field?

59 Answered Questions

[SOLVED] How do I read / convert an InputStream into a String in Java?

39 Answered Questions

[SOLVED] Should I use the datetime or timestamp data type in MySQL?

Sponsored Content