MongoDB insertMany or bulkwrite throws duplicate key error in Java-API

2.4k Views Asked by At

When I Try to use java api to insert many data into Mongodb4.0 (replica set) it throws duplicate key errors. The amount of data is not too large, only about 300000 . & insert in 3-5 seconds

firstly I searched from the official document & website. it shows

Runs up to 256^3 per process (16777216)

the data source comes from rocketmq. here is my Code ,

consumer.subscribe(GetPropertyUtils.getTestTopic(), "*", new MessageListener() {
            long insertnums = 0L ;
            List<Document> documentList= new ArrayList<Document>();
            @Override
            public Action consume(Message message, ConsumeContext context) {
                insertnums ++ ;
                consumerlogger.info(" now  total size is " + insertnums);

                String body = new String(message.getBody());
                Document document = Document.parse(body);
                documentList.add(document);
                //insert bulk 
                if(documentList.size()>=1000) {
                    try {
                        MongoInsert.insertData(documentList);
                        Thread.sleep(1000);
                    }catch (Exception e){
                        consumerlogger.error("insert sleep  3000");
                    }

                    documentList.clear();
                }
                return Action.CommitMessage;
            }

then insertData into MongoDB

 public  static  void  insertData(List<Document>  document){
        try{
            MongoInsertlogger.info("prepare to  insert ");
            //collection.insertMany(documents ,new InsertManyOptions().ordered(false));
            //---------

            List<WriteModel<Document>> requests = new ArrayList<WriteModel<Document>>();
            for (Document doc : document) {

                InsertOneModel<Document>  iom = new InsertOneModel<Document>(doc);
                requests.add(iom);
            }
            BulkWriteResult bulkWriteResult = collection.bulkWrite(requests,new BulkWriteOptions().ordered(false));
            System.out.println(bulkWriteResult.toString());


        }catch (Exception e){
            MongoInsertlogger.error("insert failed  , caused by " +e);
            System.out.println(e);
        }
    }

but the error shows

  BulkWriteError{index=811, code=11000, message='E11000 duplicate key error collection: yyj2.accpay index: _id_ dup key: { : ObjectId('5bea843604de38d61ff4d1fd') }', details={ }}, BulkWriteError{index=812, code=11000, message='E11000 duplicate key error collection: yyj2.accpay index: _id_ dup key: { : ObjectId('5bea843604de38d61ff4d1fe') }', details={ }}, BulkWriteError{index=813, code=11000, message='E11000 duplicate key error collection: yyj2.accpay index: _id_ dup key: { : ObjectId('5bea843604de38d61ff4d1ff') }', details={ }}, BulkWriteError{index=814, code=11000, message='E11000 duplicate key error collection: yyj2.accpay index: _id_ dup key: { : ObjectId('5bea843604de38d61ff4d200') }', details={ }}, BulkWriteError{index=815, code=11000, message='E11000 duplicate key error collection: ......

with my little data in java why does this happen, the object is created by MongoDB itself .and the data size is less than its supported, I use JDBC version mongo-java-driver 3.7.1
Thanks in advance !

1

There are 1 best solutions below

1
On

you get this error when the document already exists in your database - as defined by a duplicate of the primary key.