I've got a task that runs every n seconds scanning the file system for new files and storing them into a collection.
I'm not sure of the best way to ensure I don't end up with the same file twice in the database. At the moment I'm taking the files absolute path and checking that against other absolute paths in the database
public Movie findByAbsolutePath(String absolutePath) {
return getDatastore().find(Movie.class, "absolutePath", absolutePath).limit(1).get();
}
if this returns null then go ahead and save.
Is checking each file against the database like this the best way to do it? I've tried using the following index annotation but it doesn't seem to work and I end up with duplicates
@Indexes(@Index(fields = {@Field("fileName"), @Field("absolutePath")}, options = @IndexOptions(unique = true, dropDups = true)))
Is that faster (assuming I can get it to work) than the previous way I'm checking against duplicates?
Your index isn't working because the uniqueness constraint is on both
fileName
andabsolutePath
. You'll need the unique constraint on a separate index onabsolutePath
alone.To do an upsert in Morphia you'd pass an
UpdateOptions
instance todatastore.update()
as documented here.UpdateOptions
has anupsert(boolean)
that you'd simply passtrue
to.