Recently I used the python module dataset to manipulate and store information. As result, I have a collection of sqlite databases, let say file1.db, file2.db and so on. Moreover, each of the databases contains the same table.
With dataset I can easily connect and inspect the databases with the code:
>>> db1 = dataset.connect('sqlite:////path/file1.db')
>>> table1 = db1[u'tweet']
Assuming I want to keep databases separated in many files, what is the preferable way to inspect all the databases with dataset?
I am looking at something better to this:
>>> db1 = dataset.connect('sqlite:////path/file1.db')
>>> db2 = dataset.connect('sqlite:////path/file2.db')
>>> tables = [db1[u'tweet'],db2[u'tweet']]
>>> for table in tables:
for tweet in table:
print(tweet['text'])
I don't know a clean solution to this, but it might be interesting to use an in-memory SQLite database for this scenario:
There's also an insert_many call, I believe, which may be faster for bulk transfer.