Recently I am working with SQLite. Like many of you do, I sometimes need to insert large amount of data to the database. There may be thousands, or even more, of records waiting for you. Inserting them to a SQLite database could be a nightmare that it would work like forever. Asking Google about this you will be probably instructed to fine tune the SQLite parameters. Yes, this is one way but if you are not doing this too frequently, I may have another, somehow dirty, suggestion for you.
Here is what I did:
- Somehow you prepared the insert statements in text file (e.g. insert.sql). I got it from my colleague who dumped the whole database with tons of records into SQL format using the .dump command.
- Use sqlite3.exe to load your database, to which the data is inserting:
C:\> sqlite3.exe mydb.db
- Backup the database using the .backup command into a file (e.g. original.db):
sqlite> .backup original.db
- If the target tables are not empty, dump them into SQL format using the .dump command too (e.g. data.sql). Better dump other related tables if you have set certain kinds of constrains (e.g. foreign key) too. Then you may quit the utility:
sqlite> .output data.sql
sqlite> .dump table1 tabe2
sqlite> .output stdout - Run sqlite3.exe again without specifying the database file, so that a memory database will be used:
C:\> sqlite3.exe
- Restore from your database backup (original.db in the example) using .restore command:
sqlite> .restore original.db
- If you have done step 4, drop those target tables (and related ones). Load and execute the table data backup with .read command:
sqlite> DROP TABLE1;
sqlite> DROP TABLE2;
sqlite> .read data.sql - Now load and execute the insert statements with .read command. Hey, don't plan to go for tea, it will finish shortly:
sqlite> .read insert.sql
- Backup the database using the .backup command:
sqlite> .backup new.db
- Now you can use to restore the backup to anywhere you like. I restored to a new database file and then replaced my existing one.
BTW, later I found out that it may be related to the transactions. Making the whole batch as a single transaction may help as suggested in the FAQ. Journal file may also take some time . So temporarily turning off the journal with the PRAGMA journal_mode = OFF; statement before inserting might actually help too. I didn't verified these... Feel free to try it out :p