Hi folks,
In this post I’ll sharemy recent experience with importing a large SQLite table. By large, I mean a table with very many rows.
Usually I use DB Browser for SQLite to manage the database.
So at first I thought, maybe I should use the export and import functionality from DB Browser for SQLite.
I exported to an SQL file, made my changes to the dump, and then tried importing that dump to a new database using DB Browser for SQLite.
Lo and behold, it was super duper slow. On realizing it would take forever, I decided there must be a faster way.
More research showed that there’s an SQLite command-line tool that can carry out imports. I learnt that it may be a good idea to export to CSV format, then import it using the SQLite command-line tool. To export to csv, use the following command:
sqlite3 -header -csv your_db.db "select * from your_table;" > your_csv_file.csv
To import a csv file use the following commands:
sqlite3 your_db_file.db
.mode csv
.import /path/to/your/csv_file.csv your_table_name
So I tried it. Boy, was it much faster.
So, the takeaway is that if you must export and later import large SQLite table, first export the database to a CSV file, and use the SQLite command-line utility to import the file(s) into a new database.
That’s all for now. Happy software development.