Python CLI utility and library for manipulating SQLite databases
APACHE-2.0 License
Bot releases are hidden (Show)
Published by simonw over 2 years ago
--help
- see CLI reference. (#384)Published by simonw almost 3 years ago
--help
for every one of the CLI commands. (#383)sqlite-utils rows
now has --limit
and --offset
options for paginating through data. (#381)sqlite-utils rows
now has --where
and -p
options for filtering the table using a WHERE
query, see Returning all rows in a table. (#382)Published by simonw almost 3 years ago
CLI and Python library improvements to help run ANALYZE after creating indexes or inserting rows, to gain better performance from the SQLite query planner when it runs against indexes.
Three new CLI commands: create-database
, analyze
and bulk
.
More details and examples can be found in the annotated release notes.
sqlite-utils create-database
command for creating new empty database files. (#348)ANALYZE
against a database, table or index: db.analyze()
and table.analyze()
, see Optimizing index usage with ANALYZE. (#366)ANALYZE
using the CLI. (#379)create-index
, insert
and upsert
commands now have a new --analyze
option for running ANALYZE
after the command has completed. (#379)sqlite-utils insert
(from JSON, CSV or TSV) and use them to bulk execute a parametrized SQL query. (#375)python -m sqlite_utils
. (#368)--fmt
now implies --table
, so you don't need to pass both options. (#374)--convert
function applied to rows can now modify the row in place. (#371)stem
and suffix
. (#372)--nl
import option now ignores blank lines in the input. (#376)insert
command with --batch-size 1
would appear to only commit after several rows had been ingested, due to unnecessary input buffering. (#364)Published by simonw almost 3 years ago
sqlite-utils insert ... --lines
to insert the lines from a file into a table with a single line
column, see Inserting unstructured data with --lines and --text.sqlite-utils insert ... --text
to insert the contents of the file into a table with a single text
column and a single row.sqlite-utils insert ... --convert
allows a Python function to be provided that will be used to convert each row that is being inserted into the database. See Applying conversions while inserting data, including details on special behavior when combined with --lines
and --text
. (#356)sqlite-utils convert
now accepts a code value of -
to read code from standard input. (#353)sqlite-utils convert
also now accepts code that defines a named convert(value)
function, see Converting data in columns.db.supports_strict
property showing if the database connection supports SQLite strict tables.table.strict
property (see .strict) indicating if the table uses strict mode. (#344)sqlite-utils upsert ... --detect-types
ignored the --detect-types
option. (#362)Published by simonw almost 3 years ago
table.insert()
method: foreign_keys=
, column_order=
, not_null=
, defaults=
, extracts=
, conversions=
and columns=
. You can also now pass pk=
to specify a different column name to use for the primary key. (#342)Published by simonw almost 3 years ago
table.lookup()
method now has an optional second argument which can be used to populate columns only the first time the record is created, see Working with lookup tables. (#339)sqlite-utils memory
now has a --flatten
option for flattening nested JSON objects into separate columns, consistent with sqlite-utils insert
. (#332)table.create_index(..., find_unique_name=True)
parameter, which finds an available name for the created index even if the default name has already been taken. This means that index-foreign-keys
will work even if one of the indexes it tries to create clashes with an existing index name. (#335)py.typed
to the module, so mypy should now correctly pick up the type annotations. Thanks, Andreas Longo. (#331)python-dateutil
instead of depending on dateutils
. Thanks, Denys Pavlov. (#324)table.create()
(see Explicitly creating a table) now handles dict
, list
and tuple
types, mapping them to TEXT
columns in SQLite so that they can be stored encoded as JSON. (#338)item[price]
) column now have the braces converted to underscores: item_price_
. Previously such columns would be rejected with an error. (#329)Published by simonw about 3 years ago
[]
in JSON mode if no rows are returned. (#328)Published by simonw about 3 years ago
--analyze
option, which runs the equivalent of the analyze-tables command directly against the in-memory database created from the incoming CSV or JSON data. (#320)TEXT
columns in addition to the default BLOB
. Pass the --text
option or use content_text
as a column specifier. (#319)Published by simonw about 3 years ago
table.resolve_foreign_keys()
, db.create_table_sql()
, db.create_table()
and table.create()
. (#314)db.quote_fts(value)
method, see Quoting characters for use in search - thanks, Mark Neumann. (#246)table.search()
now accepts an optional quote=True
parameter. (#296)sqlite-utils search
now accepts a --quote
option. (#296)--no-headers
and --tsv
options to sqlite-utils insert could not be used together. (#295)Published by simonw about 3 years ago
.add_foreign_keys()
failed to raise an error if called against a View
. (#313).delete_where()
returned a []
instead of returning self
if called against a non-existant table. (#315)Published by simonw about 3 years ago
sqlite-utils insert --flatten
option for flattening nested JSON objects to create tables with column names like topkey_nestedkey
. (#310)sqlite-utils
CLI tool now show the responsible SQL and query parameters, if possible. (#309)Published by simonw about 3 years ago
This release introduces the new sqlite-utils convert command (#251) and corresponding table.convert(...) Python method (#302). These tools can be used to apply a Python conversion function to one or more columns of a table, either updating the column in place or using transformed data from that column to populate one or more other columns.
This command-line example uses the Python standard library textwrap module to wrap the content of the content
column in the articles
table to 100 characters:
$ sqlite-utils convert content.db articles content\
'"\n".join(textwrap.wrap(value, 100))'\
--import=textwrap
The same operation in Python code looks like this:
import sqlite_utils, textwrap
db = sqlite_utils.Database("content.db")
db["articles"].convert("content", lambda v: "\n".join(textwrap.wrap(v, 100)))
See the full documentation for the sqlite-utils convert command and the table.convert(...) Python method for more details.
Also in this release:
table.count_where(...)
method, for counting rows in a table that match a specific SQL WHERE
clause. (#305)--silent
option for the sqlite-utils insert-files command to hide the terminal progress bar, consistent with the --silent
option for sqlite-utils convert
. (#301)Published by simonw over 3 years ago
flake8
and has started to use mypy
. (#291)Published by simonw over 3 years ago
sqlite-utils memory data.csv --schema
option, for outputting the schema of the in-memory database generated from one or more files. See --schema, --dump and --save. (#288)Published by simonw over 3 years ago
This release introduces the sqlite-utils memory
command, which can be used to load CSV or JSON data into a temporary in-memory database and run SQL queries (including joins across multiple files) directly against that data.
Also new: sqlite-utils insert --detect-types
, sqlite-utils dump
, table.use_rowid
plus some smaller fixes.
This example of sqlite-utils memory
retrieves information about the all of the repositories in the Dogsheep organization on GitHub using this JSON API, sorts them by their number of stars and outputs a table of the top five (using -t
):
$ curl -s 'https://api.github.com/users/dogsheep/repos'\
| sqlite-utils memory - '
select full_name, forks_count, stargazers_count
from stdin order by stargazers_count desc limit 5
' -t
full_name forks_count stargazers_count
--------------------------------- ------------- ------------------
dogsheep/twitter-to-sqlite 12 225
dogsheep/github-to-sqlite 14 139
dogsheep/dogsheep-photos 5 116
dogsheep/dogsheep.github.io 7 90
dogsheep/healthkit-to-sqlite 4 85
The tool works against files on disk as well. This example joins data from two CSV files:
$ cat creatures.csv
species_id,name
1,Cleo
2,Bants
2,Dori
2,Azi
$ cat species.csv
id,species_name
1,Dog
2,Chicken
$ sqlite-utils memory species.csv creatures.csv '
select * from creatures join species on creatures.species_id = species.id
'
[{"species_id": 1, "name": "Cleo", "id": 1, "species_name": "Dog"},
{"species_id": 2, "name": "Bants", "id": 2, "species_name": "Chicken"},
{"species_id": 2, "name": "Dori", "id": 2, "species_name": "Chicken"},
{"species_id": 2, "name": "Azi", "id": 2, "species_name": "Chicken"}]
Here the species.csv
file becomes the species
table, the creatures.csv
file becomes the creatures
table and the output is JSON, the default output format.
You can also use the --attach
option to attach existing SQLite database files to the in-memory database, in order to join data from CSV or JSON directly against your existing tables.
Full documentation of this new feature is available in Querying data directly using an in-memory database. (#272)
The sqlite-utils insert command can be used to insert data from JSON, CSV or TSV files into a SQLite database file. The new --detect-types
option (shortcut -d
), when used in conjunction with a CSV or TSV import, will automatically detect if columns in the file are integers or floating point numbers as opposed to treating everything as a text column and create the new table with the corresponding schema. See Inserting CSV or TSV data for details. (#282)
table.transform()
, when run against a table without explicit primary keys, would incorrectly create a new version of the table with an explicit primary key column called rowid
. (#284)table.use_rowid
introspection property, see .use_rowid. (#285)sqlite-utils dump file.db
command outputs a SQL dump that can be used to recreate a database. (#274)-h
now works as a shortcut for --help
, thanks Loren McIntyre. (#276)sqlite-utils query
are now displayed as CLI errors.Published by simonw over 3 years ago
sqlite-utils schema
command showing the full SQL schema for a database, see Showing the schema (CLI). (#268)db.schema
introspection property exposing the same feature to the Python library, see Showing the schema (Python library).Published by simonw over 3 years ago
sqlite-utils indexes
command to list indexes in a database, see Listing indexes. (#263)table.xindexes
introspection property returning more details about that table's indexes, see .xindexes. (#261)