Releases: simonw/sqlite-utils
3.17
- The sqlite-utils memory command has a new
--analyze
option, which runs the equivalent of the analyze-tables command directly against the in-memory database created from the incoming CSV or JSON data. (#320) - sqlite-utils insert-files now has the ability to insert file contents in to
TEXT
columns in addition to the defaultBLOB
. Pass the--text
option or usecontent_text
as a column specifier. (#319)
3.16
- Type signatures added to more methods, including
table.resolve_foreign_keys()
,db.create_table_sql()
,db.create_table()
andtable.create()
. (#314) - New
db.quote_fts(value)
method, see Quoting characters for use in search - thanks, Mark Neumann. (#246) table.search()
now accepts an optionalquote=True
parameter. (#296)- CLI command
sqlite-utils search
now accepts a--quote
option. (#296) - Fixed bug where
--no-headers
and--tsv
options to sqlite-utils insert could not be used together. (#295) - Various small improvements to API reference documentation.
3.15.1
- Python library now includes type annotations on almost all of the methods, plus detailed docstrings describing each one. (#311)
- New API Reference documentation page, powered by those docstrings.
- Fixed bug where
.add_foreign_keys()
failed to raise an error if called against aView
. (#313) - Fixed bug where
.delete_where()
returned a[]
instead of returningself
if called against a non-existant table. (#315)
3.15
sqlite-utils insert --flatten
option for flattening nested JSON objects to create tables with column names liketopkey_nestedkey
. (#310)- Fixed several spelling mistakes in the documentation, spotted using codespell.
- Errors that occur while using the
sqlite-utils
CLI tool now show the responsible SQL and query parameters, if possible. (#309)
3.14
This release introduces the new sqlite-utils convert command (#251) and corresponding table.convert(...) Python method (#302). These tools can be used to apply a Python conversion function to one or more columns of a table, either updating the column in place or using transformed data from that column to populate one or more other columns.
This command-line example uses the Python standard library textwrap module to wrap the content of the content
column in the articles
table to 100 characters:
$ sqlite-utils convert content.db articles content\
'"\n".join(textwrap.wrap(value, 100))'\
--import=textwrap
The same operation in Python code looks like this:
import sqlite_utils, textwrap
db = sqlite_utils.Database("content.db")
db["articles"].convert("content", lambda v: "\n".join(textwrap.wrap(v, 100)))
See the full documentation for the sqlite-utils convert command and the table.convert(...) Python method for more details.
Also in this release:
- The new
table.count_where(...)
method, for counting rows in a table that match a specific SQLWHERE
clause. (#305) - New
--silent
option for the sqlite-utils insert-files command to hide the terminal progress bar, consistent with the--silent
option forsqlite-utils convert
. (#301)
3.13
3.12
- New db.query(sql, params) method, which executes a SQL query and returns the results as an iterator over Python dictionaries. (#290)
- This project now uses
flake8
and has started to usemypy
. (#291) - New documentation on contributing to this project. (#292)
3.11
- New
sqlite-utils memory data.csv --schema
option, for outputting the schema of the in-memory database generated from one or more files. See --schema, --dump and --save. (#288) - Added installation instructions. (#286)
3.10
This release introduces the sqlite-utils memory
command, which can be used to load CSV or JSON data into a temporary in-memory database and run SQL queries (including joins across multiple files) directly against that data.
Also new: sqlite-utils insert --detect-types
, sqlite-utils dump
, table.use_rowid
plus some smaller fixes.
sqlite-utils memory
This example of sqlite-utils memory
retrieves information about the all of the repositories in the Dogsheep organization on GitHub using this JSON API, sorts them by their number of stars and outputs a table of the top five (using -t
):
$ curl -s 'https://api.github.com/users/dogsheep/repos'\
| sqlite-utils memory - '
select full_name, forks_count, stargazers_count
from stdin order by stargazers_count desc limit 5
' -t
full_name forks_count stargazers_count
--------------------------------- ------------- ------------------
dogsheep/twitter-to-sqlite 12 225
dogsheep/github-to-sqlite 14 139
dogsheep/dogsheep-photos 5 116
dogsheep/dogsheep.github.io 7 90
dogsheep/healthkit-to-sqlite 4 85
The tool works against files on disk as well. This example joins data from two CSV files:
$ cat creatures.csv
species_id,name
1,Cleo
2,Bants
2,Dori
2,Azi
$ cat species.csv
id,species_name
1,Dog
2,Chicken
$ sqlite-utils memory species.csv creatures.csv '
select * from creatures join species on creatures.species_id = species.id
'
[{"species_id": 1, "name": "Cleo", "id": 1, "species_name": "Dog"},
{"species_id": 2, "name": "Bants", "id": 2, "species_name": "Chicken"},
{"species_id": 2, "name": "Dori", "id": 2, "species_name": "Chicken"},
{"species_id": 2, "name": "Azi", "id": 2, "species_name": "Chicken"}]
Here the species.csv
file becomes the species
table, the creatures.csv
file becomes the creatures
table and the output is JSON, the default output format.
You can also use the --attach
option to attach existing SQLite database files to the in-memory database, in order to join data from CSV or JSON directly against your existing tables.
Full documentation of this new feature is available in Querying data directly using an in-memory database. (#272)
sqlite-utils insert --detect-types
The sqlite-utils insert command can be used to insert data from JSON, CSV or TSV files into a SQLite database file. The new --detect-types
option (shortcut -d
), when used in conjunction with a CSV or TSV import, will automatically detect if columns in the file are integers or floating point numbers as opposed to treating everything as a text column and create the new table with the corresponding schema. See Inserting CSV or TSV data for details. (#282)
Other changes
- Bug fix:
table.transform()
, when run against a table without explicit primary keys, would incorrectly create a new version of the table with an explicit primary key column calledrowid
. (#284) - New
table.use_rowid
introspection property, see .use_rowid. (#285) - The new
sqlite-utils dump file.db
command outputs a SQL dump that can be used to recreate a database. (#274) -h
now works as a shortcut for--help
, thanks Loren McIntyre. (#276)- Now using pytest-cov and Codecov to track test coverage - currently at 96%. (#275)
- SQL errors that occur when using
sqlite-utils query
are now displayed as CLI errors.