Skip to content

Latest commit

 

History

History
1526 lines (1165 loc) · 61 KB

formats.md

File metadata and controls

1526 lines (1165 loc) · 61 KB

Supported formats

Name Description Dependencies
aac_frame Advanced Audio Coding frame
adts Audio Data Transport Stream adts_frame
adts_frame Audio Data Transport Stream frame aac_frame
aiff Audio Interchange File Format
amf0 Action Message Format 0
apev2 APEv2 metadata tag image
apple_bookmark Apple BookmarkData
ar Unix archive probe
asn1_ber ASN1 BER (basic encoding rules, also CER and DER)
av1_ccr AV1 Codec Configuration Record
av1_frame AV1 frame av1_obu
av1_obu AV1 Open Bitstream Unit
avc_annexb H.264/AVC Annex B avc_nalu
avc_au H.264/AVC Access Unit avc_nalu
avc_dcr H.264/AVC Decoder Configuration Record avc_nalu
avc_nalu H.264/AVC Network Access Layer Unit avc_sps avc_pps avc_sei
avc_pps H.264/AVC Picture Parameter Set
avc_sei H.264/AVC Supplemental Enhancement Information
avc_sps H.264/AVC Sequence Parameter Set
avi Audio Video Interleaved avc_au hevc_au mp3_frame flac_frame
avro_ocf Avro object container file
bencode BitTorrent bencoding
bitcoin_blkdat Bitcoin blk.dat bitcoin_block
bitcoin_block Bitcoin block bitcoin_transaction
bitcoin_script Bitcoin script
bitcoin_transaction Bitcoin transaction bitcoin_script
bits Raw bits
bplist Apple Binary Property List
bsd_loopback_frame BSD loopback frame inet_packet
bson Binary JSON
bytes Raw bytes
bzip2 bzip2 compression probe
caff Live2D Cubism archive probe
cbor Concise Binary Object Representation
csv Comma separated values
dns DNS packet
dns_tcp DNS packet (TCP)
elf Executable and Linkable Format
ether8023_frame Ethernet 802.3 frame inet_packet
exif Exchangeable Image File Format
fairplay_spc FairPlay Server Playback Context
fit Garmin Flexible and Interoperable Data Transfer
flac Free Lossless Audio Codec file flac_metadatablocks flac_frame
flac_frame FLAC frame
flac_metadatablock FLAC metadatablock flac_streaminfo flac_picture vorbis_comment
flac_metadatablocks FLAC metadatablocks flac_metadatablock
flac_picture FLAC metadatablock picture image
flac_streaminfo FLAC streaminfo
gif Graphics Interchange Format
gzip gzip compression probe
hevc_annexb H.265/HEVC Annex B hevc_nalu
hevc_au H.265/HEVC Access Unit hevc_nalu
hevc_dcr H.265/HEVC Decoder Configuration Record hevc_nalu
hevc_nalu H.265/HEVC Network Access Layer Unit hevc_vps hevc_pps hevc_sps
hevc_pps H.265/HEVC Picture Parameter Set
hevc_sps H.265/HEVC Sequence Parameter Set
hevc_vps H.265/HEVC Video Parameter Set
html HyperText Markup Language
icc_profile International Color Consortium profile
icmp Internet Control Message Protocol
icmpv6 Internet Control Message Protocol v6
id3v1 ID3v1 metadata
id3v11 ID3v1.1 metadata
id3v2 ID3v2 metadata image
ipv4_packet Internet protocol v4 packet ip_packet
ipv6_packet Internet protocol v6 packet ip_packet
jp2c JPEG 2000 codestream
jpeg Joint Photographic Experts Group file exif icc_profile
json JavaScript Object Notation
jsonl JavaScript Object Notation Lines
leveldb_descriptor LevelDB Descriptor
leveldb_log LevelDB Log
leveldb_table LevelDB Table
luajit LuaJIT 2.0 bytecode
macho Mach-O macOS executable
macho_fat Fat Mach-O macOS executable (multi-architecture) macho
markdown Markdown
matroska Matroska file aac_frame av1_ccr av1_frame avc_au avc_dcr flac_frame flac_metadatablocks hevc_au hevc_dcr image mp3_frame mpeg_asc mpeg_pes_packet mpeg_spu opus_packet vorbis_packet vp8_frame vp9_cfm vp9_frame
moc3 MOC3 file
mp3 MP3 file id3v2 id3v1 id3v11 apev2 mp3_frame
mp3_frame MPEG audio layer 3 frame mp3_frame_tags
mp3_frame_vbri MP3 frame Fraunhofer encoder variable bitrate tag
mp3_frame_xing MP3 frame Xing/Info tag
mp4 ISOBMFF, QuickTime and similar aac_frame av1_ccr av1_frame avc_au avc_dcr flac_frame flac_metadatablocks hevc_au hevc_dcr icc_profile id3v2 image jp2c jpeg mp3_frame mpeg_es mpeg_pes_packet opus_packet png prores_frame protobuf_widevine pssh_playready vorbis_packet vp9_frame vpx_ccr
mpeg_asc MPEG-4 Audio Specific Config
mpeg_es MPEG Elementary Stream mpeg_asc vorbis_packet
mpeg_pes MPEG Packetized elementary stream mpeg_pes_packet mpeg_spu
mpeg_pes_packet MPEG Packetized elementary stream packet
mpeg_spu Sub Picture Unit (DVD subtitle)
mpeg_ts MPEG Transport Stream
msgpack MessagePack
nes iNES/NES 2.0 cartridge ROM format
ogg OGG file ogg_page vorbis_packet opus_packet flac_metadatablock flac_frame
ogg_page OGG page
opentimestamps OpenTimestamps file
opus_packet Opus packet vorbis_comment
pcap PCAP packet capture link_frame tcp_stream ipv4_packet
pcapng PCAPNG packet capture link_frame tcp_stream ipv4_packet
pg_btree PostgreSQL btree index file
pg_control PostgreSQL control file
pg_heap PostgreSQL heap file
png Portable Network Graphics file icc_profile exif
prores_frame Apple ProRes frame
protobuf Protobuf
protobuf_widevine Widevine protobuf protobuf
pssh_playready PlayReady PSSH
rtmp Real-Time Messaging Protocol amf0 mpeg_asc
sll2_packet Linux cooked capture encapsulation v2 inet_packet
sll_packet Linux cooked capture encapsulation inet_packet
tar Tar archive probe
tcp_segment Transmission control protocol segment
tiff Tag Image File Format icc_profile
tls Transport layer security asn1_ber
toml Tom's Obvious, Minimal Language
tzif Time Zone Information Format
udp_datagram User datagram protocol udp_payload
vorbis_comment Vorbis comment flac_picture
vorbis_packet Vorbis packet vorbis_comment
vp8_frame VP8 frame
vp9_cfm VP9 Codec Feature Metadata
vp9_frame VP9 frame
vpx_ccr VPX Codec Configuration Record
wasm WebAssembly Binary Format
wav WAV file id3v2 id3v1 id3v11
webp WebP image exif vp8_frame icc_profile xml
xml Extensible Markup Language
yaml YAML Ain't Markup Language
zip ZIP archive probe
image Group gif jp2c jpeg mp4 png tiff webp
inet_packet Group ipv4_packet ipv6_packet
ip_packet Group icmp icmpv6 tcp_segment udp_datagram
link_frame Group bsd_loopback_frame ether8023_frame ipv4_packet ipv6_packet sll2_packet sll_packet
mp3_frame_tags Group mp3_frame_vbri mp3_frame_xing
probe Group adts aiff apple_bookmark ar avi avro_ocf bitcoin_blkdat bplist bzip2 caff elf fit flac gif gzip html jp2c jpeg json jsonl leveldb_table luajit macho macho_fat matroska moc3 mp3 mp4 mpeg_ts nes ogg opentimestamps pcap pcapng png tar tiff toml tzif wasm wav webp xml yaml zip
tcp_stream Group dns_tcp rtmp tls
udp_payload Group dns

Global format options

Currently the only global option is force and is used to ignore some format assertion errors. It can be used as a decode option or as a CLI -o option:

fq -d mp4 -o force=true file.mp4
fq -d bytes 'mp4({force: true})' file.mp4

Format details

aac_frame

Options

Name Default Description
object_type 1 Audio object type

Examples

Decode file using aac_frame options

$ fq -d aac_frame -o object_type=1 . file

Decode value as aac_frame

... | aac_frame({object_type:1})

apple_bookmark

Apple's bookmarkData format is used to encode information that can be resolved into a URL object for a file even if the user moves or renames it. Can also contain security scoping information for App Sandbox support.

These bookmarkData blobs are often found endcoded in data fields of Binary Property Lists. Notable examples include:

  • com.apple.finder.plist - contains an FXRecentFolders value, which is an array of ten objects, each of which consists of a name and file-bookmark field, which is a bookmarkData object for each recently accessed folder location.

  • com.apple.LSSharedFileList.RecentApplications.sfl2 - sfl2 files are actually plist files of the NSKeyedArchiver format. They can be parsed the same as plist files, but they have a more complicated tree-like structure than would typically be found, which can make locating and retrieving specific values difficult, even once it has been converted to a JSON representation. For more information about these types of files, see Sarah Edwards' excellent research on the subject (link in references).

fq's grep_by function can be used to recursively descend through the decoded tree, probing for and selecting any bookmark blobs, then converting them to readable JSON with torepr:

fq 'grep_by(.type=="data" and .value[0:4] == "book") | .value | apple_bookmark |
torepr' <sfl2 file>

Authors

References

asn1_ber

Supports decoding BER, CER and DER (X.690).

  • Currently no extra validation is done for CER and DER.
  • Does not support specifying a schema.
  • Supports torepr but without schema all sequences and sets will be arrays.

Can be used to decode certificates etc

$ fq -d bytes 'from_pem | asn1_ber | d' cert.pem

Can decode nested values

$ fq -d asn1_ber '.constructed[1].value | asn1_ber' file.ber

Manual schema

$ fq -d asn1_ber 'torepr as $r | ["version", "modulus", "private_exponent", "private_exponen", "prime1", "prime2", "exponent1", "exponent2", "coefficient"] | with_entries({key: .value, value: $r[.key]})' pkcs1.der

References

avc_au

Options

Name Default Description
length_size 0 Length value size

Examples

Decode file using avc_au options

$ fq -d avc_au -o length_size=0 . file

Decode value as avc_au

... | avc_au({length_size:0})

avi

Options

Name Default Description
decode_extended_chunks true Decode extended chunks
decode_samples true Decode samples

Examples

Decode file using avi options

$ fq -d avi -o decode_extended_chunks=true -o decode_samples=true . file

Decode value as avi

... | avi({decode_extended_chunks:true,decode_samples:true})

Samples

AVI has many redundant ways to index samples so currently .streams[].samples will only include samples the most "modern" way used in the file. That is in order of stream super index, movi ix index then idx1 index.

Extract samples for stream 1

$ fq '.streams[1].samples[] | tobytes' file.avi > stream01.mp3

Show stream summary

$ fq -o decode_samples=false '[.chunks[0] | grep_by(.id=="LIST" and .type=="strl") | grep_by(.id=="strh") as {$type} | grep_by(.id=="strf") as {$format_tag, $compression} | {$type,$format_tag,$compression}]' *.avi

Speed up decoding by disabling sample and extended chunks decoding

If your not interested in sample details or extended chunks you can speed up decoding by using:

$ fq -o decode_samples=false -o decode_extended_chunks=false d file.avi

References

avro_ocf

Supports reading Avro Object Container Format (OCF) files based on the 1.11.0 specification.

Capable of handling null, deflate, and snappy codecs for data compression.

Limitations:

  • Schema does not support self-referential types, only built-in types.
  • Decimal logical types are not supported for decoding, will just be treated as their primitive type

References

Authors

bencode

Convert represented value to JSON

$ fq -d bencode torepr file.torrent

References

bitcoin_block

Options

Name Default Description
has_header false Has blkdat header

Examples

Decode file using bitcoin_block options

$ fq -d bitcoin_block -o has_header=false . file

Decode value as bitcoin_block

... | bitcoin_block({has_header:false})

bits

Decode to a slice and indexable binary of bits.

Slice and decode bit range

$ echo 'some {"a":1} json' | fq -d bits '.[40:-48] | fromjson'
{
  "a": 1
}

Index bits

echo 'hello' | fq -d bits '.[4]'
1
$ echo 'hello' | fq -c -d bits '[.[range(8)]]'
[0,1,1,0,1,0,0,0]

bplist

Show full decoding

$ fq d Info.plist

Timestamps

Timestamps in Apple Binary Property Lists are encoded as Cocoa Core Data timestamps, where the raw value is the floating point number of seconds since January 1, 2001. By default, fq will render the raw floating point value. In order to get the raw value or string description, use the todescription function, you can use the tovalue and todescription functions:

$ fq 'torepr.SomeTimeStamp | tovalue' Info.plist
685135328

$ fq 'torepr.SomeTimeStamp | todescription' Info.plist
"2022-09-17T19:22:08Z"

Get JSON representation

bplist files can be converted to a JSON representation using the torepr filter:

$ fq torepr com.apple.UIAutomation.plist
{
  "UIAutomationEnabled": true
}

Decoding NSKeyedArchiver serialized objects

A common way that Swift and Objective-C libraries on macOS serialize objects is through the NSKeyedArchiver API, which flattens objects into a list of elements and class descriptions that are reconstructed into an object graph using CFUID elements in the property list. fq includes a function, from_ns_keyed_archiver, which will rebuild this object graph into a friendly representation.

If no parameters are supplied, it will assume that there is a CFUID located at ."$top".root that specifies the root from which decoding should occur. If this is not present, an error will be produced, asking the user to specify a root object in the .$objects list from which to decode.

The following examples show how this might be used (in this case, within the fq REPL):

# Assume $top.root is present
bplist> from_ns_keyed_archiver

# Specify optional root
bplist> from_ns_keyed_archiver(1)

Authors

References

bson

Limitations

  • The decimal128 type is not supported for decoding, will just be treated as binary

Convert represented value to JSON

$ fq -d bson torepr file.bson

Filter represented value

$ fq -d bson 'torepr | select(.name=="bob")' file.bson

Authors

References

bytes

Decode to a slice and indexable binary of bytes.

Slice out byte ranges

$ echo -n 'hello' | fq -d bytes '.[-3:]' > last_3_bytes
$ echo -n 'hello' | fq -d bytes '[.[-2:], .[0:2]] | tobytes' > first_last_2_bytes_swapped

Slice and decode byte range

$ echo 'some {"a":1} json' | fq -d bytes '.[5:-6] | fromjson'
{
  "a": 1
}

Index bytes

$ echo 'hello' | fq -d bytes '.[1]'
101

caff

Options

Name Default Description
uncompress true Uncompress and probe files

Examples

Decode file using caff options

$ fq -d caff -o uncompress=true . file

Decode value as caff

... | caff({uncompress:true})

Authors

cbor

Convert represented value to JSON

$ fq -d cbor torepr file.cbor

References

csv

Options

Name Default Description
comma , Separator character
comment # Comment line character

Examples

Decode file using csv options

$ fq -d csv -o comma="," -o comment="#" . file

Decode value as csv

... | csv({comma:",",comment:"#"})

TSV to CSV

$ fq -d csv -o comma="\t" to_csv file.tsv

Convert rows to objects based on header row

$ fq -d csv '.[0] as $t | .[1:] | map(with_entries(.key = $t[.key]))' file.csv

fit

Limitations

  • Fields with subcomponents, such as "compressed_speed_distance" field on globalMessageNumber 20 is not represented correctly. The field is read as 3 separate bytes where the first 12 bits are speed and the last 12 bits are distance.
  • There are still lots of UNKOWN fields due to gaps in Garmins SDK Profile documentation. (Currently FIT SDK 21.126)
  • Compressed timestamp messages are not accumulated against last known full timestamp.

Convert stream of data messages to JSON array

$ fq '[.data_records[] | select(.record_header.message_type == "data").data_message]' file.fit 

Authors

References

flac_frame

Options

Name Default Description
bits_per_sample 16 Bits per sample

Examples

Decode file using flac_frame options

$ fq -d flac_frame -o bits_per_sample=16 . file

Decode value as flac_frame

... | flac_frame({bits_per_sample:16})

hevc_au

Options

Name Default Description
length_size 4 Length value size

Examples

Decode file using hevc_au options

$ fq -d hevc_au -o length_size=4 . file

Decode value as hevc_au

... | hevc_au({length_size:4})

html

Options

Name Default Description
array false Decode as nested arrays
attribute_prefix @ Prefix for attribute keys
seq false Use seq attribute to preserve element order

Examples

Decode file using html options

$ fq -d html -o array=false -o attribute_prefix="@" -o seq=false . file

Decode value as html

... | html({array:false,attribute_prefix:"@",seq:false})

HTML is decoded in HTML5 mode and will always include <html>, <body> and <head> element.

See xml format for more examples and how to preserve element order and how to encode to xml.

There is no to_html function, see to_xml instead.

Element as object

# decode as object is the default
$ echo '<a href="url">text</a>' | fq -d html
{
  "html": {
    "body": {
      "a": {
        "#text": "text",
        "@href": "url"
      }
    },
    "head": ""
  }
}

Element as array

$ '<a href="url">text</a>' | fq -d html -o array=true
[
  "html",
  null,
  [
    [
      "head",
      null,
      []
    ],
    [
      "body",
      null,
      [
        [
          "a",
          {
            "#text": "text",
            "href": "url"
          },
          []
        ]
      ]
    ]
  ]
]

# decode html files to a {file: "title", ...} object
$ fq -n -d html '[inputs | {key: input_filename, value: .html.head.title?}] | from_entries' *.html

# <a> href:s in file
$ fq -r -o array=true -d html '.. | select(.[0] == "a" and .[1].href)?.[1].href' file.html

leveldb_descriptor

Limitations

  • fragmented non-"full" records are not merged and decoded further.

Authors

References

leveldb_log

Limitations

  • fragmented non-"full" records are not merged and decoded further.

Authors

References

leveldb_table

Limitations

  • no Meta Blocks (like "filter") are decoded yet.
  • Zstandard uncompression is not implemented yet.

Authors

References

luajit

Authors

References

macho

Supports decoding vanilla and FAT Mach-O binaries.

Select 64bit load segments

$ fq '.load_commands[] | select(.cmd=="segment_64")' file

References

Authors

markdown

Array with all level 1 and 2 headers

$ fq -d markdown '[.. | select(.type=="heading" and .level<=2)?.children[0]]' file.md

matroska

Options

Name Default Description
decode_samples true Decode samples

Examples

Decode file using matroska options

$ fq -d matroska -o decode_samples=true . file

Decode value as matroska

... | matroska({decode_samples:true})

Lookup element using path

$ fq 'matroska_path(".Segment.Tracks[0)")' file.mkv

Get path to element

$ fq 'grep_by(.id == "Tracks") | matroska_path' file.mkv

References

moc3

Authors

mp3

Options

Name Default Description
max_sync_seek 32768 Max byte distance to next sync
max_unique_header_configs 5 Max number of unique frame header configs allowed
max_unknown 50 Max percent (0-100) unknown bits

Examples

Decode file using mp3 options

$ fq -d mp3 -o max_sync_seek=32768 -o max_unique_header_configs=5 -o max_unknown=50 . file

Decode value as mp3

... | mp3({max_sync_seek:32768,max_unique_header_configs:5,max_unknown:50})

mp4

Options

Name Default Description
allow_truncated false Allow box to be truncated
decode_samples true Decode samples

Examples

Decode file using mp4 options

$ fq -d mp4 -o allow_truncated=false -o decode_samples=true . file

Decode value as mp4

... | mp4({allow_truncated:false,decode_samples:true})

Speed up decoding by not decoding samples

# manually decode first sample as a aac_frame
$ fq -o decode_samples=false '.tracks[0].samples[0] | aac_frame | d' file.mp4

Entries for first edit list as values

$ fq 'first(grep_by(.type=="elst").entries) | tovalue' file.mp4

Whole box tree as JSON (exclude mdat data and tracks)

$ fq 'del(.tracks) | grep_by(.type=="mdat").data = "<excluded>" | tovalue' file.mp4

Force decode a single box

$ fq -n '"AAAAHGVsc3QAAAAAAAAAAQAAADIAAAQAAAEAAA==" | from_base64 | mp4({force:true}) | d'

Lookup mp4 box using a mp4 box path.

# <decode value box> | mp4_path($path) -> <decode value box>
$ fq 'mp4_path(".moov.trak[1]")' file.mp4

Get mp4 box path for a decode value box.

# <decode value box> | mp4_path -> string
$ fq 'grep_by(.type == "trak") | mp4_path' file.mp4

References

msgpack

Convert represented value to JSON

$ fq -d msgpack torepr file.msgpack

References

nes

Limitations

  • prg_rom, chr_rom and trainer fields may contain data that is just random junk from the memory chips, since they are of a fixed size.
  • The nes_toasm function outputs ALL opcodes, including the unofficial ones, which means that none of the regular assemblers can recompile it.
  • The nes_tokitty function works on tiles in chr_rom but only outputs a Kitty graphics compatible string. You need to manually printf that string to get Kitty (or another compatible terminal) to output the graphics.

Decompile PRG ROM

$ fq -r '.prg_rom[] | nes_toasm' file.nes

Print out first CHR ROM tile in Kitty (or Konsole, wayst, WezTerm) at size 5

$ printf $(fq -r -d nes '.chr_rom[0] | nes_tokitty(5)' file.nes)

Print out all CHR ROM tiles in Kitty (with Bash) at size 5

$ for line in $(fq -r '.chr_rom[] | nes_tokitty(5)' file.nes);do printf "%b%s" "$line";done

Authors

References

opentimestamps

View a full OpenTimestamps file

$ fq dd file.ots

List the names of the Calendar servers used

$ fq '.operations | map(select(.attestation_type == "calendar") | .url)' file.ots

Check if there are Bitcoin attestations present

$ fq '.operations | map(select(.attestation_type == "bitcoin")) | length > 0' file.ots

Authors

References

pcap

Build object with number of (reassembled) TCP bytes sent to/from client IP

# for a pcapng file you would use .[0].tcp_connections for first section
$ fq '.tcp_connections | group_by(.client.ip) | map({key: .[0].client.ip, value: map(.client.stream, .server.stream | tobytes.size) | add}) | from_entries'
{
  "10.1.0.22": 15116,
  "10.99.12.136": 234,
  "10.99.12.150": 218
}

pg_btree

Options

Name Default Description
page 0 First page number in file, default is 0

Examples

Decode file using pg_btree options

$ fq -d pg_btree -o page=0 . file

Decode value as pg_btree

... | pg_btree({page:0})

Btree index meta page

$ fq -d pg_btree -o flavour=postgres14 ".[0] | d" 16404

Btree index page

$ fq -d pg_btree -o flavour=postgres14 ".[1]" 16404

Authors

References

pg_control

Options

Name Default Description
flavour PostgreSQL flavour: postgres14, pgproee14.., postgres10

Examples

Decode file using pg_control options

$ fq -d pg_control -o flavour="" . file

Decode value as pg_control

... | pg_control({flavour:""})

Decode content of pg_control file

$ fq -d pg_control -o flavour=postgres14 d pg_control

Specific fields can be got by request

$ fq -d pg_control -o flavour=postgres14 ".state, .check_point_copy.redo, .wal_level" pg_control

Authors

References

pg_heap

Options

Name Default Description
flavour postgres14 PostgreSQL flavour: postgres14, pgproee14.., postgres10
page 0 First page number in file, default is 0
segment 0 Segment file number (16790.1 is 1), default is 0

Examples

Decode file using pg_heap options

$ fq -d pg_heap -o flavour="postgres14" -o page=0 -o segment=0 . file

Decode value as pg_heap

... | pg_heap({flavour:"postgres14",page:0,segment:0})

To see heap page's content

$ fq -d pg_heap -o flavour=postgres14 ".[0]" 16994

To see page's header

$ fq -d pg_heap -o flavour=postgres14 ".[0].page_header" 16994

First and last item pointers on first page

$ fq -d pg_heap -o flavour=postgres14 ".[0].pd_linp[0, -1]" 16994

First and last tuple on first page

$ fq -d pg_heap -o flavour=postgres14 ".[0].tuples[0, -1]" 16994

Authors

References

protobuf

Can decode sub messages

$ fq -d protobuf '.fields[6].wire_value | protobuf | d' file

References

rtmp

Current only supports plain RTMP (not RTMPT or encrypted variants etc) with AMF0 (not AMF3).

Show rtmp streams in PCAP file

fq '.tcp_connections[] | select(.server.port=="rtmp") | d' file.cap

References

tls

Options

Name Default Description
keylog NSS Key Log content

Examples

Decode file using tls options

$ fq -d tls -o keylog="" . file

Decode value as tls

... | tls({keylog:""})

Supports decoding of most standard records, messages and extensions. Can also decrypt most standard cipher suits in a PCAP with traffic in both directions if a NSS key log is provided.

Decode and decrypt provding a PCAP and key log

Write traffic to a PCAP file:

$ tcpdump -i <iface> -w traffic.pcap

Make sure your curl TLS backend support SSLKEYLOGFILE and do:

$ SSLKEYLOGFILE=traffic.keylog curl --tls-max 1.2 https://host/path

Decode, decrypt and query. Uses keylog=@<path> to read option value from keylog file:

# decode and show whole tree
$ fq -o [email protected] d traffic.pcap

# write unencrypted server response to a file.
# first .stream is the TCP stream, second .stream is TLS application data stream
#
# first TCP connections:
$ fq -o [email protected] '.tcp_connections[0].server.stream.stream | tobytes' traffic.pcap > data
# first TLS connection:
$ fq -o [email protected]  'first(grep_by(.server.stream | format == "tls")).server.stream.stream | tobytes' > data

Supported cipher suites for decryption

TLS_DH_ANON_EXPORT_WITH_DES40_CBC_SHA, TLS_DH_ANON_EXPORT_WITH_RC4_40_MD5, TLS_DHE_DSS_EXPORT_WITH_DES40_CBC_SHA, TLS_DHE_DSS_WITH_3DES_EDE_CBC_SHA, TLS_DHE_DSS_WITH_AES_128_CBC_SHA, TLS_DHE_DSS_WITH_AES_128_CBC_SHA256, TLS_DHE_DSS_WITH_AES_128_GCM_SHA256, TLS_DHE_DSS_WITH_AES_256_CBC_SHA, TLS_DHE_DSS_WITH_AES_256_CBC_SHA256, TLS_DHE_DSS_WITH_AES_256_GCM_SHA384, TLS_DHE_DSS_WITH_DES_CBC_SHA, TLS_DHE_DSS_WITH_RC4_128_SHA, TLS_DHE_RSA_EXPORT_WITH_DES40_CBC_SHA, TLS_DHE_RSA_WITH_3DES_EDE_CBC_SHA, TLS_DHE_RSA_WITH_AES_128_CBC_SHA, TLS_DHE_RSA_WITH_AES_128_CBC_SHA256, TLS_DHE_RSA_WITH_AES_128_GCM_SHA256, TLS_DHE_RSA_WITH_AES_256_CBC_SHA, TLS_DHE_RSA_WITH_AES_256_CBC_SHA256, TLS_DHE_RSA_WITH_AES_256_GCM_SHA384, TLS_DHE_RSA_WITH_CHACHA20_POLY1305_SHA256, TLS_DHE_RSA_WITH_DES_CBC_SHA, TLS_ECDH_ECDSA_WITH_3DES_EDE_CBC_SHA, TLS_ECDH_ECDSA_WITH_AES_128_CBC_SHA, TLS_ECDH_ECDSA_WITH_AES_128_CBC_SHA256, TLS_ECDH_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_ECDSA_WITH_AES_256_CBC_SHA, TLS_ECDH_ECDSA_WITH_AES_256_CBC_SHA384, TLS_ECDH_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_ECDSA_WITH_RC4_128_SHA, TLS_ECDH_RSA_WITH_3DES_EDE_CBC_SHA, TLS_ECDH_RSA_WITH_AES_128_CBC_SHA, TLS_ECDH_RSA_WITH_AES_128_CBC_SHA256, TLS_ECDH_RSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_RSA_WITH_AES_256_CBC_SHA, TLS_ECDH_RSA_WITH_AES_256_CBC_SHA384, TLS_ECDH_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_RSA_WITH_RC4_128_SHA, TLS_ECDHE_ECDSA_WITH_3DES_EDE_CBC_SHA, TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA, TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA, TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256, TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA, TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA, TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384, TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384eadAESGCM, TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256, TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305, TLS_ECDHE_ECDSA_WITH_RC4_128_SHA, TLS_ECDHE_ECDSA_WITH_RC4_128_SHA, TLS_ECDHE_PSK_WITH_AES_128_CBC_SHA, TLS_ECDHE_PSK_WITH_AES_128_GCM_SHA256, TLS_ECDHE_PSK_WITH_AES_256_CBC_SHA, TLS_ECDHE_RSA_WITH_3DES_EDE_CBC_SHA, TLS_ECDHE_RSA_WITH_3DES_EDE_CBC_SHA, TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA, TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA, TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256, TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384, TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256, TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305, TLS_ECDHE_RSA_WITH_RC4_128_SHA, TLS_ECDHE_RSA_WITH_RC4_128_SHA, TLS_PSK_WITH_AES_128_CBC_SHA, TLS_PSK_WITH_AES_256_CBC_SHA, TLS_PSK_WITH_RC4_128_SHA, TLS_RSA_EXPORT_WITH_DES40_CBC_SHA, TLS_RSA_EXPORT_WITH_RC4_40_MD5, TLS_RSA_WITH_3DES_EDE_CBC_SHA, TLS_RSA_WITH_3DES_EDE_CBC_SHA, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_RSA_WITH_AES_128_CBC_SHA256, TLS_RSA_WITH_AES_128_CBC_SHA256, TLS_RSA_WITH_AES_128_GCM_SHA256, TLS_RSA_WITH_AES_128_GCM_SHA256, TLS_RSA_WITH_AES_256_CBC_SHA, TLS_RSA_WITH_AES_256_CBC_SHA, TLS_RSA_WITH_AES_256_CBC_SHA256, TLS_RSA_WITH_AES_256_GCM_SHA384, TLS_RSA_WITH_AES_256_GCM_SHA384, TLS_RSA_WITH_DES_CBC_SHA, TLS_RSA_WITH_RC4_128_MD5, TLS_RSA_WITH_RC4_128_SHA, TLS_RSA_WITH_RC4_128_SHA

References

tzif

Get last transition time

fq '.v2plusdatablock.transition_times[-1] | tovalue' tziffile

Count leap second records

fq '.v2plusdatablock.leap_second_records | length' tziffile

Authors

References

wasm

Count opcode usage

$ fq '.sections[] | select(.id == "code_section") | [.. | .opcode? // empty] | count | map({key: .[0], value: .[1]}) | from_entries' file.wasm

List exports and imports

$ fq '.sections | {import: map(select(.id == "import_section").content.im.x[].nm.b), export: map(select(.id == "export_section").content.ex.x[].nm.b)}' file.wasm

Authors

References

xml

Options

Name Default Description
array false Decode as nested arrays
attribute_prefix @ Prefix for attribute keys
seq false Use seq attribute to preserve element order

Examples

Decode file using xml options

$ fq -d xml -o array=false -o attribute_prefix="@" -o seq=false . file

Decode value as xml

... | xml({array:false,attribute_prefix:"@",seq:false})

XML can be decoded and encoded into jq values in two ways, elements as object or array. Which variant to use depends a bit what you want to do. The object variant might be easier to query for a specific value but array might be easier to use to generate xml or to query after all elements of some kind etc.

Encoding is done using the to_xml function and it will figure what variant that is used based on the input value. Is has two optional options indent and attribute_prefix.

Elements as object

Element can have different shapes depending on body text, attributes and children:

  • <a key="value">text</a> is {"a":{"#text":"text","@key":"value"}}, has text (#text) and attributes (@key)
  • <a>text</a> is {"a":"text"}
  • <a><b>text</b></a> is {"a":{"b":"text"}} one child with only text and no attributes
  • <a><b/><b>text</b></a> is {"a":{"b":["","text"]}} two children with same name end up in an array
  • <a><b/><b key="value">text</b></a> is {"a":{"b":["",{"#text":"text","@key":"value"}]}}

If there is #seq attribute it encodes the child element order. Use -o seq=true to include sequence number when decoding, otherwise order might be lost.

# decode as object is the default
$ echo '<a><b/><b>bbb</b><c attr="value">ccc</c></a>' | fq -d xml -o seq=true
{
  "a": {
    "b": [
      {
        "#seq": 0
      },
      {
        "#seq": 1,
        "#text": "bbb"
      }
    ],
    "c": {
      "#seq": 2,
      "#text": "ccc",
      "@attr": "value"
    }
  }
}

# access text of the <c> element
$ echo '<a><b/><b>bbb</b><c attr="value">ccc</c></a>' | fq '.a.c["#text"]'
"ccc"

# decode to object and encode to xml
$ echo '<a><b/><b>bbb</b><c attr="value">ccc</c></a>' | fq -r -d xml -o seq=true 'to_xml({indent:2})'
<a>
  <b></b>
  <b>bbb</b>
  <c attr="value">ccc</c>
</a>

Elements as array

Elements are arrays of the shape ["#text": "body text", "attr_name", {key: "attr value"}|null, [<child element>, ...]].

# decode as array
$ echo '<a><b/><b>bbb</b><c attr="value">ccc</c></a>' | fq -d xml -o array=true
[
  "a",
  null,
  [
    [
      "b",
      null,
      []
    ],
    [
      "b",
      {
        "#text": "bbb"
      },
      []
    ],
    [
      "c",
      {
        "#text": "ccc",
        "attr": "value"
      },
      []
    ]
  ]
]

# decode to array and encode to xml
$ echo '<a><b/><b>bbb</b><c attr="value">ccc</c></a>' | fq -r -d xml -o array=true -o seq=true 'to_xml({indent:2})'
<a>
  <b></b>
  <b>bbb</b>
  <c attr="value">ccc</c>
</a>

# access text of the <c> element, the object variant above is probably easier to use
$ echo '<a><b/><b>bbb</b><c attr="value">ccc</c></a>' | fq -o array=true '.[2][2][1]["#text"]'
"ccc"

References

zip

Options

Name Default Description
uncompress true Uncompress and probe files

Examples

Decode file using zip options

$ fq -d zip -o uncompress=true . file

Decode value as zip

... | zip({uncompress:true})

Supports ZIP64.

Timestamp and time zones

The timestamp accessed via .local_files[].last_modification is encoded in ZIP files using MS-DOS representation which lacks a known time zone. Probably the local time/date was used at creation. The unix_guess field in last_modification is a guess assuming the local time zone was UTC at creation.

References

Dependency graph

alt text