Skip to content

create_reader always returns ReadError if read operation of the underlying reader gives large output #57

@messense

Description

@messense

When using pcap-parser with zstd streaming decoder create_reader(1024 * 1024, <zstd reader>)

let mut buffer = Buffer::with_capacity(capacity);
let sz = reader.read(buffer.space()).or(Err(PcapError::ReadError))?;
if sz == 0 {
return Err(PcapError::Eof);
}
buffer.fill(sz);

the zstd decoder read operation can pull 1024 * 1024 bytes to the buffer which makes the buffer full, then in

pub fn from_buffer(
mut buffer: Buffer,
mut reader: R,
) -> Result<LegacyPcapReader<R>, PcapError<&'static [u8]>> {
let sz = reader.read(buffer.space()).or(Err(PcapError::ReadError))?;
buffer.fill(sz);
let (_rem, header) = match parse_pcap_header(buffer.data()) {

pcap-parser tries to read again, but because there is no space left in the buffer, zstd decoder read operation now throws Operation made no progress over multiple calls, due to output buffer being full error, which turns into PcapError::ReadError. (BTW it'd be nice if PcapError::ReadError carries over the original std::io::Error.)

This problem seems to apply to both legacy pcap and pcapng.

I think the fix could be only call read when buffer.available_data() < 24 (where 24 is pcap header length).

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions