Skip to content

Performance issues / memory core dumps with big files #65

Open
@melvinroest

Description

@melvinroest

We're experiencing the jsonld streaming parser to go out of memory for large files.

To reproduce, store the following file as test.js in the project directory.

const fs = require("fs");
const { JsonLdParser } = require("./");
const zlib = require("zlib");
fs.createReadStream("./test.jsonld.gz")
  .pipe(zlib.createGunzip())
  .pipe(
    new JsonLdParser({
      baseIRI: "http://base"
    })
  )
  .on('data', () => {})

To run this file:

curl https://test.triply.cc/laurensrietveld/iconclass/assets/5eda510c6300450368fbd900 -L > test.jsonld.gz;
node test

Tested on version 2.0.2 and node 12.18.0 / 14.4.0.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions