-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Idea: Allow Mammoth to serialize/deserialize values, for more precise customization #211
Comments
Just ran into another reason: "pg" doesn't correctly deserialize arrays of enums: brianc/node-pg-types#56 Mammoth has the type information to do it correctly. |
And would you suggest to let mammoth handle the pool and everything pg is doing right now as well or do you see another way? |
The statement about "parsing the text/binary protocol directly" was a bit extreme. I think we might get acceptable performance just passing in a custom value parser, e.g: import pg from "pg";
const client = new pg.Client({
connectionString: "...",
types: {
getTypeParser(typeId: ..., format: 'text' | 'binary'): any {
return v => v;
}
}
} as any); This just relays the raw string/binary value, so Mammoth can parse the values depending on the Mammoth schema. (It can use 'pg-types' to do the low-level work.) |
I problem I have:
int8
columns to deserialize differently:number
vsBigInt
vsBuffer
.jsonb
columns differently.With "pg", you can only have a single serializer/deserializer for a type.
Mammoth has richer type information, e.g.
jsonb<Article>
. If Mammoth allowed transforming values, I could get exactly the serialization/deserialization I want.One downside is performance. But maybe there won't be a performance hit if Mammoth hooks in to "pg" at a lower level, parsing the text/binary protocol directly. (Hopefully "pg-protocol" can do most of the work.)
The text was updated successfully, but these errors were encountered: