You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Although, if you have enough memory, Wren allows you to create lists with up to UINT32_MAX (4,294,967,295) elements, I noticed today that if you go above INT32_MAX (2,147,483,647) then List.count returns a negative value:
I tracked this down to list_count returning list->elements.count which is an int and, on my system (Ubuntu 22.04, x64, GCC), that's a 32 bit signed integer.
On the face of it the type of count (a field of ValueBuffer) should really be uint32_t though as this is something which is defined deep in the bowels of the VM code, I'm not sure what repercussions there would be if we changed it. It might affect negative indexing perhaps.
Of course, it's not a big problem in practice but it's something to bear in mind if you need to create huge lists for some purpose.
The text was updated successfully, but these errors were encountered:
Ideally it should even be size_t, unless we add some restriction on allocation of sizes. Anyway there is barely any checks for allocation failures. So even size_t will have some other problem in such cases.
Although, if you have enough memory, Wren allows you to create lists with up to UINT32_MAX (4,294,967,295) elements, I noticed today that if you go above INT32_MAX (2,147,483,647) then
List.count
returns a negative value:I tracked this down to
list_count
returninglist->elements.count
which is anint
and, on my system (Ubuntu 22.04, x64, GCC), that's a 32 bit signed integer.On the face of it the type of
count
(a field ofValueBuffer
) should really beuint32_t
though as this is something which is defined deep in the bowels of the VM code, I'm not sure what repercussions there would be if we changed it. It might affect negative indexing perhaps.Of course, it's not a big problem in practice but it's something to bear in mind if you need to create huge lists for some purpose.
The text was updated successfully, but these errors were encountered: