Replies: 1 comment
-
As a first guess, I'd suspect that there could be an issue in luma.gl with the mapping of that texture format to the three required WebGL parameters (which would be the parameters you used in your original example). This would be somewhere in Code Unfortunately, luma.gl doesn't yet have a lot of test cases for 16-bit formats, something we'd love to see addressed. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In my app written with deck.gl v8, I had extended
bitmapLayer
to crush a 16-bit image to 8-bit to display within the fragment shader.To get a 16-bit image to render in deck.gl v8, I had to provide certain image props that are deprecated/no longer supported in v9. These props are as follows.
I am now trying to upgrade this to deck.gl v9.
in v9, my understanding is that
type
anddataFormat
are both deprecated. I have tried passingrgba16uint
,rgba16float
, andrgba16unorm-webgl
as theformat
prop on the image, all of which result in no data making it to theuniform sampler2D bitmapTexture
inside of the fragment shader.When passing a
Uint8Array
as theimage.data
prop with a format ofrgba8unorm
the custom shaders work as expected.i.e.
Any help would be much appreciated.
Beta Was this translation helpful? Give feedback.
All reactions