Replies: 1 comment
-
Tried a few more scenarios. Tried using the iceberg connector. Below is the properties.
With above, used both latest trino docker image and the 466 docker image. I was trying different ways to connect to azure storage and thought to register an already-existing Iceberg table. So I pre-created the table using Spark in azure using a spark docker container. Post this tried below which is also resulting in From the trino CLI -
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm trying to create an external table in Trino backed by Azure Data Lake Storage Gen2 using the abfss scheme, but encountering the following error:
Configuration Details:
catalog/hive.properties
hadoop/core-site.xml
Docker Compose (Partial)
Notes
Tried with both Trino latest and version 458.
Hive connector is configured for external table writes.
Hadoop config includes abfs(s) implementations and correct OAuth MSI credentials.
This is tried in legacy mode because facing similar issues with the native implementation as well.
Is this a bug in Trino or am I missing something in the configuration (possibly in Hadoop filesystem resolution or Trino Hive connector compatibility with abfss)?
Would appreciate any guidance from the community or maintainers. Thanks!
ps - while I don't want to include this here, I can't help but feel the need to mention that Trino documentation is so poorly written that neither the legacy nor the new methods are explained well or function as they should.
Beta Was this translation helpful? Give feedback.
All reactions