-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DVX-670: Added support for additional workflow packages (including custom packages) #402
base: main
Are you sure you want to change the base?
Conversation
…stom packages) - Added a method `TableauCrawler.offline()` to fetch metadata directly from the S3 bucket. - Added support for new custom package - `AssetImportPackage`
Signed-off-by: Karanjot Singh <[email protected]>
Signed-off-by: Karanjot Singh <[email protected]>
@0xquark, you can use the following commands locally before pushing your commit:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's update __init__.py
to include the newly added custom packages, specifying the names of the modules or classes we want to expose when the package is imported.
from .big_query_crawler import BigQueryCrawler
from .confluent_kafka_crawler import ConfluentKafkaCrawler
from .connection_delete import ConnectionDelete
from .dbt_crawler import DbtCrawler
from .dynamo_d_b_crawler import DynamoDBCrawler
from .glue_crawler import GlueCrawler
from .postgres_crawler import PostgresCrawler
from .powerbi_crawler import PowerBICrawler
from .s_q_l_server_crawler import SQLServerCrawler
from .sigma_crawler import SigmaCrawler
from .snowflake_crawler import SnowflakeCrawler
from .snowflake_miner import SnowflakeMiner
from .tableau_crawler import TableauCrawler
from .asset_import import AssetImport
from .asset_export_basic import AssetExportBasic
__all__ = [
"BigQueryCrawler",
"ConfluentKafkaCrawler",
"ConnectionDelete",
"DbtCrawler",
"DynamoDBCrawler",
"GlueCrawler",
"PostgresCrawler",
"PowerBICrawler",
"SQLServerCrawler",
"SigmaCrawler",
"SnowflakeCrawler",
"SnowflakeMiner",
"TableauCrawler",
"AssetImport",
"AssetExportBasic",
]
Signed-off-by: Karanjot Singh <[email protected]>
Thanks @Aryamanz29 for the review 🙇 |
Signed-off-by: Karanjot Singh <[email protected]>
Signed-off-by: Karanjot Singh <[email protected]>
TableauCrawler.offline()
to fetch metadata directly from the S3 bucket.AssetImportPackage
TODOs
AssetImportPackage
) :asset-export-basic
andrelational-assets-builder
.TableauCrawler.offline()
AssetImportPackage
asset-export-basic
relational-assets-builder
.unit
tests in: unit/test_packages.py.