Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DVX-670: Added support for additional workflow packages (including custom packages) #402

Draft
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

Aryamanz29
Copy link
Member

@Aryamanz29 Aryamanz29 commented Oct 14, 2024

  • Added a method TableauCrawler.offline() to fetch metadata directly from the S3 bucket.
  • Added support for new custom package - AssetImportPackage

TODOs

…stom packages)

- Added a method `TableauCrawler.offline()` to fetch metadata directly from the S3 bucket.
- Added support for new custom package - `AssetImportPackage`
@Aryamanz29 Aryamanz29 added the feature New feature or request label Oct 21, 2024
@Aryamanz29
Copy link
Member Author

@0xquark, you can use the following commands locally before pushing your commit:

  1. For code formatting: 🧹

    ./pyatlan-formatter
    
  2. For code QA checks (black, flake8, mypy checks): ✅

    ./qa-checks
    

Copy link
Member Author

@Aryamanz29 Aryamanz29 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's update __init__.py to include the newly added custom packages, specifying the names of the modules or classes we want to expose when the package is imported.

from .big_query_crawler import BigQueryCrawler
from .confluent_kafka_crawler import ConfluentKafkaCrawler
from .connection_delete import ConnectionDelete
from .dbt_crawler import DbtCrawler
from .dynamo_d_b_crawler import DynamoDBCrawler
from .glue_crawler import GlueCrawler
from .postgres_crawler import PostgresCrawler
from .powerbi_crawler import PowerBICrawler
from .s_q_l_server_crawler import SQLServerCrawler
from .sigma_crawler import SigmaCrawler
from .snowflake_crawler import SnowflakeCrawler
from .snowflake_miner import SnowflakeMiner
from .tableau_crawler import TableauCrawler
from .asset_import import AssetImport
from .asset_export_basic import AssetExportBasic

__all__ = [
    "BigQueryCrawler",
    "ConfluentKafkaCrawler",
    "ConnectionDelete",
    "DbtCrawler",
    "DynamoDBCrawler",
    "GlueCrawler",
    "PostgresCrawler",
    "PowerBICrawler",
    "SQLServerCrawler",
    "SigmaCrawler",
    "SnowflakeCrawler",
    "SnowflakeMiner",
    "TableauCrawler",
    "AssetImport",
    "AssetExportBasic",
]

@0xquark
Copy link
Collaborator

0xquark commented Oct 28, 2024

Thanks @Aryamanz29 for the review 🙇

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants