Skip to content

feat(mcp): add local dev script tool #711

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 64 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from 60 commits
Commits
Show all changes
64 commits
Select commit Hold shift + click to select a range
7491600
feat(mcp): Add MCP server for PyAirbyte connector management
devin-ai-integration[bot] Jun 9, 2025
98adeec
refactor(mcp): Use FastMCP framework and existing PyAirbyte methods
devin-ai-integration[bot] Jun 9, 2025
fce3abd
fix(mcp): Remove type annotations from FastMCP tool functions
devin-ai-integration[bot] Jun 9, 2025
8c8f8c1
fix(mcp): Add proper type annotations to resolve MyPy and Ruff lint e…
devin-ai-integration[bot] Jun 9, 2025
c416ef1
feat(mcp): Default get_config_spec to YAML output with optional forma…
devin-ai-integration[bot] Jun 9, 2025
9f77901
feat(mcp): Update experimental docstring, add output_format to list_c…
devin-ai-integration[bot] Jun 10, 2025
96591ac
fix(mcp): Add type annotations compatible with both FastMCP and CI re…
devin-ai-integration[bot] Jun 10, 2025
0411ccc
fix(mcp): Use modern type hints for FastMCP compatibility and resolve…
devin-ai-integration[bot] Jun 10, 2025
c38bebb
feat(mcp): Add example script demonstrating MCP server usage and fix …
devin-ai-integration[bot] Jun 10, 2025
2fd6092
fix(mcp): Remove type annotations from FastMCP tool functions for com…
devin-ai-integration[bot] Jun 10, 2025
fe3f255
fix(mcp): Add ruff noqa and mypy disable directives for FastMCP compa…
devin-ai-integration[bot] Jun 10, 2025
14a4b52
fix(mcp): Replace unsafe os.environ check to resolve CodeQL security …
devin-ai-integration[bot] Jun 10, 2025
1ccb06e
feat(mcp): Add pydantic-ai[mcp] as dev dependency for agent examples
devin-ai-integration[bot] Jun 11, 2025
f233d56
feat(examples): Add PydanticAI agent example with MCP server integration
devin-ai-integration[bot] Jun 11, 2025
60dc850
fix(mcp): Correct FastMCP framework usage to resolve transport compat…
devin-ai-integration[bot] Jun 11, 2025
02acd8e
fix(mcp): Remove unused mcp.server.stdio import after FastMCP fix
devin-ai-integration[bot] Jun 11, 2025
ddbcb91
fix(mcp): Add stream selection to run_sync to resolve 'No streams sel…
devin-ai-integration[bot] Jun 12, 2025
885beb1
fix(mcp): Add six dependency to resolve missing module error
devin-ai-integration[bot] Jun 12, 2025
ce64fd3
docs(mcp): Add comprehensive MCP server documentation with pdoc integ…
devin-ai-integration[bot] Jun 12, 2025
b1dc629
feat(mcp): Add 5 new MCP actions for manifest-only connector development
devin-ai-integration[bot] Jun 12, 2025
341966c
feat(mcp): Add get_manifest_schema tool for retrieving CDK validation…
devin-ai-integration[bot] Jun 13, 2025
77ec0ab
many improvements...
aaronsteers Jun 25, 2025
0b9a4ce
revert now-unused overload
aaronsteers Jun 25, 2025
ef413e4
expand the docstring
aaronsteers Jun 25, 2025
3a6e7ce
delete unnecessary examples
aaronsteers Jun 25, 2025
72cb487
refactor Connector._config > Connector._hydrated_config
aaronsteers Jun 25, 2025
b28d1ae
refactor get_config() calls to use _hydrated_config
aaronsteers Jun 25, 2025
2f72d04
update test to use public api
aaronsteers Jun 25, 2025
023ce3e
fix resolution
aaronsteers Jun 25, 2025
5f66122
fix install types filter
aaronsteers Jun 25, 2025
e355bfc
Merge remote-tracking branch 'origin/main' into aj/feat/add-mcp-server
aaronsteers Jun 25, 2025
8005e52
removing required future annotations
aaronsteers Jun 25, 2025
e58aa27
lint and other fixes
aaronsteers Jun 25, 2025
3ba5749
lint fix
aaronsteers Jun 25, 2025
6d95bc6
clean up docs
aaronsteers Jun 25, 2025
0ace9fc
update names
aaronsteers Jun 25, 2025
5a1e112
rename tool
aaronsteers Jun 25, 2025
4b63df2
update docstrings
aaronsteers Jun 25, 2025
9d9c45e
fix local ops
aaronsteers Jun 25, 2025
1547baa
add arg docs
aaronsteers Jun 25, 2025
8b80029
Update airbyte/mcp/__init__.py
aaronsteers Jun 25, 2025
0386e4a
Update airbyte/mcp/__init__.py
aaronsteers Jun 25, 2025
f5c45ce
Update airbyte/secrets/hydration.py
aaronsteers Jun 25, 2025
bc9532c
Auto-fix lint and format issues
Jun 25, 2025
3f8bf21
add suggested streams to metadata
aaronsteers Jun 25, 2025
d53fb33
Merge branch 'aj/feat/add-mcp-server' of https://github.com/airbytehq…
aaronsteers Jun 25, 2025
78eecf8
add handling for suggested streams
aaronsteers Jun 25, 2025
e0c0008
fix suggested streams ref
aaronsteers Jun 25, 2025
f79423c
fix line length
aaronsteers Jun 25, 2025
ae8ce5b
add type hint
aaronsteers Jun 25, 2025
0e6c6c6
add mcp to imports, adds pdoc docs
aaronsteers Jun 25, 2025
12aa3eb
Merge branch 'main' into aj/feat/add-mcp-server
aaronsteers Jun 26, 2025
bd299ee
poetry lock
aaronsteers Jun 26, 2025
eee61a6
feat(mcp): add pipeline generation tool to dev operations
devin-ai-integration[bot] Jun 26, 2025
61f68e5
fix: resolve ruff linting issues in _local_dev.py
devin-ai-integration[bot] Jun 26, 2025
041f0eb
tidy up script template
aaronsteers Jul 4, 2025
5815443
tidy up script
aaronsteers Jul 4, 2025
f1afcd7
Merge branch 'main' into aj/mcp/add-local-dev-script-tool
aaronsteers Jul 4, 2025
d94192c
delete unused function
aaronsteers Jul 4, 2025
23799e3
re-organize mcp modules
aaronsteers Jul 5, 2025
ed3d871
cherry-pick-me: ruff allow star-args for '*args', '**kwargs'
aaronsteers Jul 15, 2025
543d5e8
chore: poetry add openai
aaronsteers Jul 15, 2025
cf456a9
cherry-pick-me: improved secrets managers and secret handling
aaronsteers Jul 16, 2025
b520416
cherry-pick-me: llm token availability awareness
aaronsteers Jul 16, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
86 changes: 86 additions & 0 deletions airbyte/mcp/_coding.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
# Copyright (c) 2024 Airbyte, Inc., all rights reserved.
"""Local development MCP operations."""

from typing import Annotated

from fastmcp import FastMCP
from pydantic import Field

from airbyte.mcp._coding_templates import DOCS_TEMPLATE, SCRIPT_TEMPLATE
from airbyte.sources import get_available_connectors


def generate_pyairbyte_pipeline(
source_connector_name: Annotated[
str,
Field(description="The name of the source connector (e.g., 'source-faker')."),
],
destination_connector_name: Annotated[
str,
Field(description="The name of the destination connector (e.g., 'destination-duckdb')."),
],
pipeline_name: Annotated[
str | None,
Field(
description="A descriptive name for the pipeline. "
"If not provided, a default name will be generated.",
),
] = None,
) -> dict[str, str]:
"""Generate a PyAirbyte pipeline script with setup instructions.

This tool creates a complete PyAirbyte pipeline script that extracts data from
a source connector and loads it to a destination connector, along with setup
instructions for running the pipeline.

Returns a dictionary with 'code' and 'instructions' keys containing the
generated pipeline script and setup instructions respectively.
"""
source_short_name = source_connector_name.replace("source-", "")
destination_short_name = destination_connector_name.replace("destination-", "")
if not pipeline_name:
pipeline_name = f"{source_short_name}_to_{destination_short_name}_pipeline"

pipeline_id = pipeline_name.lower().replace(" ", "_").replace("'", "")
available_connectors: list[str] = get_available_connectors()
if source_connector_name not in available_connectors:
return {
"error": (
f"Source connector '{source_connector_name}' not found. "
f"Available connectors: {', '.join(sorted(available_connectors))}"
)
}

if destination_connector_name not in available_connectors:
return {
"error": (
f"Destination connector '{destination_connector_name}' not found. "
f"Available connectors: {', '.join(sorted(available_connectors))}"
)
}

pipeline_code: str = SCRIPT_TEMPLATE.format(
source_connector_name=source_connector_name,
source_config_dict={}, # Placeholder for source config
destination_connector_name=destination_connector_name,
destination_config_dict={}, # Placeholder for destination config
)
Comment on lines +62 to +67
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Check template variable consistency?

I noticed the template is being formatted with source_connector_name and destination_connector_name, but looking at the template in _coding_templates.py, it uses {source_name} and {destination_name}. Should we verify these variable names match between the template and the formatting call? wdyt?


🏁 Script executed:

#!/bin/bash
# Check template variable consistency between files
echo "Variables used in _coding.py formatting:"
grep -n "\.format(" airbyte/mcp/_coding.py -A 5

echo -e "\nVariables expected in template:"
grep -n "{.*}" airbyte/mcp/_coding_templates.py | head -20

Length of output: 2307


Align pipeline template variables with SCRIPT_TEMPLATE placeholders
It looks like SCRIPT_TEMPLATE in _coding_templates.py uses {source_name} and {destination_name} (along with {source_config_dict} and {destination_config_dict}), but in airbyte/mcp/_coding.py (lines 62–67) we’re calling:

SCRIPT_TEMPLATE.format(
  source_connector_name=source_connector_name,
  source_config_dict={},
  destination_connector_name=destination_connector_name,
  destination_config_dict={},
)

This mismatch will raise a KeyError at runtime. Should we update it to:

- pipeline_code: str = SCRIPT_TEMPLATE.format(
-   source_connector_name=source_connector_name,
-   source_config_dict={},
-   destination_connector_name=destination_connector_name,
-   destination_config_dict={},
- )
+ pipeline_code: str = SCRIPT_TEMPLATE.format(
+   source_name=source_connector_name,
+   source_config_dict={},
+   destination_name=destination_connector_name,
+   destination_config_dict={},
+ )

wdyt?

🤖 Prompt for AI Agents
In airbyte/mcp/_coding.py around lines 62 to 67, the keys used in
SCRIPT_TEMPLATE.format do not match the placeholders defined in SCRIPT_TEMPLATE,
causing a KeyError. Update the keys in the format call to match the placeholders
exactly: replace source_connector_name with source_name and
destination_connector_name with destination_name, while keeping
source_config_dict and destination_config_dict as is.


setup_instructions: str = DOCS_TEMPLATE.format(
source_connector_name=source_short_name,
destination_connector_name=destination_short_name,
pipeline_id=pipeline_id,
source_short_name=source_short_name,
dest_short_name=destination_short_name,
)

return {
"code": pipeline_code,
"instructions": setup_instructions,
"filename": f"{pipeline_id}.py",
}


def register_coding_tools(app: FastMCP) -> None:
"""Register development tools with the FastMCP app."""
app.tool(generate_pyairbyte_pipeline)
210 changes: 210 additions & 0 deletions airbyte/mcp/_coding_templates.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,210 @@
# Copyright (c) 2025 Airbyte, Inc., all rights reserved.
"""Code templates for MCP local code generation."""

SCRIPT_TEMPLATE = """
#!/usr/bin/env python
# -*- coding: utf-8 -*-

# --- Generated by pyairbyte-mcp-server ---

import os
import sys
import logging

import airbyte as ab

from dotenv import load_dotenv

# Configure logging
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')

# Load environment variables from .env file
if not load_dotenv():
logging.warning("'.env' file not found. Please ensure it exists and contains the necessary credentials.")
# Optionally exit if .env is strictly required
# sys.exit("'.env' file is required. Please create it with the necessary credentials.")
Comment on lines +22 to +25
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Fix line length to satisfy linting rules?

The linter is flagging line 23 as too long (109 > 100 characters). Should we break this into multiple lines? wdyt?

-    logging.warning("'.env' file not found. Please ensure it exists and contains the necessary credentials.")
+    logging.warning(
+        "'.env' file not found. Please ensure it exists and contains the necessary credentials."
+    )
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if not load_dotenv():
logging.warning("'.env' file not found. Please ensure it exists and contains the necessary credentials.")
# Optionally exit if .env is strictly required
# sys.exit("'.env' file is required. Please create it with the necessary credentials.")
if not load_dotenv():
logging.warning(
"'.env' file not found. Please ensure it exists and contains the necessary credentials."
)
# Optionally exit if .env is strictly required
# sys.exit("'.env' file is required. Please create it with the necessary credentials.")
🧰 Tools
🪛 Ruff (0.11.9)

23-23: Line too long (109 > 100)

(E501)

🪛 GitHub Actions: Run Linters

[error] 23-23: Ruff E501: Line too long (109 > 100).

🤖 Prompt for AI Agents
In airbyte/mcp/_coding_templates.py around lines 22 to 25, the logging.warning
message on line 23 exceeds the 100 character limit set by the linter. To fix
this, break the long string into multiple concatenated strings or use implicit
string concatenation with parentheses to split the message across multiple
lines, ensuring each line stays within the character limit.


# --- Helper to get env vars ---
def get_required_env(var_name: str) -> str:
value = os.getenv(var_name)
if value is None:
logging.error(f"Missing required environment variable: {{var_name}}")
sys.exit(f"Error: Environment variable '{{var_name}}' not set. Please add it to your .env file.")
return value
Comment on lines +31 to +33
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Fix line length for consistency with linting rules?

Similar to the previous issue, line 32 is flagged as too long (105 > 100 characters). Should we break this line as well? wdyt?

-        sys.exit(f"Error: Environment variable '{var_name}' not set. Please add it to your .env file.")
+        sys.exit(
+            f"Error: Environment variable '{var_name}' not set. Please add it to your .env file."
+        )
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
logging.error(f"Missing required environment variable: {{var_name}}")
sys.exit(f"Error: Environment variable '{{var_name}}' not set. Please add it to your .env file.")
return value
logging.error(f"Missing required environment variable: {{var_name}}")
sys.exit(
f"Error: Environment variable '{var_name}' not set. Please add it to your .env file."
)
return value
🧰 Tools
🪛 Ruff (0.11.9)

32-32: Line too long (105 > 100)

(E501)

🪛 GitHub Actions: Run Linters

[error] 32-32: Ruff E501: Line too long (105 > 100).

🤖 Prompt for AI Agents
In airbyte/mcp/_coding_templates.py around lines 31 to 33, the sys.exit call on
line 32 exceeds the 100 character limit. To fix this, break the long string into
multiple concatenated shorter strings or use implicit string concatenation to
keep each line within the limit, ensuring the message remains clear and
readable.


def get_optional_env(var_name: str, default_value: str = None) -> str:
value = os.getenv(var_name)
if value is None:
if default_value is not None:
logging.info(f"Using default value for optional environment variable: {{var_name}}")
return default_value
else:
logging.info(f"Optional environment variable not set: {{var_name}}")
return None
return value

# --- Source Configuration ---
source_name = "{source_name}"
logging.info(f"Configuring source: {{source_name}}")
source_config = {{
{source_config_dict}
}}
Comment on lines +47 to +51
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Template variable mismatch detected.

I notice the template uses {source_name} here, but in _coding.py the formatting call uses source_connector_name. This mismatch will cause the template formatting to fail. Should we align these variable names? wdyt?

-source_name = "{source_name}"
+source_name = "{source_connector_name}"
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
source_name = "{source_name}"
logging.info(f"Configuring source: {{source_name}}")
source_config = {{
{source_config_dict}
}}
source_name = "{source_connector_name}"
logging.info(f"Configuring source: {{source_name}}")
source_config = {
{source_config_dict}
}
🤖 Prompt for AI Agents
In airbyte/mcp/_coding_templates.py around lines 47 to 51, the template variable
`source_name` is used but the formatting call in _coding.py uses
`source_connector_name`, causing a mismatch. To fix this, rename the template
variable from `source_name` to `source_connector_name` to match the formatting
call, ensuring consistent variable names across the template and code.


# Optional: Add fixed configuration parameters here if needed
# source_config["some_other_parameter"] = "fixed_value"

try:
source = ab.get_source(
source_name,
config=source_config,
install_if_missing=True,
)
except Exception as e:
logging.error(f"Failed to initialize source '{{source_name}}': {{e}}")
sys.exit(1)

# Verify the connection
logging.info("Checking source connection...")
try:
source.check()
logging.info("Source connection check successful.")
except Exception as e:
logging.error(f"Source connection check failed: {{e}}")
sys.exit(1)

# Select streams to sync (use select_all_streams() or specify)
logging.info("Selecting all streams from source.")
source.select_all_streams()
# Example for selecting specific streams:
# source.select_streams(["users", "products"])

# --- Read data into Cache and then Pandas DataFrame ---
logging.info("Reading data from source into cache...")
# By default, reads into a temporary DuckDB cache
# Specify a cache explicitly: cache = ab.get_cache(config=...)
try:
results = source.read()
logging.info("Finished reading data.")
except Exception as e:
logging.error(f"Failed to read data from source: {{e}}")
sys.exit(1)

# --- Process Streams into DataFrames ---
dataframes = {{}}
if results.streams:
logging.info(f"Converting {{len(results.streams)}} streams to Pandas DataFrames...")
for stream_name, stream_data in results.streams.items():
try:
df = stream_data.to_pandas()
dataframes[stream_name] = df
logging.info(
f"Successfully converted stream '{{stream_name}}' to DataFrame ({{len(df)}} rows)."
)
# --- !! IMPORTANT !! ---
# Add your data processing/analysis logic here!
# Example: print(f"\\nDataFrame for stream '{{stream_name}}':")
# print(df.head())
# print("-" * 30)
except Exception as e:
logging.error(f"Failed to convert stream '{{stream_name}}' to DataFrame: {{e}}")
logging.info("Finished processing streams.")
else:
logging.info("No streams found in the read result.")

# Example: Access a specific dataframe
# if "users" in dataframes:
# users_df = dataframes["users"]
# print("\\nUsers DataFrame Head:")
# print(users_df.head())

# --- Destination Configuration ---
destination_name = "{destination_name}"
logging.info(f"Configuring destination: {{destination_name}}")
dest_config = {{
{destination_config_dict}
}}
Comment on lines +121 to +125
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Same template variable issue for destination.

Similar to the source configuration, the destination template uses {destination_name} but the formatting call uses destination_connector_name. Should we fix this consistency issue? wdyt?

-destination_name = "{destination_name}"
+destination_name = "{destination_connector_name}"

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In airbyte/mcp/_coding_templates.py around lines 121 to 125, the template
variable for the destination name is inconsistent: the string uses
{destination_name} but the formatting call uses destination_connector_name. To
fix this, ensure both the template string and the formatting call use the same
variable name, either rename the template variable to
{destination_connector_name} or change the formatting call to use
destination_name for consistency.


# Optional: Add fixed configuration parameters here if needed
# dest_config["some_other_parameter"] = "fixed_value"

try:
destination = ab.get_destination(
destination_name,
config=dest_config,
install_if_missing=True,
)
except Exception as e:
logging.error(f"Failed to initialize destination '{{destination_name}}': {{e}}")
sys.exit(1)

# Verify the connection
logging.info("Checking destination connection...")
try:
destination.check()
logging.info("Destination connection check successful.")
except Exception as e:
logging.error(f"Destination connection check failed: {{e}}")
# Depending on the destination, a check might not be possible or fail spuriously.
# Consider logging a warning instead of exiting for some destinations.
# logging.warning(f"Destination connection check failed: {{e}} - Continuing cautiously.")
sys.exit(1) # Exit for safety by default

# --- Read data and Write to Destination ---
# This reads incrementally and writes to the destination.
# Data is processed in memory or using a temporary cache if needed by the destination connector.
logging.info("Starting data read from source and write to destination...")
try:
# source.read() returns a result object even when writing directly
# The write() method consumes this result
read_result = source.read() # Reads into default cache first usually
logging.info(f"Finished reading data. Starting write to {{destination_name}}...")
destination.write(read_result)
logging.info("Successfully wrote data to destination.")
except Exception as e:
logging.error(f"Failed during data read/write: {{e}}")
sys.exit(1)

# --- Main execution ---
if __name__ == "__main__":
logging.info("Starting PyAirbyte pipeline script.")
# The core logic is executed when the script runs directly
# If converting to dataframe, analysis happens within the 'if output_to_dataframe:' block above.
# If writing to destination, the write operation is the main action.
logging.info("PyAirbyte pipeline script finished.")

"""


DOCS_TEMPLATE = """# PyAirbyte Pipeline Setup Instructions

1. Install PyAirbyte:
```bash
pip install airbyte
```

2. Install the required connectors:
```bash
python -c "import airbyte as ab; ab.get_source('{source_connector_name}').install()"
python -c "import airbyte as ab; ab.get_destination('{destination_connector_name}').install()"
```

## Configuration
1. Update the source configuration in the pipeline script with your actual connection details
2. Update the destination configuration in the pipeline script with your actual connection details
3. Refer to the Airbyte documentation for each connector's required configuration fields

## Running the Pipeline
```bash
python {pipeline_id}.py
```

- Configure your source and destination connectors with actual credentials
- Add error handling and logging as needed
- Consider using environment variables for sensitive configuration
- Add stream selection if you only need specific data streams
- Set up scheduling using your preferred orchestration tool (Airflow, Dagster, etc.)

- Source connector docs: https://docs.airbyte.com/integrations/sources/{source_short_name}
- Destination connector docs: https://docs.airbyte.com/integrations/destinations/{dest_short_name}
- PyAirbyte docs: https://docs.airbyte.com/using-airbyte/pyairbyte/getting-started
"""
12 changes: 12 additions & 0 deletions airbyte/mcp/_connector_config.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
# Copyright (c) 2024 Airbyte, Inc., all rights reserved.
"""Local connector config MCP operations."""

from typing import Annotated

from fastmcp import FastMCP
from pydantic import Field


def register_connector_config_tools(app: FastMCP) -> None:
"""Register development tools with the FastMCP app."""
pass
4 changes: 4 additions & 0 deletions airbyte/mcp/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,8 @@
from fastmcp import FastMCP

from airbyte.mcp._cloud_ops import register_cloud_ops_tools
from airbyte.mcp._coding import register_coding_tools
from airbyte.mcp._connector_config import register_connector_config_tools
from airbyte.mcp._connector_registry import register_connector_registry_tools
from airbyte.mcp._local_ops import register_local_ops_tools
from airbyte.mcp._util import initialize_secrets
Expand All @@ -17,7 +19,9 @@
app: FastMCP = FastMCP("airbyte-mcp")
register_connector_registry_tools(app)
register_local_ops_tools(app)
register_coding_tools(app)
register_cloud_ops_tools(app)
register_connector_config_tools(app)


def main() -> None:
Expand Down
Loading