Skip to content

Commit 4edcc09

Browse files
committed
chore: set streaming opt-in
Signed-off-by: Ion Koutsouris <[email protected]>
1 parent 0718510 commit 4edcc09

File tree

2 files changed

+26
-119
lines changed

2 files changed

+26
-119
lines changed

python/deltalake/table.py

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -976,6 +976,7 @@ def merge(
976976
error_on_type_mismatch: bool = True,
977977
writer_properties: Optional[WriterProperties] = None,
978978
large_dtypes: Optional[bool] = None,
979+
streaming: bool = False,
979980
custom_metadata: Optional[Dict[str, str]] = None,
980981
post_commithook_properties: Optional[PostCommitHookProperties] = None,
981982
commit_properties: Optional[CommitProperties] = None,
@@ -993,6 +994,7 @@ def merge(
993994
error_on_type_mismatch: specify if merge will return error if data types are mismatching :default = True
994995
writer_properties: Pass writer properties to the Rust parquet writer
995996
large_dtypes: Deprecated, will be removed in 1.0
997+
streaming: Will execute MERGE using a LazyMemoryExec plan
996998
arrow_schema_conversion_mode: Large converts all types of data schema into Large Arrow types, passthrough keeps string/binary/list types untouched
997999
custom_metadata: Deprecated and will be removed in future versions. Use commit_properties instead.
9981000
post_commithook_properties: properties for the post commit hook. If None, default values are used.
@@ -1031,17 +1033,14 @@ def merge(
10311033
convert_pyarrow_table,
10321034
)
10331035

1034-
streaming = False
10351036
if isinstance(source, pyarrow.RecordBatchReader):
10361037
source = convert_pyarrow_recordbatchreader(source, conversion_mode)
1037-
streaming = True
10381038
elif isinstance(source, pyarrow.RecordBatch):
10391039
source = convert_pyarrow_recordbatch(source, conversion_mode)
10401040
elif isinstance(source, pyarrow.Table):
10411041
source = convert_pyarrow_table(source, conversion_mode)
10421042
elif isinstance(source, ds.Dataset):
10431043
source = convert_pyarrow_dataset(source, conversion_mode)
1044-
streaming = True
10451044
elif _has_pandas and isinstance(source, pd.DataFrame):
10461045
source = convert_pyarrow_table(
10471046
pyarrow.Table.from_pandas(source), conversion_mode

0 commit comments

Comments
 (0)