Skip to content

Commit 03e7cee

Browse files
committed
fix: PySparkProcessor args pass-through to parent
Update PySparkProcessor implementation for modified argument order in parent ScriptProcessor (previously relied on positional order for all args).
1 parent 7eb4b8a commit 03e7cee

File tree

1 file changed

+9
-9
lines changed

1 file changed

+9
-9
lines changed

src/sagemaker/spark/processing.py

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -250,15 +250,15 @@ def run(
250250
self._current_job_name = self._generate_current_job_name(job_name=job_name)
251251

252252
super().run(
253-
submit_app,
254-
inputs,
255-
outputs,
256-
arguments,
257-
wait,
258-
logs,
259-
job_name,
260-
experiment_config,
261-
kms_key,
253+
code=submit_app,
254+
inputs=inputs,
255+
outputs=outputs,
256+
arguments=arguments,
257+
wait=wait,
258+
logs=logs,
259+
job_name=job_name,
260+
experiment_config=experiment_config,
261+
kms_key=kms_key,
262262
)
263263

264264
def _extend_processing_args(self, inputs, outputs, **kwargs):

0 commit comments

Comments
 (0)