A batch inference job can take a long time to finish. This example monitors progress by using a Jupyter widget. You can also manage the job's progress by using: Azure Machine Learning Studio. Console output from the PipelineRun object. from azureml.widgets import RunDetails RunDetails(pipeline_run).show() pipeline_run.wait_for_completion(show_output=True) Reference: https://docs.microsoft.com/en-us/azure/machine-learning/how-to-use-parallel-run-step#monitor- the-parallel-run-job