
Explanation:

Box 1: Install a self-hosted integration runtime
The Integration Runtime is a customer-managed data integration infrastructure used by Azure Data Factory to provide data integration capabilities across different network environments.
Box 2: Create a pipeline
With ADF, existing data processing services can be composed into data pipelines that are highly available and managed in the cloud. These data pipelines can be scheduled to ingest, prepare, transform, analyze, and publish data, and ADF manages and orchestrates the complex data and processing dependencies References:
https://docs.microsoft.com/en-us/azure/machine-learning/team-data-science-process/move-sql-azure-adf
https://docs.microsoft.com/pl-pl/azure/data-factory/tutorial-hybrid-copy-data-tool syu31svc 3 months, 4 weeks ago
https://docs.microsoft.com/en-us/azure/data-factory/create-self-hosted-integration-runtime?tabs=data-factory
"A self-hosted integration runtime can run copy activities between a cloud data store and a data store in a private network"
https://docs.microsoft.com/en-us/azure/data-factory/introduction
"With Data Factory, you can use the Copy Activity in a data pipeline to move data from both on-premises and cloud source data stores to a centralization data store in the cloud for further analysis"