新しいファイルが Azure Data Lake Storage Gen2 コンテナーに到着したときに実行されるように、Azure Data Factory パイプラインをスケジュールする必要があります。 どのタイプのトリガーを使用する必要がありますか?
正解:D
Event-driven architecture (EDA) is a common data integration pattern that involves production, detection, consumption, and reaction to events. Data integration scenarios often require Data Factory customers to trigger pipelines based on events happening in storage account, such as the arrival or deletion of a file in Azure Blob Storage account. Reference: https://docs.microsoft.com/en-us/azure/data-factory/how-to-create-event-trigger