Azure Data Factory Training--Continuous Integration/Delivery
- Offered byUDEMY
Azure Data Factory Training--Continuous Integration/Delivery at UDEMY Overview
Duration | 14 hours |
Total fee | ₹455 |
Mode of learning | Online |
Credential | Certificate |
Azure Data Factory Training--Continuous Integration/Delivery at UDEMY Highlights
- Earn a Certificate of completion from Udemy
- Get a 30 days money back guarantee on the course
- Learn from 4 articles and 39 downloadable resources
Azure Data Factory Training--Continuous Integration/Delivery at UDEMY Course details
- How to create pipelines in azure data factory
- How to integrate azure data factory with azure DevOps
- How to integrate azure data factory with GitHub
- Continuous integration and delivery (CI/CD) in azure data factory
- Learn to use pipelines and activities in Azure data factory
- Azure Data Factory is the cloud-based ETL and data integration service that allows to create data-driven workflows for orchestrating data movement and transforming data at scale
- Using Azure Data Factory, students can create and schedule data-driven workflows (called pipelines) that can ingest data from disparate data stores
- Students can build complex ETL processes that transform data visually with data flows or by using compute services such as Azure HDInsight Hadoop, Azure Databricks, and Azure SQL Database
- Additionally, students can publish their transformed data to data stores such as Azure SQL Data Warehouse for business intelligence (BI) applications to consume
- Ultimately, through Azure Data Factory, raw data can be organized into meaningful data stores and data lakes for better business decisions
Azure Data Factory Training--Continuous Integration/Delivery at UDEMY Curriculum
Introduction
Download Full Resources And Code Used In This Course
Create Azure Account With Credit Card
How To Use Azure Account -- After Free Trial ,Without Spending Money
ETL Vs ELT
Key components Of ADF & SSIS
Create SQL Databases in Azure
Create Azure Data Factory Instance In Azure
Create Linked Service (Connection To Source & Destinations)
Create Data Set
Copy Data From Azure SQL To Storage Account
Create New Resource Group Called ADF
Create A New Storage Account In Azure
Create Date Lake Storage in Azure
Create Azure SQL Data Base & Tables For This Demo Part 1
Create Tables and Insert Data into Azure SQL DB
Create Azure Data Factory Instances & Linked Services
Create Multiple Date Set ( To Access Azure SQL, Storage Account & Data Lake)
Create A Pipeline With Multiple Copy Activities
Clean Your Resources
Delete Activity In Azure Data Factory
Wait Activity In Azure Data Factory
Use Get Metadata Activity With Files And Folders (file Systems)
Lookup Activity In Azure Data Factory
ForEach Activity In Azure Data Factory
ForEach Activity In Azure Data Factory (Copy Data Dynamically With Lookup)
Until Activity In Azure Data Factory