This is an example of a simple banner

Training: Implement a data engineering solution with Azure Databricks (DP-3027)

Ref. DP-3027
Duration:
1
 jour
Exam:
Not certifying
Level:
Intermédiaire

Implement a data engineering solution with Azure Databricks (DP-3027)

Master data processing with Azure Databricks

Azure Databricks is becoming a key pillar in the world of modern data engineering. Thanks to its power, flexibility, and integrated tools, this platform allows you to process large volumes of data efficiently. This DP-3027 training offers complete mastery of cloud data pipelines through hands-on cases and progressive modules.

A practical course for today’s data engineers

This course has been designed to meet the real needs of data professionals. You will learn how to implement robust engineering solutions, automate workflows, and ensure data quality in production. The modules cover the entire pipeline lifecycle, from real-time processing with Spark Structured Streaming to implementing CI/CD workflows, including automation through Azure Data Factory.

Participant Profiles

  • Data Engineer
  • Big Data Developer
  • Cloud Architect
  • Data Consultant
  • Azure Administrator
  • BI Manager

Objectives

  • Create real-time data streams
  • Build architectures with Delta Live Tables
  • Improve data processing performance
  • Deploy CI/CD workflows in Databricks
  • Automate tasks with Azure Data Factory
  • Secure and govern data
  • Use SQL warehouses in Databricks
  • Run notebooks from Azure Data Factory

Prerequisites

  • Know how to use a computer
  • Understand the basics of cloud computing

Course Content

Module 1: Perform incremental processing with Spark Structured Streaming

  • Set up real-time data sources for incremental processing
  • Optimize Delta Lake for incremental processing in Azure Databricks
  • Handle late data and out-of-order events in incremental processing
  • Monitoring and performance tuning strategies for incremental processing in Azure Databricks

Module 2: Implement streaming architecture patterns with Delta Live Tables

  • Event driven architectures with Delta Live Tables
  • Ingest data with structured streaming
  • Maintain data consistency and reliability with structured streaming
  • Scale streaming workloads with Delta Live Tables

Module 3: Optimize performance with Spark and Delta Live Tables

  • Optimize performance with Spark and Delta Live Tables
  • Perform cost-based optimization and query tuning
  • Use change data capture (CDC)
  • Use enhanced autoscaling
  • Implement observability and data quality metrics

Module 4: Implement CI/CD workflows in Azure Databricks

  • Implement version control and Git integration
  • Perform unit testing and integration testing
  • Manage and configure your environment
  • Implement rollback and roll-forward strategies

Module 5: Automate workloads with Azure Databricks Jobs

  • Implement job scheduling and automation
  • Optimize workflows with parameters
  • Handle dependency management
  • Implement error handling and retry mechanisms
  • Explore best practices and guidelines

Module 6: Manage data privacy and governance with Azure Databricks

  • Implement data encryption techniques in Azure Databricks
  • Manage access controls in Azure Databricks
  • Implement data masking and anonymization in Azure Databricks
  • Use compliance frameworks and secure data sharing in Azure Databricks
  • Use data lineage and metadata management
  • Implement governance automation in Azure Databricks

Module 7: Use SQL Warehouses in Azure Databricks

  • Get started with SQL Warehouses
  • Create databases and tables
  • Create queries and dashboards

Module 8: Run Azure Databricks Notebooks with Azure Data Factory

  • Understand Azure Databricks notebooks and pipelines
  • Create a linked service for Azure Databricks
  • Use a Notebook activity in a pipeline
  • Use parameters in a notebook

Documentation

  • Access to Microsoft Learn, Microsoft’s online learning platform, offering interactive resources and educational content to deepen your knowledge and develop your technical skills.

Lab / Exercises

  • This course provides you with exclusive access to the official Microsoft lab, enabling you to practice your skills in a professional environment.

Complementary Courses

Additional Information

Automate and secure data workflows with Azure Databricks

In a context where data volumes are skyrocketing, mastering data engineering has become a strategic priority for any organization. Azure Databricks offers a robust and scalable solution, designed to orchestrate, process, and monitor data pipelines efficiently. This training supports you in building modern data architectures based on the platform’s best practices.

You will learn how to industrialize your data workflows using powerful tools such as Delta Lake, Delta Live Tables, and SQL Warehouses. The goal is simple: help your teams gain efficiency while ensuring the quality, traceability, and security of your processes. Whether you want to automate data flows, manage complex environments, or ensure data compliance, this course is structured to meet those needs.

From continuous integration to data governance

Beyond basic data ingestion, this course helps you integrate strong professional practices. You’ll learn how to establish a smooth development cycle with CI/CD workflows, version notebooks, automate testing, and manage multiple environments. This part is essential for any organization looking to deliver data projects in production confidently.

Security and governance are also key pillars of the program. With Unity Catalog, you will learn how to trace access, define privacy rules, encrypt sensitive data, and anonymize specific fields. These practices meet today’s compliance requirements and build trust in your analytical processes.

Leverage the power of real-time data processing

The ability to process data as it arrives is a major competitive advantage. Azure Databricks gives you the tools to build streaming pipelines that meet this challenge. You’ll learn how to configure real-time data sources, handle out-of-order or late-arriving events, and maintain consistency in your results.

Performance optimization is another central focus of this course. You’ll be able to monitor workloads, control execution costs, apply autoscaling, and implement change data capture to keep your systems in sync.

A concrete and progressive approach

This training has been designed to balance clarity, skill development, and practical application. Each module introduces new features while reinforcing the foundations covered earlier. Even without prior data engineering experience, you’ll follow the course smoothly thanks to guided demos, hands-on labs, and real-world use cases.

You’ll gain a broad and detailed understanding of what Azure Databricks can offer today: from batch processing to real-time streaming, from quality management to continuous delivery, including automation and security. These skills will allow you to design effective, sustainable solutions aligned with business needs.

FAQ

Do I need prior experience in data engineering?
No, this course is designed for motivated beginners. Basic cloud knowledge is a plus.

Can I follow the course without Spark knowledge?
Yes, the concepts are introduced progressively with practical examples.

Are there practical exercises included?
Yes, each module includes hands-on labs to help you practice independently.

What concrete skills will I acquire?
You’ll learn to build streaming pipelines, automate workflows, secure data, and deploy projects in production.

Is there a badge at the end of the course?
Yes, a completion badge is awarded to validate and showcase your achievements.

Prix de l'inscription
CHF 850.-
Inclus dans ce cours
  • Training provided by a certified trainer
  • 180 days of access to Official Microsoft Labs
  • Official documentation in digital format
  • Official Microsoft achievement badge
Mois actuel

lun21juil09:00lun17:00VirtuelVirtual Etiquettes de sessionDP-3027

lun21juil09:00lun17:00Genève, Route des Jeunes 35, 1227 Carouge Etiquettes de sessionDP-3027

lun25Aoû09:00lun17:00VirtuelVirtual Etiquettes de sessionDP-3027

lun25Aoû09:00lun17:00Lausanne, Avenue Mon repos 24, 1005 Lausanne Etiquettes de sessionDP-3027

lun29Sep09:00lun17:00VirtuelVirtual Etiquettes de sessionDP-3027

lun29Sep09:00lun17:00Genève, Route des Jeunes 35, 1227 Carouge Etiquettes de sessionDP-3027

lun03Nov09:00lun17:00VirtuelVirtual Etiquettes de sessionDP-3027

lun03Nov09:00lun17:00Lausanne, Avenue Mon repos 24, 1005 Lausanne Etiquettes de sessionDP-3027

Contact

ITTA
Route des jeunes 35
1227 Carouge, Suisse

Opening hours

Monday to Friday
8:30 AM to 6:00 PM
Tel. 058 307 73 00

Contact-us

ITTA
Route des jeunes 35
1227 Carouge, Suisse

Make a request

Contact

ITTA
Route des jeunes 35
1227 Carouge, Suisse

Opening hours

Monday to Friday, from 8:30 am to 06:00 pm.

Contact us

Your request