Exam DP-200: Implementing an Azure Data Solution
Documentation
HomepageOverview
Overview
Candidates for this exam are Microsoft Azure data engineers who collaborate with business stakeholders to identify and meet the data requirements to implement data solutions that use Azure data services.
Azure data engineers are responsible for data-related implementation tasks that include provisioning data storage services, ingesting streaming and batch data, transforming data, implementing security requirements, implementing data retention policies, identifying performance bottlenecks, and accessing external data sources.
Candidates for this exam must be able to implement data solutions that use the following Azure services: Azure Cosmos DB, Azure SQL Database, Azure Synapse Analytics (formerly Azure SQL DW), Azure Data Lake Storage, Azure Data Factory, Azure Stream Analytics, Azure Databricks, and Azure Blob storage.
Skills Measured
- Implement data storage solutions (40-45%)
- Manage and develop data processing (25-30%)
- Monitor and optimize data solutions (30-35%)
Getting Started
Getting Started
-
Exam DP-200: Skills Outline
-
This Certification Exam Prep session is designed for people experienced with data engineering who are interested in taking the DP-200 exam. Attendees of this session can...
Learning Paths
This learning path gets you started with the basics of storage management in Azure, Storage Account creation, and choices for data storage.
Levels: Beginner
Roles: Developer, Data Engineer
Modules
Azure Synapse Analytics provides a relational big data store that can scale to Petabytes of data. In this learning path, you will learn how Azure Synapse Analytics can achieve this scale with its’ Massively Parallel Processing (MPP) architecture. Create a data warehouse in minutes and use familiar querying language to build reports. Load massive amounts of data in minutes, and ensure that your data warehouse is secure.
Levels: Beginner
Roles: Data Engineer
Modules
This learning path guides data engineers through the evolving benefits your organization gains with Azure cloud data platform technologies.
Levels: Beginner
Roles: Data Engineer
Modules
In this learning path, discover Azure Stream Analytics streaming data, the concepts of event processing, and integration of Internet of Things (IoT) devices.
Levels: Beginner
Roles: Data Engineer
Modules
This learning path introduces you to Azure-supported SQL-based enterprise data solutions enabling you to store and retrieve your app's data in the cloud.
Levels: Intermediate
Roles: Developer, Data Engineer
Modules
- Provision an Azure SQL database to store application data
- Migrate your relational data stored in SQL Server to Azure SQL Database
- Create an Azure Database for PostgreSQL server
- Scale multiple Azure SQL Databases with SQL elastic pools
- Secure your Azure SQL Database
- Develop and configure an ASP.NET application that queries an Azure SQL database
In this learning path, find out how to customize NoSQL data storage and distribution with the Azure portal, Cosmos DB extension, and Cosmos DB .NET Core SDK.
Levels: Beginner
Roles: Developer, Data Engineer
Modules
- Create an Azure Cosmos DB database built to scale
- Choose the appropriate API for Azure Cosmos DB
- Insert and query data in your Azure Cosmos DB database
- Store and access graph data in Azure Cosmos DB with the Graph API
- Store and Access NoSQL Data with Azure Cosmos DB and the Table API
- Build a .NET Core app for Azure Cosmos DB in Visual Studio Code
- Optimize the performance of Azure Cosmos DB by using partitioning and indexing strategies
- Distribute your data globally with Azure Cosmos DB
With this learning path, you'll use Azure Data Lake Storage to process your Big Data efficiently and easily while also being protected by Azure’s host of security features.
Levels: Beginner
Roles: Data Engineer
Modules