DP-200 Dumps 2020-02

[Free] EnsurePass Microsoft DP-200 Real Exam Dumps Questions 31-40

February 7, 2020

Get Full Version of the Exam
http://www.EnsurePass.com/DP-200.html

Question No.31

A company plans to use Azure Storage for file storage purposes. Compliance rules require:

image

image

A single storage account to store all operations including reads, writes and deletes Retention of an on-premises copy of historical operations

You need to configure the storage account.

Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

  1. Configure the storage account to log read, write and delete operations for service type Blob

  2. Use the AzCopy tool to download log data from $logs/blob

  3. Configure the storage account to log read, write and delete operations for service-type table

  4. Use the storage client to download log data from $logs/table

  5. Configure the storage account to log read, write and delete operations for service type queue

Correct Answer: AB

Explanation:

Storage Logging logs request data in a set of blobs in a blob container named $logs in your storage account. This container does not show up if you list all the blob containers in your account but you can see its contents if you access it directly.

To view and analyze your log data, you should download the blobs that contain the log data you are interested in to a local machine. Many storage-browsing tools enable you to download blobs from your storage account; you can also use the Azure Storage team provided command-line Azure Copy Tool (AzCopy) to download your log data.

References:

https://docs.microsoft.com/en-us/rest/api/storageservices/enabling-storage-logging-and- accessing-log-data

Question No.32

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution. Determine whether the solution meets the stated goals.

You develop a data ingestion process that will import data to a Microsoft Azure SQL Data Warehouse.

The data to be ingested resides in parquet files stored in an Azure Data lake Gen 2 storage account.

You need to load the data from the Azure Data Lake Gen 2 storage account into the Azure SQL Data Warehouse.

Solution:

  1. Create an external data source pointing to the Azure storage account

  2. Create a workload group using the Azure storage account name as the pool name

  3. Load the data using the INSERT…SELECT statement Does the solution meet the goal?

  1. Yes

  2. No

Correct Answer: B

Explanation:

You need to create an external file format and external table using the external data source. You then load the data using the CREATE TABLE AS SELECT statement.

References:

https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-load-from-azure- data-lake-store

Question No.33

HOTSPOT

A company plans to develop solutions to perform batch processing of multiple sets of geospatial data. You need to implement the solutions.

Which Azure services should you use? To answer, select the appropriate configuration tit the answer area.

NOTE: Each correct selection is worth one point.

image

Correct Answer:

image

Question No.34

A company runs Microsoft SQL Server in an on-premises virtual machine (VM).

You must migrate the database to Azure SQL Database. You synchronize users from Active Directory to Azure Active Directory (Azure AD).

You need to configure Azure SQL Database to use an Azure AD user as administrator. What should you configure?

  1. For each Azure SQL Database, set the Access Control to administrator.

  2. For the Azure SQL Database server, set the Active Directory to administrator.

  3. For each Azure SQL Database, set the Active Directory administrator role.

  4. For the Azure SQL Database server, set the Access Control to administrator.

Correct Answer: C

Question No.35

You develop data engineering solutions for a company.

You must integrate the company#39;s on-premises Microsoft SQL Server data with Microsoft Azure SQL Database. Data must be transformed incrementally.

You need to implement the data integration solution.

Which tool should you use to configure a pipeline to copy data?

  1. Use the Copy Data tool with Blob storage linked service as the source

  2. Use Azure PowerShell with SQL Server linked service as a source

  3. Use Azure Data Factory UI with Blob storage linked service as a source

  4. Use the .NET Data Factory API with Blob storage linked service as the source

Correct Answer: C

Explanation:

The Integration Runtime is a customer managed data integration infrastructure used by Azure Data Factory to provide data integration capabilities across different network environments.

A linked service defines the information needed for Azure Data Factory to connect to a data resource. We have three resources in this scenario for which linked services are needed:

image

image

image

On-premises SQL Server Azure Blob Storage Azure SQL database

Note:

Azure Data Factory is a fully managed cloud-based data integration service that orchestrates and automates the movement and transformation of data. The key concept in the ADF model is pipeline. A pipeline is a logical grouping of Activities, each of which defines the actions to perform on the data contained in Datasets. Linked services are used to define the information needed for Data Factory to connect to the data resources.

References:

https://docs.microsoft.com/en-us/azure/machine-learning/team-data-science-process/move-sql- azure-adf

Question No.36

You manage a solution that uses Azure HDInsight clusters.

You need to implement a solution to monitor cluster performance and status. Which technology should you use?

  1. Azure HDInsight .NET SDK

  2. Azure HDInsight REST API

  3. Ambari REST API

  4. Azure Log Analytics

  5. Ambari Web UI

Correct Answer: E

Explanation:

Ambari is the recommended tool for monitoring utilization across the whole cluster. The Ambari dashboard shows easily glanceable widgets that display metrics such as CPU, network, YARN memory, and HDFS disk usage. The specific metrics shown depend on cluster type. The quot;Hostsquot; tab shows metrics for individual nodes so you can ensure the load on your cluster is evenly distributed.

The Apache Ambari project is aimed at making Hadoop management simpler by developing software for provisioning, managing, and monitoring Apache Hadoop clusters. Ambari provides an intuitive, easy-to-use Hadoop management web UI backed by its RESTful APIs.

References:

https://azure.microsoft.com/en-us/blog/monitoring-on-hdinsight-part-1-an-overview/ https://ambari.apache.org/

Question No.37

A company plans to use Azure SQL Database to support a mission-critical application.

The application must be highly available without performance degradation during maintenance windows.

You need to implement the solution.

Which three technologies should you implement? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

  1. Premium service tier

  2. Virtual machine Scale Sets

  3. Basic service tier

  4. SQL Data Sync

  5. Always On availability groups

  6. Zone-redundant configuration

Correct Answer: AEF

Explanation:

Premium/business critical service tier model that is based on a cluster of database engine processes. This architectural model relies on a fact that there is always a quorum of available database engine nodes and has minimal performance impact on your workload even during maintenance activities.

In the premium model, Azure SQL database integrates compute and storage on the single node. High availability in this architectural model is achieved by replication of compute (SQL Server Database Engine process) and storage (locally attached SSD) deployed in 4-node cluster, using technology similar to SQL Server Always On Availability Groups.

image

Zone redundant configuration

By default, the quorum-set replicas for the local storage configurations are created in the same datacenter.

With the introduction of Azure Availability Zones, you have the ability to place the different replicas in the quorum-sets to different availability zones in the same region. To eliminate a single point of failure, the control ring is also duplicated across multiple zones as three gateway rings (GW).

References:

https://docs.microsoft.com/en-us/azure/sql-database/sql-database-high-availability

Question No.38

Your company uses several Azure HDInsight clusters.

The data engineering team reports several errors with some application using these clusters. You need to recommend a solution to review the health of the clusters.

What should you include in you recommendation?

  1. Azure Automation

  2. Log Analytics

  3. Application Insights

  4. None of the above

Correct Answer: B

Question No.39

A company has a Microsoft Azure HDInsight solution that uses different cluster types to process and analyze data. Operations are continuous.

Reports indicate slowdowns during a specific lime window.

You need to determine a monitoring solution to track down the issue in the least amount of time. What should you use?

  1. Azure Log Analytics log search query

  2. Ambari REST API

  3. Azure Monitor Metrics

  4. HDInsight .NET SDK

  5. Azure Log Analytics alert rule query

Correct Answer: B

Explanation:

Ambari is the recommended tool for monitoring the health for any given HDInsight cluster.

Note:

Azure HDInsight is a high-availability service that has redundant gateway nodes, head nodes, and ZooKeeper nodes to keep your HDInsight clusters running smoothly. While this ensures that a single failure will not affect the functionality of a cluster, you may still want to monitor cluster health so you are alerted when an issue does arise. Monitoring cluster health refers to monitoring whether all nodes in your cluster and the components that run on them are available and functioning correctly.

Ambari is the recommended tool for monitoring utilization across the whole cluster. The Ambari dashboard shows easily glanceable widgets that display metrics such as CPU, network, YARN memory, and HDFS disk usage. The specific metrics shown depend on cluster type. The quot;Hostsquot; tab shows metrics for individual nodes so you can ensure the load on your cluster is evenly distributed.

References:

https://azure.microsoft.com/en-us/blog/monitoring-on-hdinsight-part-1-an-overview/

Question No.40

HOTSPOT

A company runs Microsoft Dynamics CRM with Microsoft SQL Server on-premises. SQL Server Integration Services (SSIS) packages extract data from Dynamics CRM APIs, and load the data into a SQL Server data warehouse.

The datacenter is running out of capacity. Because of the network configuration, you must extract on premises data to the cloud over https. You cannot open any additional ports. The solution must implement the least amount of effort.

You need to create the pipeline system.

Which component should you use? To answer, select the appropriate technology in the dialog box in the answer area.

NOTE: Each correct selection is worth one point.

image

Correct Answer:

image

>

Get Full Version of the Exam
DP-200 Dumps
DP-200 VCE and PDF