Exam Questions Flashcards
You have an Azure subscription that contains a custom application named Application1. Application1 was developed by an external company named Fabrikam,
Ltd. Developers at Fabrikam were assigned role-based access control (RBAC) permissions to the Application1 components. All users are licensed for the
Microsoft 365 E5 plan.
You need to recommend a solution to verify whether the Fabrikam developers still require permissions to Application1. The solution must meet the following requirements:
✑ To the manager of the developers, send a monthly email message that lists the access permissions to Application1.
✑ If the manager does not verify an access permission, automatically revoke that permission.
✑ Minimize development effort.
What should you recommend?
A. In Azure Active Directory (Azure AD), create an access review of Application1.
B. Create an Azure Automation runbook that runs the Get-AzRoleAssignment cmdlet.
C. In Azure Active Directory (Azure AD) Privileged Identity Management, create a custom role assignment for the Application1 resources.
D. Create an Azure Automation runbook that runs the Get-AzureADUserAppRoleAssignment cmdlet.
Correct Answer: A
Recommendation: A. In Azure Active Directory (Azure AD), create an access review of Application1.
Explanation:
* Access reviews are designed specifically for this purpose: periodically evaluating access permissions and requiring approval to maintain them.
* Automatic revocation: Access reviews can be configured to automatically revoke permissions if not verified by the manager.
* Minimal development effort: Access reviews are a built-in Azure AD feature, requiring minimal configuration and no custom development.
* Monthly email reports: Access reviews can be scheduled to send email notifications to the manager with a list of permissions to review.
Comparison to other options:
* B. Azure Automation runbook: While this option could technically be used, it would require significant development effort to create the script, send emails, and manage access revocations.
* C. Privileged Identity Management (PIM): PIM is primarily for managing privileged roles and doesn’t fit the requirement of reviewing all access permissions.
* D. Get-AzureADUserAppRoleAssignment cmdlet: Similar to option B, this would require custom scripting and development effort.
Therefore, creating an access review in Azure AD is the most efficient and effective solution to meet the given requirements.
You have an Azure subscription. The subscription has a blob container that contains multiple blobs.
Ten users in the finance department of your company plan to access the blobs during the month of April.
You need to recommend a solution to enable access to the blobs during the month of April only.
Which security solution should you include in the recommendation?
A. shared access signatures (SAS)
B. Conditional Access policies
C. certificates
D. access keys
Correct Answer: A
Shared Access Signatures (SAS) allows for limited-time fine grained access control to resources. So you can generate URL, specify duration (for month of April) and disseminate URL to 10 team members. On May 1, the SAS token is automatically invalidated, denying team members continued access.
You have an Azure Active Directory (Azure AD) tenant that syncs with an on-premises Active Directory domain.
You have an internal web app named WebApp1 that is hosted on-premises. WebApp1 uses Integrated Windows authentication.
Some users work remotely and do NOT have VPN access to the on-premises network.
You need to provide the remote users with single sign-on (SSO) access to WebApp1.
Which two features should you include in the solution? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Azure AD Application Proxy
B. Azure AD Privileged Identity Management (PIM)
C. Conditional Access policies
D. Azure Arc
E. Azure AD enterprise applications
F. Azure Application Gateway
Correct Answer: AE
A: Application Proxy is a feature of Azure AD that enables users to access on-premises web applications from a remote client. Application Proxy includes both the
Application Proxy service which runs in the cloud, and the Application Proxy connector which runs on an on-premises server.
You can configure single sign-on to an Application Proxy application.
E: Add an on-premises app to Azure AD
Now that you’ve prepared your environment and installed a connector, you’re ready to add on-premises applications to Azure AD.
1. Sign in as an administrator in the Azure portal.
2. In the left navigation panel, select Azure Active Directory.
3. Select Enterprise applications, and then select New application.
4. Select Add an on-premises application button which appears about halfway down the page in the On-premises applications section. Alternatively, you can select Create your own application at the top of the page and then select Configure Application Proxy for secure remote access to an on-premise application.
5. In the Add your own on-premises application section, provide the following information about your application.
6. Etc.
Incorrect:
Not C: Conditional Access policies are not required.
You have an Azure Active Directory (Azure AD) tenant named contoso.com that has a security group named Group1. Group1 is configured for assigned membership. Group1 has 50 members, including 20 guest users.
You need to recommend a solution for evaluating the membership of Group1. The solution must meet the following requirements:
✑ The evaluation must be repeated automatically every three months.
✑ Every member must be able to report whether they need to be in Group1.
✑ Users who report that they do not need to be in Group1 must be removed from Group1 automatically.
✑ Users who do not report whether they need to be in Group1 must be removed from Group1 automatically.
What should you include in the recommendation?
A. Implement Azure AD Identity Protection.
B. Change the Membership type of Group1 to Dynamic User.
C. Create an access review.
D. Implement Azure AD Privileged Identity Management (PIM).
Correct Answer: C
Azure Active Directory (Azure AD) access reviews enable organizations to efficiently manage group memberships, access to enterprise applications, and role assignments. User’s access can be reviewed on a regular basis to make sure only the right people have continued access.
You plan to deploy Azure Databricks to support a machine learning application. Data engineers will mount an Azure Data Lake Storage account to the Databricks file system. Permissions to folders are granted directly to the data engineers.
You need to recommend a design for the planned Databrick deployment. The solution must meet the following requirements:
✑ Ensure that the data engineers can only access folders to which they have permissions.
✑ Minimize development effort.
✑ Minimize costs.
What should you include in the recommendation? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Box 1: Premium -
Premium Databricks SKU is required for credential passhtrough.
Box 2: Credential passthrough - (this is about to be decommissioned, now is preferred to use Unity Catalog. In Such a case Standard Databricks SKU can be used
Athenticate automatically to Azure Data Lake Storage Gen1 (ADLS Gen1) and Azure Data Lake Storage Gen2 (ADLS Gen2) from Azure Databricks clusters using the same Azure Active Directory (Azure AD) identity that you use to log into Azure Databricks. When you enable Azure Data Lake Storage credential passthrough for your cluster, commands that you run on that cluster can read and write data in Azure Data Lake Storage without requiring you to configure service principal credentials for access to storage.
A company named Contoso, Ltd. has an Azure Active Directory (Azure AD) tenant that is integrated with Microsoft 365 and an Azure subscription.
Contoso has an on-premises identity infrastructure. The infrastructure includes servers that run Active Directory Domain Services (AD DS) and Azure AD Connect.
Contoso has a partnership with a company named Fabrikam. Inc. Fabrikam has an Active Directory forest and a Microsoft 365 tenant. Fabrikam has the same on- premises identity infrastructure components as Contoso.
A team of 10 developers from Fabrikam will work on an Azure solution that will be hosted in the Azure subscription of Contoso. The developers must be added to the Contributor role for a resource group in the Contoso subscription.
You need to recommend a solution to ensure that Contoso can assign the role to the 10 Fabrikam developers. The solution must ensure that the Fabrikam developers use their existing credentials to access resources
What should you recommend?
A. In the Azure AD tenant of Contoso. create cloud-only user accounts for the Fabrikam developers.
B. Configure a forest trust between the on-premises Active Directory forests of Contoso and Fabrikam.
C. Configure an organization relationship between the Microsoft 365 tenants of Fabrikam and Contoso.
D. In the Azure AD tenant of Contoso, create guest accounts for the Fabnkam developers.
Correct Answer: D
You can use the capabilities in Azure Active Directory B2B to collaborate with external guest users and you can use Azure RBAC to grant just the permissions that guest users need in your environment.
Incorrect:
Not B: Forest trust is used for internal security, not external access.
You plan to deploy an Azure web app named App1 that will use Azure Active Directory (Azure AD) authentication.
App1 will be accessed from the internet by the users at your company. All the users have computers that run Windows 10 and are joined to Azure AD.
You need to recommend a solution to ensure that the users can connect to App1 without being prompted for authentication and can access App1 only from company-owned computers.
What should you recommend for each requirement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Box 1: An Azure AD app registration
Azure active directory (AD) provides cloud based directory and identity management services.You can use azure AD to manage users of your application and authenticate access to your applications using azure active directory.
You register your application with Azure active directory tenant.
Box 2: A conditional access policy
Conditional Access policies at their simplest are if-then statements, if a user wants to access a resource, then they must complete an action.
By using Conditional Access policies, you can apply the right access controls when needed to keep your organization secure and stay out of your user’s way when not needed.
Your company deploys several virtual machines on-premises and to Azure. ExpressRoute is deployed and configured for on-premises to Azure connectivity.
Several virtual machines exhibit network connectivity issues.
You need to analyze the network traffic to identify whether packets are being allowed or denied to the virtual machines.
Solution: Use Azure Traffic Analytics in Azure Network Watcher to analyze the network traffic.
Does this meet the goal?
A. Yes
B. No
Correct Answer: B
Instead use Azure Network Watcher IP Flow Verify, which allows you to detect traffic filtering issues at a VM level.
Note: IP flow verify checks if a packet is allowed or denied to or from a virtual machine. The information consists of direction, protocol, local IP, remote IP, local port, and remote port. If the packet is denied by a security group, the name of the rule that denied the packet is returned. While any source or destination IP can be chosen, IP flow verify helps administrators quickly diagnose connectivity issues from or to the internet and from or to the on-premises environment.
Your company deploys several virtual machines on-premises and to Azure. ExpressRoute is deployed and configured for on-premises to Azure connectivity.
Several virtual machines exhibit network connectivity issues.
You need to analyze the network traffic to identify whether packets are being allowed or denied to the virtual machines.
Solution: Use Azure Advisor to analyze the network traffic.
Does this meet the goal?
A. Yes
B. No
Correct Answer: B
Instead use Azure Network Watcher IP Flow Verify, which allows you to detect traffic filtering issues at a VM level.
Note: IP flow verify checks if a packet is allowed or denied to or from a virtual machine. The information consists of direction, protocol, local IP, remote IP, local port, and remote port. If the packet is denied by a security group, the name of the rule that denied the packet is returned. While any source or destination IP can be chosen, IP flow verify helps administrators quickly diagnose connectivity issues from or to the internet and from or to the on-premises environment.
Your company deploys several virtual machines on-premises and to Azure. ExpressRoute is deployed and configured for on-premises to Azure connectivity.
Several virtual machines exhibit network connectivity issues.
You need to analyze the network traffic to identify whether packets are being allowed or denied to the virtual machines.
Solution: Use Azure Network Watcher to run IP flow verify to analyze the network traffic.
Does this meet the goal?
A. Yes
B. No
Correct Answer: A
Azure Network Watcher IP Flow Verify allows you to detect traffic filtering issues at a VM level.
IP flow verify checks if a packet is allowed or denied to or from a virtual machine. The information consists of direction, protocol, local IP, remote IP, local port, and remote port. If the packet is denied by a security group, the name of the rule that denied the packet is returned. While any source or destination IP can be chosen,
IP flow verify helps administrators quickly diagnose connectivity issues from or to the internet and from or to the on-premises environment.
You have an Azure subscription. The subscription contains Azure virtual machines that run Windows Server 2016 and Linux.
You need to use Azure Monitor to design an alerting strategy for security-related events.
Which Azure Monitor Logs tables should you query? To answer, drag the appropriate tables to the correct log types. Each table may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place: Reference
You are designing a large Azure environment that will contain many subscriptions.
You plan to use Azure Policy as part of a governance solution.
To which three scopes can you assign Azure Policy definitions? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. Azure Active Directory (Azure AD) administrative units
B. Azure Active Directory (Azure AD) tenants
C. subscriptions
D. compute resources
E. resource groups
F. management groups
Correct Answer: CEF
Azure Policy evaluates resources in Azure by comparing the properties of those resources to business rules. Once your business rules have been formed, the policy definition or initiative is assigned to any scope of resources that Azure supports, such as management groups, subscriptions, resource groups, or individual resources.
Your on-premises network contains a server named Server1 that runs an ASP.NET application named App1.
You have a hybrid deployment of Azure Active Directory (Azure AD).
You need to recommend a solution to ensure that users sign in by using their Azure AD account and Azure Multi-Factor Authentication (MFA) when they connect to App1 from the internet.
Which three features should you recommend be deployed and configured in sequence? To answer, move the appropriate features from the list of features to the answer area and arrange them in the correct order.
Select and Place: Answer Area
Step 1: Azure AD Application Proxy
Start by enabling communication to Azure data centers to prepare your environment for Azure AD Application Proxy.
Step 2: an Azure AD enterprise application
Add an on-premises app to Azure AD.
Now that you’ve prepared your environment and installed a connector, you’re ready to add on-premises applications to Azure AD.
1. Sign in as an administrator in the Azure portal.
2. In the left navigation panel, select Azure Active Directory.
3. Select Enterprise applications, and then select New application.
4. Etc.
Step 3: Setup a conditional Access Policy to ensure MFA
You need to recommend a solution to generate a monthly report of all the new Azure Resource Manager (ARM) resource deployments in your Azure subscription.
What should you include in the recommendation?
A. Azure Activity Log
B. Azure Advisor
C. Azure Analysis Services
D. Azure Monitor action groups
Correct Answer: A
Activity logs are kept for 90 days. You can query for any range of dates, as long as the starting date isn’t more than 90 days in the past.
Through activity logs, you can determine:
✑ what operations were taken on the resources in your subscription
✑ who started the operation
✑ when the operation occurred
✑ the status of the operation
✑ the values of other properties that might
help you research the operation
Your company deploys several virtual machines on-premises and to Azure. ExpressRoute is deployed and configured for on-premises to Azure connectivity.
Several virtual machines exhibit network connectivity issues.
You need to analyze the network traffic to identify whether packets are being allowed or denied to the virtual machines.
Solution: Install and configure the Azure Monitoring agent and the Dependency Agent on all the virtual machines. Use VM insights in Azure Monitor to analyze the network traffic.
Does this meet the goal?
A. Yes
B. No
Correct Answer: B
Use the Azure Monitor agent if you need to:
Collect guest logs and metrics from any machine in Azure, in other clouds, or on-premises.
Use the Dependency agent if you need to:
Use the Map feature VM insights or the Service Map solution.
Note: Instead use Azure Network Watcher IP Flow Verify allows you to detect traffic filtering issues at a VM level.
IP flow verify checks if a packet is allowed or denied to or from a virtual machine. The information consists of direction, protocol, local IP, remote IP, local port, and remote port. If the packet is denied by a security group, the name of the rule that denied the packet is returned. While any source or destination IP can be chosen,
IP flow verify helps administrators quickly diagnose connectivity issues from or to the internet and from or to the on-premises environment.
DRAG DROP -
You need to design an architecture to capture the creation of users and the assignment of roles. The captured data must be stored in Azure Cosmos DB.
Which services should you include in the design? To answer, drag the appropriate services to the correct targets. Each service may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Box 1: Azure Event Hubs -
You can route Azure Active Directory (Azure AD) activity logs to several endpoints for long term retention and data insights.
The Event Hub is used for streaming.
Box 2: Azure Function -
Use an Azure Function along with a cosmos DB change feed, and store the data in Cosmos DB.
Your company, named Contoso, Ltd., implements several Azure logic apps that have HTTP triggers. The logic apps provide access to an on-premises web service.
Contoso establishes a partnership with another company named Fabrikam, Inc.
Fabrikam does not have an existing Azure Active Directory (Azure AD) tenant and uses third-party OAuth 2.0 identity management to authenticate its users.
Developers at Fabrikam plan to use a subset of the logic apps to build applications that will integrate with the on-premises web service of Contoso.
You need to design a solution to provide the Fabrikam developers with access to the logic apps. The solution must meet the following requirements:
✑ Requests to the logic apps from the developers must be limited to lower rates than the requests from the users at Contoso.
✑ The developers must be able to rely on their existing OAuth 2.0 provider to gain access to the logic apps.
✑ The solution must NOT require changes to the logic apps.
✑ The solution must NOT use Azure AD guest accounts.
What should you include in the solution?
A. Azure Front Door
B. Azure AD Application Proxy
C. Azure AD business-to-business (B2B)
D. Azure API Management
Correct Answer: D
The best solution to provide Fabrikam developers with access to the logic apps while meeting the given requirements is:
D. Azure API Management
Here’s why:
- Rate limiting: Azure API Management allows you to set rate limits for different API consumers, ensuring that requests from Fabrikam developers are limited to lower rates than those from Contoso users.
- OAuth 2.0 integration: Azure API Management supports integration with various identity providers, including third-party OAuth 2.0 providers. This means Fabrikam developers can use their existing OAuth 2.0 provider to authenticate and gain access to the logic apps.
- No changes to logic apps: Azure API Management acts as a gateway, handling authentication, authorization, and rate limiting without requiring any modifications to the existing logic apps.
- No Azure AD guest accounts: The solution relies on the existing OAuth 2.0 provider, eliminating the need for Azure AD guest accounts.
While Azure Front Door and Azure AD Application Proxy can be used for other purposes, they do not directly address the specific requirements of this scenario. Azure AD B2B is not suitable because it involves creating guest accounts in Azure AD, which is explicitly prohibited in the requirements.
Therefore, Azure API Management is the most appropriate solution to provide Fabrikam developers with access to the logic apps while meeting the given constraints.
You have an Azure subscription that contains 300 virtual machines that run Windows Server 2019.
You need to centrally monitor all warning events in the System logs of the virtual machines.
What should you include in the solution? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Box 1: A Log Analytics workspace
Send resource logs to a Log Analytics workspace to enable the features of Azure Monitor Logs.
You must create a diagnostic setting for each Azure resource to send its resource logs to a Log Analytics workspace to use with Azure Monitor Logs.
Box 2: Install the Azure Monitor agent
Use the Azure Monitor agent if you need to:
Collect guest logs and metrics from any machine in Azure, in other clouds, or on-premises.
Manage data collection configuration centrally
You have several Azure App Service web apps that use Azure Key Vault to store data encryption keys.
Several departments have the following requests to support the web app
Which service should you recommend for each department’s request? To answer, configure the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Box 1: Azure AD Privileged Identity Management
Privileged Identity Management provides time-based and approval-based role activation to mitigate the risks of excessive, unnecessary, or misused access permissions on resources that you care about. Here are some of the key features of Privileged Identity Management:
Provide just-in-time privileged access to Azure AD and Azure resources
Assign time-bound access to resources using start and end dates
Require approval to activate privileged roles
Enforce multi-factor authentication to activate any role
Use justification to understand why users activate
Get notifications when privileged roles are activated
Conduct access reviews to ensure users still need roles
Download audit history for internal or external audit
Prevents removal of the last active Global Administrator role assignment
Box 2: Azure Managed Identity -
Managed identities provide an identity for applications to use when connecting to resources that support Azure Active Directory (Azure AD) authentication.
Applications may use the managed identity to obtain Azure AD tokens. With Azure Key Vault, developers can use managed identities to access resources. Key
Vault stores credentials in a secure manner and gives access to storage accounts.
Box 3: Azure AD Privileged Identity Management
Privileged Identity Management provides time-based and approval-based role activation to mitigate the risks of excessive, unnecessary, or misused access permissions on resources that you care about. Here are some of the key features of Privileged Identity Management:
Provide just-in-time privileged access to Azure AD and Azure resources
Assign time-bound access to resources using start and end dates
Your company has the divisions shown in the following table
You plan to deploy a custom application to each subscription. The application will contain the following:
✑ A resource group
✑ An Azure web app
✑ Custom role assignments
✑ An Azure Cosmos DB account
You need to use Azure Blueprints to deploy the application to each subscription.
What is the minimum number of objects required to deploy the application? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Box 1: 2 -
One management group for each Azure AD tenant
Azure management groups provide a level of scope above subscriptions.
All subscriptions within a management group automatically inherit the conditions applied to the management group.
All subscriptions within a single management group must trust the same Azure Active Directory tenant.
Box 2: 1 -
One single blueprint definition can be assigned to different existing management groups or subscriptions.
When creating a blueprint definition, you’ll define where the blueprint is saved. Blueprints can be saved to a management group or subscription that you have
Contributor access to. If the location is a management group, the blueprint is available to assign to any child subscription of that management group.
Box 3: 2 -
Blueprint assignment -
Each Published Version of a blueprint can be assigned (with a max name length of 90 characters) to an existing management group or subscription.
Assigning a blueprint definition to a management group means the assignment object exists at the management group. The deployment of artifacts still targets a subscription.
You need to design an Azure policy that will implement the following functionality:
✑ For new resources, assign tags and values that match the tags and values of the resource group to which the resources are deployed.
✑ For existing resources, identify whether the tags and values match the tags and values of the resource group that contains the resources.
✑ For any non-compliant resources, trigger auto-generated remediation tasks to create missing tags and values.
The solution must use the principle of least privilege.
What should you include in the design? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Box 1: Modify -
Modify is used to add, update, or remove properties or tags on a subscription or resource during creation or update. A common example is updating tags on resources such as costCenter. Existing non-compliant resources can be remediated with a remediation task. A single Modify rule can have any number of operations. Policy assignments with effect set as Modify require a managed identity to do remediation.
Incorrect:
* The following effects are deprecated: EnforceOPAConstraint EnforceRegoPolicy
* Append is used to add additional fields to the requested resource during creation or update. A common example is specifying allowed IPs for a storage resource.
Append is intended for use with non-tag properties. While Append can add tags to a resource during a create or update request, it’s recommended to use the
Modify effect for tags instead.
Box 2: A managed identity with the Contributor role
The managed identity needs to be granted the appropriate roles required for remediating resources to grant the managed identity.
Contributor - Can create and manage all types of Azure resources but can’t grant access to others.
Incorrect:
User Access Administrator: lets you manage user access to Azure resources.
Governance Policy Effects
Remediate Resources
RBAC Build-In roles
Monitoring
You have an Azure subscription that contains the resources shown in the following table
You create an Azure SQL database named DB1 that is hosted in the East US Azure region.
To DB1, you add a diagnostic setting named Settings1. Settings1 archive SQLInsights to storage1 and sends SQLInsights to Workspace1.
For each of the following statements, select Yes if the statement is true. Otherwise, select No.
Analyzing the Statements
Given the information provided, here’s the breakdown of the statements:
-
You can add a new diagnostic setting that archives SQLInsights logs to storage2.
- Yes. You can create a new diagnostic setting for DB1 that archives SQLInsights logs to storage2. This would be in addition to the existing setting that archives to storage1.
-
You can add a new diagnostic setting that sends SQLInsights logs to Workspace2.
- Yes. You can create a new diagnostic setting for DB1 that sends SQLInsights logs to Workspace2. This would be in addition to the existing setting that sends to Workspace1.
-
You can add a new diagnostic setting that sends SQLInsights logs to Hub1.
- No. Hub1 is an Azure event hub, which is primarily designed for streaming data. It’s not directly suitable for storing and analyzing log data like SQLInsights. While you might be able to configure a custom pipeline to send SQLInsights data to Hub1, it’s not a straightforward or recommended approach.
In summary:
- You can configure multiple diagnostic settings for a single Azure SQL database.
- You can choose different storage accounts and Log Analytics workspaces for archiving and analyzing SQLInsights logs.
- Sending SQLInsights data to an event hub (like Hub1) is not directly supported and would require custom configuration.
Azure Monitor: Diagnostic Settings
Azure Sql: Diagnostic Telemetry
You plan to deploy an Azure SQL database that will store Personally Identifiable Information (PII).
You need to ensure that only privileged users can view the PII.
What should you include in the solution?
A. dynamic data masking
B. role-based access control (RBAC)
C. Data Discovery & Classification
D. Transparent Data Encryption (TDE)
A. dynamic data masking
Dynamic Data Masking (DDM) is a feature in Azure SQL Database that helps you protect sensitive data by obfuscating it from non-privileged users. DDM allows you to define masking rules on specific columns, so that the data in those columns is automatically replaced with a masked value when queried by users without the appropriate permissions. This ensures that only privileged users can view the actual Personally Identifiable Information (PII), while other users will see the masked data.
You plan to deploy an app that will use an Azure Storage account.
You need to deploy the storage account. The storage account must meet the following requirements:
✑ Store the data for multiple users.
✑ Encrypt each user’s data by using a separate key.
✑ Encrypt all the data in the storage account by using customer-managed keys.
What should you deploy?
A. files in a premium file share storage account
B. blobs in a general purpose v2 storage account
C. blobs in an Azure Data Lake Storage Gen2 account
D. files in a general purpose v2 storage account
Correct Answer: B
You can specify a customer-provided key on Blob storage operations. A client making a read or write request against Blob storage can include an encryption key on the request for granular control over how blob data is encrypted and decrypted.
You have an Azure App Service web app that uses a system-assigned managed identity.
You need to recommend a solution to store the settings of the web app as secrets in an Azure key vault. The solution must meet the following requirements:
✑ Minimize changes to the app code.
✑ Use the principle of least privilege.
What should you include in the recommendation? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Box 1: Key Vault references in Application settings
Source Application Settings from Key Vault.
Key Vault references can be used as values for Application Settings, allowing you to keep secrets in Key Vault instead of the site config. Application Settings are securely encrypted at rest, but if you need secret management capabilities, they should go into Key Vault.
To use a Key Vault reference for an app setting, set the reference as the value of the setting. Your app can reference the secret through its key as normal. No code changes are required.
Box 2: Secrets: Get -
In order to read secrets from Key Vault, you need to have a vault created and give your app permission to access it.
1. Create a key vault by following the Key Vault quickstart.
2. Create a managed identity for your application.
3. Key Vault references will use the app’s system assigned identity by default, but you can specify a user-assigned identity.
4. Create an access policy in Key Vault for the application identity you created earlier. Enable the “Get” secret permission on this policy.
You plan to deploy an application named App1 that will run on five Azure virtual machines. Additional virtual machines will be deployed later to run App1.
You need to recommend a solution to meet the following requirements for the virtual machines that will run App1:
✑ Ensure that the virtual machines can authenticate to Azure Active Directory (Azure AD) to gain access to an Azure key vault, Azure Logic Apps instances, and an Azure SQL database.
✑ Avoid assigning new roles and permissions for Azure services when you deploy additional virtual machines.
✑ Avoid storing secrets and certificates on the virtual machines.
✑ Minimize administrative effort for managing identities.
Which type of identity should you include in the recommendation?
A. a system-assigned managed identity
B. a service principal that is configured to use a certificate
C. a service principal that is configured to use a client secret
D. a user-assigned managed identity
Correct Answer: D
Managed identities provide an identity for applications to use when connecting to resources that support Azure Active Directory (Azure AD) authentication.
A user-assigned managed identity:
Can be shared.
The same user-assigned managed identity can be associated with more than one Azure resource.
Common usage:
Workloads that run on multiple resources and can share a single identity.
For example, a workload where multiple virtual machines need to access the same resource.
Incorrect:
Not A: A system-assigned managed identity can’t be shared. It can only be associated with a single Azure resource.
Typical usage:
Workloads that are contained within a single Azure resource.
Workloads for which you need independent identities.
For example, an application that runs on a single virtual machine.
You have the resources shown in the following table
CDB1 hosts a container that stores continuously updated operational data.
You are designing a solution that will use AS1 to analyze the operational data daily.
You need to recommend a solution to analyze the data without affecting the performance of the operational data store.
What should you include in the recommendation?
A. Azure Cosmos DB change feed
B. Azure Data Factory with Azure Cosmos DB and Azure Synapse Analytics connectors
C. Azure Synapse Link for Azure Cosmos DB
D. Azure Synapse Analytics with PolyBase data loading
Correct Answer: C
Azure Synapse Link for Azure Cosmos DB creates a tight integration between Azure Cosmos DB and Azure Synapse Analytics. It enables customers to run near real-time analytics over their operational data with full performance isolation from their transactional workloads and without an ETL pipeline.
You deploy several Azure SQL Database instances.
You plan to configure the Diagnostics settings on the databases as shown in the following exhibit
Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.
NOTE: Each correct selection is worth one point.
You have an application that is used by 6,000 users to validate their vacation requests. The application manages its own credential store.
Users must enter a username and password to access the application. The application does NOT support identity providers.
You plan to upgrade the application to use single sign-on (SSO) authentication by using an Azure Active Directory (Azure AD) application registration.
Which SSO method should you use?
A. header-based
B. SAML
C. password-based
D. OpenID Connect
Correct Answer: C
Password - On-premises applications can use a password-based method for SSO. This choice works when applications are configured for Application Proxy.
With password-based SSO, users sign in to the application with a username and password the first time they access it. After the first sign-on, Azure AD provides the username and password to the application. Password-based SSO enables secure application password storage and replay using a web browser extension or mobile app. This option uses the existing sign-in process provided by the application, enables an administrator to manage the passwords, and doesn’t require the user to know the password.
Incorrect:
Choosing an SSO method depends on how the application is configured for authentication. Cloud applications can use federation-based options, such as OpenID
Connect, OAuth, and SAML.
Federation - When you set up SSO to work between multiple identity providers, it’s called federation.
You have an Azure subscription that contains a virtual network named VNET1 and 10 virtual machines. The virtual machines are connected to VNET1.
You need to design a solution to manage the virtual machines from the internet. The solution must meet the following requirements:
✑ Incoming connections to the virtual machines must be authenticated by using Azure Multi-Factor Authentication (MFA) before network connectivity is allowed.
✑ Incoming connections must use TLS and connect to TCP port 443.
✑ The solution must support RDP and SSH.
What should you include in the solution? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Box 1: Just-in-time (JIT) VN access
Lock down inbound traffic to your Azure Virtual Machines with Microsoft Defender for Cloud’s just-in-time (JIT) virtual machine (VM) access feature. This reduces exposure to attacks while providing easy access when you need to connect to a VM.
Note: Threat actors actively hunt accessible machines with open management ports, like RDP or SSH. Your legitimate users also use these ports, so it’s not practical to keep them closed.
When you enable just-in-time VM access, you can select the ports on the VM to which inbound traffic will be blocked.
To solve this dilemma, Microsoft Defender for Cloud offers JIT. With JIT, you can lock down the inbound traffic to your VMs, reducing exposure to attacks while providing easy access to connect to VMs when needed.
Box 2: A conditional Access policy that has Cloud Apps assignment set to Azure Windows VM Sign-In
You can enforce Conditional Access policies such as multi-factor authentication or user sign-in risk check before authorizing access to Windows VMs in Azure that are enabled with Azure AD sign in. To apply Conditional Access policy, you must select the “Azure Windows VM Sign-In” app from the cloud apps or actions assignment option and then use Sign-in risk as a condition and/or require multi-factor authentication as a grant access control.
You are designing an Azure governance solution.
All Azure resources must be easily identifiable based on the following operational information: environment, owner, department and cost center.
You need to ensure that you can use the operational information when you generate reports for the Azure resources.
What should you include in the solution?
A. an Azure data catalog that uses the Azure REST API as a data source
B. an Azure management group that uses parent groups to create a hierarchy
C. an Azure policy that enforces tagging rules
D. Azure Active Directory (Azure AD) administrative units
Correct Answer: C
You apply tags to your Azure resources, resource groups, and subscriptions to logically organize them into a taxonomy. Each tag consists of a name and a value pair.
You use Azure Policy to enforce tagging rules and conventions. By creating a policy, you avoid the scenario of resources being deployed to your subscription that don’t have the expected tags for your organization. Instead of manually applying tags or searching for resources that aren’t compliant, you create a policy that automatically applies the needed tags during deployment.
Your company has the divisions shown in the following table
Sub1 contains an Azure App Service web app named App1. App1 uses Azure AD for single-tenant user authentication. Users from contoso.com can authenticate to App1.
You need to recommend a solution to enable users in the fabrikam.com tenant to authenticate to App1.
What should you recommend?
A. Configure the Azure AD provisioning service.
B. Enable Azure AD pass-through authentication and update the sign-in endpoint.
C. Use Azure AD entitlement management to govern external users.
D. Configure Azure AD join.
Your company has 20 web APIs that were developed in-house.
The company is developing 10 web apps that will use the web APIs. The web apps and the APIs are registered in the company s Azure Active Directory (Azure
AD) tenant. The web APIs are published by using Azure API Management.
You need to recommend a solution to block unauthorized requests originating from the web apps from reaching the web APIs. The solution must meet the following requirements:
✑ Use Azure AD-generated claims.
Minimize configuration and management effort.
What should you include in the recommendation? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
The correct options to select are:
- Grant permissions to allow the web apps to access the web APIs by using: Azure AD
- Configure a JSON Web Token (JWT) validation policy by using: Azure API Management
Here’s why:
- Azure AD is the most appropriate choice for granting permissions to the web apps to access the web APIs. Azure AD provides a robust and secure mechanism for managing access control and authorization. By using Azure AD, you can leverage the built-in features and capabilities of the platform to ensure that only authorized web apps can access the web APIs.
- Azure API Management is the best option for configuring a JSON Web Token (JWT) validation policy. Azure API Management provides a centralized platform for managing and securing APIs. By configuring a JWT validation policy in Azure API Management, you can enforce authorization rules and prevent unauthorized access to the web APIs. This approach minimizes configuration and management effort, as you can manage the policy centrally rather than having to configure it in each individual web API.
The other options are not as suitable:
- Azure API Management and The web APIs are not appropriate for granting permissions to the web apps. While Azure API Management can be used to manage access control for APIs, it is not the best choice for granting permissions to web apps. The web APIs themselves are not responsible for granting permissions.
- The web APIs is not appropriate for configuring a JWT validation policy. The web APIs are designed to provide functionality, not to enforce security policies.
You need to recommend a solution to generate a monthly report of all the new Azure Resource Manager (ARM) resource deployments in your Azure subscription.
What should you include in the recommendation?
A. Azure Log Analytics
B. Azure Arc
C. Azure Analysis Services
D. Application Insights
Correct Answer: A
The Activity log is a platform log in Azure that provides insight into subscription-level events. Activity log includes such information as when a resource is modified or when a virtual machine is started.
Activity log events are retained in Azure for 90 days and then deleted.
For more functionality, you should create a diagnostic setting to send the Activity log to one or more of these locations for the following reasons: to Azure Monitor Logs for more complex querying and alerting, and longer retention (up to two years) to Azure Event Hubs to forward outside of Azure to Azure Storage for cheaper, long-term archiving
Note: Azure Monitor builds on top of Log Analytics, the platform service that gathers log and metrics data from all your resources. The easiest way to think about it is that Azure Monitor is the marketing name, whereas Log Analytics is the technology that powers it.
Your company has the divisions shown in the following table
Sub1 contains an Azure App Service web app named App1. App1 uses Azure AD for single-tenant user authentication. Users from contoso.com can authenticate to App1.
You need to recommend a solution to enable users in the fabrikam.com tenant to authenticate to App1.
What should you recommend?
A. Configure the Azure AD provisioning service.
B. Configure assignments for the fabrikam.com users by using Azure AD Privileged Identity Management (PIM).
C. Use Azure AD entitlement management to govern external users.
D. Configure Azure AD Identity Protection.
Correct Answer: C
Entitlement management is an identity governance capability that enables organizations to manage identity and access lifecycle at scale by automating access request workflows, access assignments, reviews, and expiration. Entitlement management allows delegated non-admins to create access packages that external users from other organizations can request access to. One and multi-stage approval workflows can be configured to evaluate requests, and provision users for time-limited access with recurring reviews. Entitlement management enables policy-based provisioning and deprovisioning of external accounts.
Note: Access Packages -
An access package is the foundation of entitlement management. Access packages are groupings of policy-governed resources a user needs to collaborate on a project or do other tasks. For example, an access package might include: access to specific SharePoint sites. enterprise applications including your custom in-house and SaaS apps like Salesforce.
Microsoft Teams.
Microsoft 365 Groups.
Incorrect:
Not A: Automatic provisioning refers to creating user identities and roles in the cloud applications that users need access to. In addition to creating user identities, automatic provisioning includes the maintenance and removal of user identities as status or roles change.
Not B: Privileged Identity Management provides time-based and approval-based role activation to mitigate the risks of excessive, unnecessary, or misused access permissions on resources that you care about. Here are some of the key features of Privileged Identity Management:
Provide just-in-time privileged access to Azure AD and Azure resources
Assign time-bound access to resources using start and end dates
Etc.
You are developing an app that will read activity logs for an Azure subscription by using Azure Functions.
You need to recommend an authentication solution for Azure Functions. The solution must minimize administrative effort.
What should you include in the recommendation?
A. an enterprise application in Azure AD
B. system-assigned managed identities
C. shared access signatures (SAS)
D. application registration in Azure AD
Correct Answer: B
Recommendation: B. system-assigned managed identities
Explanation:
* Minimal administrative effort: Managed identities are automatically created and managed by Azure, requiring minimal configuration.
* Strong security: Managed identities provide a secure way for Azure resources to authenticate to other Azure services without exposing credentials.
* Ideal for Azure Functions: Azure Functions seamlessly integrates with managed identities, making it easy to access resources like Azure Monitor logs.
Additional Considerations:
* Enterprise application in Azure AD (A): While this option can be used for authentication, it involves more administrative overhead in creating and managing the application and assigning permissions.
* Shared access signatures (SAS): SAS tokens provide temporary access to resources, but they require careful management and rotation to maintain security.
* Application registration in Azure AD (D): Similar to enterprise applications, this option also involves additional administrative tasks.
Therefore, system-assigned managed identities offer the best balance of security and minimal administrative effort for this scenario.
Your company has the divisions shown in the following table
Sub1 contains an Azure App Service web app named App1. App1 uses Azure AD for single-tenant user authentication. Users from contoso.com can authenticate to App1.
You need to recommend a solution to enable users in the fabrikam.com tenant to authenticate to App1.
What should you recommend?
A. Configure Azure AD join.
B. Use Azure AD entitlement management to govern external users.
C. Enable Azure AD pass-through authentication and update the sign-in endpoint.
D. Configure assignments for the fabrikam.com users by using Azure AD Privileged Identity Management (PIM).
Correct Answer: B
What can I do with entitlement Management?
Here are some of capabilities of entitlement management:
- Select connected organizations whose users can request access. When a user who isn’t yet in your directory requests access, and is approved, they’re automatically invited into your directory and assigned access. When their access expires, if they have no other access package assignments, their B2B account in your directory can be automatically removed.
Your company has the divisions shown in the following table
Sub1 contains an Azure App Service web app named App1. App1 uses Azure AD for single-tenant user authentication. Users from contoso.com can authenticate to App1.
You need to recommend a solution to enable users in the fabrikam.com tenant to authenticate to App1.
What should you recommend?
A. Configure Azure AD join.
B. Configure Azure AD Identity Protection.
C. Use Azure AD entitlement management to govern external users.
D. Configure assignments for the fabrikam.com users by using Azure AD Privileged Identity Management (PIM).
Correct Answer: C
What can I do with entitlement Management?
Here are some of capabilities of entitlement management:
- Select connected organizations whose users can request access. When a user who isn’t yet in your directory requests access, and is approved, they’re automatically invited into your directory and assigned access. When their access expires, if they have no other access package assignments, their B2B account in your directory can be automatically removed.
You need to recommend a solution to generate a monthly report of all the new Azure Resource Manager (ARM) resource deployments in your Azure subscription.
What should you include in the recommendation?
A. Azure Activity Log
B. Azure Arc
C. Azure Analysis Services
D. Azure Monitor metrics
Correct Answer: A
This question has two variants up to this point.
If you don’t see **Log Analytics Workspace ** in the answer section, choose Azure Activity log. If you don’t see Activity Log, choose LA.
Azure Activity Log provides insights into subscription-level events that have occurred in your Azure account. It includes information about resource creation, deletion, and modification events, making it an excellent choice for monitoring new ARM resource deployments in your Azure subscription. You can export the Activity Log data to a storage account, Event Hubs, or Log Analytics workspace for further analysis and reporting. By creating a custom query or using the built-in tools for filtering and visualization, you can generate a monthly report of all the new ARM resource deployments in your Azure subscription.
You have an Azure subscription that contains an Azure key vault named KV1 and a virtual machine named VM1. VM1 runs Windows Server 2022: Azure Edition.
You plan to deploy an ASP.Net Core-based application named App1 to VM1.
You need to configure App1 to use a system-assigned managed identity to retrieve secrets from KV1. The solution must minimize development effort.
What should you do? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
- Client credentials grant flows
- Azure Instance Metadata (IMDS) endpoint
The second answer is no correct.
The key difference in this scenario is that we are using a Managed Identity, which is a feature of Azure AD, and in that case, access tokens are obtained through the Azure Instance Metadata Service (IMDS) API. The managed identity is responsible for managing the lifecycle of these credentials.
Therefore, for the case of an application in an Azure VM that uses a managed identity to authenticate with Key Vault, the IMDS would be used, not an OAuth 2.0 endpoint directly.
Your company has the divisions shown in the following table
Sub1 contains an Azure App Service web app named App1. App1 uses Azure AD for single-tenant user authentication. Users from contoso.com can authenticate to App1.
You need to recommend a solution to enable users in the fabrikam.com tenant to authenticate to App1.
What should you recommend?
A. Configure Azure AD join.
B. Configure Azure AD Identity Protection.
C. Configure a Conditional Access policy.
D. Configure Supported account types in the application registration and update the sign-in endpoint.
Correct Answer: D
To enable users in the fabrikam.com tenant to authenticate to App1, you need to configure the application registration for App1 in Azure AD to support users from both contoso.com and fabrikam.com. This can be done by updating the “Supported account types” in the application registration to allow users from any organizational directory (Any Azure AD directory - Multitenant). Once this is done, you need to update the sign-in endpoint for the application to include the fabrikam.com tenant.
This will allow users from the fabrikam.com tenant to authenticate to App1 using their Azure AD credentials.
You have an Azure AD tenant named contoso.com that has a security group named Group1. Group1 is configured for assigned memberships. Group1 has 50 members, including 20 guest users.
You need to recommend a solution for evaluating the membership of Group1. The solution must meet the following requirements:
- The evaluation must be repeated automatically every three months.
- Every member must be able to report whether they need to be in Group1.
- Users who report that they do not need to be in Group1 must be removed from Group1 automatically.
- Users who do not report whether they need to be in Group1 must be removed from Group1 automatically.
What should you include in the recommendation?
A. Implement Azure AD Identity Protection.
B. Change the Membership type of Group1 to Dynamic User.
C. Create an access review.
D. Implement Azure AD Privileged Identity Management (PIM).
Correct Answer: C
Based on the requirements below:
The evaluation must be repeated automatically every three months.
* Every member must be able to report whether they need to be in Group1.
* Users who report that they do not need to be in Group1 must be removed from Group1 automatically.
* Users who do not report whether they need to be in Group1 must be removed from Group1 automatically.
The correct answer should be: Create an access review
You have an Azure subscription named Sub1 that is linked to an Azure AD tenant named contoso.com.
You plan to implement two ASP.NET Core apps named App1 and App2 that will be deployed to 100 virtual machines in Sub1. Users will sign in to App1 and App2 by using their contoso.com credentials.
App1 requires read permissions to access the calendar of the signed-in user. App2 requires write permissions to access the calendar of the signed-in user.
You need to recommend an authentication and authorization solution for the apps. The solution must meet the following requirements:
- Use the principle of least privilege.
- Minimize administrative effort.
What should you include in the recommendation? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
- Application registration
- Delegated permissions
Important point here is that both apps are deployed to the same machines. So Managed identitied will violate the principle of least privelege. As a user/system managed identity will have to be assigned both read and write permission to user’s calendar.
App registeration will provide ability to use the service principal per app to set the correct permission required for the app.
Use delegated permissions to access user’s data as admin allowed/forces users to delegate the permission to the app.
Your company has the divisions shown in the following table
Sub1 contains an Azure App Service web app named App1. App1 uses Azure AD for single-tenant user authentication. Users from contoso.com can authenticate to App1.
You need to recommend a solution to enable users in the fabrikam.com tenant to authenticate to App1.
What should you recommend?
A. Enable Azure AD pass-through authentication and update the sign-in endpoint.
B. Use Azure AD entitlement management to govern external users.
C. Configure assignments for the fabrikam.com users by using Azure AD Privileged Identity Management (PIM).
D. Configure Azure AD Identity Protection.
Correct Answer: B
What can I do with entitlement management
Here are some of capabilities of entitlement management:
- Select connected organizations whose users can request access. When a user who isn’t yet in your directory requests access, and is approved, they’re automatically invited into your directory and assigned access. When their access expires, if they have no other access package assignments, their B2B account in your directory can be automatically removed.
Your company has the divisions shown in the following table
Sub1 contains an Azure App Service web app named App1. App1 uses Azure AD for single-tenant user authentication. Users from contoso.com can authenticate to App1.
You need to recommend a solution to enable users in the fabrikam.com tenant to authenticate to App1.
What should you recommend?
A. Configure the Azure AD provisioning service.
B. Enable Azure AD pass-through authentication and update the sign-in endpoint.
C. Configure Supported account types in the application registration and update the sign-in endpoint.
D. Configure Azure AD join.
Correct Answer: C
the question has 2 answers but they are never together. Therefore it can be:
1. Use Azure AD entitlement management to govern external users
2. Configure Supported account types in the application registration and update the sign-in endpoint
You have an Azure AD tenant that contains a management group named MG1.
You have the Azure subscriptions shown in the following table
The subscriptions contain the resource groups shown in the following table
The subscription contains the Azure AD security groups shown in the following table
The subscription contains the user accounts shown in the following table
You perform the following actions:
Assign User3 the Contributor role for Sub1.
Assign Group1 the Virtual Machine Contributor role for MG1.
Assign Group3 the Contributor role for the Tenant Root Group.
For each of the following statements, select Yes if the statement is true. Otherwise, select No.
NOTE: Each correct selection is worth one point.
Since Group 1 is assigned VM contributor to MG1, it will be able to create a new VM in RG1.
User 2 is not able to grant permission to Group 2 because it is just a member with contributor role.
Since Group 3 has Contributor role for the Tenant Root Group, User3 can create storage account in RG2
You can add an existing Security group to another Security group (also known as nested groups). Depending on the group types, you can add a group as a member of another group, just like a user, which applies settings like roles and access to the nested groups.
You have an Azure subscription that contains 1,000 resources.
You need to generate compliance reports for the subscription. The solution must ensure that the resources can be grouped by department.
What should you use to organize the resources?
A. application groups and quotas
B. Azure Policy and tags
C. administrative units and Azure Lighthouse
D. resource groups and role assignments
Correct Answer: B
Azure Policy allows you to define and enforce rules and regulations for your resources, ensuring compliance with organizational standards and industry regulations. You can create policies that specify the required tags for resources, such as department, and enforce their usage across the subscription. This will help you categorize and group resources based on departments.
Tags, on the other hand, are key-value pairs that you can assign to resources. By assigning tags to resources with the department information, you can easily filter and group resources based on departments when generating compliance reports.
You need to recommend a solution to generate a monthly report of all the new Azure Resource Manager (ARM) resource deployments in your Azure subscription.
What should you include in the recommendation?
A. Azure Arc
B. Azure Monitor metrics
C. Azure Advisor
D. Azure Log Analytics
Correct Answer: D
Azure Log Analytics is a service that collects and analyzes data from various sources, including Azure resources, applications, and operating systems. It provides a centralized location for storing and querying log data, making it an ideal solution for monitoring and analyzing resource deployments.
By configuring Log Analytics to collect and store the deployment logs, you can easily query and filter the data to generate a report of all the new ARM resource deployments within a specific time frame, such as a month.
Therefore, the correct answer is D. Azure Log Analytics”
You need to recommend a solution to generate a monthly report of all the new Azure Resource Manager (ARM) resource deployments in your Azure subscription.
What should you include in the recommendation?
A. Azure Monitor action groups
B. Azure Arc
C. Azure Monitor metrics
D. Azure Activity Log
Correct Answer: D
D. Azure Activity Log
Explanation:
* Azure Activity Log captures all actions performed on Azure resources.
* It provides detailed information about when, who, and what changes were made.
* You can query the Activity Log for specific resource types, operations, and timeframes.
* By filtering for resource creation events within a specific month, you can generate a report of new resource deployments.
Additional Considerations:
* Azure Monitor metrics are primarily for numerical data points and wouldn’t capture resource creation events.
* Azure Monitor action groups are used for alerting based on specific conditions, not for generating reports.
* Azure Arc is for managing on-premises and multi-cloud resources, which is not relevant to this scenario.
Therefore, Azure Activity Log is the most suitable option for generating a monthly report of new Azure Resource Manager resource deployments.
You have an Azure AD tenant that contains an administrative unit named MarketingAU. MarketingAU contains 100 users.
You create two users named User1 and User2.
You need to ensure that the users can perform the following actions in MarketingAU:
- User1 must be able to create user accounts.
- User2 must be able to reset user passwords.
Which role should you assign to each user? To answer, drag the appropriate roles to the correct users. Each role may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Here’s an explanation:
The roles that you need to assign are:
User1: User Administrator for the MarketingAU administrative unit.
User2: Password Administrator or Helpdesk Administrator for the MarketingAU administrative unit.
The User Administrator role provides permissions to manage user accounts, including creating new users. The Password Administrator and Helpdesk Administrator roles provide permissions to reset user passwords.
Therefore, User1 needs the User Administrator role for the MarketingAU administrative unit to be able to create new user accounts.
User2 needs either the Password Administrator or Helpdesk Administrator role for the MarketingAU administrative unit to be able to reset user passwords.
Note that assigning Helpdesk Administrator for the tenant role to User2 would provide permissions to reset passwords for all users in the Azure AD tenant, not just in the MarketingAU administrative unit.
You are designing an app that will be hosted on Azure virtual machines that run Ubuntu. The app will use a third-party email service to send email messages to users. The third-party email service requires that the app authenticate by using an API key.
You need to recommend an Azure Key Vault solution for storing and accessing the API key. The solution must minimize administrative effort.
What should you recommend using to store and access the key? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
- Storage: Secret.
API keys are typically stored as secrets in Azure Key Vault. The key vault can store and manage secrets like API keys, passwords, or database connection strings. - Should be service principal
A service principal is indeed the more appropriate choice for accessing a third-party email service using an API key.
Here’s a breakdown of why:
Managed Service Identity (MSI) is primarily designed for accessing other Azure resources. While it can be used for external resources, it’s often more complex to set up and manage.
Service Principal is specifically designed for applications to authenticate to other services, including external ones. It provides a clear separation of concerns and simplifies the authentication process.
To summarize:
Store the API key as a secret
in Azure Key Vault.
Use a service principal to authenticate to the third-party email service using the API key.
By following these steps, you’ll ensure secure storage of the API key and efficient authentication to the external service.
You have two app registrations named App1 and App2 in Azure AD. App1 supports role-based access control (RBAC) and includes a role named Writer.
You need to ensure that when App2 authenticates to access App1, the tokens issued by Azure AD include the Writer role claim.
Which blade should you use to modify each app registration? To answer, drag the appropriate blades to the correct app registrations. Each blade may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
- App1: b. App roles
- App2: c. Token configuration
This is assuming that the exam expects you to know that an application requesting a token (App2) would need to have the roles claim added via Token Configuration. While in practice, this is not the exact place to assign a role to an application, but given the choices provided, this would be the most appropriate.
This is because token configuration does indeed impact the claims present in a token, and since no other suitable choice is available (API Permissions would not be used to assign a role to the application), it seems this would be the expected answer.
However, please note this is not entirely accurate based on the full capabilities of Azure AD, but it’s the best choice given the options. Normally, you would assign the app role to the service principal of App2 in the context of Enterprise Applications, which is not an option here.
You have an Azure subscription.
You plan to deploy a monitoring solution that will include the following:
- Azure Monitor Network Insights
- Application Insights
- Microsoft Sentinel
- VM insights
The monitoring solution will be managed by a single team.
What is the minimum number of Azure Monitor workspaces required?
A. 1
B. 2
C. 3
D. 4
A. 1
You only need a single Azure Monitor Log Analytics workspace for all these monitoring solutions.
Here’s why:
- Azure Monitor Network Insights, Application Insights, Microsoft Sentinel, and VM insights, all of these components can send their data to a Log Analytics workspace.
- The workspace is a unique environment for Azure Monitor log data. Each workspace has its own data repository and configuration, and data sources and solutions are configured to store their data in a workspace.
Therefore, a single Azure Monitor Log Analytics workspace can be utilized to collect and analyze data from all the components of the monitoring solution. This will also enable a unified management and analysis of the collected data.
You need to recommend a solution to generate a monthly report of all the new Azure Resource Manager (ARM) resource deployments in your Azure subscription.
What should you include in the recommendation?
A. Application Insights
B. Azure Analysis Services
C. Azure Advisor
D. Azure Activity Log
D. Azure Activity Log
The Azure Activity Log records all ARM resource deployments in your Azure subscription, making it the appropriate choice for generating a monthly report of new resource deployments.
You have an Azure subscription that contains 10 web apps. The apps are integrated with Azure AD and are accessed by users on different project teams.
The users frequently move between projects.
You need to recommend an access management solution for the web apps. The solution must meet the following requirements:
- The users must only have access to the app of the project to which they are assigned currently.
- Project managers must verify which users have access to their project’s app and remove users that are no longer assigned to their project.
- Once every 30 days, the project managers must be prompted automatically to verify which users are assigned to their projects.
What should you include in the recommendation?
A. Azure AD Identity Protection
B. Microsoft Defender for Identity
C. Microsoft Entra Permissions Management
D. Microsoft Entra ID Governance
Correct Answer: D
Microsoft AD Identity Governance (now Microsoft Entra ID Governance) allows you to balance your organization’s need for security and employee productivity with the right processes and visibility. It provides you with capabilities to ensure that the right people have the right access to the right resources.
You have an Azure subscription that contains 50 Azure SQL databases.
You create an Azure Resource Manager (ARM) template named Template1 that enables Transparent Data Encryption (TDE).
You need to create an Azure Policy definition named Policy1 that will use Template1 to enable TDE for any noncompliant Azure SQL databases.
How should you configure Policy1? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Box 1: DeployIfNotExists
DeployIfNotExists policy definition executes a template deployment when the condition is met. Policy assignments with effect set as DeployIfNotExists require a managed identity to do remediation.
Box 2: The role-based access control (RBAC) roles required to perform the remediation task
The question is what you have to “Include in the definition:” of the policy.
Refer to list of DeployIfNotExists properties, among them is roleDefinitionIds (required) - This property must include an array of strings that match role-based access control role ID accessible by the subscription.
You have an Azure subscription. The subscription contains a tiered app named App1 that is distributed across multiple containers hosted in Azure Container Instances.
You need to deploy an Azure Monitor monitoring solution for App. The solution must meet the following requirements:
- Support using synthetic transaction monitoring to monitor traffic between the App1 components.
- Minimize development effort.
What should you include in the solution?
A. Network insights
B. Application Insights
C. Container insights
D. Log Analytics Workspace insights
Correct Answer: B
B. Application Insights
Explanation:
* Application Insights is the ideal choice for monitoring a distributed application like App1 running on Azure Container Instances.
* It provides comprehensive application performance monitoring (APM) capabilities, including:
* Performance monitoring
* Dependency tracking
* Availability testing
* Synthetic transaction monitoring (essential for your requirement)
* Log management
* It integrates seamlessly with Azure Container Instances, making it easy to set up and use.
* It offers a rich set of features and visualizations, minimizing development effort.
Why not other options:
* Network Insights: Focuses on network connectivity and troubleshooting, not application performance monitoring.
* Container Insights: Primarily for monitoring container health and resource utilization within a Kubernetes cluster, not suitable for distributed applications.
* Log Analytics Workspace insights: While capable of collecting and analyzing logs, it lacks the built-in features and visualizations for application performance monitoring.
By choosing Application Insights, you get a powerful and comprehensive monitoring solution that meets all your requirements.
You have an Azure subscription that contains the resources shown in the following table:
Log files from App1 are registered to App1Logs. An average of 120 GB of log data is ingested per day.
You configure an Azure Monitor alert that will be triggered if the App1 logs contain error messages.
You need to minimize the Log Analytics costs associated with App1. The solution must meet the following requirements:
* Ensure that all the log files from App1 are ingested to App1Logs.
* Minimize the impact on the Azure Monitor alert.
Which resource should you modify, and which modification should you perform? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Workspace1: This is the Log Analytics workspace where the logs are ingested. Modifying this resource helps manage costs associated with log ingestion.
Change to a commitment pricing tier: Commitment tiers offer discounted rates for log ingestion based on a fixed commitment, which can significantly reduce costs compared to the pay-as-you-go pricing tier, especially when dealing with large volumes of data like 120 GB per day. This change ensures that all log files are ingested while minimizing costs and impact on the Azure Monitor alert.
You have 12 Azure subscriptions and three projects. Each project uses resources across multiple subscriptions.
You need to use Microsoft Cost Management to monitor costs on a per project basis. The solution must minimize administrative effort.
Which two components should you include in the solution? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. budgets
B. resource tags
C. custom role-based access control (RBAC) roles
D. management groups
E. Azure boards
Correct Answer: AB
B. Resource tags
Why: Tags allow categorizing and tracking costs for resources by project across multiple subscriptions. This enables detailed cost analysis and reporting for each project.Use tags to assign metadata to resources (e.g., project name), making it easier to filter and analyze costs per project.
A. Budgets
Why: Budgets enable setting cost limits and alerts for each project. When combined with tags, budgets can help track and control costs effectively, ensuring each project stays within its allocated budget.Set up budgets for each project to monitor spending, receive alerts, and enforce cost controls based on tagged resources.
You have an Azure subscription that contains multiple storage accounts.
You assign Azure Policy definitions to the storage accounts.
You need to recommend a solution to meet the following requirements:
- Trigger on-demand Azure Policy compliance scans.
- Raise Azure Monitor non-compliance alerts by querying logs collected by Log Analytics.
What should you recommend for each requirement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Provided answers look correct:
To trigger the compliance scans, use Azure CLI
An evaluation scan for a subscription or a resource group can be started with Azure CLI, Azure PowerShell, a call to the REST API, or by using the Azure Policy Compliance Scan GitHub Action. This scan is an asynchronous process. An evaluation scan for a subscription or a resource group can be started with Azure CLI, Azure PowerShell, a call to the REST API, or by using the Azure Policy Compliance Scan GitHub Action. This scan is an asynchronous process.
To generate alerts, configure diagnostic settings for the Azure activity logs
You have an Azure subscription.
You plan to deploy five storage accounts that will store block blobs and five storage accounts that will host file shares. The file shares will be accessed by using the SMB protocol.
You need to recommend an access authorization solution for the storage accounts. The solution must meet the following requirements:
- Maximize security.
- Prevent the use of shared keys.
- Whenever possible, support time-limited access.
What should you include in the solution? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
The correct answer is:
For the blobs:
- A user delegation shared access signature (SAS) and a stored access policy
For the file shares: - Azure AD credentials
Explanation:
For the blobs: - A user delegation shared access signature (SAS) provides fine-grained control over access to individual blobs or containers within a storage account.
- A stored access policy allows you to define access rules that can be applied to multiple SAS tokens, simplifying management.
- Combining a user delegation SAS and a stored access policy offers the best security and flexibility by enabling you to grant time-limited access to specific blobs or containers while centralizing access control.
For the file shares:
- Azure AD credentials are the most secure option for accessing file shares over SMB. They provide strong authentication and authorization, eliminating the need for shared keys.
- Azure AD credentials also support time-limited access through features like conditional access policies.
- Using SAS tokens for file shares is less secure and not recommended, as they can be easily compromised and misused.
By following these recommendations, you can ensure that your storage accounts are protected against unauthorized access and that access is granted only to authorized users for specific time periods.
You have an Azure subscription. The subscription contains 100 virtual machines that run Windows Server 2022 and have the Azure Monitor Agent installed.
You need to recommend a solution that meets the following requirements:
- Forwards JSON-formatted logs from the virtual machines to a Log Analytics workspace
- Transforms the logs and stores the data in a table in the Log Analytics workspace
What should you include in the recommendation? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Answer Area
You have five Azure subscriptions. Each subscription is linked to a separate Azure AD tenant and contains virtual machines that run Windows Server 2022.
You plan to collect Windows security events from the virtual machines and send them to a single Log Analytics workspace.
You need to recommend a solution that meets the following requirements:
- Collects event logs from multiple subscriptions
- Supports the use of data collection rules (DCRs) to define which events to collect
What should you recommend for each requirement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Final Answer:
- To collect the event logs: Azure Lighthouse
- Azure Lighthouse provides a centralized management experience across multiple subscriptions. It allows you to delegate administrative access to other tenants, enabling you to manage resources in those subscriptions as if they were your own.
- To support DCRs: The Azure Monitor agent
- The Azure Monitor agent is the core component for collecting and sending data to Azure Monitor. It supports DCRs, allowing you to define which events to collect and send to Log Analytics.
Explanation:
* Azure Lighthouse is essential for managing resources across multiple subscriptions and tenants.
* Azure Monitor agent is the primary tool for collecting and filtering data from virtual machines. DCRs are a powerful feature of the Azure Monitor agent for customizing data collection.
By combining Azure Lighthouse and Azure Monitor agents with DCRs, you can effectively collect Windows security events from multiple subscriptions and send them to a single Log Analytics workspace for centralized monitoring and analysis.
You have 100 servers that run Windows Server 2012 R2 and host Microsoft SQL Server 2014 instances. The instances host databases that have the following characteristics:
✑ Stored procedures are implemented by using CLR.
✑ The largest database is currently 3 TB. None of the databases will ever exceed 4 TB.
You plan to move all the data from SQL Server to Azure.
You need to recommend a service to host the databases. The solution must meet the following requirements:
✑ Whenever possible, minimize management overhead for the migrated databases.
✑ Ensure that users can authenticate by using Azure Active Directory (Azure AD) credentials.
✑ Minimize the number of database changes required to facilitate the migration.
What should you include in the recommendation?
A. Azure SQL Database elastic pools
B. Azure SQL Managed Instance
C. Azure SQL Database single databases
D. SQL Server 2016 on Azure virtual machines
**Correct Answer: B **
SQL Managed Instance allows existing SQL Server customers to lift and shift their on-premises applications to the cloud with minimal application and database changes. At the same time, SQL Managed Instance preserves all PaaS capabilities (automatic patching and version updates, automated backups, high availability) that drastically reduce management overhead and TCO.
You have an Azure subscription that contains an Azure Blob Storage account named store1.
You have an on-premises file server named Server1 that runs Windows Server 2016. Server1 stores 500 GB of company files.
You need to store a copy of the company files from Server1 in store1.
Which two possible Azure services achieve this goal? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. an Azure Logic Apps integration account
B. an Azure Import/Export job
C. Azure Data Factory
D. an Azure Analysis services On-premises data gateway
E. an Azure Batch account
Correct Answer: BC
B: You can use the Azure Import/Export service to securely export large amounts of data from Azure Blob storage. The service requires you to ship empty drives to the Azure datacenter. The service exports data from your storage account to the drives and then ships the drives back.
C: Big data requires a service that can orchestrate and operationalize processes to refine these enormous stores of raw data into actionable business insights.
Azure Data Factory is a managed cloud service that’s built for these complex hybrid extract-transform-load (ETL), extract-load-transform (ELT), and data integration projects.
You have an Azure subscription that contains two applications named App1 and App2. App1 is a sales processing application. When a transaction in App1 requires shipping, a message is added to an Azure Storage account queue, and then App2 listens to the queue for relevant transactions.
In the future, additional applications will be added that will process some of the shipping requests based on the specific details of the transactions.
You need to recommend a replacement for the storage account queue to ensure that each additional application will be able to read the relevant transactions.
What should you recommend?
A. one Azure Data Factory pipeline
B. multiple storage account queues
C. one Azure Service Bus queue
D. one Azure Service Bus topic
**Correct Answer: D **
A queue allows processing of a message by a single consumer. In contrast to queues, topics and subscriptions provide a one-to-many form of communication in a publish and subscribe pattern. It’s useful for scaling to large numbers of recipients. Each published message is made available to each subscription registered with the topic. Publisher sends a message to a topic and one or more subscribers receive a copy of the message, depending on filter rules set on these subscriptions.
You need to design a storage solution for an app that will store large amounts of frequently used data. The solution must meet the following requirements:
✑ Maximize data throughput.
✑ Prevent the modification of data for one year.
✑ Minimize latency for read and write operations.
Which Azure Storage account type and storage service should you recommend? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Box 1: BlockBlobStorage -
Block Blob is a premium storage account type for block blobs and append blobs. Recommended for scenarios with high transactions rates, or scenarios that use smaller objects or require consistently low storage latency.
Box 2: Blob -
The Archive tier is an offline tier for storing blob data that is rarely accessed. The Archive tier offers the lowest storage costs, but higher data retrieval costs and latency compared to the online tiers (Hot and Cool). Data must remain in the Archive tier for at least 180 days or be subject to an early deletion charge.
You have an Azure subscription that contains the storage accounts shown in the following table
You plan to implement two new apps that have the requirements shown in the following table
Which storage accounts should you recommend using for each app? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Choosing Storage Accounts for App1 and App2
For App1:
* Storage1 and Storage2 only
Explanation:
- Lifecycle management is a feature that allows you to automatically transition blobs between storage tiers based on defined policies.
- To utilize this feature, you need at least two storage tiers: one for hot storage (Storage2: Premium) and one for cold storage (Storage1: Standard).
- Storage3 (BlobStorage) is not suitable for lifecycle management as it’s specifically designed for block blobs.
- Storage4 (FileStorage) is not relevant for storing blobs.
For App2:
* Storage4 only
Explanation:
- Azure file shares are used for storing files and are accessible through the SMB protocol.
- Storage4 is the only file storage account among the given options, making it the ideal choice for App2.
- Storage1, Storage2, and Storage3 are not designed for file storage.
In summary:
- App1 should use Storage1 (Standard) and Storage2 (Premium) for lifecycle management.
- App2 should use Storage4 (Premium File Storage) for storing app data in a file share.
By selecting these storage accounts, you ensure optimal performance, cost-efficiency, and data management for both applications.
You are designing an application that will be hosted in Azure.
The application will host video files that range from 50 MB to 12 GB. The application will use certificate-based authentication and will be available to users on the internet.
You need to recommend a storage option for the video files. The solution must provide the fastest read performance and must minimize storage costs.
What should you recommend?
A. Azure Files
B. Azure Data Lake Storage Gen2
C. Azure Blob Storage
D. Azure SQL Database
**Correct Answer: C **
Blob Storage: Stores large amounts of unstructured data, such as text or binary data, that can be accessed from anywhere in the world via HTTP or HTTPS. You can use Blob storage to expose data publicly to the world, or to store application data privately.
Max file in Blob Storage. 4.77 TB.
You are designing a SQL database solution. The solution will include 20 databases that will be 20 GB each and have varying usage patterns.
You need to recommend a database platform to host the databases. The solution must meet the following requirements:
✑ The solution must meet a Service Level Agreement (SLA) of 99.99% uptime.
✑ The compute resources allocated to the databases must scale dynamically.
✑ The solution must have reserved capacity.
Compute charges must be minimized.
What should you include in the recommendation?
A. an elastic pool that contains 20 Azure SQL databases
B. 20 databases on a Microsoft SQL server that runs on an Azure virtual machine in an availability set
C. 20 databases on a Microsoft SQL server that runs on an Azure virtual machine
D. 20 instances of Azure SQL Database serverless
Correct Answer: A
The compute and storage redundancy is built in for business critical databases and elastic pools, with a SLA of 99.99%.
Reserved capacity provides you with the flexibility to temporarily move your hot databases in and out of elastic pools (within the same region and performance tier) as part of your normal operations without losing the reserved capacity benefit.
You have an on-premises database that you plan to migrate to Azure.
You need to design the database architecture to meet the following requirements:
✑ Support scaling up and down.
✑ Support geo-redundant backups.
✑ Support a database of up to 75 TB.
✑ Be optimized for online transaction processing (OLTP).
What should you include in the design? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area
Box 1: Azure SQL Database -
Azure SQL Database:
Database size always depends on the underlying service tiers (e.g. Basic, Business Critical, Hyperscale).
It supports databases of up to 100 TB with Hyperscale service tier model.
Active geo-replication is a feature that lets you to create a continuously synchronized readable secondary database for a primary database. The readable secondary database may be in the same Azure region as the primary, or, more commonly, in a different region. This kind of readable secondary databases are also known as geo-secondaries, or geo-replicas.
Azure SQL Database and SQL Managed Instance enable you to dynamically add more resources to your database with minimal downtime.
Box 2: Hyperscale -
Incorrect Answers:
✑ SQL Server on Azure VM: geo-replication not supported.
✑ Azure Synapse Analytics is not optimized for online transaction processing (OLTP).
✑ Azure SQL Managed Instance max database size is up to currently available instance size (depending on the number of vCores).
Max instance storage size (reserved) - 2 TB for 4 vCores
- 8 TB for 8 vCores
- 16 TB for other sizes
You are planning an Azure IoT Hub solution that will include 50,000 IoT devices.
Each device will stream data, including temperature, device ID, and time data. Approximately 50,000 records will be written every second. The data will be visualized in near real time.
You need to recommend a service to store and query the data.
Which two services can you recommend? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. Azure Table Storage
B. Azure Event Grid
C. Azure Cosmos DB SQL API
D. Azure Time Series Insights
Correct Answer: CD
D: Time Series Insights is a fully managed service for time series data. In this architecture, Time Series Insights performs the roles of stream processing, data store, and analytics and reporting. It accepts streaming data from either IoT Hub or Event Hubs and stores, processes, analyzes, and displays the data in near real time.
C: The processed data is stored in an analytical data store, such as Azure Data Explorer, HBase, Azure Cosmos DB, Azure Data Lake, or Blob Storage.
You are designing an application that will aggregate content for users.
You need to recommend a database solution for the application. The solution must meet the following requirements:
✑ Support SQL commands.
✑ Support multi-master writes.
✑ Guarantee low latency read operations.
What should you include in the recommendation?
A. Azure Cosmos DB SQL API
B. Azure SQL Database that uses active geo-replication
C. Azure SQL Database Hyperscale
D. Azure Database for PostgreSQL
Correct Answer: A
With Cosmos DB’s novel multi-region (multi-master) writes replication protocol, every region supports both writes and reads. The multi-region writes capability also enables:
Unlimited elastic write and read scalability.
99.999% read and write availability all around the world.
Guaranteed reads and writes served in less than 10 milliseconds at the 99th percentile.
You have an Azure subscription that contains the SQL servers on Azure shown in the following table
The subscription contains the storage accounts shown in the following table
You create the Azure SQL databases shown in the following table
For each of the following statements, select Yes if the statement is true. Otherwise, select No.
NOTE: Each correct selection is worth one point.
Storage
You plan to import data from your on-premises environment to Azure. The data is shown in the following table
What should you recommend using to migrate the data? To answer, drag the appropriate tools to the correct data sources. Each tool may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place
Box 1: Data Migration Assistant -
The Data Migration Assistant (DMA) helps you upgrade to a modern data platform by detecting compatibility issues that can impact database functionality in your new version of SQL Server or Azure SQL Database. DMA recommends performance and reliability improvements for your target environment and allows you to move your schema, data, and uncontained objects from your source server to your target server.
Incorrect:
AzCopy is a command-line utility that you can use to copy blobs or files to or from a storage account.
Box 2: Azure Cosmos DB Data Migration Tool
Azure Cosmos DB Data Migration Tool can used to migrate a SQL Server Database table to Azure Cosmos.
You store web access logs data in Azure Blob Storage.
You plan to generate monthly reports from the access logs.
You need to recommend an automated process to upload the data to Azure SQL Database every month.
What should you include in the recommendation?
A. Microsoft SQL Server Migration Assistant (SSMA)
B. Data Migration Assistant (DMA)
C. AzCopy
D. Azure Data Factory
Correct Answer: D
You can create Data Factory pipelines that copies data from Azure Blob Storage to Azure SQL Database. The configuration pattern applies to copying from a file- based data store to a relational data store.
Required steps:
Create a data factory.
Create Azure Storage and Azure SQL Database linked services.
Create Azure Blob and Azure SQL Database datasets.
Create a pipeline contains a Copy activity.
Start a pipeline run.
Monitor the pipeline and activity runs.
You have an Azure subscription.
Your on-premises network contains a file server named Server1. Server1 stores 5 ׀¢׀’ of company files that are accessed rarely.
You plan to copy the files to Azure Storage.
You need to implement a storage solution for the files that meets the following requirements:
✑ The files must be available within 24 hours of being requested.
✑ Storage costs must be minimized.
Which two possible storage solutions achieve this goal? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. Create an Azure Blob Storage account that is configured for the Cool default access tier. Create a blob container, copy the files to the blob container, and set each file to the Archive access tier.
B. Create a general-purpose v1 storage account. Create a blob container and copy the files to the blob container.
C. Create a general-purpose v2 storage account that is configured for the Cool default access tier. Create a file share in the storage account and copy the files to the file share.
D. Create a general-purpose v2 storage account that is configured for the Hot default access tier. Create a blob container, copy the files to the blob container, and set each file to the Archive access tier.
E. Create a general-purpose v1 storage account. Create a fie share in the storage account and copy the files to the file share.
Correct Answer: AD
To minimize costs: The Archive tier is optimized for storing data that is rarely accessed and stored for at least 180 days with flexible latency requirements (on the order of hours).
You have an app named App1 that uses two on-premises Microsoft SQL Server databases named DB1 and DB2.
You plan to migrate DB1 and DB2 to Azure
You need to recommend an Azure solution to host DB1 and DB2. The solution must meet the following requirements:
✑ Support server-side transactions across DB1 and DB2.
✑ Minimize administrative effort to update the solution.
What should you recommend?
A. two Azure SQL databases in an elastic pool
B. two databases on the same Azure SQL managed instance
C. two databases on the same SQL Server instance on an Azure virtual machine
D. two Azure SQL databases on different Azure SQL Database servers
Correct Answer: B
Elastic database transactions for Azure SQL Database and Azure SQL Managed Instance allow you to run transactions that span several databases.
SQL Managed Instance enables system administrators to spend less time on administrative tasks because the service either performs them for you or greatly simplifies those tasks.
You need to design a highly available Azure SQL database that meets the following requirements:
✑ Failover between replicas of the database must occur without any data loss.
✑ The database must remain available in the event of a zone outage.
✑ Costs must be minimized.
Which deployment option should you use?
A. Azure SQL Database Hyperscale
B. Azure SQL Database Premium
C. Azure SQL Database Basic
D. Azure SQL Managed Instance General Purpose
Correct Answer: B
Azure SQL Database Premium tier supports multiple redundant replicas for each database that are automatically provisioned in the same datacenter within a region. This design leverages the SQL Server AlwaysON technology and provides resilience to server failures with 99.99% availability SLA and RPO=0.
With the introduction of Azure Availability Zones, we are happy to announce that SQL Database now offers built-in support of Availability Zones in its Premium service tier.
Incorrect:
Not A: Hyperscale is more expensive than Premium.
Not C: Need Premium for Availability Zones.
Not D: Zone redundant configuration that is free on Azure SQL Premium is not available on Azure SQL Managed Instance.
You are designing a data storage solution to support reporting.
The solution will ingest high volumes of data in the JSON format by using Azure Event Hubs. As the data arrives, Event Hubs will write the data to storage. The solution must meet the following requirements:
✑ Organize data in directories by date and time.
✑ Allow stored data to be queried directly, transformed into summarized tables, and then stored in a data warehouse.
✑ Ensure that the data warehouse can store 50 TB of relational data and support between 200 and 300 concurrent read operations.
Which service should you recommend for each type of data store? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area
Box 1: Azure Data Lake Storage Gen2
Azure Data Explorer integrates with Azure Blob Storage and Azure Data Lake Storage (Gen1 and Gen2), providing fast, cached, and indexed access to data stored in external storage. You can analyze and query data without prior ingestion into Azure Data Explorer. You can also query across ingested and uningested external data simultaneously.
Azure Data Lake Storage is optimized storage for big data analytics workloads.
Use cases: Batch, interactive, streaming analytics and machine learning data such as log files, IoT data, click streams, large datasets
Box 2: Azure SQL Database Hyperscale
Azure SQL Database Hyperscale is optimized for OLTP and high throughput analytics workloads with storage up to 100TB.
A Hyperscale database supports up to 100 TB of data and provides high throughput and performance, as well as rapid scaling to adapt to the workload requirements. Connectivity, query processing, database engine features, etc. work like any other database in Azure SQL Database.
Hyperscale is a multi-tiered architecture with caching at multiple levels. Effective IOPS will depend on the workload.
Compare to:
General purpose: 500 IOPS per vCore with 7,000 maximum IOPS
Business critical: 5,000 IOPS with 200,000 maximum IOPS
Incorrect:
* Azure Synapse Analytics Dedicated SQL pool.
Max database size: 240 TB -
A maximum of 128 concurrent queries will execute and remaining queries will be queued.
Data Lake Query Data
Service Tier Hyperscale
Sql Data warehouse service capacity limits
You have an app named App1 that uses an on-premises Microsoft SQL Server database named DB1.
You plan to migrate DB1 to an Azure SQL managed instance.
You need to enable customer managed Transparent Data Encryption (TDE) for the instance. The solution must maximize encryption strength.
Which type of encryption algorithm and key length should you use for the TDE protector?
A. RSA 3072
B. AES 256
C. RSA 4096
D. RSA 2048
Correct Answer: A
A. RSA 3072
RSA 3072 provides a higher level of encryption strength compared to RSA 2048. While RSA 4096 offers even stronger encryption, it is not supported by Azure SQL Database and Azure SQL Managed Instance for TDE protectors.
By choosing RSA 3072 for the TDE protector, you ensure strong encryption for your Azure SQL Managed Instance while complying with the platform’s requirements. This will help protect sensitive data and maintain compliance with relevant security standards and regulations.
You are planning an Azure IoT Hub solution that will include 50,000 IoT devices.
Each device will stream data, including temperature, device ID, and time data. Approximately 50,000 records will be written every second. The data will be visualized in near real time.
You need to recommend a service to store and query the data.
Which two services can you recommend? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. Azure Table Storage
B. Azure Event Grid
C. Azure Cosmos DB for NoSQL
D. Azure Time Series Insights
Correct Answer: CD
A. Azure Table Storage -> Throughput: scalability limit of 20,000 operations/s. -> Not enough for this question
B. Azure Event Grid -> It is only a broker, not a storage solution
Therefore, C and D are right
You are planning an Azure Storage solution for sensitive data. The data will be accessed daily. The dataset is less than 10 GB.
You need to recommend a storage solution that meets the following requirements:
- All the data written to storage must be retained for five years.
- Once the data is written, the data can only be read. Modifications and deletion must be prevented.
- After five years, the data can be deleted, but never modified.
- Data access charges must be minimized.
What should you recommend? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
- correct: General Purpose V2 with Hot access tier for blobs
- Should be Container access Policy for immutable storage. A resource lock does not prevent removal of files and folders. Prevents deleting resource inside the resource group
You are designing a data analytics solution that will use Azure Synapse and Azure Data Lake Storage Gen2.
You need to recommend Azure Synapse pools to meet the following requirements:
- Ingest data from Data Lake Storage into hash-distributed tables.
- Implement query, and update data in Delta Lake.
What should you recommend for each requirement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Recommended Azure Synapse Pools
Box 1: To Ingest data from Data Lake Storage into hash-distributed tables:
* A serverless Apache Spark pool
Explanation:
* Serverless Apache Spark pools offer the flexibility and scalability needed for data ingestion tasks.
* Spark can efficiently read data from Data Lake Storage and load it into hash-distributed tables in a dedicated SQL pool.
* Serverless pools eliminate the need to manage dedicated clusters, saving costs.
Box 2: To Implement, query, and update data in Delta Lake:
* A dedicated SQL pool
Explanation:
* Dedicated SQL pools provide high performance and predictable performance for complex query workloads and data updates.
* Delta Lake is optimized for OLTP workloads, and a dedicated SQL pool offers the best performance for these types of operations.
* Hash-distributed tables in a dedicated SQL pool are ideal for efficient data querying and updates.
Additional Considerations:
* Consider using a combination of serverless and dedicated pools for optimal cost-efficiency and performance. For example, use a serverless Apache Spark pool for initial data ingestion and transformation, then move the data to a dedicated SQL pool for querying and updates.
* Explore advanced features like Synapse Link for real-time integration between Data Lake Storage and dedicated SQL pool.
By utilizing these Azure Synapse pools, you can effectively ingest, process, and analyze your data in a cost-efficient and performant manner.
You have an on-premises storage solution.
You need to migrate the solution to Azure. The solution must support Hadoop Distributed File System (HDFS).
What should you use?
A. Azure Data Lake Storage Gen2
B. Azure NetApp Files
C. Azure Data Share
D. Azure Table storage
Correct Answer: A
A. Azure Data Lake Storage Gen2
Azure Data Lake Storage Gen2 is the best choice for migrating your on-premises storage solution to Azure with support for Hadoop Distributed File System (HDFS). It is a highly scalable and cost-effective storage service designed for big data analytics, providing integration with Azure HDInsight, Azure Databricks, and other Azure services. It is built on Azure Blob Storage and combines the advantages of HDFS with Blob Storage, offering a hierarchical file system, fine-grained security, and high-performance analytics.
You have an on-premises app named App1.
Customers use App1 to manage digital images.
You plan to migrate App1 to Azure.
You need to recommend a data storage solution for App1. The solution must meet the following image storage requirements:
- Encrypt images at rest.
- Allow files up to 50 MB.
- Manage access to the images by using Azure Web Application Firewall (WAF) on Azure Front Door.
The solution must meet the following customer account requirements:
- Support automatic scale out of the storage.
- Maintain the availability of App1 if a datacenter fails.
- Support reading and writing data from multiple Azure regions.
Which service should you include in the recommendation for each type of data? To answer, drag the appropriate services to the correct type of data. Each service may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct answer is worth one point.
Drag Drop
Box 1 - Image storage: A. Azure Blob Storage
Azure Blob Storage is a suitable choice for storing digital images, as it supports encryption at rest, handles large file sizes (up to 50 MB or even larger), and can be used in conjunction with Azure Web Application Firewall (WAF) on Azure Front Door.
Box 2 - Customer accounts: B. Azure Cosmos DB
Azure Cosmos DB is a highly scalable, globally distributed, multi-model database service that supports automatic scale-out, ensures high availability even in the event of a datacenter failure, and allows for reading and writing data from multiple Azure regions. This makes it an ideal choice for storing customer account data in your scenario.
You are designing an application that will aggregate content for users.
You need to recommend a database solution for the application. The solution must meet the following requirements:
- Support SQL commands.
- Support multi-master writes.
- Guarantee low latency read operations.
What should you include in the recommendation?
A. Azure Cosmos DB for NoSQL
B. Azure SQL Database that uses active geo-replication
C. Azure SQL Database Hyperscale
D. Azure Cosmos DB for PostgreSQL
Correct Answer: A
A. Azure Cosmos DB for NoSQL
Azure Cosmos DB is a globally distributed, multi-model database service that supports SQL commands, multi-master writes, and guarantees low latency read operations. It supports a variety of NoSQL data models including document, key-value, graph, and column-family. Azure Cosmos DB provides automatic and instant scalability, high availability, and low latency globally by replicating and synchronizing data across multiple Azure regions.
On the other hand, Azure SQL Database and Azure SQL Database Hyperscale are traditional relational database services that do not natively support multi-master writes.
You plan to migrate on-premises MySQL databases to Azure Database for MySQL Flexible Server.
You need to recommend a solution for the Azure Database for MySQL Flexible Server configuration. The solution must meet the following requirements:
- The databases must be accessible if a datacenter fails.
- Costs must be minimized.
Which compute tier should you recommend?
A. Burstable
B. General Purpose
C. Memory Optimized
Correct Answer: B
B. General Purpose
The General Purpose compute tier provides a balance between performance and cost. It is suitable for most common workloads and offers a good combination of CPU and memory resources. It provides high availability and fault tolerance by utilizing Azure’s infrastructure across multiple datacenters. This ensures that the databases remain accessible even if a datacenter fails.
The Burstable compute tier (option A) is designed for workloads with variable or unpredictable usage patterns. It provides burstable CPU performance but may not be the optimal choice for ensuring availability during a datacenter failure.
The Memory Optimized compute tier (option C) is designed for memory-intensive workloads that require high memory capacity. While it provides excellent performance for memory-bound workloads, it may not be necessary for minimizing costs or meeting the specified requirements.
You are designing an app that will use Azure Cosmos DB to collate sales from multiple countries.
You need to recommend an API for the app. The solution must meet the following requirements:
- Support SQL queries.
- Support geo-replication.
- Store and access data relationally.
Which API should you recommend?
A. Apache Cassandra
B. PostgreSQL
C. MongoDB
D. NoSQL
Correct Answer: B
Store data relationally:
- NoSQL stores data in document format
- MongoDB stores data in a document structure (BSON format)
Support SQL Queries:
- Apache Cassandra uses Cassandra Query Language (CQL)
If you’re looking for a managed open source relational database with high performance and geo-replication, Azure Cosmos DB for PostgreSQL is the recommended choice.
You have an app that generates 50,000 events daily.
You plan to stream the events to an Azure event hub and use Event Hubs Capture to implement cold path processing of the events. The output of Event Hubs Capture will be consumed by a reporting system.
You need to identify which type of Azure storage must be provisioned to support Event Hubs Capture, and which inbound data format the reporting system must support.
What should you identify? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Answer Area
Storage Type: Event Hubs Capture allows captured data to be written to either Azure Blob Storage or Azure Data Lake Storage Gen2. However, for cold path processing scenarios, which involve analyzing historical data, Azure Data Lake Storage Gen2 is the more suitable choice. It’s designed for big data analytics workloads and offers better performance and scalability for working with large datasets captured from event hubs.
Inbound Data Format: Event Hubs Capture uses Avro format for the captured data. Avro is a widely used open-source data format specifically designed for data exchange. It’s a row-oriented, binary format that provides rich data structures with inline schema definition. This makes it efficient for storage and easy for various analytics tools and reporting systems to understand and process the captured event data.
You have the resources shown in the following table.
CDB1 hosts a container that stores continuously updated operational data.
You are designing a solution that will use AS1 to analyze the operational data daily.
You need to recommend a solution to analyze the data without affecting the performance of the operational data store.
What should you include in the recommendation?
A. Azure Data Factory with Azure Cosmos DB and Azure Synapse Analytics connectors
B. Azure Synapse Analytics with PolyBase data loading
C. Azure Synapse Link for Azure Cosmos DB
D. Azure Cosmos DB change feed
The correct answer is C. Azure Synapse Link for Azure Cosmos DB.
Azure Synapse Link for Azure Cosmos DB creates a tight integration between Azure Cosmos DB and Azure Synapse Analytics, allowing you to run near real-time analytics over operational data in Azure Cosmos DB. It creates a “no-ETL” (Extract, Transform, Load) environment that allows you to analyze data directly without affecting the performance of the transactional workload, which is exactly what is required in this scenario.
A. Azure Data Factory with Azure Cosmos DB and Azure Synapse Analytics connectors would require ETL operations which might impact the performance of the operational data store.
B. Azure Synapse Analytics with PolyBase data loading is more appropriate for loading data from external data sources such as Azure Blob Storage or Azure Data Lake Storage.
D. Azure Cosmos DB change feed doesn’t directly address the need for analytics without affecting the performance of the operational data store.
You have an Azure subscription. The subscription contains an Azure SQL managed instance that stores employee details, including social security numbers and phone numbers.
You need to configure the managed instance to meet the following requirements:
- The helpdesk team must see only the last four digits of an employee’s phone number.
- Cloud administrators must be prevented from seeing the employee’s social security numbers.
What should you enable for each column in the managed instance? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Dynamic data masking helps prevent unauthorized access to sensitive data by enabling customers to designate how much of the sensitive data to reveal with minimal effect on the application layer.
Always Encrypted is a feature designed to protect sensitive data, such as credit card numbers or national/regional identification numbers (for example, U.S. social security numbers), stored in Azure SQL Database, Azure SQL Managed Instance, and SQL Server databases.
You plan to use an Azure Storage account to store data assets.
You need to recommend a solution that meets the following requirements:
- Supports immutable storage
- Disables anonymous access to the storage account
- Supports access control list (ACL)-based Azure AD permissions
What should you include in the recommendation?
A. Azure Files
B. Azure Data Lake Storage
C. Azure NetApp Files
D. Azure Blob Storage
Correct Answer: B
The correct answer is B. Azure Data Lake Storage.
Here’s a breakdown of why Azure Data Lake Storage is the best fit based on the given requirements:
Supports immutable storage:
* Azure Data Lake Storage Gen2 offers immutable storage through features like:
* Append-only writes: Once data is written to a file, it cannot be modified.
* Time-based retention policies: Files can be retained for a specified duration, preventing accidental deletion or modification.
* Legal hold: Files can be placed on legal hold, restricting any changes or deletions.
Disables anonymous access to the storage account:
* Azure Data Lake Storage Gen2 allows you to configure network rules and access control lists (ACLs) to strictly control who can access the storage account. You can disable public access entirely, ensuring that only authorized users can interact with the data.
Supports access control list (ACL)-based Azure AD permissions:
* Azure Data Lake Storage Gen2 integrates with Azure Active Directory (AD) to provide granular access control. You can use ACLs to assign permissions to individual users, groups, or service principals, allowing fine-grained control over who can access and modify data within the storage account.
Additional considerations:
* Azure Files: While Azure Files supports ACL-based permissions, it doesn’t offer immutable storage or the same level of granular access control as Azure Data Lake Storage Gen2.
* Azure NetApp Files: Azure NetApp Files is primarily designed for enterprise-grade file shares and doesn’t offer immutable storage or the same level of integration with Azure AD as Azure Data Lake Storage Gen2.
* Azure Blob Storage: While Azure Blob Storage supports various access control mechanisms, it doesn’t offer immutable storage or the same level of granular ACL-based permissions as Azure Data Lake Storage Gen2.
By choosing Azure Data Lake Storage, you can ensure that your data assets are stored securely, with strict control over who can access and modify them, while also benefiting from the immutability features to protect against accidental or malicious data changes.
You are designing a storage solution that will ingest, store, and analyze petabytes (PBs) of structured, semi-structured, and unstructured text data. The analyzed data will be offloaded to Azure Data Lake Storage Gen2 for long-term retention.
You need to recommend a storage and analytics solution that meets the following requirements:
* Stores the processed data
* Provides interactive analytics
* Supports manual scaling, built-in autoscaling, and custom autoscaling
What should you include in the recommendation? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Recommendation:
- For storage and interactive analytics: Azure Data Lake Analytics
- Azure Data Explorer is optimized for rapid ingestion and querying of large volumes of data, making it suitable for petabyte-scale data processing.
- It supports a variety of data formats, including structured, semi-structured, and unstructured text data.
- It offers interactive query capabilities, allowing for rapid exploration and analysis of data.
- It provides built-in autoscaling and supports manual scaling to handle varying workloads.
- Query language: KQL (Kusto Query Language)
- KQL is specifically designed for Azure Data Explorer and offers powerful capabilities for querying and analyzing large datasets.
- It provides a rich set of functions and operators for data manipulation and exploration.
Explanation:
- Azure Data Explorer’s high performance, scalability, and support for various data formats make it an ideal choice for storing and analyzing petabytes of text data.
- KQL provides the necessary tools for efficient and interactive data exploration within Azure Data Explorer.
By combining Azure Data Explorer and KQL, you can effectively ingest, store, analyze, and offload petabytes of text data to Azure Data Lake Storage Gen2 for long-term retention.
You plan to use Azure SQL as a database platform.
You need to recommend an Azure SQL product and service tier that meets the following requirements:
* Automatically scales compute resources based on the workload demand
* Provides per second billing
What should you recommend? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
The correct options to select are:
- Azure SQL product: A single Azure SQL database
- Service tier: Hyperscale
Here’s why:
- A single Azure SQL database is the most appropriate choice for this scenario, as it provides a fully managed database service that can be scaled to meet your specific needs.
- Hyperscale is the best service tier for automatically scaling compute resources based on workload demand. Hyperscale offers elastic scalability, allowing the database to automatically adjust its compute resources to handle varying workloads. This ensures optimal performance and cost-efficiency.
- Hyperscale also provides per-second billing, which means you only pay for the resources you use, resulting in more accurate and granular billing.
The other options are not as suitable:
- An Azure SQL Database elastic pool is not the best choice for this scenario, as it is designed to share resources across multiple databases. While it can provide some level of scalability, it may not be as flexible or efficient as a single Azure SQL database.
- Azure SQL Managed Instance is a more complex option that is better suited for migrating on-premises SQL Server workloads to Azure. It may not be the best choice for a new, cloud-native application.
- The other service tiers (Basic, Business Critical, General Purpose, and Standard) do not offer automatic scaling based on workload demand.
Q34 T2
You have an Azure subscription.
You need to deploy a solution that will provide point-in-time restore for blobs in storage accounts that have blob versioning and blob soft delete enabled.
Which type of blob should you create, and what should you enable for the accounts? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Answer Area
Blob Type: Block
Enable: A stored access policy (if required)
By selecting Block blobs and enabling a stored access policy (if necessary), you’ll have the appropriate configuration for point-in-time restore based on the existing blob versioning and soft delete settings.
Explanation:
A stored access policy is a mechanism in Azure Storage that allows you to grant specific permissions to access your storage resources. It’s essential for controlling who can read, write, or delete your blobs.
The other options (Immutable blob storage, Object replication, and The change feed) are not directly related to point-in-time restore and are typically configured separately based on specific use cases.
Therefore, the correct configuration for point-in-time restore in this scenario would be:
Q35 T2
Your company, named Contoso, Ltd., has an Azure subscription that contains the following resources:
- An Azure Synapse Analytics workspace named contosoworkspace1
- An Azure Data Lake Storage account named contosolake1
- An Azure SQL database named contososql1
The product data of Contoso is copied from contososql1 to contosolake1.
Contoso has a partner company named Fabrikam Inc. Fabrikam has an Azure subscription that contains the following resources:
- A virtual machine named FabrikamVM1 that runs Microsoft SQL Server 2019
- An Azure Storage account named fabrikamsa1
Contoso plans to upload the research data on FabrikamVM1 to contosolake1. During the upload, the research data must be transformed to the data formats used by Contoso.
The data in contosolake1 will be analyzed by using contosoworkspace1.
You need to recommend a solution that meets the following requirements:
- Upload and transform the FabrikamVM1 research data.
- Provide Fabrikam with restricted access to snapshots of the data in contosoworkspace1.
What should you recommend for each requirement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
For restricted access use Azure Data Share:
Azure Data Share enables organizations to securely share data with multiple customers and partners. Data providers are always in control of the data that they’ve shared and Azure Data Share makes it simple to manage and monitor what data was shared, when and by whom.
In this case snapshot-based sharing should be used
Q36 T2
You are designing a data pipeline that will integrate large amounts of data from multiple on-premises Microsoft SQL Server databases into an analytics platform in Azure. The pipeline will include the following actions:
- Database updates will be exported periodically into a staging area in Azure Blob storage.
- Data from the blob storage will be cleansed and transformed by using a highly parallelized load process.
- The transformed data will be loaded to a data warehouse.
- Each batch of updates will be used to refresh an online analytical processing (OLAP) model in a managed serving layer.
- The managed serving layer will be used by thousands of end users.
You need to implement the data warehouse and serving layers.
What should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Data Warehouse: An Azure Synapse Analytics dedicated SQL pool
Serving Layer: Azure Analysis Services
The prompt asks us to select the appropriate options for implementing the data warehouse and serving layer in an Azure data pipeline.
Data Warehouse:
- An Apache Spark pool in Azure Synapse Analytics: This is a good choice for data warehouses that require high performance and scalability. Apache Spark is a distributed computing framework that can handle large datasets and complex workloads.
- An Azure Synapse Analytics dedicated SQL pool: This is a good choice for data warehouses that require SQL-based analytics and OLTP workloads. It is highly optimized for SQL queries and can be scaled to meet the needs of large organizations.
- Azure Data Lake Analytics: This is a good choice for data warehouses that require ad-hoc analysis and exploration of large datasets. It is a pay-as-you-go service that allows you to analyze data in place without having to move it to a different storage location.
Serving Layer:
- Azure Analysis Services: This is a good choice for serving OLAP models to thousands of end users. It is a fully managed service that provides high performance and scalability.
- An Apache Spark pool in Azure Synapse Analytics: While Apache Spark can be used for serving OLAP models, it is not as optimized for this purpose as Azure Analysis Services.
- An Azure Synapse Analytics dedicated SQL pool: While a dedicated SQL pool can be used for serving OLAP models, it is not as optimized for this purpose as Azure Analysis Services.
This combination will provide high performance, scalability, and flexibility for both the data warehouse and serving layer.
Q37 T2
You have an Azure subscription.
You need to deploy a relational database. The solution must meet the following requirements:
- Support multiple read-only replicas.
- Automatically load balance read-only requests across all the read-only replicas.
- Minimize administrative effort
What should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
As part of the requirement -> Support multiple read-only replicas.
Hyperscale is the right choice. Business critical tier has only 1 additional read replica
You have an app named App1 that uses an Azure Blob Storage container named app1data.
App1 uploads a cumulative transaction log file named File1.txt to a block blob in app1data once every hour. File1.txt only stores transaction data from the current day.
You need to ensure that you can restore the last uploaded version of File1.txt from any day for up to 30 days after the file was overwritten. The solution must minimize storage space.
What should you include in the solution?
A. container soft delete
B. blob snapshots
C. blob soft delete
D. blob versioning
Correct Answer: D
Justification:
Blob Versioning: Automatically keeps previous versions of an object when it is overwritten, enabling you to restore any version within the retention period.
Storage Efficiency: Only stores the changes, minimizing the additional storage required.
You have 12 on-premises data sources that contain customer information and consist of Microsoft SQL Server, MySQL, and Oracle databases.
You have an Azure subscription.
You plan to create an Azure Data Lake Storage account that will consolidate the customer information for analysis and reporting.
You need to recommend a solution to automatically copy new information from the data sources to the Data Lake Storage account by using extract, transform and load (ETL). The solution must minimize administrative effort.
What should you include in the recommendation?
A. Azure Data Factory
B. Azure Data Explorer
C. Azure Data Share
D. Azure Data Studio
Correct Answer: A
The correct answer is A. Azure Data Factory.
Here’s why:
* Azure Data Factory is a fully managed cloud-based ETL service that allows you to automate data movement and transformation between on-premises and cloud data sources.
* It provides a drag-and-drop interface for creating and managing data pipelines, making it easy to design and automate the ETL process.
* Azure Data Factory supports various data sources, including Microsoft SQL Server, MySQL, and Oracle databases, making it compatible with your on-premises data sources.
* It offers built-in data transformation capabilities, allowing you to clean, filter, and aggregate data before loading it into the Data Lake Storage account.
* Azure Data Factory also supports scheduling and monitoring of data pipelines, ensuring that data is copied consistently and reliably.
Here’s a breakdown of why the other options are not as suitable:
* B. Azure Data Explorer: It’s a fast and highly scalable data exploration service that is optimized for ad-hoc queries on large datasets. While it can be used to analyze data in the Data Lake Storage account, it doesn’t provide the ETL capabilities needed to automatically copy data from on-premises sources.
* C. Azure Data Share: It’s a service that allows you to share data between Azure subscriptions and external parties. While it can be used to share data from the Data Lake Storage account, it doesn’t provide the ETL capabilities needed to copy data from on-premises sources.
* D. Azure Data Studio: It’s a SQL Server management tool that provides a graphical interface for managing and querying SQL Server databases. While it can be used to extract data from SQL Server databases, it doesn’t provide the ETL capabilities needed to automate data movement and transformation between different data sources.
Therefore, based on the requirements of automating data copying from on-premises data sources to the Data Lake Storage account, Azure Data Factory is the most suitable solution. It offers a comprehensive set of features for ETL, including data source support, data transformation, scheduling, and monitoring, while minimizing administrative effort.
Q1 T3
You have SQL Server on an Azure virtual machine. The databases are written to nightly as part of a batch process.
You need to recommend a disaster recovery solution for the data. The solution must meet the following requirements:
✑ Provide the ability to recover in the event of a regional outage.
✑ Support a recovery time objective (RTO) of 15 minutes.
✑ Support a recovery point objective (RPO) of 24 hours.
✑ Support automated recovery.
✑ Minimize costs.
What should you include in the recommendation?
A. Azure virtual machine availability sets
B. Azure Disk Backup
C. an Always On availability group
D. Azure Site Recovery
Correct Answer: D
Replication with Azure Site Recover:
✑ RTO is typically less than 15 minutes.
✑ RPO: One hour for application consistency and five minutes for crash consistency.
Incorrect Answers:
B: Too slow.
C: Always On availability group RPO: Because replication to the secondary replica is asynchronous, there’s some data loss.
Reference
Q3 T3
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You need to deploy resources to host a stateless web app in an Azure subscription. The solution must meet the following requirements:
✑ Provide access to the full .NET framework.
Provide redundancy if an Azure region fails.
✑ Grant administrators access to the operating system to install custom application dependencies.
Solution: You deploy two Azure virtual machines to two Azure regions, and you create an Azure Traffic Manager profile.
Does this meet the goal?
A. Yes
B. No
Correct Answer: A
Azure Traffic Manager is a DNS-based traffic load balancer that enables you to distribute traffic optimally to services across global Azure regions, while providing high availability and responsiveness.
Q4 T3
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You need to deploy resources to host a stateless web app in an Azure subscription. The solution must meet the following requirements:
✑ Provide access to the full .NET framework.
✑ Provide redundancy if an Azure region fails.
✑ Grant administrators access to the operating system to install custom application dependencies.
Solution: You deploy two Azure virtual machines to two Azure regions, and you deploy an Azure Application Gateway.
Does this meet the goal?
A. Yes
B. No
Correct Answer: B
App Gateway will balance the traffic between VMs deployed in the same region. Create an Azure Traffic Manager profile instead.
While Azure Application Gateway is a powerful tool for handling application traffic at the application layer and can assist with routing, load balancing, and other functions, it operates within a single region. It doesn’t automatically provide geo-redundancy across multiple Azure regions.
For redundancy across regions, Azure Traffic Manager or Azure Front Door would be more suitable. They operate at the DNS level and are designed to route traffic across different regions for high availability and failover purposes.
So, in this case, deploying two Azure virtual machines to two Azure regions and deploying an Azure Application Gateway would not fully meet the stated goals due to the lack of a regional failover strategy.