test8 Flashcards
Your organization plans to utilize Azure Blueprints to make sure that sets of Azure resources meet company standards, patterns, and requirements. The blueprints in development consist of: * Role Assignments * Policy Assignments * ARM Templates * Resource Groups
You need to determine the features and capabilities of Azure Blueprints.
For the statement below, choose Yes if it is correct. If not, choose No.
STATEMENT: Does deploying a resource using a blueprint create a persistent link between the blueprint and the resource?
Yes
No
Yes
Explanation
When you use a blueprint to deploy a resource, it keeps a link between the blueprint and the resource. This is different from just using an ARM template alone. The link helps with better tracking and checking of the deployments.
Your organization plans to utilize Azure Blueprints to make sure that sets of Azure resources meet company standards, patterns, and requirements. The blueprints in development consist of: * Role Assignments * Policy Assignments * ARM Templates * Resource Groups You need to determine the features and capabilities of Azure Blueprints.
For the statement below, choose Yes if it is correct. If not, choose No. STATEMENT: A subscription owner can bypass the blueprint’s Read Only and Do Not Delete resource locking settings.
Yes
No
No.
Explanation
A subscription owner cannot override the Read Only and Do Not Delete settings applied by Azure Blueprints. These locks are designed to enforce security and prevent accidental changes or deletions, ensuring adherence to organizational policies. Consequently, the correct answer is NO.
Your organization plans to utilize Azure Blueprints to make sure that sets of Azure resources meet company standards, patterns, and requirements. The blueprints in development consist of: * Role Assignments * Policy Assignments * ARM Templates * Resource Groups You need to determine the features and capabilities of Azure Blueprints.
For the statement below, choose Yes if it is correct. If not, choose No.
STATEMENT: RBAC roles can be assigned at the subscription level or for specific resource groups within a blueprint.
Yes
No
Yes
Explanation
Role-based access control (RBAC) roles can be set up for either a subscription or specific resource groups within a blueprint. These RBAC role assignments can be included as artifacts in an Azure Blueprint, with specified roles and permissions.
Company A uses an Azure Logic App called LApp1 to handle some HR workflows. LApp1 requires access to employee data stored in a SQL database named DB1, which is hosted on-premises (DC). Company policy restricts DB1 from accessing the internet. You need to create a solution that allows LApp1 to query data from DB1. Which two resources should be part of your design? Each correct answer is a component of the solution.
An on-premises data gateway
An Azure connection gateway resource
Microsoft Entra Application Proxy
Azure application registration
An on-premises data gateway
Explanation
For your design, you should include both an on-premises data gateway and an Azure connection gateway resource. The on-premises data gateway enables Azure cloud services to securely connect to on-premises data sources like SQL databases, even if those sources do not have internet access. This gateway is used by various Azure services, including Power BI, Power Apps, Power Automate, Azure Analysis Services, and Azure Logic Apps, and acts as a connection broker between cloud and on-premises resources.
An Azure connection gateway resource
Explanation
Additionally, you should add an Azure connection gateway resource and link it to your on-premises setup.
Your company operates an Active Directory Domain Services (AD DS) forest with a single domain, named company.com, and has several geographic locations. The forest functional level is Windows Server 2016. The company has legacy on-premises applications that depend on AD DS and Kerberos authentication, and there are no plans to rewrite or replace these applications. The company is exploring options to migrate its applications to the cloud and has acquired an Azure subscription, but Microsoft Entra ID has not been configured. For testing purposes, you need to move the legacy applications to Azure and set up a test environment that: - Supports Kerberos authentication from the cloud without needing access to the on-premises network. - Minimizes changes to the applications. - Reduces additional AD management tasks. - Requires minimal effort to configure. You have already set up a single virtual network (VNet) in Azure and a site-to-site (S2S) VPN Gateway connection between the VNet and the on-premises network. To complete the setup and test the applications, what steps should you take?
Create a new VNet and create a new AD DS forest root domain on the new VNet.
Configure a Microsoft Entra domain named company.com.
Deploy one or more domain controllers for company.com on the existing VNet.
Create a child domain of company.com on the existing Vnet
Deploy one or more domain controllers for company.com on the existing VNet.
Explanation
You should install one or more domain controllers for company.com within the existing VNet. By replicating your Active Directory Domain Services (AD DS) environment in the cloud, you ensure support for Kerberos authentication. This approach minimizes changes to the applications, as they will continue to operate within the familiar AD DS environment.
You manage a sophisticated inventory management application named App1, which is running on Azure. When a new client joins, you must add a new data connector to App1. However, as additional connectors are added, App1 suffers from performance slowdowns. To ensure that the addition of new data connectors does not impact App1’s performance, you need to come up with a solution. Both App1 and the data connectors for each client are implemented as microservices. STATEMENT: Use Azure Kubernetes Service (AKS) to automatically scale your App1. Select Yes if the statement is true. Otherwise, select No.
Yes
No
Yes
Explanation
You should utilize Azure Kubernetes Service (AKS) to enable automatic scaling for your App1. AKS provides strong orchestration capabilities that let you deploy additional pods based on compute metrics such as CPU or memory usage. It can automatically scale according to the demands from the data connectors.
You manage a sophisticated inventory management application named App1, which is running on Azure. When a new client joins, you must add a new data connector to App1. However, as additional connectors are added, App1 suffers from performance slowdowns. To ensure that the addition of new data connectors does not impact App1’s performance, you need to come up with a solution. Both App1 and the data connectors for each client are implemented as microservices. STATEMENT: Deploy additional Docker containers using virtual machines (VMs).
Yes
No
No
Explanation
You should avoid using VMs to deploy additional Docker containers. Simply adding more containers won’t provide auto-scaling capabilities. These containers operate independently and require significant operational expertise to enable communication between them. Instead, you should use a container orchestrator like Kubernetes to manage the interactions between applications.
You have an on-premises network and an Azure subscription. The on-premises network has several branch offices.
A branch office in Toronto contains a virtual machine named VM1 that is configured as a file server. Users access the shared files on VM1 from all the offices.
You need to recommend a solution to ensure that the users can access the shared files as quickly as possible if the Toronto branch office is inaccessible.
What should you include in the recommendation?
a Recovery Services vault and Windows Server Backup
Azure blob containers and Azure File Sync
an Azure file share and Azure File Sync
an Azure file share and Azure File Sync
Explanation
Use Azure File Sync to centralize your organization’s file shares in Azure Files, while keeping the flexibility, performance, and compatibility of an on-premises file server. Azure File Sync transforms Windows Server into a quick cache of your Azure file share.
You have an Azure subscription that contains a custom application named Application1. Application1 was developed by an external company named Fabrikam,
Ltd. Developers at Fabrikam were assigned role-based access control (RBAC) permissions to the Application1 components. All users are licensed for the Microsoft 365 E5 plan.
You need to recommend a solution to verify whether the Fabrikam developers still require permissions to Application1. The solution must meet the following requirements:
To the manager of the developers, send a monthly email message that lists the access permissions to Application1.
If the manager does not verify an access permission, automatically revoke that permission.
Minimize development effort.
What should you recommend?
In Microsoft Entra ID, create an access review of Application1.
In Microsoft Entra ID Privileged Identity Management, create a custom role assignment for
the Application1 resources.
Create an Azure Automation runbook that runs the Get-AzureADUserAppRoleAssignment cmdlet.
In Microsoft Entra ID, create an access review of Application1.
Explanation
In Microsoft Entra ID, create an access review of Application1.
You have an Azure virtual machine named VM1 that runs Windows Server 2019 and contains 500 GB of data files. You are designing a solution that will use Azure Data Factory to transform the data files, and then load the files to Azure Data Lake Storage. What should you deploy on VM1 to support the design?
the On-premises data gateway
the Azure Pipelines agent
Self-hosted integration runtime
the Azure File Sync agent
Self-hosted integration runtime
Explanation
The integration runtime (IR) is the compute infrastructure that Azure Data Factory and Synapse pipelines use to provide data-integration capabilities across different network environments. A self-hosted integration runtime can run copy activities between a cloud data store and a data store in a private network. It also can dispatch transform activities against compute resources in an on-premises network or an Azure virtual network. The installation of a self-hosted integration runtime needs an on-premises machine or a virtual machine inside a private network.
You need to recommend a solution to generate a monthly report of all the new Azure Resource Manager (ARM) resource deployments in your Azure subscription.
What should you include in the recommendation?
Azure Arc
Azure Monitor metrics
Azure Advisor
Azure Log Analytics
Azure Log Analytics
Explanation
Azure Log Analytics is a service that collects and analyzes data from various sources, including Azure resources, applications, and operating systems. It provides a centralized location for storing and querying log data, making it an ideal solution for monitoring and analyzing resource deployments. By configuring Log Analytics to collect and store the deployment logs, you can easily query and filter the data to generate a report of all the new ARM resource deployments within a specific time frame, such as a month.
You need to recommend most effective method to stream monitoring data to a third-party tool.
What should you recommend ?
Azure Logic Apps
Azure Event Grid
Azure Service Bus
Azure Event Hub
Azure Event Hub
Explanation
Azure Event Hub is a big data streaming platform and event ingestion service that can handle millions of events per second. It is specifically designed for high-throughput, real-time event streaming scenarios, making it the most effective method for streaming monitoring data to a third-party tool compared to other options.
You are planning to move online shop application to Azure and deploy it in the East US and West US regions.
Your solution must meet the following requirements:
> East US is the primary region where all customer requests should go.
> If the East US region fails, the web application in the West US region must take over automatically.
You need to recommend the solution for this.
Routing Configuration: Azure Load Balancer
Routing Method: Priority routing
Routing Configuration: Azure Traffic Manager
Routing Method: Priority routing
Routing Configuration: Azure Traffic Manager
Routing Method: Weighted routing
Routing Configuration: Azure Traffic Manager
Routing Method: Priority routing
Explanation
Azure Traffic Manager is the correct choice for this scenario as it provides DNS-based traffic routing to different Azure regions or global datacenters. By using the Priority routing method, you can specify the order in which endpoints should be used, ensuring that customer requests are directed to the primary region (East US) first. In case of a failure in the primary region, traffic will automatically failover to the secondary region (West US).
https://learn.microsoft.com/en-us/azure/traffic-manager/traffic-manager-routing-methods
Which Compute service should you recommend for the below statement:
Statement: To migrate an on-premises, cloud-optimized, HPC workload application.
Azure Batch
Azure Kubernetes Service
Azure Functions
Azure Batch
Explanation
Azure Batch is the correct choice for migrating an on-premises, cloud-optimized, HPC workload application. Azure Batch allows you to run large-scale parallel and high-performance computing (HPC) applications efficiently in the cloud. It provides job scheduling, auto-scaling of compute resources, and management of tasks, making it ideal for HPC workloads.
https://learn.microsoft.com/en-us/azure/batch/batch-technical-overview
You are designing a database solution for databases hosted in an on-premises environment. Currently, there are 15 databases hosted on individual VMs. Each database has varying usage patterns, and the resources on the VMs are underutilized. You need to recommend a database solution that minimizes costs and reduces administrative overhead. The solution should meet the following requirements:
The solution should have a 99.99% uptime SLA.
The resources in the solution can scale dynamically.
The solution should have geo-replication capabilities.
Which solution should you recommend?
15 Azure SQL databases
15 Azure SQL Managed Instances
15 databases running in SQL Server VMs
An Azure SQL Database elastic pool hosting 15 databases
An Azure SQL Database elastic pool hosting 15 databases
Explanation
An Azure SQL Database elastic pool allows you to share resources among multiple databases, providing cost savings and reducing administrative overhead. The elastic pool can automatically scale resources based on the workload of the databases, ensuring optimal performance and cost efficiency. Additionally, geo-replication capabilities can be configured for individual databases, simplifying disaster recovery and ensuring high availability.
As given in the below architecture, there is virtual network named VNET1 and subnet named subnet1.
The organization has two groups of servers: Web Servers and Management Servers.
You should be able to RDP into the Management Servers, but not the Web Servers.
The Web Servers should display the IIS web page when accessed from the internet.
Private IP Address of VMs changes frequently.
How would you group virtual machines into - Web Servers and Management Servers ?
Network Rule
Network Security group (NSGs)
Application security groups (ASGs)
Azure Firewall
Application security groups (ASGs)
Explanation
Application Security Groups (ASGs) are used to group virtual machines based on their application roles or functions. By assigning virtual machines to ASGs, you can define network security rules based on the group membership. This allows you to control access to different groups of servers, such as Web Servers and Management Servers, based on their roles.
Your company has offices in North America and Europe.
You plan to migrate to Azure.
You need to recommend a networking solution for the new Azure infrastructure. The solution must meet the following requirements:
- The Point-to-Site (P2S) VPN connections of mobile users must connect automatically to the closest Azure region.
- The offices in each region must connect to their local Azure region by using an ExpressRoute circuit.
- Transitive routing between virtual networks and on-premises networks must be supported.
- The network traffic between virtual networks must be filtered by using FQDNs.
What should you include in the recommendation?
Azure Virtual WAN with a secured virtual hub
Virtual network peering and application security groups
Azure Route Server
Azure Network Function Manager
Azure Virtual WAN with a secured virtual hub
Explanation
Azure Virtual WAN with a secured virtual hub is the correct choice as it provides a centralized and scalable solution for connecting multiple Azure regions. It allows for automatic connection of mobile users to the closest Azure region through Point-to-Site VPN connections. Additionally, it supports ExpressRoute circuits for connecting offices in each region to their local Azure region. Transitive routing between virtual networks and on-premises networks is also supported, along with the ability to filter network traffic between virtual networks using FQDNs.
You are designing a point of sale (POS) solution that will be deployed across multiple locations and will use an Azure Databricks workspace in the Standard tier. The solution will include multiple apps deployed to the on-premises network of each location.
You need to configure the authentication method that will be used by the app to access the workspace. The solution must minimize the administrative effort associated with staff turnover and credential management.
What should you configure?
a managed identity
a service principal
a personal access token
a service principal
Explanation
Configuring a service principal is the correct choice for configuring authentication for the app to access the Azure Databricks workspace. A service principal is a security identity used by applications or services to access specific Azure resources. It allows for secure authentication without the need for storing credentials in the app, minimizing administrative effort and improving security.
Managed identity can be used only between Azure resources, here we have on-prem application communicating to Azure resources, then you need a service principal
Given below is the hierarchy
You need to assign role-based access control (RBAC) permissions to the IT assistant. Your solution must require minimal maintenance.
Which scope should you apply the permissions to ?
Resource Group
Management Group
Resource
Subscription
Management Group
Explanation
Applying RBAC permissions at the Management Group level allows you to centrally manage permissions for multiple subscriptions, resource groups, and resources. This approach reduces the need for individual permission assignments and simplifies maintenance tasks, making it the ideal choice for minimal maintenance
You are developing an auditing application developed as multiple Azure functions that will:
Subscribe to events like virtual machine creation.
Receive events from resources as soon as they happen.
Deploy all infrastructure resources related to the application using Infrastructure as Code (IaC).
Which two Azure services should you recommend ?
Azure Service Bus
Azure Event Grid
Azure Bicep
Azure Batch
Azure Event Grid
Explanation
Azure Event Grid is a fully managed event routing service that allows you to react to events from various Azure services in near real-time. You can create an Azure Event Grid topic for subscribing to events like virtual machine creation and handler function will execute events as soon as they happen, making it a perfect fit for the auditing application’s requirements.
Azure Bicep
Explanation
Azure Bicep is a domain-specific language for deploying Azure resources declaratively using Infrastructure as Code (IaC). It allows you to define the infrastructure resources related to the application in a structured and repeatable manner, ensuring consistency and efficiency in deployment. Using Azure Bicep aligns with the requirement to deploy all infrastructure resources related to the application using IaC.
You need to create a Custom Azure Policy for the below requirement:
Restrict the types of VMs that can be deployed.
Deny the creation of resources without a environment tag when they are created.
What is the Minimum number of Custom Policies required ?
Correct answer
1
2
3
4
1
Explanation
Only one custom policy is required to address both requirements. By creating a single policy that combines the restrictions for VM types and the presence of an environment tag, you can enforce both conditions simultaneously.
You need to create a Custom Azure Policy for the below requirement:
Restrict the types of VMs that can be deployed.
Append an automatic tag when they are created.
What is the Minimum number of Custom Policies required ?
1
2
3
4
2
Explanation
Single Policy should produce a single effect like Deny or Append. But, in this case there are Two effect - Deny and Append. So, We need to create two policies, first one with deny effect, and second one with append effect.
Your company has an on-premises Active Directory (AD) forest and a Microsoft Entra ID P1 tenant. All Microsoft Entra users are assigned a P1 license. You plan to have users use the same usernames and passwords for on-premises and Microsoft Entra authentication. Password changes are currently managed through the helpdesk to ensure that passwords remain synchronized.
Your company is considering deploying Microsoft Entra Connect on the on-premises network. You need to identify features from this action that will help to reduce the management overhead for your network infrastructure and helpdesk personnel.
Which two features could you use?
Password writeback
Self-service password reset
Identity Protection
Privileged Identity Management (PIM)
Password writeback
Explanation
Password writeback feature in Microsoft Entra Connect allows password changes made in the cloud to be written back to the on-premises Active Directory. This helps to ensure that passwords remain synchronized between on-premises and cloud environments, reducing the need for manual password management by the helpdesk.
Self-service password reset
Explanation
Self-service password reset feature in Microsoft Entra Connect allows users to reset their own passwords without the need to contact the helpdesk. This reduces the workload on helpdesk personnel and allows users to manage their passwords more efficiently, ultimately reducing management overhead for the network infrastructure.
Your organization - ABC has a contract with other organization - XYZ. XYZ does not have accounts in ABC’s Microsoft Entra Tenant. ABC need to provide access to the images stored in ABC’s blob storage account.
You need to design a strategy to allow secure access to the images to XYZ organization.
Provide the Primary Access Keys
Provide the Secondary Access Keys
Create a Shared access signature (SAS)
Create a Shared access signature (SAS)
Explanation
Creating a Shared Access Signature (SAS) is the correct choice as it allows you to grant limited access to specific resources in the storage account for a specific period of time. This way, you can securely share access to the images stored in the blob storage account with XYZ organization without exposing the account keys.
You are developing a sales application that will contain several Azure cloud services and handle different components of a transaction. Different cloud services will process customer orders, billing, payment, inventory, and shipping.
You need to recommend a solution to enable the cloud services to asynchronously communicate transaction information by using XML messages.
What should you include in the recommendation?
Azure Data Lake
Azure Notification Hubs
Azure Service Bus
Azure Blob Storage
Azure Service Bus
Explanation
Azure Service Bus is a messaging service that enables reliable and secure asynchronous communication between applications and services. It supports various messaging patterns, including publish/subscribe and message queues, making it suitable for communicating transaction information using XML messages between different cloud services in a decoupled manner.
Azure Service Bus queues are well suited for transferring commands from producers to consumers.
Which three of the following you can store in Azure Key vault ?
Secrets
Policies
Tags
Blueprints
Keys
Certificates
Secrets
Keys
Certificates
You have deployed multiple instances of e-commerce based application and load balanced them using Azure Application Gateway ?
Select Yes if the statement is True. Otherwise, Select No.
Statement: If an incoming request to the e-commerce website includes “/images” in the request, you can route it to a pool that has been optimized for storing images.
Yes
No
Yes
Explanation
Yes, this statement is correct. By configuring routing rules in Azure Application Gateway, you can route incoming requests based on specific path-based rules. In this case, if the incoming request includes “/images” in the URL, you can direct it to a backend pool optimized for storing images, ensuring efficient handling and delivery of image content.
You are hosting a business critical application in the cloud. For the application, you expect low I/O latency from the database.
You need to meet the below requirements:
Solution should be able to scale individual components like compute and storage.
Solution must provide secondary read-only replica for running analytics queries.
Which Database tier should you use ?
DTU-based Premium
vCore-based Business Critical
vCore-based General Purpose
DTU-based Standard
vCore-based Business Critical
Explanation
vCore-based Business Critical tier in Azure SQL Database offers high-performance storage and compute resources with the ability to scale them individually. It is designed for business-critical applications that require low I/O latency and the ability to scale compute and storage independently. Additionally, it supports read-only replicas for running analytics queries, making it a suitable choice for the given requirements.
You have data coming from various sources and have drift issues. You need to recommend a data preparation solution to prepare the data for analysis. Solution need to minimize development effort.
Which Azure service should you recommend ?
Azure Data Factory
Azure Functions
Azure Logic apps
Azure Stream Analytics
Azure Data Factory
Explanation
Azure Data Factory is the correct choice for this scenario as it is a cloud-based data integration service that allows you to create, schedule, and manage data pipelines for data movement and transformation. It provides a visual interface for building data workflows without writing extensive code, making it an ideal solution to minimize development effort for data preparation tasks with drift issues.
Your organization is having 500 TB of files in its on-premises environment.
You need to transfer these files to Azure Blob Storage. Solution must require minimal effort.
Which Solution should you recommend ?
Azure Import/Export
Azure Data Box Heavy
Azure Data Box
Azure Resource Mover
Azure Data Box Heavy
Explanation
Azure Data Box Heavy is designed for transferring large amounts of data to Azure storage by shipping a ruggedized device to the organization’s location. It is a suitable solution for transferring 500 TB of files to Azure Blob Storage with minimal effort, as it simplifies the data transfer process and minimizes the organization’s involvement.
You have an Azure subscription that contains a storage account. An application sometimes writes duplicate files to the storage account. You have a PowerShell script that identifies and deletes duplicate files in the storage account. Currently, the script is run manually after approval from the operations manager. You need to recommend a serverless solution that performs the following actions
✑ Runs the script once an hour to identify whether duplicate files exist
✑ Sends an email notification to the operations manager requesting approval to delete the duplicate files
✑ Processes an email response from the operations manager specifying whether the deletion was approved
✑ Runs the script if the deletion was approved
What should you include in the recommendation?
Azure Logic Apps and Azure Event Grid
Azure Logic Apps and Azure Functions
Azure Pipelines and Azure Service Fabric
Azure Functions and Azure Batch
Azure Logic Apps and Azure Functions
Explanation
You can schedule a Powershell script with Azure Logic Apps. When you want to run code that performs a specific job in your logic apps, you can create your own function by using Azure Functions. This service helps you create Node.js, C#, and F# functions so you don’t have to build a complete app or infrastructure to run code. You can also call logic apps from inside Azure functions.
Your company is using Microsoft Entra ID to control access to application. A recent audit shows Global Administrator group is populated with people who do not need such broad access to resources.
You need to restrict the access and in addition elevated access should be provided only for specific period of time like for an hour when a person need s access.
Which Microsoft Entra ID service can you use ?
Add Conditional access policies
use managed identities
Use Privileged Identity Management (PIM) to create additional rules.
Use Privileged Identity Management (PIM) to create additional rules.
Explanation
Privileged Identity Management (PIM) allows you to manage, control, and monitor access within your organization. With PIM, you can create additional rules to restrict access to Global Administrator roles and provide time-limited elevated access to users when needed. This aligns with the requirements stated in the question.
A company has set up a storage account in Azure. They have the following storage requirements.
Ensure that administrators can recover any BLOB data if it has been accidentally deleted.
Have the ability to recover data over a period of 14 days after the deletion has occurred.
Which of the following feature of Azure storage could be used for this requirement? A company has set up a storage account in Azure. They have the following storage requirements.
Ensure that administrators can recover any BLOB data if it has been accidentally deleted.
Have the ability to recover data over a period of 14 days after the deletion has occurred.
Which of the following feature of Azure storage could be used for this requirement?
CORS
TDE
Soft Delete
Soft Delete
Explanation
Soft Delete is a feature in Azure Blob storage that allows administrators to recover any BLOB data that has been accidentally deleted within a specific retention period, which is typically 14 days. It enables the recovery of deleted data and helps in meeting the specified requirement for data recovery in Azure storage accounts.
You plan to deploy an Azure SQL database that will store Personally Identifiable Information (PII).
You need to ensure that only privileged users can view the PII.
What should you include in the solution?
dynamic data masking
role-based access control (RBAC)
Data Discovery & Classification
Transparent Data Encryption (TDE)
dynamic data masking
Explanation
Dynamic data masking is the correct choice as it allows you to define masking rules to protect sensitive data in the database. This feature ensures that only privileged users can view the actual PII data while masking it for other users who do not have the necessary permissions.
You are planning to move data to Azure Blob Storage account for long-term storage. Data will be maintained for at least 180 days. Storage costs should be minimized.
You need to determine type of storage account that you should create ?
General-purpose-v1
General-purpose-v2
BlockBlobStorage
FileStorage
General-purpose-v2
Explanation
This type supports blob storage and lets you choose an access tier. General-purpose-v1 and BlockBlobStorage does not support access tiers.
You need to recommend a solution to identify Sign-in risks and provides remediation strategies in your Microsoft Entra ID environment ?
Microsoft Entra Privileged Identity Management
Azure Policies
Microsoft Entra ID Protection
Microsoft Entra ID Protection
Explanation
Microsoft Entra ID Protection is designed to help organizations secure their identities, detect potential vulnerabilities, and provide remediation strategies to mitigate risks. It includes features like risk-based conditional access, identity protection reports, and automated remediation options to enhance the security of the Microsoft Entra ID environment.
You have an Azure web app that uses an Azure key vault named KeyVault1 in the West US Azure region. You are designing a disaster recovery plan for KeyVault1. You plan to back up the keys in KeyVault1. You need to identify to where you can restore the backup.
What should you identify?
any region worldwide
the same region only
the same geography only
KeyVault1 only
the same geography only
Explanation
Relocate Azure Key Vault to another region | Microsoft Learn
Question 13
Correct
Which Load Balancing solution would you recommend for the below requirements:
Support for HTTP/HTTPS and layer 7 routing
SSL termination
Autoscaling based on traffic patterns
Protection from common vulnerabilities.
Azure Load Balancer
Azure Application Gateway
Azure Traffic Manager
Azure Application Gateway
Explanation
Azure Application Gateway is a Layer 7 load balancer that supports HTTP/HTTPS protocols and provides advanced routing capabilities, including URL-based routing and SSL termination. It also offers autoscaling based on traffic patterns through autoscaling rules and provides protection from common vulnerabilities through Web Application Firewall (WAF) features. This makes it the recommended choice for the given requirements.
You need to design a highly available Azure SQL database that meets the following requirements:
✑ Failover between replicas of the database must occur without any data loss.
✑ The database must remain available in the event of a zone outage.
✑ Costs must be minimized.
Which deployment option should you use?
Azure SQL Database Hyperscale
Azure SQL Database Premium
Azure SQL Database Basic
Azure SQL Managed Instance General Purpose
Azure SQL Database Premium
Explanation
Azure SQL Database Premium provides high availability with automatic failover between replicas without data loss. It also ensures availability in the event of a zone outage. While it may have a higher cost compared to other options, it meets all the specified requirements for this scenario.
You are designing a data storage solution to support reporting. The solution will ingest high volumes of data in the JSON format by using Azure Event Hubs. As the data arrives, Event Hubs will write the data to storage. The solution must meet the following requirements:
✑ Organize data in directories by date and time.
✑ Allow stored data to be queried directly, transformed into summarized tables, and then stored in a data warehouse.
✑ Ensure that the data warehouse can store 50 TB of relational data and support between 200 and 300 concurrent read operations.
Which service should you recommend for each type of data store?
Data Store for the Ingested data _______________.
Data Store for data Warehouse _______________ .
Azure Data lake Storage Gen2 , Azure SQL Database Hyperscale
Azure Data lake Storage Gen2 , Azure Synapse Analytics dedicated SQL pools.
Azure Data lake Storage Gen2 , Azure Cosmos DB SQL API
Azure Data lake Storage Gen2 , Azure SQL Database Hyperscale
Explanation
This is mentioned in the documentation as How can I choose between Azure Synapse Analytics and Azure SQL Database Hyperscale?
You are planning to move 10 web applications with SQL databases to Azure. You should be able to change the connection strings, passwords, and rotated secrets on all your application and SQL databases. User must stay connected to web applications.
You need to recommend a solution for this ? Administrative and development effort must be minimized.
Use Azure Key Vault
Use Azure App Configuration
Update Web.config
Use Azure App Configuration
Explanation
Azure App Configuration provides a service to centrally manage application settings.
Which Azure SQL Database service tier provides the fastest recovery time for a database?
General purpose
Business critical
Hyperscale
Business critical
Explanation
The Business Critical tier in Azure SQL Database is specifically optimized for high-performance OLTP workloads that require the fastest recovery time in case of failures. It offers features like in-memory technologies and accelerated database recovery to ensure minimal downtime and quick recovery.
You need to recommend an Azure SQL Database service tier that supports a 40-TB database. The solution must provide the fastest failover time.
Which service tier should you recommend?
General Purpose
Business critical
Hyperscale
Hyperscale
Explanation
The Hyperscale service tier is specifically designed for large databases that require high performance, scalability, and fast failover times. It can support databases up to 100 TB in size and provides capabilities like automatic scaling, high availability, and accelerated database recovery, making it the ideal choice for a 40-TB database that needs the fastest failover time. Achieve high resiliency and fast failure recovery mentioned in the below document.
You are tasked with developing a business continuity plan for three Azure SQL databases that are critical to a high-transaction application. Each database is in a separate subscription. The application must meet the following availability requirements:
Maximum downtime: 45 mins
Data must be no older than 10 seconds
Considering that your estimated costs for geo-replication are lower than the potential financial liability, you need to recommend a recovery method for the Azure SQL databases that meets the specified RPO and RTO.
Which recovery method should you recommend?
Point-in-time database restoration
Geo-restore from geo-replicated backups
Manual database failover
Auto-failover groups
Auto-failover groups
Explanation
Auto-failover groups provide automatic failover capabilities for Azure SQL databases, ensuring high availability and minimal downtime in case of a disaster. This method meets the specified RPO and RTO requirements by automatically failing over to a secondary replica within seconds, ensuring that data is no older than 10 seconds and downtime is minimized to meet the 45-minute maximum requirement.
Will there be a penalty if data is deleted before 180 days in the archive tier?
Yes
No
Yes
Explanation
Yes, an early deletion fee equivalent to the remaining days of the 180-day minimum retention period will be charged.
You are planning to do migration from on-prem to cloud using Cloud Adoption Framework.
What are the three main phases are there in migration adoption ?
Plan
Assess
Deploy
Ready
Release
Assess
Explanation
Assessing the current on-premises environment, workloads, applications, and data is a crucial phase in the migration adoption process. This phase helps identify dependencies, risks, and requirements for a successful migration to the cloud.
Deploy
Explanation
Deploying the migrated workloads, applications, and data to the cloud infrastructure is a key phase in the migration adoption process. This phase involves executing the migration plan, testing the migrated resources, and ensuring a smooth transition to the cloud environment.
Release
Explanation
Releasing the migrated workloads, applications, and data to production is a critical phase in the migration adoption process. This phase involves finalizing the migration, verifying the functionality of the migrated resources, and officially transitioning to the cloud environment.
You have an app that generates 50,000 events daily. You plan to stream the events to an Azure event hub and use Event Hubs Capture to implement cold path processing of the events. The output of Event Hubs Capture will be consumed by a reporting system. You need to identify which type of Azure storage must be provisioned to support Event Hubs Capture, and which inbound data format the reporting system must support.
What should you identify? To answer, select the appropriate options in the answer area
Storage Type: Azure Data Lake Storage Gen2
Data format: Avro
Storage Type: Azure Data Lake Storage Gen2
Data format: Apache Parquet
Storage Type: Premium Block blobs
Data format: Avro
Storage Type: Premium Block blobs
Data format: JSON
Storage Type: Azure Data Lake Storage Gen2
Data format: Avro
Explanation
Event Hubs Capture supports both Apache Avro and Parquet formats for output event serialization. However, Parquet format is only supported via Azure Stream Analytics integration. Therefore, if you want to use Event Hubs Capture to store captured data in your own Azure Blob Storage account or Azure Data Lake Storage account, you would need to use Apache Avro format
You are designing an app that will be hosted on Azure virtual machines that run Ubuntu. The app will use a third-party email service to send email messages to users. The third-party email service requires that the app authenticate by using an API key.
You need to recommend an Azure Key Vault solution for storing and accessing the API key. The solution must minimize administrative effort.
What should you recommend using to store and access the key
Storage : Certificate
Access: Managed Identity
Storage : Key
Access: Service Principal
Storage : Key
Access: Managed Identity
Storage : Secret
Access: Managed Identity
Storage : Secret
Access: Managed Identity
Explanation
Storing the API key as a secret in Azure Key Vault and accessing it using a Managed Identity is the most suitable solution for this scenario. Secrets in Azure Key Vault are specifically designed for storing sensitive information like API keys, and using a Managed Identity for access ensures secure and seamless retrieval of the API key without the need for additional credentials or keys. This solution minimizes administrative effort and aligns with the requirement to securely store and access the API key for the third-party email service.
Add the API key as a secret in the Key Vault.
You need to recommend a solution to generate a monthly report of all the new Azure Resource Manager (ARM) resource deployments in your Azure subscription.
What should you include in the recommendation?
Application Insights
Azure Analysis Services
Azure Advisor
Azure Activity Log
Azure Activity Log
Explanation
Azure Activity Log is the correct choice for tracking Azure Resource Manager resource deployments. It provides a comprehensive view of all activities that occur within an Azure subscription, including resource creations, updates, and deletions. By utilizing the Azure Activity Log, you can easily generate a monthly report of new resource deployments in your Azure subscription.
You have an Azure subscription that contains 10 web apps. The apps are integrated with Microsoft Entra ID and are accessed by users on different project teams.
The users frequently move between projects.
You need to recommend an access management solution for the web apps. The solution must meet the following requirements:
- The users must only have access to the app of the project to which they are assigned currently.
- Project managers must verify which users have access to their project’s app and remove users that are no longer assigned to their project.
- Once every 30 days, the project managers must be prompted automatically to verify which users are assigned to their projects.
What should you include in the recommendation?
Microsoft Entra ID Protection
Microsoft Entra Permissions Management
Microsoft Entra ID Governance
Microsoft Entra ID Governance
Explanation
Microsoft Entra ID Governance is the correct choice as it provides capabilities for managing and governing user identities and access within Azure AD. It allows for defining and enforcing access policies based on user attributes, such as project assignments, and enables project managers to verify and manage user access to specific apps. Additionally, the automated prompt feature aligns with the requirement for periodic verification by project managers.