az305 Flashcards
Whis of the following would you use to restrict access to KeyVault?
Access policies for KeyVault
An Azure Policy
RBAC
Azure Ad Multi Factor Auth
Access policies for KeyVault
Requirement: All data in the storage account is encrypted at rest
Azure Storage Encryption
Azure Disk Encryption
Always Encyrpted
Transparent Data Encrption
Azure Storage Encryption
To the manager of the developers, send a monthly email message that lists the access permissions to Application1.
If the manager does not verify an access permission, automatically revoke that permission.
Minimize development effort
A. In Azure Active Directory (Azure AD), create an access review of Application1.
B. Create an Azure Automation runbook that runs the Get-AzRoleAssignment cmdlet.
C. In Azure Active Directory (Azure AD) Privileged Identity Management, create a custom role assignment for the Application1 resources.
D. Create an Azure Automation runbook that runs the Get-AzureADUserAppRoleAssignment cmdlet.
A. In Azure Active Directory (Azure AD), create an access review of Application1.
Some users work remotely and do NOT have VPN access to the on-premises network.You need to provide the remote users with single sign-on (SSO) access to WebApp1. Select 2
A. Azure AD Application Proxy
B. Azure AD Privileged Identity Management (PIM)
C. Conditional Access policies
D. Azure Arc
E. Azure AD enterprise applications
F. Azure Application Gateway
A,E
✑ The evaluation must be repeated automatically every three months.
✑ Every member must be able to report whether they need to be in Group1.
✑ Users who report that they do not need to be in Group1 must be removed from Group1 automatically.
✑ Users who do not report whether they need to be in Group1 must be removed from Group1 automatically.What should you include in the recommendation?
A. Implement Azure AD Identity Protection.
B. Change the Membership type of Group1 to Dynamic User.
C. Create an access review.
D. Implement Azure AD Privileged Identity Management (PIM).
C. Create an access review.
You need to recommend a design for the planned Databrick deployment. The solution must meet the following requirements:
✑ Ensure that the data engineers can only access folders to which they have permissions.
✑ Minimize development effort.
✑ Minimize costs.
Databticks SKU: Premium or Standard
Cluster Config:
Credential Passthrough
Managed Identities
MLFlow
Secret Scope
Premium, Credential Passthrough
You need to analyze the network traffic to identify whether packets are being allowed or denied to the virtual machines.Solution: Use Azure Traffic Analytics in Azure Network Watcher to analyze the network traffic.
Does this meet the goal?
Instead use Azure Network Watcher IP Flow Verify, which allows you to detect traffic filtering issues at a VM level.
Users can connect to app without being prompted for auth:
Azure AD App registration
Azure AD Managed identity
Azure Ad App Proxy
User can only access apps from company owned computers:
A conditional access policy
Azure AD administrative unit
Azure Application Gateway
Azure blueprionts
Azure Policy
Azure AD App registration
A conditional access policy
You need to use Azure Monitor to design an alerting strategy for security-related events.
Which Azure Monitor Logs tables should you query?
Select for win and linux
Azure Activity
Azure Diagnostics
Event
syslog
Win: Event, Linux: Syslog
To which three scopes can you assign Azure Policy definitions?
A. Azure Active Directory (Azure AD) administrative units
B. Azure Active Directory (Azure AD) tenants
C. subscriptions
D. compute resources
E. resource groups
F. management groups
C,E,F
Your on-premises network contains a server named Server1 that runs an ASP.NET application named App1.
You have a hybrid deployment of Azure Active Directory (Azure AD).
You need to recommend a solution to ensure that users sign in by using their Azure AD account and Azure Multi-Factor Authentication (MFA) when they connect to App1 from the internet.
Which three features should you recommend be deployed and configured in sequence?
A public load balancer
A managed identity
an internal azure load balancer
conditional access policy
azure app service plan
Azure AD apllication proxy
Azure Ad application Enterprise
- Application Proxy
- Enterprise Application
- Conditional Access
Your on-premises network contains a server named Server1 that runs an ASP.NET application named App1.
You have a hybrid deployment of Azure Active Directory (Azure AD).
You need to recommend a solution to ensure that users sign in by using their Azure AD account and Azure Multi-Factor Authentication (MFA) when they connect to App1 from the internet.
Which three features should you recommend be deployed and configured in sequence?
A public load balancer
A managed identity
an internal azure load balancer
conditional access policy
azure app service plan
Azure AD apllication proxy
Azure Ad application Enterprise
- Application Proxy
- Enterprise Application
- Conditional Access
You need to recommend a solution to generate a monthly report of all the new Azure Resource Manager (ARM) resource deployments in your Azure subscription.
What should you include in the recommendation?
A. Azure Activity Log
B. Azure Advisor
C. Azure Analysis Services
D. Azure Monitor action groups
A. Azure Activity Log
Your company, named Contoso, Ltd., implements several Azure logic apps that have HTTP triggers. The logic apps provide access to an on-premises web service.
Contoso establishes a partnership with another company named Fabrikam, Inc.
Fabrikam does not have an existing Azure Active Directory (Azure AD) tenant and uses third-party OAuth 2.0 identity management to authenticate its users.
Developers at Fabrikam plan to use a subset of the logic apps to build applications that will integrate with the on-premises web service of Contoso.
You need to design a solution to provide the Fabrikam developers with access to the logic apps. The solution must meet the following requirements:
✑ Requests to the logic apps from the developers must be limited to lower rates than the requests from the users at Contoso.
✑ The developers must be able to rely on their existing OAuth 2.0 provider to gain access to the logic apps.
✑ The solution must NOT require changes to the logic apps.
✑ The solution must NOT use Azure AD guest accounts.
What should you include in the solution?
A. Azure Front Door
B. Azure AD Application Proxy
C. Azure AD business-to-business (B2B)
D. Azure API Management
Many APIs support OAuth 2.0 to secure the API and ensure that only valid users have access, and they can only access resources to which they’re entitled. To use Azure API Management’s interactive developer console with such APIs, the service allows you to configure your service instance to work with your OAuth 2.0 enabled API.
Incorrect:
Azure AD business-to-business (B2B) uses guest accounts.
Azure AD Application Proxy is for on-premises scenarios.
You have an Azure subscription that contains 300 virtual machines that run Windows Server 2019.
You need to centrally monitor all warning events in the System logs of the virtual machines.
Resources to create:
Event hub
Log Analytics
search engine
storage acount
Conf on Vms:
Create event subs
Conf CD
Install Azure monitor agent
Modify membership of the Event Log Reader Group
Resources to create: Log Analytics
Conf on Vms: Install Azure monitor agent
Security:
Get alerts about changes in administrator assignements
Development:
enable KeyVault access
Quality:
Require temporary admin roles
Azure AD Privilied Identity Management
Azure Managed Identity
Azure AD connect
Azure AD Identity Protection
Security: PIM
Development: MI
Quality: PIM
East / Sub1,Sub2 / tenant1
west / Sub3,Sub4 / tenant2
of Management Group = ? 1,2,3,4
# of Blueprint Definitons = ? 1,2,3,4
# of Blueprint Assignments = ? 1,2,3,4
of Management Group = 2
# of Blueprint Definitons = 2
# of Blueprint Assignments = 2
✑ For new resources, assign tags and values that match the tags and values of the resource group to which the resources are deployed.
✑ For existing resources, identify whether the tags and values match the tags and values of the resource group that contains the resources.
✑ For any non-compliant resources, trigger auto-generated remediation tasks to create missing tags and values.
The solution must use the principle of least privilege.
What should you include in the design?
Azure Policy Effect to use:
Append
EnforceOPAConstraint
EnforceRegoPolicy
Modify
RBAC for remedition tasks:
Managed Identity with Contributer
Managed Identity with User Access Admin
Service Principal with Contributer
Service Principal with User Access Admin
Azure Policy Effect to use:
Modify
RBAC for remedition tasks:
Managed Identity with Contributer
To DB1, you add a diagnostic setting named Settings1. Settings1 archive SQLInsights to storage1 and sends SQLInsights to Workspace1(Azure Log analytics Workspace).
T/F
You can add new dignostic setting that archives SqlInsights logs to storage2
You can add new dignostic setting that sends SqlInsights logs to Workspace2
You can add new dignostic setting that sends SqlInsights logs to EventHub1
T,T,T
You plan to deploy an Azure SQL database that will store Personally Identifiable Information (PII).
You need to ensure that only privileged users can view the PII.
What should you include in the solution?
A. dynamic data masking
B. role-based access control (RBAC)
C. Data Discovery & Classification
D. Transparent Data Encryption (TDE)
A. dynamic data masking
Dynamic data masking limits sensitive data exposure by masking it to non-privileged users.
Store data for multiple users
Ecrypt each users data by using a separate key
Encrypt all the data in the storage account by using customer-managed keys
A. files in a premium file share storage account
B. blobs in a general purpose v2 storage account
C. blobs in an Azure Data Lake Storage Gen2 account
D. files in a general purpose v2 storage account
B. blobs in a general purpose v2 storage account
You have an Azure App Service web app that uses a system-assigned managed identity.
You need to recommend a solution to store the settings of the web app as secrets in an Azure key vault. The solution must meet the following requirements:
✑ Minimize changes to the app code.
✑ Use the principle of least privilege.
KeyVault Integration method: ?
KeyVault permission for the managed identity: ?
KeyVault Integration method: Application settings
KeyVault permission for the managed identity: Secrets: Get
You need to recommend a solution to meet the following requirements for the virtual machines that will run App1:
✑ Ensure that the virtual machines can authenticate to Azure Active Directory (Azure AD) to gain access to an Azure key vault, Azure Logic Apps instances, and an Azure SQL database.
✑ Avoid assigning new roles and permissions for Azure services
✑ Avoid storing secrets and certificates on the virtual machines.
✑ Minimize administrative effort for managing identities.
Which type of identity should you include in the recommendation?
A. a system-assigned managed identity
B. a service principal that is configured to use a certificate
C. a service principal that is configured to use a client secret
D. a user-assigned managed identity
D. a user-assigned managed identity
User assigned MI can be shared with more than one Azure resource
Azure cosmos DB hosts a container that stores continuously updated operational data.
You are designing a solution that will use AS1 to analyze the operational data daily.
You need to recommend a solution to analyze the data without affecting the performance of the operational data store.
What should you include in the recommendation?
A. Azure Cosmos DB change feed
B. Azure Data Factory with Azure Cosmos DB and Azure Synapse Analytics connectors
C. Azure Synapse Link for Azure Cosmos DB
D. Azure Synapse Analytics with PolyBase data loading
C. Azure Synapse Link for Azure Cosmos DB
The maximum amount of time that the SQL Insights data can be stored in Azure Log Analytics is
30
90
730
indefinite
730
OpenID Connect and OAuth - Choose OpenID Connect and OAuth 2.0 if the application you’re connecting to supports it.
SAML - Choose SAML whenever possible for existing applications that do not use OpenID Connect or OAuth.
Password-based - Choose password-based when the application has an HTML sign-in page. Password-based SSO is also known as password vaulting. Password-based SSO enables you to manage user access and passwords to web applications that don’t support identity federation. It’s also useful where several users need to share a single account, such as to your organization’s social media app accounts. Password-based SSO supports applications that require multiple sign-in fields for applications that require more than just username and password fields to sign in.
The application manages its own credential store.
Users must enter a username and password to access the application. The application does NOT support identity providers.
You plan to upgrade the application to use single sign-on (SSO) authentication by using an Azure Active Directory (Azure AD) application registration.
Which SSO method should you use?
A. header-based
B. SAML
C. password-based
D. OpenID Connect
C. password-based
Connect vms from the internet.
✑ Incoming connections to the virtual machines must be authenticated by using Azure Multi-Factor Authentication (MFA) before network connectivity is allowed.
✑ Incoming connections must use TLS and connect to TCP port 443.
✑ The solution must support RDP and SSH.
Access the vms on Vnet, use:
-Azure Bastion
-JIT VM access
-Azure Web App firewall(WAF) in Azure Front door
Enforce MFA, use:
-Azure Identoty Governance access package
-Conditional Access policy that has the Cloud apps assignment set to Azure Windows VM Sign-in
-Conditional Access policy that has the Cloud apps assignment set to Microsoft Azure Management
Access the vms on Vnet, use:
-Azure Bastion (uses port 443)
Enforce MFA, use:
-Conditional Access policy that has the Cloud apps assignment set to Azure Windows VM Sign-in
All Azure resources must be easily identifiable based on the following operational information: environment, owner, department and cost center.
You need to ensure that you can use the operational information when you generate reports for the Azure resources.
A. an Azure data catalog that uses the Azure REST API as a data source
B. an Azure management group that uses parent groups to create a hierarchy
C. an Azure policy that enforces tagging rules
D. Azure Active Directory (Azure AD) administrative units
C. an Azure policy that enforces tagging rules
There are 2 companies that use Azure AD. Company A wants to give Contributer access to 10 Company B employees .
Employees should use their own credentials
A. In the Azure AD tenant of Contoso. create cloud-only user accounts for the Fabrikam developers.
B. Configure a forest trust between the on-premises Active Directory forests of Contoso and Fabrikam.
C. Configure an organization relationship between the Microsoft 365 tenants of Fabrikam and Contoso.
D. In the Azure AD tenant of Contoso, create guest accounts for the Fabnkam developers.
D. In the Azure AD tenant of Contoso, create guest accounts for the Fabnkam developers.
B is incorrect because forest is used for internal security not for external access
Sub1 contains an Azure App Service web app named App1. App1 uses Azure AD for single-tenant user authentication. Users from contoso.com can authenticate to App1.
You need to recommend a solution to enable users in the fabrikam.com tenant to authenticate to App1.
A. Configure the Azure AD provisioning service.
B. Enable Azure AD pass-through authentication and update the sign-in endpoint.
C. Use Azure AD entitlement management to govern external users.
D. Configure Azure AD join.
C is correct
https://docs.microsoft.com/en-us/azure/active-directory/governance/entitlement-management-external-users
IF App1 is multi-tenant application, A might be correct since you can provision users from other tenant to App1 and configure App1 to SSO with other tenants.
Grants permissions to allow web apps to access the web APIs:
- Azure AD
- Azure API Management
- The web APIs
Configures a JSON Web Token(JWT) validation policy:
- Azure AD
- Azure API Management
- The web APIs
1: Azure AD
2: Azure API Management
You have 100 servers that run Windows Server 2012 R2 and host Microsoft SQL Server 2014 instances. The instances host databases that have the following characteristics:
✑ Stored procedures are implemented by using CLR.
✑ The largest database is currently 3 TB. None of the databases will ever exceed 4 TB.
You plan to move all the data from SQL Server to Azure.
You need to recommend a service to host the databases. The solution must meet the following requirements:
✑ Whenever possible, minimize management overhead for the migrated databases.
✑ Ensure that users can authenticate by using Azure Active Directory (Azure AD) credentials.
✑ Minimize the number of database changes required to facilitate the migration.
A. Azure SQL Database elastic pools
B. Azure SQL Managed Instance
C. Azure SQL Database single databases
D. SQL Server 2016 on Azure virtual machines
B. Azure SQL Managed Instance
Azure SQL Database (single or elastic) does not support CLR
You have an Azure subscription that contains an Azure Blob Storage account named store1.
You have an on-premises file server named Server1 that runs Windows Server 2016. Server1 stores 500 GB of company files.
You need to store a copy of the company files from Server1 in store1.
Which two possible Azure services achieve this goal?
A. an Azure Logic Apps integration account
B. an Azure Import/Export job
C. Azure Data Factory
D. an Azure Analysis services On-premises data gateway
E. an Azure Batch account
B. an Azure Import/Export job
C. Azure Data Factory
Multiple Applications should read one transaction info
A. one Azure Data Factory pipeline
B. multiple storage account queues
C. one Azure Service Bus queue
D. one Azure Service Bus topic
D. one Azure Service Bus topic
✑ Maximize data throughput.
✑ Prevent the modification of data for one year.
✑ Minimize latency for read and write operations.
Storage account type:
BlobStorage
BlockBlobStorage
FileStorage
StorageV2 with Premium perf
StorageV2 with Standard perf
Storage service:
Blob
File
Table
BlockBlobStorage: provide a very low latency
S1: StorageV2 - Standard
S2: StorageV2 - Premium
S3: BlobStorage - Standard
S4: FileStorage - Premium
App1: Use Lifecycle management
Apps: Store app data in Azure File share
App1:
S1,S2
S1,S3
S1,S2,S3
S1,S2,S3,S4
App2:
S4
S1,S4
S1,S2,S4
S1,S2,S3,S4
App1: Storage1 and storage3 only
App2: Storage1 and storage4 only
The application will host video files that range from 50 MB to 12 GB. The application will use certificate-based authentication and will be available to users on the internet.
- The solution must provide the fastest read performance and must minimize storage costs.
A. Azure Files
B. Azure Data Lake Storage Gen2
C. Azure Blob Storage
D. Azure SQL Database
C. Azure Blob Storage
Stores large amounts of unstructured data, such as text or binary data, that can be accessed from anywhere in the world
You need to recommend a database platform to host the databases. The solution must meet the following requirements:
✑ The solution must meet a Service Level Agreement (SLA) of 99.99% uptime.
✑ The compute resources allocated to the databases must scale dynamically.
✑ The solution must have reserved capacity.
Compute charges must be minimized.
A. an elastic pool that contains 20 Azure SQL databases
B. 20 databases on a Microsoft SQL server that runs on an Azure virtual machine in an availability set
C. 20 databases on a Microsoft SQL server that runs on an Azure virtual machine
D. 20 instances of Azure SQL Database serverless
A is correct. Elastic pool is needed for SLA 99,95 % and auto scale.
You need to design the database architecture to meet the following requirements:
✑ Support scaling up and down.
✑ Support geo-redundant backups.
✑ Support a database of up to 75 TB.
✑ Be optimized for online transaction processing (OLTP).
Azure SQL Database
HyperScale (up to 100TB)
You are planning an Azure IoT Hub solution that will include 50,000 IoT devices.
Each device will stream data, including temperature, device ID, and time data. Approximately 50,000 records will be written every second. The data will be visualized in near real time.Which two services can you recommend?
A. Azure Table Storage
B. Azure Event Grid
C. Azure Cosmos DB SQL API
D. Azure Time Series Insights
C. Azure Cosmos DB SQL API
D. Azure Time Series Insights
✑ Support SQL commands.
✑ Support multi-master writes.
✑ Guarantee low latency read operations.
A. Azure Cosmos DB SQL API
B. Azure SQL Database that uses active geo-replication
C. Azure SQL Database Hyperscale
D. Azure Database for PostgreSQL
A. Azure Cosmos DB SQL API
Only Cosmos DB supports multi-master writes:
How to migrate data, you can sleect options more than 1?
Microsoft SQL Server 2012 -> An Azure SQL DB
A table in Microsoft Sql Server 2014 -> Cosmos account that use SQL API
AzCopy
azure Cosmos Db data migration tool
Data Management Gateway
Data Migration Assistant
1: Data Migration Assistant
2: Azure Cosmos DB data migration tool
You store web access logs data in Azure Blob Storage.
You plan to generate monthly reports from the access logs.
You need to recommend an automated process to upload the data to Azure SQL Database every month.
A. Microsoft SQL Server Migration Assistant (SSMA)
B. Data Migration Assistant (DMA)
C. AzCopy
D. Azure Data Factory
D. Azure Data Factory
Your on-premises network contains a file server named Server1. Server1 stores 5 ׀¢׀’ of company files that are accessed rarely.
You plan to copy the files to Azure Storage.
You need to implement a storage solution for the files that meets the following requirements:
✑ The files must be available within 24 hours of being requested.
✑ Storage costs must be minimized.
Which two possible storage solutions achieve this goal?
A. Create an Azure Blob Storage account that is configured for the Cool default access tier. Create a blob container, copy the files to the blob container, and set each file to the Archive access tier.
B. Create a general-purpose v1 storage account. Create a blob container and copy the files to the blob container.
C. Create a general-purpose v2 storage account that is configured for the Cool default access tier. Create a file share in the storage account and copy the files to the file share.
D. Create a general-purpose v2 storage account that is configured for the Hot default access tier. Create a blob container, copy the files to the blob container, and set each file to the Archive access tier.
E. Create a general-purpose v1 storage account. Create a fie share in the storage account and copy the files to the file share.
A, D
Archive tier rehydration time is a claimed 15 hours. This meets their needs at the lowest cost.
You have an app named App1 that uses two on-premises Microsoft SQL Server databases named DB1 and DB2.
You plan to migrate DB1 and DB2 to Azure
You need to recommend an Azure solution to host DB1 and DB2. The solution must meet the following requirements:
✑ Support server-side transactions across DB1 and DB2.
✑ Minimize administrative effort to update the solution.
A. two Azure SQL databases in an elastic pool
B. two databases on the same Azure SQL managed instance
C. two databases on the same SQL Server instance on an Azure virtual machine
D. two Azure SQL databases on different Azure SQL Database servers
B. two databases on the same Azure SQL managed instance
✑ Failover between replicas of the database must occur without any data loss.
✑ The database must remain available in the event of a zone outage.
✑ Costs must be minimized.
A. Azure SQL Database Hyperscale
B. Azure SQL Database Premium
C. Azure SQL Database Basic
D. Azure SQL Managed Instance General Purpose
B. Azure SQL Database Premium
Not A: Hyperscale is more expensive than Premium.
Not C: Need Premium for Availability Zones.
Not D: Zone redundant configuration that is free on Azure SQL Premium is not available on Azure SQL Managed Instance.
You need to recommend a storage solution that meets the following requirements:
✑ All the data written to storage must be retained for five years.
✑ Once the data is written, the data can only be read. Modifications and deletion must be prevented.
✑ After five years, the data can be deleted, but never modified.
✑ Data access charges must be minimized.
Storage Account Type: Archive, Cool, Hot
Prevent Modif and Deletion by:
Container Access Level,
Container Access Policy,
Storage Account Access Lock
Hot
Container Access Policy
The solution will ingest high volumes of data in the JSON format by using Azure Event Hubs. As the data arrives, Event Hubs will write the data to storage. The solution must meet the following requirements:
✑ Organize data in directories by date and time.
✑ Allow stored data to be queried directly, transformed into summarized tables, and then stored in a data warehouse.
✑ Ensure that the data warehouse can store 50 TB of relational data and support between 200 and 300 concurrent read operations.
Which service should you recommend for each type of data store?
Datastore for the ingested data:
Azure Blob Storage
Azure Datalake Storage Gen2
Azure Files
Azure Netapp files
Datastore for the warehouse:
CosmosDb Cassandra API
CosmosDb SQL API
SQL Database Hyperscale
Synapse Analytics dedicated SQL pools
Azure Datalake Storage Gen2
SQL Database Hyperscale
You need to recommend a disaster recovery solution for the data. The solution must meet the following requirements:
✑ Provide the ability to recover in the event of a regional outage.
✑ Support a recovery time objective (RTO) of 15 minutes.
✑ Support a recovery point objective (RPO) of 24 hours.
✑ Support automated recovery.
✑ Minimize costs.
A. Azure virtual machine availability sets
B. Azure Disk Backup
C. an Always On availability group
D. Azure Site Recovery
D. Azure Site Recovery
You need to deploy resources to host a stateless web app in an Azure subscription. The solution must meet the following requirements:
✑ Provide access to the full .NET framework.
Provide redundancy if an Azure region fails.
✑ Grant administrators access to the operating system to install custom application dependencies.
Solution: You deploy two Azure virtual machines to two Azure regions, and you create an Azure Traffic Manager profile.
Does this meet the goal?
Yes
Azure Traffic Manager is a DNS-based traffic load balancer that enables you to distribute traffic optimally to services across global Azure regions, while providing high availability and responsiveness.
You need to deploy resources to host a stateless web app in an Azure subscription. The solution must meet the following requirements:
✑ Provide access to the full .NET framework.
✑ Provide redundancy if an Azure region fails.
✑ Grant administrators access to the operating system to install custom application dependencies.
Solution: You deploy two Azure virtual machines to two Azure regions, and you deploy an Azure Application Gateway.
Does this meet the goal?
No
App Gateway will balance the traffic between VMs deployed in the same region. Create an Azure Traffic Manager profile instead
You plan to create an Azure Storage account that will host file shares.
The shares will be accessed from on-premises applications that are transaction intensive.
You need to recommend a solution to minimize latency when accessing the file shares.
The solution must provide the highest-level of resiliency for the selected storage tier.
What should you include in the recommendation?
Storage Tier: Hot, Premium, Transaction optimized
Redundancy: Geo-Redundant, Zone-Redundant, Locally-Redundant
Storage: Premium
Hot is offered for general urpose file sharing
Transaction Optimed: Does not support low latency
Redundancy: ZRS
Premium file share only supports LRS and ZRS
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You need to deploy resources to host a stateless web app in an Azure subscription. The solution must meet the following requirements:
✑ Provide access to the full .NET framework.
✑ Provide redundancy if an Azure region fails.
✑ Grant administrators access to the operating system to install custom application dependencies.
Solution: You deploy an Azure virtual machine scale set that uses autoscaling.
Does this meet the goal?
No
Instead, you should deploy two Azure virtual machines to two Azure regions, and you create a Traffic Manager profile.