Az Cloud Academy Certification test Flashcards

1
Q

ASP.NET applications that run in Azure web app can create which of the following kinds of logs?

Application tracing, Web server, Detailed error message, Failed request tracing
Application tracing, Web server, Detailed error message, Access request tracing
Application tracing, Web server, Error message, Access request tracing
Application tracing, Web server, Error message, Successful request tracing

A

ASP.NET applications running in Azure web apps can create the following types of logs:

Application tracing
Web server
Detailed error message
Failed request tracing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

An Azure subscription named Subscription 1 contains three resource groups named Development, Test, and Production. Thomas, Logan, and Guy have been assigned roles via role-based access controls (RBAC) to access Subscription 1 resources.Logan can perform all read and write operations on all compute and storage resources within the Development and Test resource groups. Guy is an owner of the Development and Test resource groups. Thomas is an owner of Subscription 1.If necessary, who would be able to delete the entire Development resource group and all resources within it?

A

Both Guy and Thomas

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

The junior database administrator at your organization is experimenting with an Azure Stream Analytics parallel job. The query is designed to be embarrassingly parallel. The job input is from an Event Hub with eight partitions. Which of the following would be feasible for the job output?

An Event Hub with 0 partitions
An Event Hub with 16 partitions
A Blob Output
A Blob Output with 8 partitions

A

A Blob Output
The number of input partitions must equal the number of output partitions so the idea is to avoid a mismatched partition count issue. Blob output does not currently support partitions. However, it will inherit the partitioning scheme of the upstream query. If Event Hubs are used, there must be eight partitions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Your database administrator and you are brainstorming ways to monitor memory pressure on a newly installed Azure Redis Cache Premium tier instance. Your database administrator insists that using the cache misses Azure Portal metric is the best way to monitor memory pressure. Why do you advise against using cache misses for monitoring memory pressure?

Cache misses are normal and do not always reflect memory pressure.
Cache misses are more a reflection of server CPU utilization issues and latency issues.
Cache misses result from client/server regional variances and request/response timeouts.
Cache misses can only measure timeout issues resulting from low network bandwidth availability.

A

Cache misses are normal and do not always reflect memory pressure.

Cache misses are not necessarily a bad thing. Not all data can be in the cache at once. When using the cache-aside programming pattern, an application looks first in the cache for an item. If the item is not there (cache miss), the item is retrieved from the database and added to the cache for next time. Cache misses are normal behavior for the cache-aside programming pattern. Higher than expected cache misses may be caused by application logic that populates and reads from the cache. However, if items are being evicted from the cache due to memory pressure then there may be some cache misses, but a better metric to monitor for memory pressure would be Used Memory or Evicted Keys.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Which of the following Azure PowerShell cmdlets can be used to verify VM encryption status of a Linux VM?

Get-AzVmDiskEncryptionStatus
Get-AzureLinuxVmDiskEncryptionStatus
Get-VmEncryptionStatus
Get-AzureRmLinuxDiskEncryptionStatus

A

Get-AzVmDiskEncryptionStatus

Use the Get-AzVmDiskEncryptionStatus cmdlet to verify the encryption status of a Linux VM.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

You are designing several message queue services for clients. Service 1 is a delivery system for online invitations with the following specifications: First-in, first-out support is required to ensure messages are delivered in order. Messages must have unlimited time to live (TTL). Service 2 is a billing reminder delivery services with the following specifications: Prevention of duplicate messages - any duplicate messages would need to be detected and removed from the queue automatically. The messages will average 150 KB in size. Service 3 is a data delivery system for weather data from numerous IoT producers to a central data warehouse for batch processing for eventual data analysis. Its specifications are: The messages will be 10 KB in size The service will have to process thousands of messages per second. The data analysis application used with Service 3 performs idempotent operations. Which service(s) would be ideal for Azure Storage Queue?

Service 1 and 3
Service 2 only
Service 3 only
Service 1 and 2.

A

Service 3 only

Explanation
Azure Storage Queues and Azure Service Bus Queues have several similar use cases, but their service limitations make them ideal for specific services.

Storage Queues cannot guarantee FIFO delivery, while Service Bus Queues can.
Storage Queues cannot detect duplicate messages in a queue.
Storage Queues have a maximum message TTL of 7 days, while Service Bus Queues TTL can be unlimited.
Storage Queues have a maximum file size of 64 KB, and although they can provide a pointer to larger size files if necessary, doing so decreases the speed of the service. Service Bus Queues are capable of delivering larger messages.
Storage Queues are generally recommended for large, asynchronous workflows while Service Bus Queues are ideal for medium-scale transaction workflows.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Which Microsoft PowerShell Security Cmdlet converts a secure string to an encrypted standard string?

ConvertTo-EncryptedString
ConvertFrom-EncryptedString
ConvertTo-SecureString
ConvertFrom-SecureString

A

ConvertFrom-SecureString

Explanation
PowerShell has a Security module that consists of cmdlets and providers that manage the basic security features of Windows. To convert a secure string to an encrypted standard string, use the ConvertFrom-SecureString cmdlet.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does Microsoft recommend when choosing an Azure Cosmos DB partition key?

Select partition keys that have high volumes of data for the same value.
Set the same partition key for all your documents.
Select a unique partition key for each document.
Select a partition key that prevents “hot spots” within your application.

A

Select a partition key that prevents “hot spots” within your application.

Explanation
Your choice of partition key should balance the need to enable the use of transactions against the requirement to distribute your entities across multiple partition keys to ensure a scalable solution. It is important to pick a property that allows writes to be distributed across a number of distinct values. Requests to the same partition key cannot exceed the throughput of a single partition, and will be throttled.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

A blob can be leased to limit write and delete permissions for that specific blob to which of the following scopes?

A single user

A single Azure AD tenant

A single resource group

A single Azure AD group

A

A single user

Explanation
Blob leases limit write and delete permissions to the specific user who has leased the object.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

You are developing an app that uses Azure Functions and need to write a trigger function that runs immediately on startup, and then every two hours thereafter. How would you code the TimerTrigger attributes to accomplish this task? Assume that you will use these six fields for the scheduling string: {second} {minute} {hour} {day} {month} {day of the week}.

TimerTrigger(“0 0 */2 * * *”, RunOnStartup = true)
TimerTrigger(“2 0 * 0 * * *”, TimerInfo = RunOnStartup)
TimerTrigger(“2 0 * 0 * * *”, RunOnStartup = true)
TimerTrigger(“0 0 */2 * * *”, TimerInfo = RunOnStartup)

A

TimerTrigger(“0 0 */2 * * *”, RunOnStartup = true)

Explanation
A fully featured Timer trigger for scheduled jobs that supports cron expressions, as well as other schedule expressions. The first parameter is a cron expression that declares the schedule. The second parameter alerts the timer to begin immediately.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

When configuring Azure Notification Hub push notifications for your Azure App Service mobile app, which credential type is required to allow your mobile backend to connect to your notification hub?

Access policy connection strings

OAuth 2.0 authentication

Managed Service Identity authentication

HubTriggers

A

Access policy connection strings

Explanation
You will need to get the connection string from the Access Policies page. This is the credential that will let your mobile backend actually connect to your hub for pushing messages. It will be part of your mobile backend code.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Before you deploy a new application to its production environment, you need to integrate a monitoring solution that sends messages to the development team’s mobile devices. The key requirements for this messaging solution are: It can be deployed with minimal customization or administration required.It can deliver messages to mobile devices running Android and iOS operating systems.Which Azure solution is optimal for this scenario?

Azure Service Bus
Azure Event Hub
Azure Notification Hub
Azure Event Grid

A

Azure Notification Hub

Explanation
This is where Azure Notification Hubs and IoT Edge come in. The former is a ready-made smart device notification solution. Need to send push notifications to iPhones, Android phones, or tablets? Notification Hubs is your answer. The great thing about it is that it takes away a lot of the pain involved in supporting a variety of mobile devices. If you have experience as a mobile developer, then you’ll know what I am talking about. Unlike other forms of messaging, push notifications often have tricky platform-dependent logic. Scaling, managing tokens, and routing messages to different segments of users on different hardware and different versions of Android is non-trivial work for even an experienced tech team.

Notification Hub takes away most of that pain. It lets you broadcast to all platforms with a single interface. It can work both in the cloud or on-premises and includes security features like SAS, shared access secrets, and federated authentication. See the “How To” guide link for more details.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Jeremy will manage security for all applications within two subscriptions, named Subscription 1 and Subscription 2. Jeremy needs to be assigned the appropriate role to manage these resources.This new role has the following requirements:Jeremy needs to be able to assign employees he manages permanent roles within PIM.With his potential ability to assign other employees resource access in PIM, his role assignment will need administrative review.Before management activates his assignment, they would like Jeremy to complete MFA.What Azure resource role assignment within PIM will meet these requirements?

Permanent eligible assignment
Permanent active assignment
An eligible assignment with expiration
An active assignment with expiration

A

Permanent eligible assignment

Explanation
Permanent assignments allow users to assign other users permanent roles within PIM. Eligible assignments require the user to complete an action, which could be a justification for the role or MFA, before activating the role. Active role assignments do not need to be justified or require MFA.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

You have built a Web App application that keeps returning 500 error when being called, and you’re scrambling to understand the underlying issue. What’s your first line of defense?

Turn on Web server diagnostic logs, collect and analyze
Turn on Application server diagnostic logs, collect and analyze
Open a support request to Azure Helpdesk asking for assistance
Open a Kudu console and watch application log stream

A

Turn on Web server diagnostic logs, collect and analyze

Explanation
Azure provides built-in diagnostics to assist with debugging an App Service web app. App Service web apps provide diagnostic functionality for logging information from both the web server and the web application. These are logically separated into web server diagnostics and application diagnostics. In order to enable logging for the web server diagnostics, you simply change the setting on the Azure Portal.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

You are auditing and updating a small number of critical blobs within an Azure Blob Storage account, and those updates are recorded in a separate on-premises database. The entire update process for each blob takes roughly 30-50 seconds because the on-premises update can lag occasionally. The process has never takes longer than 50 seconds. During this update, you plan to lease each blob individually as you audit the account, to limit the potential effects to ongoing business. You want to lease the blob from the time you begin your update until the time the update is recorded in the on-premises database. Which lease operations should you perform?

Lease the blob for 60 seconds, perform the update, and break the lease.

Lease the blob for 60 seconds, perform the manual update, and release the lease.

Lease the blob indefinitely, perform the manual update and then break the lease.

Lease the blob indefinitely, perform the manual update and, then release the lease.

A

Lease the blob for 60 seconds, perform the update, and break the lease.

Explanation
The key to answering this question correctly is understanding how timed and indefinite (or infinite) leases operate.

Timed and indefinite leases, when released, end immediately.
Timed leases, when broken, last for the remaining time of the lease period and then end.
Indefinite leases, when broken, end immediately.
Therefore, the correct answer is to select a timed lease of 60 seconds, and break it once you’ve completed the manual update. This way, the lease will extend the full 60 seconds while the on-premises database is updated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

You are a start-up company currently hosting two small web applications, Web App 1 and Web App 2, on Azure Web Apps. Your Web Apps run on three instances on a Basic app service plan. You need to manage both web apps to meet the following requirements:Allow Web App 1 to scale from 5-8 instances based on application workload, as traffic for this web app is growing.Maintain Web App 2 on three separate instances, as this application is also growing more popular. However, Web App 2 does not require scaling capabilities yet.What steps would be most cost-effective and meet your application requirements?

Move Web App 1 to a separate Standard app service plan. Configure auto scaling for Web App 1 between a range of 5 to 8 instances based on application metrics. Keep your existing Basic app service plan for Web App 2.

Scale up to a Premium app service plan. Leave Web App 2 as it is currently configured. Configure auto scaling for Web App 1 between a range of 5 to 8 instances based on application metrics.

Move Web App 1 to a separate Premium app service plan. Configure auto scaling for Web App 1 between a range of 5 to 8 instances based on application metrics. Scale your Basic app service plan down to a Shared service plan for Web App 2.

Move Web App 1 to a separate Premium app service plan. Configure auto scaling for Web App 1 between a range of 5 to 8 instances based on application metrics. Scale up your existing service plan from Basic to Standard for Web App 2.

A

Move Web App 1 to a separate Standard app service plan. Configure auto scaling for Web App 1 between a range of 5 to 8 instances based on application metrics. Keep your existing Basic app service plan for Web App 2.

Explanation
App Service plans are containers for the apps that you deploy in App Service. App Service plans are offered in different tiers, with more functionality provided by higher, more expensive tiers. The following list highlights some of the distinctions between the available tiers:
Free (Windows only): Run a small number of apps for free
Shared (Windows only): Run more apps and provides support for custom domains
Basic: Run unlimited apps and scale up to three instances with built-in load balancing
Standard: The first tier recommended for production workloads. It scales up to ten (10) instances with Autoscaling support and VNet integration to access resources in your Azure virtual networks without exposing them to the internet
Premium: Scale up to 20 instances and additional storage over the standard tier
Isolated: Scale up to 100 instances, runs inside of an Azure Virtual Network isolated from other customers, and supports private access use cases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Which PowerShell command will create a new deployment slot for a web app?

New-AzWebAppSlot -ResourceGroupName [resource group name] -Name [web app name] -Slot [deployment slot name] -AppServicePlan [app service plan name]

New-AzDeploymentSlot -ResourceGroupName [resource group name] -Name [web app name] -Slot [deployment slot name] -AppServicePlan [app service plan name]

New-AzWebAppSlot -Name [web app name] -Slot [deployment slot name] -AppServicePlan [app service plan name]

New-AzWebAppDeploymentSlot -ResourceGroupName [resource group name] -Name [web app name] -Slot [deployment slot name] -AppServicePlan [app service plan name]

A

New-AzWebAppSlot -ResourceGroupName [resource group name] -Name [web app name] -Slot [deployment slot name] -AppServicePlan [app service plan name]

Explanation
The correct answer is:

New-AzWebAppSlot -ResourceGroupName [resource group name] -Name [web app name] -Slot [deployment slot name] -AppServicePlan [app service plan name].

All of the other answers contain errors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Your team develops multiple mobile finance APIs for an online banking service. You need mitigate potential abuse for a single online product, a business travel expense submission service. Using Azure API Management, you need to set policies within Azure API Management to control the character types within data strings submitted to the backend via all the product APIs. Which stage and level would you need to set for this API policy in Azure API Management?

Inbound stage and Product scope

Backend stage and Specific API scope

Frontend stage at Individual Operation score

Inbound stage and Global scope

A

Inbound stage and Product scope

Explanation
This policy would control inbound stages APIs at the product scope, because it modifies or controls request contents before they reach the backend for all of a product’s APIs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

You are managing an Azure Cosmos DB environment that is logging time-series data. Which example of a partition key would be a good choice for this architecture?

A user ID
A process ID
A device ID
A tenant ID

A

A process ID

Explanation
In Cosmos DB, different scenarios require a different type of partition key for optimized performance. If you’re using Cosmos DB for logging time-series data, then the hostname or process ID is a good choice for the partition key. A process ID is a system-generated unique identifier of the process that is referenced by a running instance of an application.

20
Q

You are configuring security settings for your Azure Data Lake, and want to integrate a Data Lake service endpoint within an existing VNet. Which steps should you implement to configure this? (Choose 2 answers)

Configure your Azure Data Lake in the same resource group as your VNet

Configure a Microsoft Azure Active Directory Service endpoint

Deploy the endpoint in your selected VNET

Disable connectivity from Azure services outside of the selected VNET

A

Configure a Microsoft Azure Active Directory Service endpoint
Deploy the endpoint in your selected VNET

Explanation
To use virtual network integration with data lake storage gen1, you must create a virtual network in the same region as your data lake storage account. You need to configure a service endpoint with the Microsoft Azure Active Directory as the service. After creating your virtual network in the same region as your data lake, you need to go to your data lake and click on Firewall and virtual networks. Choose the Selected network radio button and then Add existing virtual network. In the Add networks blade, select your virtual network and the subnet and click Add. Below the firewall section under exceptions, you can enable connectivity from Azure services outside of your selected network.

21
Q

When using the Azure Monitoring service for Web Apps in Azure, which of the below logging facilities is not an available option?

Application Logging (File System)
Application Logging (Table Storage)
Application Logging (Blob Storage)
Application Logging (Queue Storage)

A

Application Logging (Queue Storage)

Explanation
By default, the following Application Diagnostics are disabled for a Web App service , but can be enabled whenever required:
Application Logging (File System): The logs are collected by the file system of the web app.
Application Logging (Table Storage): The logs are collected in the Table storage that is specified under Manage Table Storage.
Application Logging (Blob Storage): The logs are collected in the Blob container that is specified under Manage Blob Storage.

22
Q

Your team is spending too much time recovering from unplanned events, specifically when small resource updates occur that disrupt service operations, or noncompliant resources are created. You want to automate a process to review log data related to resource updates, to detect anomalies within the updates. You would like to utilize live dashboards to evaluate the log data quickly. What type of logs would you analyze, and with what Azure service?

Process activity logs with Azure Event Hub.
Process diagnostic logs with Log Analytics.
Process application logs with tables in Azure Storage.
Process diagnostic logs with Power BI.

A

Process activity logs with Azure Event Hub.

Explanation
Azure offers activity logs to help you track subscription level operations on resources, such as creating or updates resources. Azure Event Hubs allows you to receive thousands of log events per second and detect anomalies, and it also provides live dashboards as well.

23
Q

What operation does the following command in AzCopy perform? azcopy copy ‘https://mysourceaccount.blob.core.windows.net/mycontainer/myTextFile.txt?sv=2018-03-28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-07-04T05:30:08Z&st=2019-07-03T21:30:08Z&spr=https&sig=CAfhgnc9gdGktvB=ska7bAiqIddM845yiyFwdMH481QA8%3D’ ‘https://mydestinationaccount.blob.core.windows.net/mycontainer/myTextFile.txt’

Copies the blob “myTextFile.txt” in one Azure storage container to another container in the same Azure Storage account

Copies the blob “myTextFile.txt” from one Azure storage account to another Azure storage account

Moves the blob “myTextFile.txt” in one Azure storage container to another container in the same Azure Storage account

Copies the blob “myTextFile.txt” from one Azure Storage account to another Azure storage account and deletes the blob from the source Azure Storage account

A

Copies the blob “myTextFile.txt” from one Azure storage account to another Azure storage account

Explanation
The command copies the blob ‘myTextFile.txt” from one Azure Storage account to another Azure Storage account, but does not delete the blob from the source account. AzCopy cannot perform deletions, and it is also important to remember that it only copies files and does not move or migrate them. The difference between copying vs. moving is that copying simply duplicates the blob in another storage account, while moving would remove it from the source account and place it in the other account.

24
Q

You have successfully containerized your application within an Azure Container Registry, created an image of your application and pushed it into the container registry. You have also created an AKS cluster. Now you want to deploy the containerized application onto your AKS cluster. Which three steps do you need to complete? (Choose 3 answers)

Get credentials to authenticate kubectl commands sent to the Kubernetes cluster.

Create a manifest file declaring the required Kubernetes resources.

Create the resources in the cluster

Create a service principal to allow your cluster to interact with Azure resources

A

Get credentials to authenticate kubectl commands sent to the Kubernetes cluster.
Create a manifest file declaring the required Kubernetes resources.

Explanation
You would need to complete all of the following steps in order to deploy your application to an AKS cluster except for creating a service principal. This step must already be completed in order for your AKS cluster to be provisioned and ready to host your application. You can also have AKS create a service principal for you using Azure CLI or Azure Portal.

25
Q

Your manager has asked for advice on how best to fire off a console app that will nightly pick up some files that are uploaded to a Web App hosted on App Service and add them to Blob Storage. Cost and management effort are a concern. Given what you know, which service would work best?

WebJobs
Azure Logic Apps

Azure Functions
Azure Automation

A

WebJobs

Explanation
While there are multiple answers that would work, the answer that would be considered the “best” is the use of WebJobs.

WebJobs will have access to the files on the servers without any additional configuration. That will keep management and cost down.

26
Q

You are designing a messaging solution with the following requirements:Your application must store over 80 GB of messages in a queue, and the messages have a lifetime shorter than 7 days.Your application wants to track progress for processing a message inside of the queue.You require server-side logs of all of the transactions executed against your queues.Which of the following services will be included in your design?

Azure Storage Queues
Service Bus Queues
Service Bus Topics
Event Hubs

A

Azure Storage Queues

Explanation
This can be a difficult question, because Storage Queues and Service Bus Queues offer similar options. However Storage queues have a max queue size of 500 TB (versus 80GB for Service Bus queues), allowing for large amounts of data to be stored. Keep in mind, that’s the max queue size, not the message size, which is 64 KB (48 KB when using Base64 encoding).

27
Q

You have configured an Azure Stream Analytics job and want to check its progress periodically using the metric graphing feature available in Azure Portal. You need to monitor the following metrics: Streaming units (percentage) Late input events (count) Early input events (count) Input Event Bytes (bytes) Runtime errors (count) Out-of-Order Events (count) You prefer to create the minimum number of graphs, for optimal efficiency. How many graphs will you need to create?

5

1

3

2

A

3

Explanation
All of the metrics on the same graph have to be the same unit of measure. There are three units of measure in the collection of metrics listed in this question - count, percentage, and number of bytes. Therefore, the correct answer is three.

28
Q

You have just launched an update for your multi-language translation mobile app, hosted on App Service. You receive multiple complaints that customer submissions of text translations are not being processed, and did not receive HTTP 4xx or 5xx error code responses. You want to know which App service components may have caused the issue. What log type should you enable?

Failed Request Tracing

Web Server Logging

Detailed Error Messaging

Application Logging

A

Failed Request Tracing

Explanation
Detailed information on failed requests, including a trace of the IIS components used to process the request and the time taken in each component. It’s useful if you want to improve site performance or isolate a specific HTTP error.

29
Q

When using the Azure portal to view Azure App Service logs, what is a prerequisite that is required within your code to ensure that the portal receives your logs?

Log entries
Error codes
Traces
Checkpoints

A

Traces

Explanation
The Azure portal provides an integrated streaming log viewer that lets you view tracing events from your App Service apps in real time. Setting up this feature requires a few simple steps:

Write traces in your code
Enable Application Diagnostic Logs for your app
View the stream from the built-in Streaming Logs UI in the Azure portal.

30
Q

Which tool can copy blobs from one Azure Storage container to another container programmatically, and delete data from the source container once the copy is complete?

AzCopy

Azure Storage Data Movement Library

Azure Migrate

Azure Storage Explorer

A

Azure Storage Data Movement Library

Explanation
It is possible to effectively move blobs between containers programmatically using the Microsoft Azure storage data movement library. This library contains methods that can be added to a C# project that can copy data between containers as well as delete the blobs after the copy process has completed. To learn more about the Microsoft Azure storage data movement library refer to this URL (https://docs.microsoft.com/azure/storage/common/storage-use-data-movement-library).

31
Q

As the network engineer for a large investment firm, you have been asked to set up an Azure Redis Cache with one critical need. The application developers know that all database elements will likely be accessed with the same probability. Because it is important to select the right eviction policy depending on the access pattern of the application, what Maxmemory eviction policy setting should you choose?

noeviction
allkeys-lru
allkeys-random
volatile-random

A

allkeys-random

Explanation
The Azure Redis Maxmemory policy setting on the Azure Portal Advanced settings blade configures the memory policy for the cache. The exact behavior Redis follows when the maxmemory limit is reached is configured using the maxmemory-policy configuration directive. There are several directives available: noeviction, allkeys-lru, volatile-lru, allkeys-random, volatile-random and volatile-ttl. Redis recommends using the allkeys-random value if you have a cyclic access where all the keys are scanned continuously, or when you expect the distribution to be uniform (all elements likely accessed with the same probability).

32
Q

Which of the following is a recommendation when designing a solution that uses Azure Notification Hubs?

Use the same notification hub for production and test environments to save on costs.
For multi-tenant, each tenant should use the same hub.
For multi-tenant, each tenant should have a separate hub.
You can use the same notification hub for multiple mobile apps.

A

For multi-tenant, each tenant should have a separate hub.

Explanation
Below are the recommendations when it comes to designing a solution that uses notification hubs

Never share the same notification hub for production and test environments. This practice might cause problems when sending notifications
Use one notification hub per mobile app, per environment.

33
Q

You have recently launched a Python application with an Azure Cache for Redis. You want to store a string within your Python app titled Reference that reads “Filename: Critical_Doc; Last update 8/19/2019”. Which script will perform this operation?

result = r.set(“Reference”, “Filename: Critical_Doc; Last update 8/19/2019”)

print = r.insert(“String: Reference”, “Filename: Critical_Doc; Last update 8/19/2019”)

result = r.get(“String”, “Filename: Critical_Doc; Last update 8/19/2019”)

print = r.add(“Reference”, “Filename: Critical_Doc; Last update 8/19/2019”)

A

result = r.set(“Reference”, “Filename: Critical_Doc; Last update 8/19/2019”)

Explanation
The script result = r.set(“Reference”, “Filename: Critical_Doc; Last update 8/19/2019”) will store the string in your Azure Cache for Redis. You can retrieve it using the short script result = r.get(“Reference”).

34
Q

You have a microservice application hosted on Azure App Services named Azure Service Environment 1. The application communicates with on-premise database servers and data analysis applications. You need to find an effective monitoring solution to do the following:Monitor performance of Azure Service Environment 1 and the on-premise database servers.Provide alerts when communication between the on-premise database and Azure Service Environment 1 is disrupted.Provide quantitative data regarding customer usage.What Azure services or features within Azure App Service can meet all your requirements?

Azure Application Insights
Azure Monitor
Azure App Service Diagnostic Logs
Azure App Service Metrics

A

Azure Application Insights

Explanation
Application Insights can collect data from applications in Azure, running on-premise, or on other clouds. The integration with Azure Web Apps makes it exceptionally easy to use in Azure.

35
Q

As a feature of Azure Active Directory, Identity Protection offers each of the following capabilities except which one?

Reporting to help remove/reduce security risks

Automated detection of compromised user IDs

Enforce multi-factor authentication policy

Enable “just-in-time” role assignments

A

Enable “just-in-time” role assignments

Explanation
Identity Protection allows you to enforce MFA policy, automate detection of potentially compromised user credentials, and can create reports to help you identify and remove or mitigate security risks. However, identity project does not include ‘just-in-time’ role assignments. This is a feature of privileged identity management, a separate service offered through Azure Active Directory.

36
Q

You need to design and implement a function using Azure Functions to initiate order processing for your online website. Online orders from the client are processed via an Azure Storage Queue, and these order details need to be written into a CosmosDB database table. To create the necessary function in Azure Functions to complete this task, which function components will you need to configure? (Choose 2 answers)

An Azure App Service account connection

An Azure Storage Queue trigger

An Azure Cosmos DB output binding

An HTTP input binding

A

An Azure Storage Queue trigger
An Azure Cosmos DB output binding

Explanation
The Azure Queue trigger includes an input binding, so the HTTP input binding is not necessary. The data received from the queue is being written to the Cosmos DB table, so modifying a boilerplate Cosmos DB output binding accomplishes that task. When configuring the function, you need to create connections between Azure Functions and the other services and resources that the function will interact with, so connections must be created for the Azure Storage account that delivers the message via a queue, and for the Cosmos DB account that contains the table.

37
Q

You are configuring how your application responds to transient failures. Which set of statements regarding transient failure is correct?

An exponential backoff retry strategy is ideal for background operations. A regular interval retry strategy is ideal for customer-facing operations. Store retry strategy files and code centrally.

An immediate retry strategy is ideal for background operations. Use a finite number of retries instead of an endless number of retries. Fewer retries should be attempted for critical processes.

Implement duplicated layers of retries in your code. Minimize randomization in your retry strategy. Use a continuous number of retries instead of a finite number.

Log transient failures as errors. Hard code retry strategy into each layer of your application. Use a passive approach to retrying critical processes.

A

An exponential backoff retry strategy is ideal for background operations. A regular interval retry strategy is ideal for customer-facing operations. Store retry strategy files and code centrally.

Explanation
The correct choice is exponential backoff strategy for backend process, a regular interval strategy is ideal for client-facing operations, and files and code for retries should be stored centrally.

38
Q

Which feature within SQL Database would allow a user to group multiple databases with variable usage demands together while limiting the cost to the customer?

Shards

Elastic Clusters

Containers

Elastic Pools

A

Elastic Pools

Explanation
SQL Database elastic pools are a simple, cost-effective solution for managing and scaling multiple databases that have varying and unpredictable usage demands. The databases in an elastic pool are on a single Azure SQL Database server and share a set number of resources (elastic Database Transaction Units (eDTUs)) at a set price. Elastic pools in Azure SQL Database enable SaaS developers to optimize the price performance for a group of databases within a prescribed budget while delivering performance elasticity for each database.

39
Q

You have an Azure service plan hosting three Azure Web Apps, named Azure Web App 1, Azure Web App 2, and Azure Web App 3. Web App 1 is suddenly experiencing a complete outage that is affecting multiple deployment slots. You would like to stop the entire application.What effect can executing a stop command for Web App 1 application have in Azure App Service? (Choose 2 answers)

It can stop the VMs hosting the application.
It can stop all Web App 1 deployment slots.
It can stop Web App 1 entirely.
It can stop all Web Apps running on your App Service Plan.

A

It can stop all Web App 1 deployment slots.
It can stop Web App 1 entirely.

Explanation
There are commands to Stop and Restart the application. The underlying virtual machines are not stopped or restarted, so these commands do not impact other apps in the same App Service plan.

40
Q

Your company includes over 300 office employees and several employees who work remotely. All employees are registered within the company’s Azure Active Directory tenant. You want to enable MFA for all employees, but allow them to skip MFA when logging in under normal circumstances.Normal circumstances are as follows:Office employees login through Azure Active Directory over the company intranet.Remote employees login through point-to-site VPNs through devices registered with Azure AD Join.You completed the following configurations: With ADFS configured, Azure AD will recognize all users logging in from the office as federated users allowed to skip MFA. A conditional access policy to allow remote users logging in from Azure AD joined devices to skip MFA.Will this allow on-premise and remote employees to bypass MFA under normal circumstances?

Yes, it will allow on-premise and remote employees to bypass MFA.
No, remote employees will still need to complete MFA.
No, office employees will still need to complete MFA.
No, both office and remote employees will still need to complete MFA.

A

Yes, it will allow on-premise and remote employees to bypass MFA.

Explanation
Federated Trusted IPs would allow office workers to bypass MFA, and the conditional access policy will allow remote workers to login via devices joined to Azure AD.

41
Q

The following is a subsection of an ARM template to deploy a Windows VM. In order to create the network interface you need a public IP Address and a Virtual Network. Which of the answers below belong in the dependsOn array to accomplish that objective?…{“apiVersion”: “2016-03-30”,”type”: “Microsoft.Network/networkInterfaces”,”name”: “[variables(‘nicName’)]”,”location”: “[resourceGroup().location]”,”dependsOn”: [____FILL_IN_THE_BLANK____“[resourceId(‘Microsoft.Network/virtualNetworks/’, variables(‘virtualNetworkName’))]”],…

“[resourceId(‘Microsoft.Network/publicIPAddresses/’, variables(‘publicIPAddressName’))]”,
“[resourceId(‘Microsoft.Network/networkInterfaces/’, variables(‘nicName’))]”
“[reference(variables(‘publicIPAddressName’)).dnsSettings.fqdn]”
“[resourceId(‘Microsoft.Storage/storageAccounts/’, variables(‘storageAccountName’))]”,

A

“[resourceId(‘Microsoft.Network/publicIPAddresses/’, variables(‘publicIPAddressName’))]”,

Explanation
The dependsOn property of a resource will allow you to delay the creation of a resource until another exists.

42
Q

Below is a small section from an ARM template which creates a virtual machine in Azure. This section is used to assign a publicIPAddress to the network NIC associated with the Azure virtual machine. “type” : “Microsoft.Network/publicIPAddresses”, “name” : “Demonw” “properties” : { } Which of the choices below complete the properties section to assign a dynamic IP Address to the instance?

“publicIPAllocationMethod”: “Dynamic”
“IPAllocationMethod”: “Dynamic”
“StaticIPAllocationMethod”: “Dynamic”
“DynamicIPAllocationMethod”: “Dynamic”

A

“publicIPAllocationMethod”: “Dynamic”

Explanation
The publicIPAllocationMethod property can be used to assign either a dynamic or a static IP address to a virtual machine via the ARM template. If you set the property to “publicIPAllocationMethod”: “Dynamic” you will get a dynamically allocated IP address. If you set it to “publicIPAllocationMethod”: “Static” you will get a static allocated IP address.

43
Q

You want to assign a role-based access control (RBAC) role to a user in the Azure Portal. Consider the following steps listed below:
1. Select the user
2. Open Access Control (IAM) and select ‘Add Role Assignment’
3. Open Azure Resource Manager and select ‘Add Role Assignment’
4. Provide Reason for Assignment
5. Select the role
6. Save
7. Select Eligible or Permanent

Assuming you have the necessary permissions, which answer lists the necessary steps to assign an RBAC role to a user in the correct order?

2 - 1 - 5 - 6
3 - 1 - 5 - 6
2 - 1 - 5 - 7 - 6
2 - 1 - 5 - 7 - 4 - 6

A

2 - 1 - 5 - 6

Explanation
In Access control (IAM), you can Add permissions to the resources. To assign a role to a user, you simply select the desired Role, Assign access to an Azure AD user, group, or application, Select the user from the list, and click Save.

44
Q

When configuring Azure API Management, what are two specific benefits of authorization code OAuth grants? (Choose 2 answers)

Applications do not receive client credentials.

Access tokens are not transferred through the client’s browser.

The client’s access token is returned immediately.

Clients can obtain access tokens from a browserless devices.

A

Applications do not receive client credentials.
Access tokens are not transferred through the client’s browser.

Explanation
The method of transmitting requests between a browser, client, authorization server, and resource server allows the client to submit credentials to a trusted third party rather than the specific application handling the authorization request. It also avoids transmitting the access token through the client browser, which limits the potential risk of the client’s token being stolen/copied by a malicious user.

45
Q

You need to design an IoT solution that will allow you to collect data from thousands of IoT devices, at different times. You need to be able to hold the messages from the different devices for up to 5 days. Which of the following is the best option?

Azure Event Hubs (Basic)
Azure Event Hubs (Standard)
Azure Storage Queues
Service Bus Relay

A

Azure Event Hubs (Standard)

Explanation
The Standard Tier saves the messages for 1 day by default, however you can keep a message for up to 7 days for a fee.