AZ 204 questions Flashcards

1
Q

You have two Hyper-V hosts named Host1 and Host2. Host1 has an Azure virtual machine
named VM1 that was deployed by using a custom Azure Resource Manager template.
You need to move VM1 to Host2.
What should you do?
A. From the Update management blade, click Enable.
B. From the Overview blade, move VM1 to a different subscription.
C. From the Redeploy blade, click Redeploy.
D. From the Profile blade, modify the usage location.

A

Answer: C
Explanation:
When you redeploy a VM, it moves the VM to a new node within the Azure infrastructure and
then powers it back on, retaining all your configuration options and associated resources.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Your company has an Azure Kubernetes Service (AKS) cluster that you manage from an Azure
AD-joined device. The cluster is located in a resource group.
Developers have created an application named MyApp. MyApp was packaged into a container image.
You need to deploy the YAML manifest file for the application.
Solution: You install the Azure CLI on the device and run the
kubectl apply –fmyapp.yaml command.
Does this meet the goal?
A. Yes
B. No

A

Answer: A
Explanation:
kubectl apply -f myapp.yaml applies a configuration change to a resource from a file or stdin.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Your company has an Azure Kubernetes Service (AKS) cluster that you manage from an Azure
AD-joined device. The cluster is located in a resource group.
Developers have created an application named MyApp. MyApp was packaged into a container
image.
You need to deploy the YAML manifest file for the application.
Solution: You install the docker client on the device and run the docker run -it
microsoft/azure-cli:0.10.17 command.
Does this meet the goal?
A. Yes
B. No

A

Answer: B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Your company has a web app named WebApp1.
You use the WebJobs SDK to design a triggered App Service background task that automatically
invokes a function in the code every time new data is received in a queue.
You are preparing to configure the service processes a queue data item.
Which of the following is the service you should use?
A. Logic Apps
B. WebJobs
C. Flow
D. Functions

A

Answer: B
Usually you’ll host the WebJobs SDK in Azure WebJobs, but you can also run your jobs in a Worker Role. The Azure WebJobs feature of Azure Web Apps provides an easy way for you to run programs such as services or background tasks in a Web App…

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Your company has an Azure subscription.
You need to deploy a number of Azure virtual machines to the subscription by using Azure
Resource Manager (ARM) templates. The virtual machines will be included in a single
availability set.
You need to ensure that the ARM template allows for as many virtual machines as possible to
remain accessible in the event of fabric failure or maintenance.
Which of the following is the value that you should configure for the
platformFaultDomainCount property?
A. 10
B. 30
C. Min Value
D. Max Value

A

Answer: D
2 or 3 is max for a region so answer should be Max.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Your company has an Azure subscription.
You need to deploy a number of Azure virtual machines to the subscription by using Azure
Resource Manager (ARM) templates. The virtual machines will be included in a single
availability set.
You need to ensure that the ARM template allows for as many virtual machines as possible to
remain accessible in the event of fabric failure or maintenance.
Which of the following is the value that you should configure for the
platformUpdateDomainCount property?
A. 10
B. 20
C. 30
D. 40

A

Answer: B
Each availability set can be configured with up to three fault domains and twenty update domains.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

This question requires that you evaluate the underlined text to determine if it is correct.
You company has an on-premises deployment of MongoDB, and an Azure Cosmos DB account
that makes use of the MongoDB API.
You need to devise a strategy to migrate MongoDB to the Azure Cosmos DB account.
You include the Data Management Gateway tool in your migration strategy.
Instructions: Review the underlined text. If it makes the statement correct, select “No change required.”
If the statement is incorrect, select the answer choice that makes the statement correct.
A. No change required
B. mongorestore
C. Azure Storage Explorer
D. AzCopy

A

Answer: B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

You are developing an e-Commerce Web App.
You want to use Azure Key Vault to ensure that sign-ins to the e-Commerce Web App are secured by using Azure App Service authentication and Azure Active Directory (AAD).
What should you do on the e-Commerce Web App?
A. Run the az keyvault secret command.
B. Enable Azure AD Connect.
C. Enable Managed Service Identity (MSI).
D. Create an Azure AD service principal.

A

Answer: C
Explanation:
A managed identity from Azure Active Directory allows your app to easily access other AADprotected resources such as Azure Key Vault.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

This question requires that you evaluate the underlined text to determine if it is correct.
Your Azure Active Directory Azure (Azure AD) tenant has an Azure subscription linked to it.
Your developer has created a mobile application that obtains Azure AD access tokens using the OAuth 2 implicit grant type.
The mobile application must be registered in Azure AD.
You require a redirect URI from the developer for registration purposes.
Instructions: Review the underlined text. If it makes the statement correct, select “No change is needed.”
If the statement is incorrect, select the answer choice that makes the statement correct.
A. No change required.
B. a secret
C. a login hint
D. a client ID

A

Answer: A
Explanation:
For Native Applications you need to provide a Redirect URI, which Azure AD will use to return token responses.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

You are creating an Azure key vault using PowerShell. Objects deleted from the key vault must be kept for a set period of 90 days.
Which two of the following parameters must be used in conjunction to meet the requirement?
(Choose two.)
A. EnabledForDeployment
B. EnablePurgeProtection
C. EnabledForTemplateDeployment
D. EnableSoftDelete

A

Answer: BD
You’ll need to enable soft delete, and then purge protection to make sure that soft-deleted objects are not purged early.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

You manage an Azure SQL database that allows for Azure AD authentication.
You need to make sure that database developers can connect to the SQL database via Microsoft SQL Server Management Studio (SSMS). You also need to make sure the developers use their on-premises Active Directory account for authentication.
Your strategy should allow for authentication prompts to be kept to a minimum.
Which of the following should you implement?
A. Azure AD token.
B. Azure Multi-Factor authentication.
C. Active Directory integrated authentication.
D. OATH software tokens.

A

Answer: C
Explanation:
Azure AD can be the initial Azure AD managed domain. Azure AD can also be an on-premises
Active Directory Domain Services that is federated with the Azure AD.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

You are developing an application to transfer data between on-premises file servers and Azure
Blob storage. The application stores keys, secrets, and certificates in Azure Key Vault and makes
use of the Azure Key Vault APIs.
You want to configure the application to allow recovery of an accidental deletion of the key
vault or key vault objects for 90 days after deletion.
What should you do?
A. Run the Add-AzKeyVaultKey cmdlet.
B. Run the az keyvault update –enable-soft-delete true –enablepurge-protection true CLI.
C. Implement virtual network service endpoints for Azure Key Vault.
D. Run the az keyvault update –enable-soft-delete false CLI.

A

Answer: B
Explanation:
When soft-delete is enabled, resources marked as deleted resources are retained for a specified
period (90 days by default). The service further provides a mechanism for recovering the deleted
object, essentially undoing the deletion.
Purge protection is an optional Key Vault behavior and is not enabled by default. Purge
protection can only be enabled once soft-delete is enabled.
When purge protection is on, a vault or an object in the deleted state cannot be purged until the
retention period has passed. Soft-deleted vaults and objects can still be recovered, ensuring that
the retention policy will be followed.
The default retention period is 90 days, but it is possible to set the retention policy interval to a
value from 7 to 90 days through the Azure portal. Once the retention policy interval is set and
saved it cannot be changed for that vault.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

You are configuring a web app that delivers streaming video to users. The application makes use
of continuous integration and deployment.
You need to ensure that the application is highly available and that the users’ streaming
experience is constant. You also want to configure the application to store data in a geographic
location that is nearest to the user.
Solution: You include the use of Azure Redis Cache in your design.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

You are configuring a web app that delivers streaming video to users. The application makes use
of continuous integration and deployment.
You need to ensure that the application is highly available and that the users’ streaming
experience is constant. You also want to configure the application to store data in a geographic
location that is nearest to the user.
Solution: You include the use of an Azure Content Delivery Network (CDN) in your design.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

You are configuring a web app that delivers streaming video to users. The application makes use
of continuous integration and deployment.
You need to ensure that the application is highly available and that the users’ streaming
experience is constant. You also want to configure the application to store data in a geographic
location that is nearest to the user.
Solution: You include the use of a Storage Area Network (SAN) in your design.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

You develop a Web App on a tier D1 app service plan.
You notice that page load times increase during periods of peak traffic.
You want to implement automatic scaling when CPU load is above 80 percent. Your solution
must minimize costs.
What should you do first?
A. Enable autoscaling on the Web App.
B. Switch to the Premium App Service tier plan.
C. Switch to the Standard App Service tier plan.
D. Switch to the Azure App Services consumption plan.

A

Answer: C
Explanation:
Configure the web app to the Standard App Service Tier. The Standard tier supports autoscaling, and we should minimize the cost. We can then enable autoscaling on the web app, add a
scale rule and add a Scale condition.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Your company’s Azure subscription includes an Azure Log Analytics workspace.
Your company has a hundred on-premises servers that run either Windows Server 2012 R2 or
Windows Server 2016, and is linked to the Azure Log Analytics workspace. The Azure Log
Analytics workspace is set up to gather performance counters associated with security from these
linked servers.
You must configure alerts based on the information gathered by the Azure Log Analytics
workspace.
You have to make sure that alert rules allow for dimensions, and that alert creation time should
be kept to a minimum. Furthermore, a single alert notification must be created when the alert is
created and when the alert is resolved.
You need to make use of the necessary signal type when creating the alert rules.
Which of the following is the option you should use?
A. The Activity log signal type.
B. The Application Log signal type.
C. The Metric signal type.
D. The Audit Log signal type.

A

Answer: C
Explanation:
Metric alerts in Azure Monitor provide a way to get notified when one of your metrics cross a
threshold. Metric alerts work on a range of multi-dimensional platform metrics, custom metrics,
Application Insights standard and custom metrics.
Note: Signals are emitted by the target resource and can be of several types. Metric, Activity log,
Application Insights, and Log

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

You are developing a .NET Core MVC application that allows customers to research
independent holiday accommodation providers.
You want to implement Azure Search to allow the application to search the index by using
various criteria to locate documents related to accommodation.
You want the application to allow customers to search the index by using regular expressions.
What should you do?
A. Configure the SearchMode property of the SearchParameters class.
B. Configure the QueryType property of the SearchParameters class.
C. Configure the Facets property of the SearchParameters class.
D. Configure the Filter property of the SearchParameters class.

A

Answer: B
Explanation:
The SearchParameters.QueryType Property gets or sets a value that specifies the syntax of the
search query. The default is ‘simple’. Use ‘full’ if your query uses the Lucene query syntax.
You can write queries against Azure Search based on the rich Lucene Query Parser syntax for
specialized query forms: wildcard, fuzzy search, proximity search, regular expressions are a few
examples.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

You are a developer at your company.
You need to update the definitions for an existing Logic App.
What should you use?
A. the Enterprise Integration Pack (EIP)
B. the Logic App Code View
C. the API Connections
D. the Logic Apps Designer

A

Answer: B
Explanation:
Edit JSON - Azure portal
Sign in to the Azure portal.
From the left menu, choose All services. In the search box, find “logic apps”, and then from the
results, select your logic app.
On your logic app’s menu, under Development Tools, select Logic App Code View.
The Code View editor opens and shows your logic app definition in JSON format.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

You are developing a solution for a public facing API.
The API back end is hosted in an Azure App Service instance. You have implemented a
RESTful service for the API back end.
You must configure back-end authentication for the API Management service instance.
Solution: You configure Basic gateway credentials for the Azure resource.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Explanation:
API Management allows to secure access to the back-end service of an API using client
certificates.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

You are developing a solution for a public facing API.
The API back end is hosted in an Azure App Service instance. You have implemented a
RESTful service for the API back end.
You must configure back-end authentication for the API Management service instance.
Solution: You configure Client cert gateway credentials for the HTTP(s) endpoint.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: A
This is scenario questions.
If backend is accepts HTTP(S)
Then Basic AUTH or Certificate will work.
so Client Certificate + HTTP(s) YES

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

You are developing a solution for a public facing API.
The API back end is hosted in an Azure App Service instance. You have implemented a
RESTful service for the API back end.
You must configure back-end authentication for the API Management service instance.
Solution: You configure Basic gateway credentials for the HTTP(s) endpoint.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: A
This is scenario questions.
If backend is accepts HTTP(S)
Then Basic AUTH or Certificate will work.
so Basic + HTTPS Yes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

You are developing a solution for a public facing API.
The API back end is hosted in an Azure App Service instance. You have implemented a
RESTful service for the API back end.
You must configure back-end authentication for the API Management service instance.
Solution: You configure Client cert gateway credentials for the Azure resource.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
This is scenario questions.
If backend is accepts HTTP(S)
Then Basic AUTH or Certificate will work.
so Certificate + Azure Resource NO

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

You are developing a .NET Core MVC application that allows customers to research
independent holiday accommodation providers.
You want to implement Azure Search to allow the application to search the index by using
various criteria to locate documents related to accommodation venues.
You want the application to list holiday accommodation venues that fall within a specific price
range and are within a specified distance to an airport.
What should you do?
A. Configure the SearchMode property of the SearchParameters class.
B. Configure the QueryType property of the SearchParameters class.
C. Configure the Facets property of the SearchParameters class.
D. Configure the Filter property of the SearchParameters class.

A

Answer: D
Explanation:
The Filter property gets or sets the OData $filter expression to apply to the search query

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

You are a developer at your company.
You need to edit the workflows for an existing Logic App.
What should you use?
A. the Enterprise Integration Pack (EIP)
B. the Logic App Code View
C. the API Connections
D. the Logic Apps Designer

A

Answer: D
For definitions use the Code View, for the Workflows use the Designer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

You are developing an application that applies a set of governance policies for internal and
external services, as well as for applications.
You develop a stateful ASP.NET Core 2.1 web application named PolicyApp and deploy it to an
Azure App Service Web App. The PolicyApp reacts to events from Azure Event Grid and
performs policy actions based on those events.
You have the following requirements:
Authentication events must be used to monitor users when they sign in and sign out.
All authentication events must be processed by PolicyApp.
Sign outs must be processed as fast as possible.
What should you do?
A. Create a new Azure Event Grid subscription for all authentication events. Use the
subscription to process sign-out events.
B. Create a separate Azure Event Grid handler for sign-in and sign-out events.
C. Create separate Azure Event Grid topics and subscriptions for sign-in and sign-out events.
D. Add a subject prefix to sign-out events. Create an Azure Event Grid subscription. Configure
the subscription to use the subjectBeginsWith filter.

A

Answer: C
Only C is mentioned both topic and subscription, which are two critical parts for event grid

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

You develop a software as a service (SaaS) offering to manage photographs. Users upload
photos to a web service which then stores the photos in Azure Storage Blob storage. The storage
account type is General-purpose V2.
When photos are uploaded, they must be processed to produce and save a mobile-friendly
version of the image. The process to produce a mobile-friendly version of the image must start in
less than one minute.
You need to design the process that starts the photo processing.
Solution: Trigger the photo processing from Blob storage events.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
The answer (B) is correct. Because, the trick is in the “less than one minute” detail.
You can read about “..10-minute delay in processing new blobs..” in “3-Minimizing latency” description.
Microsoft says: “…..Use Event Grid instead of the Blob storage trigger for the following scenarios:”
1-Blob-only storage accounts: Blob-only storage accounts are supported for blob input and output bindings but not for blob triggers.
2-High-scale: High scale can be loosely defined as containers that have more than 100,000 blobs in them or storage accounts that have more than 100 blob updates per second.
3-Minimizing latency: If your function app is on the Consumption plan, there can be up to a ##10-minute delay in processing new blobs## if a function app has gone idle. To avoid this latency, you can switch to an App Service plan with Always On enabled. You can also use an Event Grid trigger with your Blob storage account. For an example, see the Event Grid tutorial.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

You develop and deploy an Azure App Service API app to a Windows-hosted deployment slot
named Development. You create additional deployment slots named Testing and Production.
You enable auto swap on the Production deployment slot.
You need to ensure that scripts run and resources are available before a swap operation occurs.
Solution: Update the web.config file to include the applicationInitialization configuration
element. Specify custom initialization actions to run the scripts.
Does the solution meet the goal?
A. No
B. Yes

A

Answer: B
Correct answer must be B (applicationinitialization tag is way of implementing custom warm-up)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

You develop and deploy an Azure App Service API app to a Windows-hosted deployment slot
named Development. You create additional deployment slots named Testing and Production.
You enable auto swap on the Production deployment slot.
You need to ensure that scripts run and resources are available before a swap operation occurs.
Solution: Enable auto swap for the Testing slot. Deploy the app to the Testing slot.
Does the solution meet the goal?
A. No
B. Yes

A

Answer: A
I vote A, No, because for me the solution is updating the web.config file to include the applicationInitialization configuration element.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

You develop and deploy an Azure App Service API app to a Windows-hosted deployment slot
named Development. You create additional deployment slots named Testing and Production.
You enable auto swap on the Production deployment slot.
You need to ensure that scripts run and resources are available before a swap operation occurs.
Solution: Disable auto swap. Update the app with a method named statuscheck to run the scripts.
Re-enable auto swap and deploy the app to the Production slot.
Does the solution meet the goal?
A. No
B. Yes

A

Answer: A
Instead, use applicationInitialization

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

You develop a software as a service (SaaS) offering to manage photographs. Users upload
photos to a web service which then stores the photos in Azure Storage Blob storage. The storage
account type is General-purpose V2.
When photos are uploaded, they must be processed to produce and save a mobile-friendly
version of the image. The process to produce a mobile-friendly version of the image must start in
less than one minute.
You need to design the process that starts the photo processing.
Solution: Convert the Azure Storage account to a BlockBlobStorage storage account.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Explanation:
Not necessary to convert the account, instead move photo processing to an Azure Function
triggered from the blob upload.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

You develop a website. You plan to host the website in Azure. You expect the website to
experience high traffic volumes after it is published.
You must ensure that the website remains available and responsive while minimizing cost.
You need to deploy the website.
What should you do?
A. Deploy the website to a virtual machine. Configure the virtual machine to automatically scale
when the CPU load is high.
B. Deploy the website to an App Service that uses the Shared service tier. Configure the App
Service plan to automatically scale when the CPU load is high.
C. Deploy the website to a virtual machine. Configure a Scale Set to increase the virtual machine
instance count when the CPU load is high.
D. Deploy the website to an App Service that uses the Standard service tier. Configure the App
Service plan to automatically scale when the CPU load is high.

A

Answer: D
Explanation:
Windows Azure Web Sites (WAWS) offers 3 modes: Standard, Free, and Shared.
Standard mode carries an enterprise-grade SLA (Service Level Agreement) of 99.9% monthly,
even for sites with just one instance.
Standard mode runs on dedicated instances, making it different from the other ways to buy
Windows Azure Web Sites.
Incorrect Answers:
B: Shared and Free modes do not offer the scaling flexibility of Standard, and they have some
important limits.
Shared mode, just as the name states, also uses shared Compute resources, and also has a CPU
limit. So, while neither Free nor Shared is likely to be the best choice for your production
environment due to these limits.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

You develop an HTTP triggered Azure Function app to process Azure Storage blob data. The
app is triggered using an output binding on the blob.
The app continues to time out after four minutes. The app must process the blob data.
You need to ensure the app does not time out and processes the blob data.
Solution: Use the Durable Function async pattern to process the blob data.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: A
“230 seconds is the maximum amount of time[…] For longer processing times, consider using the DURABLE FUNCTIONS ASYNC PATTERN[…]”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

You develop an HTTP triggered Azure Function app to process Azure Storage blob data. The
app is triggered using an output binding on the blob.
The app continues to time out after four minutes. The app must process the blob data.
You need to ensure the app does not time out and processes the blob data.
Solution: Pass the HTTP trigger payload into an Azure Service Bus queue to be processed by a
queue trigger function and return an immediate HTTP success response.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: A
Yes, the solution meets the goal. By passing the HTTP trigger payload into an Azure Service Bus queue to be processed by a queue trigger function and returning an immediate HTTP success response, you can address the timeout issue and ensure that the blob data is processed without timing out.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

You develop an HTTP triggered Azure Function app to process Azure Storage blob data. The
app is triggered using an output binding on the blob.
The app continues to time out after four minutes. The app must process the blob data.
You need to ensure the app does not time out and processes the blob data.
Solution: Configure the app to use an App Service hosting plan and enable the Always On
setting.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Explanation:
Instead pass the HTTP trigger payload into an Azure Service Bus queue to be processed by a
queue trigger function and return an immediate HTTP success response.
Note: Large, long-running functions can cause unexpected timeout issues. General best practices
include:
Whenever possible, refactor large functions into smaller function sets that work together and
return responses fast. For example, a webhook or HTTP trigger function might require an
acknowledgment response within a certain time limit; it’s common for webhooks to require an
immediate response. You can pass the HTTP trigger payload into a queue to be processed by a
queue trigger function. This approach lets you defer the actual work and return an immediate
response.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

You develop a software as a service (SaaS) offering to manage photographs. Users upload
photos to a web service which then stores the photos in Azure Storage Blob storage. The storage
account type is General-purpose V2.
When photos are uploaded, they must be processed to produce and save a mobile-friendly
version of the image. The process to produce a mobile-friendly version of the image must start in
less than one minute.
You need to design the process that starts the photo processing.
Solution: Move photo processing to an Azure Function triggered from the blob upload.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
blob storage event doesn’t guarantee an SLA.
you cannot control the event arrival in less than a minute.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

You are developing an application that uses Azure Blob storage.
The application must read the transaction logs of all the changes that occur to the blobs and the
blob metadata in the storage account for auditing purposes. The changes must be in the order in
which they occurred, include only create, update, delete, and copy operations and be retained for
compliance reasons.
You need to process the transaction logs asynchronously.
What should you do?
A. Process all Azure Blob storage events by using Azure Event Grid with a subscriber Azure
Function app.
B. Enable the change feed on the storage account and process all changes for available events.
C. Process all Azure Storage Analytics logs for successful blob events.
D. Use the Azure Monitor HTTP Data Collector API and scan the request body for successful
blob events.

A

Answer: B
Explanation:
Change feed support in Azure Blob Storage
The purpose of the change feed is to provide transaction logs of all the changes that occur to the
blobs and the blob metadata in your storage account. The change feed provides ordered,
guaranteed, durable, immutable, read-only log of these changes. Client applications can read
these logs at any time, either in streaming or in batch mode. The change feed enables you to
build efficient and scalable solutions that process change events that occur in your Blob Storage
account at a low cost.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

You are developing an Azure Function App that processes images that are uploaded to an Azure
Blob container.
Images must be processed as quickly as possible after they are uploaded, and the solution must
minimize latency. You create code to process images when the Function App is triggered.
You need to configure the Function App.
What should you do?
A. Use an App Service plan. Configure the Function App to use an Azure Blob Storage input
trigger.
B. Use a Consumption plan. Configure the Function App to use an Azure Blob Storage trigger.
C. Use a Consumption plan. Configure the Function App to use a Timer trigger.
D. Use an App Service plan. Configure the Function App to use an Azure Blob Storage trigger.
E. Use a Consumption plan. Configure the Function App to use an Azure Blob Storage input
trigger.

A

Answer: D
The answer is D. Use an App Service plan. Configure the Function App to use an Azure Blob Storage trigger.
Consumption plan can cause a 10-min delay in processing new blobs if a function app has gone idle. To avoid this latency, you can switch to an App Service plan with Always On enabled.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

You are preparing to deploy a website to an Azure Web App from a GitHub repository. The
website includes static content generated by a script.
You plan to use the Azure Web App continuous deployment feature.
You need to run the static generation script before the website starts serving traffic.
What are two possible ways to achieve this goal? Each correct answer presents a complete
solution.
NOTE: Each correct selection is worth one point.
A. Add the path to the static content generation tool to WEBSITE_RUN_FROM_PACKAGE
setting in the host.json file.
B. Add a PreBuild target in the websites csproj project file that runs the static content generation
script.
C. Create a file named run.cmd in the folder /run that calls a script which generates the static
content and deploys the website.
D. Create a file named .deployment in the root of the repository that calls a script which
generates the static content and deploys the website.

A

Answer: B,D
Option B is correct because you can use the PreBuild target in the csproj file to execute a custom command or script before the project is built. This way, you can run the static content generation script and include the generated files in the project output.
Option D is correct because you can use the .deployment file in the root of the repository to customize the deployment process and specify a custom deployment script. This way, you can run the static content generation script and deploy the website using the custom script.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

You develop a software as a service (SaaS) offering to manage photographs. Users upload
photos to a web service which then stores the photos in Azure Storage Blob storage. The storage
account type is General-purpose V2.
When photos are uploaded, they must be processed to produce and save a mobile-friendly
version of the image. The process to produce a mobile-friendly version of the image must start in
less than one minute.
You need to design the process that starts the photo processing.
Solution: Create an Azure Function app that uses the Consumption hosting model and that is
triggered from the blob upload.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Answer should be “No”. Consumption plan can take up to several minutes to trigger the function.
“When your function app runs in the default Consumption plan, there may be a delay of up to several minutes between the blob being added or updated and the function being triggered. If you need low latency in your blob triggered functions, consider running your function app in an App Service plan.”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

You develop and deploy an Azure App Service API app to a Windows-hosted deployment slot
named Development. You create additional deployment slots named Testing and Production.
You enable auto swap on the Production deployment slot.
You need to ensure that scripts run and resources are available before a swap operation occurs.
Solution: Update the app with a method named statuscheck to run the scripts. Update the app
settings for the app. Set the WEBSITE_SWAP_WARMUP_PING_PATH and
WEBSITE_SWAP_WARMUP_PING_STATUSES with a path to the new method and
appropriate response codes.
Does the solution meet the goal?
A. No
B. Yes

A

Answer: B
Should be YES?
You can also customize the warm-up behavior with one or both of the following app settings:
WEBSITE_SWAP_WARMUP_PING_PATH: The path to ping to warm up your site. Add this app setting by specifying a custom path that begins with a slash as the value. An example is /statuscheck. The default value is /.
WEBSITE_SWAP_WARMUP_PING_STATUSES: Valid HTTP response codes for the warm-up operation. Add this app setting with a comma-separated list of HTTP codes. An example is 200,202 . If the returned status code isn’t in the list, the warmup and swap operations are stopped. By default, all response codes are valid.
WEBSITE_WARMUP_PATH: A relative path on the site that should be pinged whenever the site restarts (not only during slot swaps). Example values include /statuscheck or the root path, /.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

You are developing a web app that is protected by Azure Web Application Firewall (WAF). All
traffic to the web app is routed through an Azure Application Gateway instance that is used by
multiple web apps. The web app address is contoso.azurewebsites.net.
All traffic must be secured with SSL. The Azure Application Gateway instance is used by
multiple web apps.
You need to configure the Azure Application Gateway for the web app.
Which two actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. In the Azure Application Gateway’s HTTP setting, enable the Use for App service setting.
B. Convert the web app to run in an Azure App service environment (ASE).
C. Add an authentication certificate for contoso.azurewebsites.net to the Azure Application
Gateway.
D. In the Azure Application Gateway’s HTTP setting, set the value of the Override backend path
option to contoso22.azurewebsites.net.

A

Answer: AD
Explanation:
D: The ability to specify a host override is defined in the HTTP settings and can be applied to
any back-end pool during rule creation.
The ability to derive the host name from the IP or FQDN of the back-end pool members. HTTP
settings also provide an option to dynamically pick the host name from a back-end pool
member’s FQDN if configured with the option to derive host name from an individual back-end
pool member.
A (not C): SSL termination and end to end SSL with multi-tenant services.
In case of end to end SSL, trusted Azure services such as Azure App service web apps do not
require whitelisting the backends in the application gateway. Therefore, there is no need to add
any authentication certificates.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

You develop a software as a service (SaaS) offering to manage photographs. Users upload
photos to a web service which then stores the photos in Azure Storage Blob storage. The storage
account type is General-purpose V2.
When photos are uploaded, they must be processed to produce and save a mobile-friendly
version of the image. The process to produce a mobile-friendly version of the image must start in
less than one minute.
You need to design the process that starts the photo processing.
Solution: Use the Azure Blob Storage change feed to trigger photo processing.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Explanation:
The change feed is a log of changes that are organized into hourly segments but appended to and
updated every few minutes. These segments are created only when there are blob change events
that occur in that hour.
Instead catch the triggered event, so move the photo processing to an Azure Function triggered
from the blob upload.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
44
Q

You are developing a web application that runs as an Azure Web App. The web application
stores data in Azure SQL Database and stores files in an Azure Storage account. The web
application makes HTTP requests to external services as part of normal operations.
The web application is instrumented with Application Insights. The external services are
OpenTelemetry compliant.
You need to ensure that the customer ID of the signed in user is associated with all operations
throughout the overall system.
What should you do?
A. Add the customer ID for the signed in user to the CorrelationContext in the web application
B. On the current SpanContext, set the TraceId to the customer ID for the signed in user
C. Set the header Ocp-Apim-Trace to the customer ID for the signed in user
D. Create a new SpanContext with the TraceFlags value set to the customer ID for the signed in
user

A

Answer: A
“I would choose option A, Add the customer ID for the signed in user to the CorrelationContext in the web application.

The CorrelationContext is a way to associate contextual information with a request as it flows through the system. It allows you to track a request as it passes through different components of the system, and to identify related log entries and telemetry data. By adding the customer ID to the CorrelationContext in the web application, you can ensure that it is associated with all operations throughout the overall system. This will allow you to track the request and identify related log entries and telemetry data for a specific customer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
45
Q

You develop an HTTP triggered Azure Function app to process Azure Storage blob data. The
app is triggered using an output binding on the blob.
The app continues to time out after four minutes. The app must process the blob data.
You need to ensure the app does not time out and processes the blob data.
Solution: Update the functionTimeout property of the host.json project file to 10 minutes.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Explanation:
Instead pass the HTTP trigger payload into an Azure Service Bus queue to be processed by a
queue trigger function and return an immediate HTTP success response.
Note: Large, long-running functions can cause unexpected timeout issues. General best practices
include:
Whenever possible, refactor large functions into smaller function sets that work together and
return responses fast. For example, a webhook or HTTP trigger function might require an
acknowledgment response within a certain time limit; it’s common for webhooks to require an
immediate response. You can pass the HTTP trigger payload into a queue to be processed by a
queue trigger function. This approach lets you defer the actual work and return an immediate
response.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
46
Q

You are developing an Azure Durable Function to manage an online ordering process.
The process must call an external API to gather product discount information.
You need to implement the Azure Durable Function.
Which Azure Durable Function types should you use? Each correct answer presents part of the
solution.
NOTE: Each correct selection is worth one point.
A. Orchestrator
B. Entity
C. Client
D. Activity

A

Answer: A, D
“ Like orchestrator functions, entity functions are functions with a special trigger type, entity trigger.” - u can not call Entity from Orchestrator… right answer is Orchestrator and Activity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
47
Q

You develop Azure Durable Functions to manage vehicle loans.
The loan process includes multiple actions that must be run in a specified order. One of the
actions includes a customer credit check process, which may require multiple days to process.
You need to implement Azure Durable Functions for the loan process.
Which Azure Durable Functions type should you use?
A. orchestrator
B. client
C. entity
D. activity

A

Answer: A
Explanation:
Durable Functions is an extension of Azure Functions. You can use an orchestrator function to
orchestrate the execution of other Durable functions within a function app. Orchestrator
functions have the following characteristics:
Orchestrator functions define function workflows using procedural code. No declarative schemas
or designers are needed.
Orchestrator functions can call other durable functions synchronously and asynchronously.
Output from called functions can be reliably saved to local variables.
Orchestrator functions are durable and reliable. Execution progress is automatically
checkpointed when the function “awaits” or “yields”. Local state is never lost when the process
recycles or the VM reboots.
Orchestrator functions can be long-running. The total lifespan of an orchestration instance can be
seconds, days, months, or never-ending.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
48
Q

You develop Azure Web Apps for a commercial diving company. Regulations require that all
divers fill out a health questionnaire every 15 days after each diving job starts.
You need to configure the Azure Web Apps so that the instance count scales up when divers are
filling out the questionnaire and scales down after they are complete.
You need to configure autoscaling.
What are two possible auto scaling configurations to achieve this goal? Each correct answer
presents a complete solution.
NOTE: Each correct selection is worth one point.
A. Recurrence profile
B. CPU usage-based autoscaling
C. Fixed date profile
D. Predictive autoscaling

A

Answer: B,D
I think it should
B. CPU usage-based autoscaling
D. Predictive autoscaling

A. Recurrence profile is used to schedule the scaling of resources at specific times or dates, but it does not meet the requirement to scale up when divers are filling out the questionnaire and scale down after they are complete. It only triggers scaling based on a set schedule, not based on actual usage.

C. Fixed date profile is used to specify the number of instances at a specific date and time, but it also does not meet the requirement to dynamically scale based on actual usage. It only sets a fixed number of instances and does not adjust based on changing workloads.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
49
Q

You are building a website that uses Azure Blob storage for data storage. You configure Azure
Blob storage lifecycle to move all blobs to the archive tier after 30 days.
Customers have requested a service-level agreement (SLA) for viewing data older than 30 days.
You need to document the minimum SLA for data recovery.
Which SLA should you use?
A. at least two days
B. between one and 15 hours
C. at least one day
D. between zero and 60 minutes

A

Answer: B
Explanation:
The archive access tier has the lowest storage cost. But it has higher data retrieval costs
compared to the hot and cool tiers. Data in the archive tier can take several hours to retrieve
depending on the priority of the rehydration. For small objects, a high priority rehydrate may
retrieve the object from archive in under 1 hour.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
50
Q

You are developing an Azure solution to collect point-of-sale (POS) device data from
2,000 stores located throughout the world. A single device can produce 2 megabytes (MB) of
data every 24 hours. Each store location has one to five devices that send data.
You must store the device data in Azure Blob storage. Device data must be correlated based on a
device identifier. Additional stores are expected to open in the future.
You need to implement a solution to receive the device data.
Solution: Provision an Azure Event Grid. Configure the machine identifier as the partition key
and enable capture.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: A
Partitions - 2000 per CU - Dedicated Plan - So we can have 2K+ partitions in Event Hub
Size allowed - Yes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
51
Q

You develop Azure solutions.
A .NET application needs to receive a message each time an Azure virtual machine finishes
processing data. The messages must NOT persist after being processed by the receiving
application.
You need to implement the .NET object that will receive the messages.
Which object should you use?
A. QueueClient
B. SubscriptionClient
C. TopicClient
D. CloudQueueClient

A

Answer: A

Azure.Storage.Queues.QueueClient: .NET v12
Azure.Storage.Queues.CloudQueueClient: .NET v11 (Legacy)

So, the question is really about what kind of queue message tool you should use. And the key word here is that “message must NOT persist after being processed”.

Azure.Storage.Queues.QueueClient supports “At-Most-Once” deliver mode, while Azure.Storage.Queues.CloudQueueClient doesn’t.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
52
Q

You develop Azure solutions.
You must connect to a No-SQL globally-distributed database by using the .NET API.
You need to create an object to configure and execute requests in the database.
Which code segment should you use?
A. new Container(EndpointUri, PrimaryKey);
B. new Database(EndpointUri, PrimaryKey);
C. new CosmosClient(EndpointUri, PrimaryKey);

A

Answer: C
Explanation:
Example:
// Create a new instance of the Cosmos Client
this.cosmosClient = new CosmosClient(EndpointUri, PrimaryKey)
//ADD THIS PART TO YOUR CODE
await this.CreateDatabaseAsync();

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
53
Q

You have an existing Azure storage account that stores large volumes of data across multiple
containers.
You need to copy all data from the existing storage account to a new storage account. The copy
process must meet the following requirements:
 Automate data movement.
 Minimize user input required to perform the operation.
 Ensure that the data movement process is recoverable.
What should you use?
A. AzCopy
B. Azure Storage Explorer
C. Azure portal
D. .NET Storage Client Library

A

Answer: A
Explanation:
You can copy blobs, directories, and containers between storage accounts by using the AzCopy
v10 command-line utility.
The copy operation is synchronous so when the command returns, that indicates that all files
have been copied.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
54
Q

You are developing an Azure Cosmos DB solution by using the Azure Cosmos DB SQL API.
The data includes millions of documents. Each document may contain hundreds of properties.
The properties of the documents do not contain distinct values for partitioning. Azure Cosmos
DB must scale individual containers in the database to meet the performance needs of the
application by spreading the workload evenly across all partitions over time.
You need to select a partition key.
Which two partition keys can you use? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. a single property value that does not appear frequently in the documents
B. a value containing the collection name
C. a single property value that appears frequently in the documents
D. a concatenation of multiple property values with a random suffix appended
E. a hash suffix appended to a property value

A

Answer: D,E
Explanation:
You can form a partition key by concatenating multiple property values into a single artificial
partitionKey property. These keys are referred to as synthetic keys.
Another possible strategy to distribute the workload more evenly is to append a random number
at the end of the partition key value. When you distribute items in this way, you can perform
parallel write operations across partitions.
Note: It’s the best practice to have a partition key with many distinct values, such as hundreds or
thousands. The goal is to distribute your data and workload evenly across the items associated
with these partition key values. If such a property doesn’t exist in your data, you can construct a
synthetic partition key.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
55
Q

You develop and deploy a web application to Azure App Service. The application accesses data
stored in an Azure Storage account. The account contains several containers with several blobs
with large amounts of data. You deploy all Azure resources to a single region.
You need to move the Azure Storage account to the new region. You must copy all data to the
new region.
What should you do first?
A. Export the Azure Storage account Azure Resource Manager template
B. Initiate a storage account failover
C. Configure object replication for all blobs
D. Use the AzCopy command line tool
E. Create a new Azure Storage account in the current region
F. Create a new subscription in the current region

A

Answer: A
Explanation:
To move a storage account, create a copy of your storage account in another region. Then, move
your data to that account by using AzCopy, or another tool of your choice and finally, delete the
resources in the source region.
To get started, export, and then modify a Resource Manager template

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
56
Q

An organization deploys Azure Cosmos DB.
You need to ensure that the index is updated as items are created, updated, or deleted.
What should you do?
A. Set the indexing mode to Lazy.
B. Set the value of the automatic property of the indexing policy to False.
C. Set the value of the EnableScanInQuery option to True.
D. Set the indexing mode to Consistent.

A

Answer: D
Explanation:
Azure Cosmos DB supports two indexing modes:
Consistent: The index is updated synchronously as you create, update or delete items. This
means that the consistency of your read queries will be the consistency configured for the
account.
None: Indexing is disabled on the container.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
57
Q

You are developing a .Net web application that stores data in Azure Cosmos DB. The application
must use the Core API and allow millions of reads and writes. The Azure Cosmos DB account
has been created with multiple write regions enabled. The application has been deployed to the
East US2 and Central US regions.
You need to update the application to support multi-region writes.
What are two possible ways to achieve this goal? Each correct answer presents part of the
solution.
NOTE: Each correct selection is worth one point.
A. Update the ConnectionPolicy class for the Cosmos client and populate the PreferredLocations
property based on the geo-proximity of the application.
B. Update Azure Cosmos DB to use the Strong consistency level. Add indexed properties to the
container to indicate region.
C. Update the ConnectionPolicy class for the Cosmos client and set the
UseMultipleWriteLocations property to true.
D. Create and deploy a custom conflict resolution policy.
E. Update Azure Cosmos DB to use the Session consistency level. Send the SessionToken
property value from the FeedResponse object of the write action to the end-user by using a
cookie.

A

Answer: A,C
The goal is

“You need to update the application to support multi-region writes”,

that is enable multi-region writes (bool, option C) and add the regions (option A)
Then you have to apply the Conflict resolution policies.This can be LLW(default, not mentioned) or custom (option D).

Hence : there is only ONE way to to support multi-region writes (both apply C AND A) and there are subsequently TWO ways to apply the Conflict resolution policies (@ SQL) to solve write, update and delete conflicts of which one is mentioned in the question (D).
To support multi-region writes I would answer A and C , but they have to be set both, not one or the other.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
58
Q

You are developing an application to store business-critical data in Azure Blob storage.
The application must meet the following requirements:
 Data must not be modified or deleted for a user-specified interval.
 Data must be protected from overwrites and deletes.
 Data must be written once and allowed to be read many times.
You need to protect the data in the Azure Blob storage account.
Which two actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Configure a time-based retention policy for the storage account.
B. Create an account shared-access signature (SAS).
C. Enable the blob change feed for the storage account.
D. Enable version-level immutability support for the storage account.
E. Enable point-in-time restore for containers in the storage account.
F. Create a service shared-access signature (SAS).

A

Answer: A,D
A. Configure a time-based retention policy for the storage account
- A time-based retention policy stores blob data in a Write-Once, Read-Many (WORM) format for a specified interval. When a time-based retention policy is set, clients can create and read blobs, but can’t modify or delete them. After the retention interval has expired, blobs can be deleted but not overwritten.
D. Before you can apply a time-based retention policy to a blob version, you must enable support for version-level immutability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
59
Q

You are updating an application that stores data on Azure and uses Azure Cosmos DB for
storage. The application stores data in multiple documents associated with a single username.
The application requires the ability to update multiple documents for a username in a single
ACID operation.
You need to configure Azure Cosmos DB.
Which two actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Create a collection sharded on username to store documents.
B. Configure Azure Cosmos DB to use the Gremlin API.
C. Create an unsharded collection to store documents.
D. Configure Azure Cosmos DB to use the MongoDB API.

A

Answer: CD
Explanation:
C: Multi-document transactions, Requirements
Multi-document transactions are supported within an unsharded collection in API version 4.0.
Multi-document transactions are not supported across collections or in sharded collections in 4.0.
D: In Azure Cosmos DB for MongoDB, operations on a single document are atomic. Multidocument transactions enable applications to execute atomic operations across multiple
documents. It offers “all-or-nothing” semantics to the operations. On commit, the changes made
inside the transactions are persisted and if the transaction fails, all changes inside the transaction
are discarded.
Multi-document transactions follow ACID semantics:
Atomicity: All operations treated as one
Consistency: Data committed is valid
Isolation: Isolated from other operations
Durability: Transaction data is persisted when client is told so

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
60
Q

You develop Azure solutions.
You must connect to a No-SQL globally-distributed database by using the .NET API.
You need to create an object to configure and execute requests in the database.
Which code segment should you use?
A.
database_name = ‘MyDatabase’
database =
client.create_database_if_not_exists(id=database_name)
B.
client = CosmosClient(endpoint, key)
C.
container_name = ‘MyContainer’
container = database.create_container_if_not_exists(
id=container_name, partition_key=PartitionKey(path=”/lastName”),
offer_throughput=400 )

A

Answer: B
CosmosClient has to be created before you can do option A and C to create databases and execute requests.

client = CosmosClient(endpoint, key)
database_name = ‘MyDatabase’
database = client.create_database_if_not_exists(id=database_name)
container_name = ‘MyContainer’
container = database.create_container_if_not_exists(
id=container_name, partition_key=PartitionKey(path=”/lastName”), offer_throughput=400 )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
61
Q

You develop a web application that provides access to legal documents that are stored on Azure
Blob Storage with version-level immutability policies. Documents are protected with both timebased policies and legal hold policies. All time-based retention policies have the
AllowProtectedAppendWrites property enabled.
You have a requirement to prevent the user from attempting to perform operations that would
fail only when a legal hold is in effect and when all other policies are expired.
You need to meet the requirement.
Which two operations should you prevent? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. adding data to documents
B. deleting documents
C. creating documents
D. overwriting existing documents

A

Answer: BD
Explanation:
The Append Block operation is permitted only for policies with the
allowProtectedAppendWrites or allowProtectedAppendWritesAll property enabled.
The AllowProtectedAppendWrites property setting allows for writing new blocks to an append
blob while maintaining immutability protection and compliance. If this setting is enabled, you
can create an append blob directly in the policy-protected container, and then continue to add
new blocks of data to the end of the append blob with the Append Block operation. Only new
blocks can be added; any existing blocks can’t be modified or deleted. Enabling this setting
doesn’t affect the immutability behavior of block blobs or page blobs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
62
Q

You are developing a Java application that uses Cassandra to store key and value data. You plan
to use a new Azure Cosmos DB resource and the Cassandra API in the application. You create
an Azure Active Directory (Azure AD) group named Cosmos DB Creators to enable
provisioning of Azure Cosmos accounts, databases, and containers.
The Azure AD group must not be able to access the keys that are required to access the data.
You need to restrict access to the Azure AD group.
Which role-based access control should you use?
A. DocumentDB Accounts Contributor
B. Cosmos Backup Operator
C. Cosmos DB Operator
D. Cosmos DB Account Reader

A

Answer: C
Explanation:
Azure Cosmos DB now provides a new RBAC role, Cosmos DB Operator. This new role lets
you provision Azure Cosmos accounts, databases, and containers, but can’t access the keys that
are required to access the data. This role is intended for use in scenarios where the ability to
grant access to Azure Active Directory service principals to manage deployment operations for
Cosmos DB is needed, including the account, database, and containers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
63
Q

You are developing a website that will run as an Azure Web App. Users will authenticate by
using their Azure Active Directory (Azure AD) credentials.
You plan to assign users one of the following permission levels for the website: admin, normal,
and reader. A user’s Azure AD group membership must be used to determine the permission
level.
You need to configure authorization.
Solution: Configure the Azure Web App for the website to allow only authenticated requests and
require Azure AD log on.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Explanation:
Instead in the Azure AD application’s manifest, set value of the groupMembershipClaims option
to All. In the website, use the value of the groups claim from the JWT for the user to determine
permissions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
64
Q

You are developing a website that will run as an Azure Web App. Users will authenticate by
using their Azure Active Directory (Azure AD) credentials.
You plan to assign users one of the following permission levels for the website: admin, normal,
and reader. A user’s Azure AD group membership must be used to determine the permission
level.
You need to configure authorization.
Solution:
 Create a new Azure AD application. In the application’s manifest, set value of the
groupMembershipClaims option to All.
 In the website, use the value of the groups claim from the JWT for the user to determine
permissions.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: A
Explanation:
To configure Manifest to include Group Claims in Auth Token
1. Go to Azure Active Directory to configure the Manifest. Click on Azure Active
Directory, and go to App registrations to find your application:
2. Click on your application (or search for it if you have a lot of apps) and edit the Manifest
by clicking on it.
3. Locate the “groupMembershipClaims” setting. Set its value to either “SecurityGroup” or
“All”. To help you decide which:
 “SecurityGroup” - groups claim will contain the identifiers of all security groups
of which the user is a member.
 “All” - groups claim will contain the identifiers of all security groups and all
distribution lists of which the user is a member
Now your application will include group claims in your manifest and you can use this fact in
your code.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
65
Q

You are developing a website that will run as an Azure Web App. Users will authenticate by
using their Azure Active Directory (Azure AD) credentials.
You plan to assign users one of the following permission levels for the website: admin, normal,
and reader. A user’s Azure AD group membership must be used to determine the permission
level.
You need to configure authorization.
Solution:
 Create a new Azure AD application. In the application’s manifest, define application
roles that match the required permission levels for the application.
 Assign the appropriate Azure AD group to each role. In the website, use the value of the
roles claim from the JWT for the user to determine permissions.
Does the solution meet the goal?
A. Yes
B. No

A

Answer:A
The roles get assigned by AD groups, so the requirement “A user’s Azure AD group membership must be used to determine the permission level” is met.

This solution should be answered with “yes”.

This scenario has 2 solutions provided as the approach using the “groupMembershipClaims” is possible as well.
That’s OK as it says “Some question sets might have more than one correct solution, while others might not have a correct solution.”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
66
Q

You provide an Azure API Management managed web service to clients. The back-end web
service implements HTTP Strict Transport Security (HSTS).
Every request to the backend service must include a valid HTTP authorization header.
You need to configure the Azure API Management instance with an authentication policy.
Which two policies can you use? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. Basic Authentication
B. Digest Authentication
C. Certificate Authentication
D. OAuth Client Credential Grant

A

Answer: A,C
As the API documentation only allows 3 options. It states:&raquo_space;»
Authentication policies
Authenticate with Basic - Authenticate with a backend service using Basic authentication.
Authenticate with client certificate - Authenticate with a backend service using client certificates.
Authenticate with managed identity - Authenticate with the managed identity for the API Management service.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
67
Q

You have an application that includes an Azure Web app and several Azure Function apps.
Application secrets including connection strings and certificates are stored in Azure Key Vault.
Secrets must not be stored in the application or application runtime environment. Changes to
Azure Active Directory (Azure AD) must be minimized.
You need to design the approach to loading application secrets.
What should you do?
A. Create a single user-assigned Managed Identity with permission to access Key Vault and
configure each App Service to use that Managed Identity.
B. Create a single Azure AD Service Principal with permission to access Key Vault and use a
client secret from within the App Services to access Key Vault.
C. Create a system assigned Managed Identity in each App Service with permission to access
Key Vault.
D. Create an Azure AD Service Principal with Permissions to access Key Vault for each App
Service and use a certificate from within the App Services to access Key Vault.

A

Answer: A
Because we have more than one App (Web App and other Function Apps) , So we agree it is going to be a managed identity but should I create one for each app or one for all apps?
If I create system MI then there should be one for each App.
If I create user MI then I can re-use it for any App I want with minimum change to AD

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
68
Q

You are developing a medical records document management website. The website is used to
store scanned copies of patient intake forms.
If the stored intake forms are downloaded from storage by a third party, the contents of the forms
must not be compromised.
You need to store the intake forms according to the requirements.
Solution:
1. Create an Azure Key Vault key named skey.
2. Encrypt the intake forms using the public key portion of skey.
3. Store the encrypted data in Azure Blob storage.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
69
Q

You are developing a medical records document management website. The website is used to
store scanned copies of patient intake forms.
If the stored intake forms are downloaded from storage by a third party, the contents of the forms
must not be compromised.
You need to store the intake forms according to the requirements.
Solution:
1. Create an Azure Cosmos DB database with Storage Service Encryption enabled.
2. Store the intake forms in the Azure Cosmos DB database.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Explanation:
Instead use an Azure Key vault and public key encryption. Store the encrypted from in Azure
Storage Blob storage.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
70
Q

Your company is developing an Azure API hosted in Azure.
You need to implement authentication for the Azure API to access other Azure resources. You
have the following requirements:
 All API calls must be authenticated.
 Callers to the API must not send credentials to the API.
Which authentication mechanism should you use?
A. Basic
B. Anonymous
C. Managed identity
D. Client certificate

A

Answer: C
Explanation:
Azure Active Directory Managed Service Identity (MSI) gives your code an automatically
managed identity for authenticating to Azure services, so that you can keep credentials out of
your code.
Note: Use the authentication-managed-identity policy to authenticate with a backend service
using the managed identity. This policy essentially uses the managed identity to obtain an access
token from Azure Active Directory for accessing the specified resource. After successfully
obtaining the token, the policy will set the value of the token in the Authorization header using
the Bearer scheme.
Incorrect Answers:
A: Use the authentication-basic policy to authenticate with a backend service using Basic
authentication. This policy effectively sets the HTTP Authorization header to the value
corresponding to the credentials provided in the policy.
B: Anonymous is no authentication at all.
D: Your code needs credentials to authenticate to cloud services, but you want to limit the
visibility of those credentials as much as possible. Ideally, they never appear on a developer’s
workstation or get checked-in to source control. Azure Key Vault can store credentials securely
so they aren’t in your code, but to retrieve them you need to authenticate to Azure Key Vault. To
authenticate to Key Vault, you need a credential! A classic bootstrap problem.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
71
Q

You develop Azure solutions.
You must grant a virtual machine (VM) access to specific resource groups in Azure Resource
Manager.
You need to obtain an Azure Resource Manager access token.
Solution: Use an X.509 certificate to authenticate the VM with Azure Resource Manager.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Explanation:
Instead run the Invoke-RestMethod cmdlet to make a request to the local managed identity for
Azure resources endpoint.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
72
Q

You are developing a website that will run as an Azure Web App. Users will authenticate by
using their Azure Active Directory (Azure AD) credentials.
You plan to assign users one of the following permission levels for the website: admin, normal,
and reader. A user’s Azure AD group membership must be used to determine the permission
level.
You need to configure authorization.
Solution:
 Configure and use Integrated Windows Authentication in the website.
 In the website, query Microsoft Graph API to load the groups to which the user is a
member.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Explanation:
Microsoft Graph is a RESTful web API that enables you to access Microsoft Cloud service
resources.
Instead in the Azure AD application’s manifest, set value of the groupMembershipClaims option
to All. In the website, use the value of the groups claim from the JWT for the user to determine
permissions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
73
Q

You develop Azure solutions.
You must grant a virtual machine (VM) access to specific resource groups in Azure Resource
Manager.
You need to obtain an Azure Resource Manager access token.
Solution: Run the Invoke-RestMethod cmdlet to make a request to the local managed identity for
Azure resources endpoint.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: A
Explanation:
Get an access token using the VM’s system-assigned managed identity and use it to call Azure
Resource Manager
You will need to use PowerShell in this portion.
1. In the portal, navigate to Virtual Machines and go to your Windows virtual machine and
in the Overview, click Connect.
2. Enter in your Username and Password for which you added when you created the
Windows VM.
3. Now that you have created a Remote Desktop Connection with the virtual machine, open
PowerShell in the remote session.
4. Using the Invoke-WebRequest cmdlet, make a request to the local managed identity for
Azure resources endpoint to get an access token for Azure Resource Manager.
Example:
$response = Invoke-WebRequest -Uri
‘http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-
01&resource=https://management.azure.com/’ -Method GET -Headers @{Metadata=”true”}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
74
Q

You develop an app that allows users to upload photos and videos to Azure storage. The app
uses a storage REST API call to upload the media to a blob storage account named Account1.
You have blob storage containers named Container1 and Container2.
Uploading of videos occurs on an irregular basis.
You need to copy specific blobs from Container1 to Container2 when a new video is uploaded.
What should you do?
A. Copy blobs to Container2 by using the Put Blob operation of the Blob Service REST API
B. Create an Event Grid topic that uses the Start-AzureStorageBlobCopy cmdlet
C. Use AzCopy with the Snapshot switch to copy blobs to Container2
D. Download the blob to a virtual machine and then upload the blob to Container2

A

Answer: B
Explanation:
The Start-AzureStorageBlobCopy cmdlet starts to copy a blob.
Example 1: Copy a named blob
C:\PS>Start-AzureStorageBlobCopy -SrcBlob “ContosoPlanning2015” -DestContainer
“ContosoArchives” -SrcContainer “ContosoUploads”
This command starts the copy operation of the blob named ContosoPlanning2015 from the
container named ContosoUploads to the container named ContosoArchives.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
75
Q

You are developing an ASP.NET Core website that uses Azure FrontDoor. The website is used
to build custom weather data sets for researchers. Data sets are downloaded by users as Comma
Separated Value (CSV) files. The data is refreshed every 10 hours.
Specific files must be purged from the FrontDoor cache based upon Response Header values.
You need to purge individual assets from the Front Door cache.
Which type of cache purge should you use?
A. single path
B. wildcard
C. root domain

A

Answer: A
Explanation:
These formats are supported in the lists of paths to purge:
 Single path purge: Purge individual assets by specifying the full path of the asset (without
the protocol and domain), with the file extension, for example, /pictures/strasbourg.png;
 Wildcard purge: Asterisk () may be used as a wildcard. Purge all folders, subfolders,
and files under an endpoint with /
in the path or purge all subfolders and files under a
specific folder by specifying the folder followed by /, for example, /pictures/.
 Root domain purge: Purge the root of the endpoint with “/” in the path.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
76
Q

Your company is developing an Azure API.
You need to implement authentication for the Azure API. You have the following requirements:
 All API calls must be secure.
 Callers to the API must not send credentials to the API.
Which authentication mechanism should you use?
A. Basic
B. Anonymous
C. Managed identity
D. Client certificate

A

Answer: C
Explanation:
Use the authentication-managed-identity policy to authenticate with a backend service using the
managed identity of the API Management service. This policy essentially uses the managed
identity to obtain an access token from Azure Active Directory for accessing the specified
resource. After successfully obtaining the token, the policy will set the value of the token in the
Authorization header using the Bearer scheme.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
77
Q

You are a developer for a SaaS company that offers many web services.
All web services for the company must meet the following requirements:
 Use API Management to access the services
 Use OpenID Connect for authentication
 Prevent anonymous usage
A recent security audit found that several web services can be called without any authentication.
Which API Management policy should you implement?
A. jsonp
B. authentication-certificate
C. check-header
D. validate-jwt

A

Answer: D
Explanation:
Add the validate-jwt policy to validate the OAuth token for every incoming request.
Incorrect Answers:
A: The jsonp policy adds JSON with padding (JSONP) support to an operation or an API to
allow cross-domain calls from JavaScript browser-based clients. JSONP is a method used in
JavaScript programs to request data from a server in a different domain. JSONP bypasses the
limitation enforced by most web browsers where access to web pages must be in the same
domain.
JSONP - Adds JSON with padding (JSONP) support to an operation or an API to allow crossdomain calls from JavaScript browser-based clients.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
78
Q

You are developing an Azure App Service REST API.
The API must be called by an Azure App Service web app. The API must retrieve and update
user profile information stored in Azure Active Directory (Azure AD).
You need to configure the API to make the updates.
Which two tools should you use? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Microsoft Graph API
B. Microsoft Authentication Library (MSAL)
C. Azure API Management
D. Microsoft Azure Security Center
E. Microsoft Azure Key Vault SDK

A

Answer: A,B
To configure the Azure App Service REST API to retrieve and update user profile information stored in Azure Active Directory (Azure AD), you should use the following tools:
A. Microsoft Graph API: The Microsoft Graph API allows you to interact with data in Azure AD, including retrieving and updating user profile information.
B. Microsoft Authentication Library (MSAL): MSAL is used for handling authentication in your application. It helps you authenticate users and acquire access tokens, which are necessary when making requests to the Microsoft Graph API.
Therefore, the correct answers are A (Microsoft Graph API) and B (Microsoft Authentication Library).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
79
Q

You develop a REST API. You implement a user delegation SAS token to communicate with
Azure Blob storage.
The token is compromised.
You need to revoke the token.
What are two possible ways to achieve this goal? Each correct answer presents a complete
solution.
NOTE: Each correct selection is worth one point.
A. Revoke the delegation key.
B. Delete the stored access policy.
C. Regenerate the account key.
D. Remove the role assignment for the security principle.

A

Answer: A,D
There’re two ways to create a SAS:
(1). The “standard” way to generate a SAS token is to use the storage account key.
(2). by using “managed identities” with a technique is called a “user delegation” SAS, and it allows you to sign the signature with Azure AD credentials instead of with the storage account key.
This question is (2) hence A, D is correct

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
80
Q

You develop and deploy an Azure Logic app that calls an Azure Function app. The Azure
Function app includes an OpenAPI (Swagger) definition and uses an Azure Blob storage
account. All resources are secured by using Azure Active Directory (Azure AD).
The Azure Logic app must securely access the Azure Blob storage account. Azure AD resources
must remain if the Azure Logic app is deleted.
You need to secure the Azure Logic app.
What should you do?
A. Create a user-assigned managed identity and assign role-based access controls.
B. Create an Azure AD custom role and assign the role to the Azure Blob storage account.
C. Create an Azure Key Vault and issue a client certificate.
D. Create a system-assigned managed identity and issue a client certificate.
E. Create an Azure AD custom role and assign role-based access controls.

A

Answer: A
Explanation:
To give a managed identity access to an Azure resource, you need to add a role to the target
resource for that identity.
Note: To easily authenticate access to other resources that are protected by Azure Active
Directory (Azure AD) without having to sign in and provide credentials or secrets, your logic
app can use a managed identity (formerly known as Managed Service Identity or MSI). Azure
manages this identity for you and helps secure your credentials because you don’t have to
provide or rotate secrets.
If you set up your logic app to use the system-assigned identity or a manually created, userassigned identity, the function in your logic app can also use that same identity for
authentication.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
81
Q

You are developing a solution that will use a multi-partitioned Azure Cosmos DB database. You
plan to use the latest Azure Cosmos DB SDK for development.
The solution must meet the following requirements:
 Send insert and update operations to an Azure Blob storage account.
 Process changes to all partitions immediately.
 Allow parallelization of change processing.
You need to process the Azure Cosmos DB operations.
What are two possible ways to achieve this goal? Each correct answer presents a complete
solution.
NOTE: Each correct selection is worth one point.
A. Create an Azure App Service API and implement the change feed estimator of the SDK.
Scale the API by using multiple Azure App Service instances.
B. Create a background job in an Azure Kubernetes Service and implement the change feed
feature of the SDK.
C. Create an Azure Function to use a trigger for Azure Cosmos DB. Configure the trigger to
connect to the container.
D. Create an Azure Function that uses a FeedIterator object that processes the change feed by
using the pull model on the container. Use a FeedRange object to parallelize the processing of
the change feed across multiple functions.

A

Answer: C,D
C: “Because Azure Functions uses the change feed processor behind the scenes, it automatically parallelizes change processing across your container’s partitions.”
D: “You can use the change feed pull model to consume the Azure Cosmos DB change feed at your own pace. Similar to the change feed processor, you can use the change feed pull model to parallelize the processing of changes across multiple change feed consumers.”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
82
Q

You deploy an Azure App Service web app. You create an app registration for the app in Azure
Active Directory (Azure AD) and Twitter.
The app must authenticate users and must use SSL for all communications. The app must use
Twitter as the identity provider.
You need to validate the Azure AD request in the app code.
What should you validate?
A. ID token header
B. ID token signature
C. HTTP response code
D. Tenant ID

A

Answer: B
To validate the Azure AD request in the app code when using Twitter as the identity provider, you should validate the ID token signature (option B).

The ID token is a JSON Web Token (JWT) that contains claims about the user. It is signed by Azure AD using a private key, and the signature can be verified using the corresponding public key. Validating the ID token signature ensures that the token was issued by a trusted source and that it has not been tampered with in transit.

Option A, validating the ID token header, is not sufficient for validating the entire ID token. The header only contains metadata about the token, such as the algorithm used for signing.

Option C, validating the HTTP response code, is unrelated to validating the ID token.

Option D, validating the tenant ID, is important for ensuring that the app is only accepting tokens from a trusted Azure AD tenant, but it does not ensure the integrity of the token itself.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
83
Q

A development team is creating a new REST API. The API will store data in Azure Blob
storage. You plan to deploy the API to Azure App Service.
Developers must access the Azure Blob storage account to develop the API for the next two
months. The Azure Blob storage account must not be accessible by the developers after the twomonth time period.
You need to grant developers access to the Azure Blob storage account.
What should you do?
A. Generate a shared access signature (SAS) for the Azure Blob storage account and provide the
SAS to all developers.
B. Create and apply a new lifecycle management policy to include a last accessed date value.
Apply the policy to the Azure Blob storage account.
C. Provide all developers with the access key for the Azure Blob storage account. Update the
API to include the Coordinated Universal Time (UTC) timestamp for the request header.
D. Grant all developers access to the Azure Blob storage account by assigning role-based access
control (RBAC) roles

A

Answer: A
A. Generate a shared access signature (SAS) for the Azure Blob storage account and provide the SAS to all developers.

A shared access signature (SAS) is a secure token that can be used to grant temporary and revocable access to a blob container or individual blobs. You can specify an expiration time for the SAS, so it will automatically expire after the two-month time period, making the blob storage account no longer accessible to the developers.
This approach allows you to grant the developers the necessary access to the Azure Blob storage account while still maintaining control over the access, and it also allows you to revoke access easily after the two-month time period.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
84
Q

You have a new Azure subscription. You are developing an internal website for employees to
view sensitive data. The website uses Azure Active Directory (Azure AD) for authentication.
You need to implement multifactor authentication for the website.
Which two actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Configure the website to use Azure AD B2C.
B. In Azure AD, create a new conditional access policy.
C. Upgrade to Azure AD Premium.
D. In Azure AD, enable application proxy.
E. In Azure AD conditional access, enable the baseline policy.

A

Answer: B, C
Explanation:
B: MFA Enabled by conditional access policy. It is the most flexible means to enable two-step
verification for your users. Enabling using conditional access policy only works for Azure MFA
in the cloud and is a premium feature of Azure AD.
C: Multi-Factor Authentication comes as part of the following offerings:
 Azure Active Directory Premium licenses - Full featured use of Azure Multi-Factor
Authentication Service (Cloud) or Azure Multi-Factor Authentication Server (Onpremises).
 Multi-Factor Authentication for Office 365
 Azure Active Directory Global Administrators

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
85
Q

You manage a data processing application that receives requests from an Azure Storage queue.
You need to manage access to the queue. You have the following requirements:
 Provide other applications access to the Azure queue.
 Ensure that you can revoke access to the queue without having to regenerate the storage
account keys.
 Specify access at the queue level and not at the storage account level.
Which type of shared access signature (SAS) should you use?
A. Service SAS with a stored access policy
B. Account SAS
C. User Delegation SAS
D. Service SAS with ad hoc SAS

A

Answer: A
Explanation:
A service SAS is secured with the storage account key. A service SAS delegates access to a
resource in only one of the Azure Storage services: Blob storage, Queue storage, Table storage,
or Azure Files.
Stored access policies give you the option to revoke permissions for a service SAS without
having to regenerate the storage account keys.
Incorrect Answers:
B: Account SAS is specified at the account level. It is secured with the storage account key.
C: A user delegation SAS applies to Blob storage only.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
86
Q

You are building a web application that uses the Microsoft identity platform for user
authentication.
You are implementing user identification for the web application.
You need to retrieve a claim to uniquely identify a user.
Which claim type should you use?
A. aud
B. nonce
C. oid
D. idp

A

Answer: C
Explanation:
oid -The object identifier for the user in Azure AD. This value is the immutable and non-reusable
identifier of the user. Use this value, not email, as a unique identifier for users; email addresses
can change. If you use the Azure AD Graph API in your app, object ID is that value used to
query profile information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
87
Q

You are developing an Azure Function that calls external APIs by providing an access token for
the API. The access token is stored in a secret named token in an Azure Key Vault named
mykeyvault.
You need to ensure the Azure Function can access to the token. Which value should you store in
the Azure Function App configuration?
A. KeyVault:mykeyvault;Secret:token
B. App:Settings:Secret:mykeyvault:token
C. AZUREKVCONNSTR_
https://mykeyveult.vault.ezure.net/secrets/token/
D.
@Microsoft.KeyVault(SecretUri=https://mykeyvault.vault.azure.net
/secrets/token/)

A

Answer: D
Explanation:
Add Key Vault secrets reference in the Function App configuration.
Syntax: @Microsoft.KeyVault(SecretUri={copied identifier for the username secret})

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
88
Q

A company maintains multiple web and mobile applications. Each application uses custom inhouse identity providers as well as social identity providers.
You need to implement single sign-on (SSO) for all the applications.
What should you do?
A. Use Azure Active Directory B2C (Azure AD B2C) with custom policies.
B. Use Azure Active Directory B2B (Azure AD B2B) and enable external collaboration.
C. Use Azure Active Directory B2C (Azure AD B2C) with user flows.
D. Use Azure Active Directory B2B (Azure AD B2B).

A

Answer: A
External collaboration settings let you specify what roles in your organization can invite external users for B2B collaboration. These settings also include options for allowing or blocking specific domains, and options for restricting what external guest users can see in your Azure AD directory.
So, you use B2B external collaboration to invite guests into your Azure AD tenant.

I vote for Custom Policies. Both Custom Policies and User Flows support external identity providers, but because of required custom in-house providers support, I’d choose Custom Policies over the User Flows

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
89
Q

You develop a Python application for image rendering that uses GPU resources to optimize
rendering processes. You deploy the application to an Azure Container Instances (ACI) Linux
container.
The application requires a secret value to be passed when the container is started. The value must
only be accessed from within the container.
You need to pass the secret value.
What are two possible ways to achieve this goal? Each correct answer presents a complete
solution.
NOTE: Each correct selection is worth one point.
A. Create an environment variable Set the secureValue property to the secret value.
B. Add the secret value to the container image. Use a managed identity.
C. Add the secret value to the application code Set the container startup command.
D. Add the secret value to an Azure Blob storage account. Generate a SAS token.
E. Mount a secret volume containing the secret value in a secrets file.

A

Answer: AE
Explanation:
A: Secure environment variables
Another method (another than a secret volume) for providing sensitive information to containers
(including Windows containers) is through the use of secure environment variables.
E: Use a secret volume to supply sensitive information to the containers in a container group.
The secret volume stores your secrets in files within the volume, accessible by the containers in
the container group. By storing secrets in a secret volume, you can avoid adding sensitive data
like SSH keys or database credentials to your application code.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
90
Q

You are developing a user portal for a company.
You need to create a report for the portal that lists information about employees who are subject
matter experts for a specific topic. You must ensure that administrators have full control and
consent over the data.
Which technology should you use?
A. Microsoft Graph data connect
B. Microsoft Graph API
C. Microsoft Graph connectors

A

Answer: A
Explanation:
Data Connect grants a more granular control and consent model: you can manage data, see who
is accessing it, and request specific properties of an entity. This enhances the Microsoft Graph
model, which grants or denies applications access to entire entities.
Microsoft Graph Data Connect augments Microsoft Graph’s transactional model with an
intelligent way to access rich data at scale. The data covers how workers communicate,
collaborate, and manage their time across all the applications and services in Microsoft 365.
Incorrect:
Not B: The Microsoft Graph API is a RESTful web API that enables you to access Microsoft
Cloud service resources. After you register your app and get authentication tokens for a user or
service, you can make requests to the Microsoft Graph API.
A simplistic definition of a Graph API is an API that models the data in terms of nodes and
edges (objects and relationships) and allows the client to interact with multiple nodes in a single
request.
Not C: Microsoft Graph connectors, your organization can index third-party data so that it
appears in Microsoft Search results.
With Microsoft Graph connectors, your organization can index third-party data so that it appears
in Microsoft Search results.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
91
Q

You are developing a web application that uses the Microsoft identity platform for user and
resource authentication. The web application calls several REST APIs.
A REST API call must read the user’s calendar. The web application requires permission to send
an email as the user.
You need to authorize the web application and the API.
Which parameter should you use?
A. tenant
B. code_challenge
C. state
D. client_id
E. scope

A

Answer: E
Explanation:
Microsoft identity platform and OAuth 2.0 authorization code flow, Request an authorization
code
https://login.microsoftonline.com/{tenant}/oauth2/v2.0/authorize?
The authorization code flow begins with the client directing the user to the /authorize endpoint.
In this request, the client requests the openid, offline_access, and
https://graph.microsoft.com/mail.read permissions from the user.
Parameters include:
* scope required
A space-separated list of scopes that you want the user to consent to. For the /authorize leg of the
request, this parameter can cover multiple resources. This value allows your app to get consent
for multiple web APIs you want to call.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
92
Q

You develop and deploy an Azure App Service web app named App1. You create a new Azure
Key Vault named Vault1. You import several API keys, passwords, certificates, and
cryptographic keys into Vault1.
You need to grant App1 access to Vault1 and automatically rotate credentials. Credentials must
not be stored in code.
What should you do?
A. Enable App Service authentication for Appl. Assign a custom RBAC role to Vault1.
B. Add a TLS/SSL binding to App1.
C. Upload a self-signed client certificate to Vault1. Update App1 to use the client certificate.
D. Assign a managed identity to App1.

A

Answer: D Explanation: An Azure Function can be used with managed identity to rotate service principal keys. Then an app can use service principal keys to authenticate to Key Vault to check for new versions of the app secret. As long as it does so before the old secret expires it can successfully update its cache with the new secret allowing a smooth transition to the new version.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
93
Q

You are developing a Java application to be deployed in Azure. The application stores sensitive
data in Azure Cosmos DB.
You need to configure Always Encrypted to encrypt the sensitive data inside the application.
What should you do first?
A. Create a new container to include an encryption policy with the JSON properties to be
encrypted.
B. Create a customer-managed key (CMK) and store the key in a new Azure Key Vault instance.
C. Create a data encryption key (DEK) by using the Azure Cosmos DB SDK and store the key in
Azure Cosmos DB.
D. Create an Azure AD managed identity and assign the identity to a new Azure Key Vault
instance.

A

Answer: B
Explanation:
Encryption keys
Customer-managed keys
Before DEKs get stored in Azure Cosmos DB, they are wrapped by a customer-managed key
(CMK). By controlling the wrapping and unwrapping of DEKs, CMKs effectively control the
access to the data that’s encrypted with their corresponding DEKs. CMK storage is designed as
an extensible, with a default implementation that expects them to be stored in Azure Key Vault.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
94
Q

You are developing several microservices to deploy to a new Azure Kubernetes Service cluster.
The microservices manage data stored in Azure Cosmos DB and Azure Blob storage. The data is
secured by using customer-managed keys stored in Azure Key Vault.
You must automate key rotation for all Azure Key Vault keys and allow for manual key rotation.
Keys must rotate every three months. Notifications of expiring keys must be sent before key
expiry.
You need to configure key rotation and enable key expiry notifications.
Which two actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Create and configure a new Azure Event Grid instance.
B. Configure Azure Key Vault alerts.
C. Create and assign an Azure Key Vault access policy.
D. Create and configure a key rotation policy during key creation

A

Answer: A,D
You can use the Key Rotation Policy in Azure Key Vault combined with Event Grid to trigger sending notification when a secret in the key vault is about to expire.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
95
Q

You are developing and deploying several ASP.NET web applications to Azure App Service.
You plan to save session state information and HTML output.
You must use a storage mechanism with the following requirements:
 Share session state across all ASP.NET web applications.
 Support controlled, concurrent access to the same session state data for multiple readers
and a single writer.
 Save full HTTP responses for concurrent requests.
You need to store the information.
Proposed Solution: Enable Application Request Routing (ARR).
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Explanation:
Instead deploy and configure Azure Cache for Redis. Update the web applications.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
96
Q

You are developing and deploying several ASP.NET web applications to Azure App Service.
You plan to save session state information and HTML output.
You must use a storage mechanism with the following requirements:
 Share session state across all ASP.NET web applications.
 Support controlled, concurrent access to the same session state data for multiple readers
and a single writer.
 Save full HTTP responses for concurrent requests.
You need to store the information.
Proposed Solution: Deploy and configure an Azure Database for PostgreSQL. Update the web
applications.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Share Session State Across Applications: While PostgreSQL can store session state, sharing session state across multiple ASP.NET applications is not its primary use case. It requires additional configuration and programming to handle session state management.
Controlled, Concurrent Access for Multiple Readers and a Single Writer: PostgreSQL supports concurrent access, but managing this for session state data would require additional programming effort.
Save Full HTTP Responses for Concurrent Requests: PostgreSQL is not designed for caching full HTTP responses. It is primarily a relational database for structured data, not a cache for HTTP responses.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
97
Q

You develop a gateway solution for a public facing news API. The news API back end is
implemented as a RESTful service and uses an OpenAPI specification.
You need to ensure that you can access the news API by using an Azure API Management
service instance.
Which Azure PowerShell command should you run?
A. Import-AzureRmApiManagementApi -Context $ApiMgmtContext
-SpecificationFormat “Swagger” -SpecificationPath $SwaggerPath
-Path $Path
B. New-AzureRmApiManagementBackend -Context $ApiMgmtContext -Url
$Url -Protocol http
C. New-AzureRmApiManagement -ResourceGroupName $ResourceGroup
-Name $Name –Location $Location -Organization $Org
-AdminEmail $AdminEmail
D. New-AzureRmApiManagementBackendProxy -Url $ApiUrl

A

Answer: A
correct answer is A because it is the one that takes advantage of the swagger definition of the API

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
98
Q

You are creating a hazard notification system that has a single signaling server which triggers
audio and visual alarms to start and stop.
You implement Azure Service Bus to publish alarms. Each alarm controller uses Azure Service
Bus to receive alarm signals as part of a transaction. Alarm events must be recorded for audit
purposes. Each transaction record must include information about the alarm type that was
activated.
You need to implement a reply trail auditing solution.
Which two actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Assign the value of the hazard message SessionID property to the ReplyToSessionId
property.
B. Assign the value of the hazard message MessageId property to the DevileryCount property.
C. Assign the value of the hazard message SessionID property to the SequenceNumber property.
D. Assign the value of the hazard message MessageId property to the CorrelationId property.
E. Assign the value of the hazard message SequenceNumber property to the DeliveryCount
property.
F. Assign the value of the hazard message MessageId property to the SequenceNumber property.

A

Answer: A, D
Explanation:
D: CorrelationId: Enables an application to specify a context for the message for the purposes of
correlation; for example, reflecting the MessageId of a message that is being replied to.
A: ReplyToSessionId: This value augments the ReplyTo information and specifies which
SessionId should be set for the reply when sent to the reply entity.
Incorrect Answers:
B, E: DeliveryCount
Number of deliveries that have been attempted for this message. The count is incremented when
a message lock expires, or the message is explicitly abandoned by the receiver. This property is
read-only.
C, E: SequenceNumber
The sequence number is a unique 64-bit integer assigned to a message as it is accepted and
stored by the broker and functions as its true identifier. For partitioned entities, the topmost 16
bits reflect the partition identifier. Sequence numbers monotonically increase and are gapless.
They roll over to 0 when the 48-64 bit range is exhausted. This property is read-only.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
99
Q

You are developing an Azure function that connects to an Azure SQL Database instance. The
function is triggered by an Azure Storage queue.
You receive reports of numerous System.InvalidOperationExceptions with the following
message:
“Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool.
This may have occurred because all pooled connections were in use and max pool size was
reached.”
You need to prevent the exception.
What should you do?
A. In the host.json file, decrease the value of the batchSize option
B. Convert the trigger to Azure Event Hub
C. Convert the Azure Function to the Premium plan
D. In the function.json file, change the value of the type option to queueScaling

A

Answer: A
The answer should be A.
The error message shows that there is not enough connections, which means that the concurrency is too high. Too many instances are running parallel. So we have to reduce the concurrency of the app.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
100
Q

You are developing and deploying several ASP.NET web applications to Azure App Service.
You plan to save session state information and HTML output.
You must use a storage mechanism with the following requirements:
 Share session state across all ASP.NET web applications.
 Support controlled, concurrent access to the same session state data for multiple readers
and a single writer.
 Save full HTTP responses for concurrent requests.
You need to store the information.
Proposed Solution: Deploy and configure Azure Cache for Redis. Update the web applications.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: A
Explanation:
The session state provider for Azure Cache for Redis enables you to share session information
between different instances of an ASP.NET web application.
The same connection can be used by multiple concurrent threads.
Redis supports both read and write operations.
The output cache provider for Azure Cache for Redis enables you to save the HTTP responses
generated by an ASP.NET web application.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
101
Q

You develop and deploy an ASP.NET web app to Azure App Service. You use Application
Insights telemetry to monitor the app.
You must test the app to ensure that the app is available and responsive from various points
around the world and at regular intervals. If the app is not responding, you must send an alert to
support staff.
You need to configure a test for the web app.
Which two test types can you use? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. integration
B. multi-step web
C. URL ping
D. unit
E. load

A

Answer: B, C
Explanation:
There are three types of availability tests:
 URL ping test: a simple test that you can create in the Azure portal.
 Multi-step web test: A recording of a sequence of web requests, which can be played
back to test more complex scenarios. Multi-step web tests are created in Visual Studio
Enterprise and uploaded to the portal for execution.
 Custom Track Availability Tests: If you decide to create a custom application to run
availability tests, the TrackAvailability() method can be used to send the results to
Application Insights.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
102
Q

You develop and add several functions to an Azure Function app that uses the latest runtime
host. The functions contain several REST API endpoints secured by using SSL. The Azure
Function app runs in a Consumption plan.
You must send an alert when any of the function endpoints are unavailable or responding too
slowly.
You need to monitor the availability and responsiveness of the functions.
What should you do?
A. Create a URL ping test.
B. Create a timer triggered function that calls TrackAvailability() and send the results to
Application Insights.
C. Create a timer triggered function that calls GetMetric(“Request Size”) and send the
results to Application Insights.
D. Add a new diagnostic setting to the Azure Function app. Enable the FunctionAppLogs and
Send to Log Analytics options.

A

Answer: B
Explanation:
You can create an Azure Function with TrackAvailability() that will run periodically according
to the configuration given in TimerTrigger function with your own business logic. The results of
this test will be sent to your Application Insights resource, where you will be able to query for
and alert on the availability results data. This allows you to create customized tests similar to
what you can do via Availability Monitoring in the portal. Customized tests will allow you to
write more complex availability tests than is possible using the portal UI, monitor an app inside
of your Azure VNET, change the endpoint address, or create an availability test even if this
feature is not available in your region.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
103
Q

You develop and deploy an Azure App Service web app. The app is deployed to multiple regions
and uses Azure Traffic Manager. Application Insights is enabled for the app.
You need to analyze app uptime for each month.
Which two solutions will achieve the goal? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. Azure Monitor logs
B. Application Insights alerts
C. Azure Monitor metrics
D. Application Insights web tests

A

Answer: A,C
Metrics will give you the uptime. Logs will give you the causes of the downtime.
wrong B: Alerts are not required
wrong D: Web tests has nothing to do with uptime.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
104
Q

You are developing an ASP.NET Core Web API web service. The web service uses Azure
Application Insights for all telemetry and dependency tracking. The web service reads and writes
data to a database other than Microsoft SQL Server.
You need to ensure that dependency tracking works for calls to the third-party database.
Which two dependency telemetry properties should you use? Each correct answer presents part
of the solution.
NOTE: Each correct selection is worth one point.
A. Telemetry.Context.Cloud.RoleInstance
B. Telemetry.Id
C. Telemetry.Name
D. Telemetry.Context.Operation.Id
E. Telemetry.Context.Session.Id

A

Answer: BD
Explanation:
Example:
public async Task Enqueue(string payload)
{
// StartOperation is a helper method that initializes the telemetry item
// and allows correlation of this operation with its parent and children.
var operation = telemetryClient.StartOperation(“enqueue “ +
queueName);

operation.Telemetry.Type = “Azure Service Bus”;
operation.Telemetry.Data = “Enqueue “ + queueName;
var message = new BrokeredMessage(payload);
// Service Bus queue allows the property bag to pass along with the message.
// We will use them to pass our correlation identifiers (and other context)
// to the consumer.
message.Properties.Add(“ParentId”, operation.Telemetry.Id);
message.Properties.Add(“RootId”, operation.Telemetry.Context.Operation.Id);

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
105
Q

You are developing a web application that uses Azure Cache for Redis. You anticipate that the
cache will frequently fill and that you will need to evict keys.
You must configure Azure Cache for Redis based on the following predicted usage pattern: A
small subset of elements will be accessed much more often than the rest.
You need to configure the Azure Cache for Redis to optimize performance for the predicted
usage pattern.
Which two eviction policies will achieve the goal?
NOTE: Each correct selection is worth one point.
A. noeviction
B. allkeys-lru
C. volatile-lru
D. allkeys-random
E. volatile-ttl
F. volatile-random

A

Answer: B, C
Explanation:
B: The allkeys-lru policy evict keys by trying to remove the less recently used (LRU) keys first,
in order to make space for the new data added. Use the allkeys-lru policy when you expect a
power-law distribution in the popularity of your requests, that is, you expect that a subset of
elements will be accessed far more often than the rest.
C: volatile-lru: evict keys by trying to remove the less recently used (LRU) keys first, but only
among keys that have an expire set, in order to make space for the new data added.
Note: The allkeys-lru policy is more memory efficient since there is no need to set an expire for
the key to be evicted under memory pressure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
106
Q

An organization hosts web apps in Azure. The organization uses Azure Monitor.
You discover that configuration changes were made to some of the web apps.
You need to identify the configuration changes.
Which Azure Monitor log should you review?
A. AppServiceAppLogs
B. AppServiceEnvironmentPlatformlogs
C. AppServiceConsoleLogs
D. AppServiceAuditLogs

A

Answer: B
Explanation:
The log type AppServiceEnvironmentPlatformLogs handles the App Service Environment:
scaling, configuration changes, and status logs.
Incorrect:
AppServiceAppLogs contains logs generated through your application.
AppServiceAuditLogs logs generated when publishing users successfully log on via one of the
App Service publishing protocols.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
107
Q

You develop and deploy an Azure App Service web app to a production environment. You
enable the Always On setting and the Application Insights site extensions.
You deploy a code update and receive multiple failed requests and exceptions in the web app.
You need to validate the performance and failure counts of the web app in near real time.
Which Application Insights tool should you use?
A. Profiler
B. Smart Detection
C. Live Metrics Stream
D. Application Map
E. Snapshot Debugger

A

Answer: C
Explanation:
Live Metrics Stream
Deploying the latest build can be an anxious experience. If there are any problems, you want to
know about them right away, so that you can back out if necessary. Live Metrics Stream gives
you key metrics with a latency of about one second.
With Live Metrics Stream, you can:
* Validate a fix while it’s released, by watching performance and failure counts.
* Etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
108
Q

You are building a web application that performs image analysis on user photos and returns
metadata containing objects identified. The image analysis is very costly in terms of time and
compute resources. You are planning to use Azure Redis Cache so duplicate uploads do not need
to be reprocessed.
In case of an Azure data center outage, metadata loss must be kept to a minimum.
You need to configure the Azure Redis cache instance.
Which two actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Configure Azure Redis with AOF persistence.
B. Configure Azure Redis with RDB persistence.
C. Configure second storage account for persistence.
D. Set backup frequency to the minimum value.

A

Answer: A, C
The key here is “In case of an Azure data center outage, metadata loss must be kept to a minimum.”

So the correct answer is AC.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
109
Q

You are developing an Azure-based web application. The application goes offline periodically to
perform offline data processing. While the application is offline, numerous Azure Monitor alerts
fire which result in the on-call developer being paged.
The application must always log when the application is offline for any reason.
You need to ensure that the on-call developer is not paged during offline processing.
What should you do?
A. Add Azure Monitor alert processing rules to suppress notifications.
B. Disable Azure Monitor Service Health Alerts during offline processing.
C. Create an Azure Monitor Metric Alert.
D. Build an Azure Monitor action group that suppresses the alerts.

A

Answer: A
A. Add Azure Monitor alert processing rules to suppress notifications: Correct. This allows suppression of notifications during offline processing.
B. Disable Azure Monitor Service Health Alerts during offline processing: Incorrect. This would stop all alerts, not just the ones related to offline processing.
C. Create an Azure Monitor Metric Alert: Incorrect. This would still trigger alerts during offline processing.
D. Build an Azure Monitor action group that suppresses the alerts: Incorrect. This requires additional configuration and may not specifically target the offline processing alerts.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
110
Q

You are developing an online game that includes a feature that allows players to interact with
other players on the same team within a certain distance. The calculation to determine the
players in range occurs when players move and are cached in an Azure Cache for Redis instance.
The system should prioritize players based on how recently they have moved and should not
prioritize players who have logged out of the game.
You need to select an eviction policy.
Which eviction policy should you use?
A. allkeys-lru
B. volatile-lru
C. allkeys-lfu
D. volatile-ttl

A

Answer: B
There must be a way to tell our redis that logged off users must not be prioritized.
Sample: User A moves and then automatically logs-off. With allkeys-lru we can’t distinguish this particularity. With volatile-lru we can tell our redis what are good candidates to be removed using different TTL values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
111
Q

You develop an Azure App Service web app and deploy to a production environment. You
enable Application Insights for the web app.
The web app is throwing multiple exceptions in the environment.
You need to examine the state of the source code and variables when the exceptions are thrown.
Which Application Insights feature should you configure?
A. Smart detection
B. Profiler
C. Snapshot Debugger
D. Standard test

A

Answer: C
Explanation:
Exceptions in web applications can be reported with Application Insights. You can correlate
failed requests with exceptions and other events on both the client and server so that you can
quickly diagnose the causes.
When an exception occurs, you can automatically collect a debug snapshot from your live web
application. The debug snapshot shows the state of source code and variables at the moment the
exception was thrown. The Snapshot Debugger in Azure Application Insights:
Monitors system-generated logs from your web app.
Collects snapshots on your top-throwing exceptions.
Provides information you need to diagnose issues in production.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
112
Q

You develop an ASP.NET Core app that uses Azure App Configuration. You also create an App
Configuration containing 100 settings.
The app must meet the following requirements:
• Ensure the consistency of all configuration data when changes to individual settings
occur.
• Handle configuration data changes dynamically without causing the application to restart.
• Reduce the overall number of requests made to App Configuration APIs.
You must implement dynamic configuration updates in the app.
What are two ways to achieve this goal? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Create and register a sentinel key in the App Configuration store. Set the refreshAll parameter
of the Register method to true.
B. Increase the App Configuration cache expiration from the default value.
C. Decrease the App Configuration cache expiration from the default value.
D. Create and configure Azure Key Vault. Implement the Azure Key Vault configuration
provider.
E. Register all keys in the App Configuration store. Set the refreshAll parameter of the Register
method to false.
F. Create and implement environment variables for each App Configuration store setting

A

Answer: AB
Explanation:
The App Configuration .NET provider library supports updating configuration on demand
without causing an application to restart.
A: Request-driven configuration refresh
The configuration refresh is triggered by the incoming requests to your web app. No refresh will
occur if your app is idle. When your app is active, the App Configuration middleware monitors
the sentinel key, or any other keys you registered for refreshing in the ConfigureRefresh call.
The middleware is triggered upon every incoming request to your app. However, the middleware
will only send requests to check the value in App Configuration when the cache expiration time
you set has passed.
B, not C: The SetCacheExpiration method specifies the minimum time that must elapse before a
new request is made to App Configuration to check for any configuration changes. Default
expiration time is 30 seconds. Adjust to a higher value if you need to reduce the number of
requests made to your App Configuration store.

113
Q

You develop and deploy a web app to Azure App Service. The Azure App Service uses a Basic
plan in a single region.
Users report that the web app is responding slow. You must capture the complete call stack to
help identify performance issues in the code. Call stack data must be correlated across app
instances. You must minimize cost and impact to users on the web app.
You need to capture the telemetry.
Which three actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Restart all apps in the App Service plan.
B. Enable Application Insights site extensions.
C. Upgrade the Azure App Service plan to Premium.
D. Enable Profiler.
E. Enable the Always On setting for the app service.
F. Enable Snapshot debugger.
G. Enable remote debugging.

A

Answer: B, D, E

Gotta enable App Insights.
Always On can solve the performance issues.
Profiler can help find performance issues.
You can eliminate Premium upgrade because it says minimize costs, snapshot debugger is for debugging exceptions and has nothing to do with performance issues, restarting app impacts users and it says minimize impact.

114
Q

You are building an application to track cell towers that are available to phones in near real time.
A phone will send information to the application by using the Azure Web PubSub service. The
data will be processed by using an Azure Functions app. Traffic will be transmitted by using a
content delivery network (CDN).
The Azure function must be protected against misconfigured or unauthorized invocations.
You need to ensure that the CDN allows for the Azure function protection.
Which HTTP header should be on the allowed list?
A. Authorization
B. WebHook-Request-Callback
C. Resource
D. WebHook-Request-Origin

A

Answer: A
A. Authorization - Correct The Authorization header is used to authenticate the client/user in the application by including authorization credentials. It’s crucial for protecting the Azure function against misconfigured invocations.

B. WebHook-Request-Callback - Incorrect The WebHook-Request-Callback is not a standard HTTP header and it’s not typically used for authorization or protection against misconfigured invocations.

C. Resource - Incorrect The Resource is not a standard HTTP header and it’s not typically used for authorization or protection against misconfigured invocations.

D. WebHook-Request-Origin - Incorrect The WebHook-Request-Origin is not a standard HTTP header and it’s not typically used for authorization or protection against misconfigured invocations.

115
Q

You are developing an Azure solution to collect point-of-sale (POS) device data from
2,000 stores located throughout the world. A single device can produce 2 megabytes (MB) of
data every 24 hours. Each store location has one to five devices that send data.
You must store the device data in Azure Blob storage. Device data must be correlated based on a
device identifier. Additional stores are expected to open in the future.
You need to implement a solution to receive the device data.
Solution: Provision an Azure Service Bus. Configure a topic to receive the device data by using a
correlation filter.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B

Explanation:
- topics allows multiple subscribers, and here we need to process each event once
- correlation filter is for subscriptions, not topics
- even when assuming there is typo in the question and correlation filter is defined on the subscription level - it still is not a valid solution, because new stores can be opened in the future with many new device identifiers which you can’t know in advance. Besides that filter make no sense in this scenario whatsoever, you just need to save data in storage account and basically partition it by device identifier.

116
Q

You are developing an Azure solution to collect point-of-sale (POS) device data from
2,000 stores located throughout the world. A single device can produce 2 megabytes (MB) of
data every 24 hours. Each store location has one to five devices that send data.
You must store the device data in Azure Blob storage. Device data must be correlated based on a
device identifier. Additional stores are expected to open in the future.
You need to implement a solution to receive the device data.
Solution: Provision an Azure Event Grid. Configure event filtering to evaluate the device
identifier.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Sensors do not send events, they send messages containing specific data that has been gathered. This makes automatically the solution incorrect, because you need a Service Bus to collect them. Event Grids and Event Hubs won’t do the job here.

117
Q

You are developing an Azure Service application that processes queue data when it receives a
message from a mobile application. Messages may not be sent to the service consistently.
You have the following requirements:
 Queue size must not grow larger than 80 gigabytes (GB).
 Use first-in-first-out (FIFO) ordering of messages.
 Minimize Azure costs.
You need to implement the messaging solution.
Solution: Use the .Net API to add a message to an Azure Storage Queue from the mobile
application. Create an Azure Function App that uses an Azure Storage Queue trigger.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Azure Queue doesn’t support FIFO!

118
Q

You are developing an Azure messaging solution.
You need to ensure that the solution meets the following requirements:
 Provide transactional support.
 Provide duplicate detection.
 Store the messages for an unlimited period of time.
Which two technologies will meet the requirements? Each correct answer presents a complete
solution.
NOTE: Each correct selection is worth one point.
A. Azure Service Bus Topic
B. Azure Service Bus Queue
C. Azure Storage Queue
D. Azure Event Hub

A

Answer: A, B
Queue Storage does not provide transactional support and Event Hub can’t be configured to store events for infinite time.

119
Q

You are developing a solution that will use Azure messaging services.
You need to ensure that the solution uses a publish-subscribe model and eliminates the need for
constant polling.
What are two possible ways to achieve the goal? Each correct answer presents a complete
solution.
NOTE: Each correct selection is worth one point.
A. Service Bus
B. Event Hub
C. Event Grid
D. Queue

A

Answer: AC
Explanation:
It is strongly recommended to use available messaging products and services that support a
publish-subscribe model, rather than building your own. In Azure, consider using Service Bus or
Event Grid. Other technologies that can be used for pub/sub messaging include Redis,
RabbitMQ, and Apache Kafka

120
Q

A company is implementing a publish-subscribe (Pub/Sub) messaging component by using
Azure Service Bus. You are developing the first subscription application.
In the Azure portal you see that messages are being sent to the subscription for each topic. You
create and initialize a subscription client object by supplying the correct details, but the
subscription application is still not consuming the messages.
You need to ensure that the subscription client processes all messages.
Which code segment should you use?
A. await subscriptionClient.AddRuleAsync(
new RuleDescription(RuleDescription.DefaultRuleName,
new TrueFilter()));
B. subscriptionClient = new
SubscriptionClient(ServiceBusConnectionString, TopicName,
SubscriptionName);
C. await subscriptionClient.CloseAsync();
D. subscriptionClient.RegisterMessageHandler(
ProcessMessagesAsync, messageHandlerOptions);

A

Answer: D
Explanation:
Using topic client, call RegisterMessageHandler which is used to receive messages continuously
from the entity. It registers a message handler and begins a new thread to receive messages. This
handler is waited on every time a new message is received by the receiver.
subscriptionClient.RegisterMessageHandler(ReceiveMessagesAsync, messageHandlerOptions);

121
Q

You are developing an Azure Service application that processes queue data when it receives a
message from a mobile application. Messages may not be sent to the service consistently.
You have the following requirements:
 Queue size must not grow larger than 80 gigabytes (GB).
 Use first-in-first-out (FIFO) ordering of messages.
 Minimize Azure costs.
You need to implement the messaging solution.
Solution: Use the .Net API to add a message to an Azure Storage Queue from the mobile
application. Create an Azure VM that is triggered from Azure Storage Queue events.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Explanation:
Don’t use a VM, instead create an Azure Function App that uses an Azure Service Bus Queue
trigger.

122
Q

You are developing an Azure Service application that processes queue data when it receives a
message from a mobile application. Messages may not be sent to the service consistently.
You have the following requirements:
 Queue size must not grow larger than 80 gigabytes (GB).
 Use first-in-first-out (FIFO) ordering of messages.
 Minimize Azure costs.
You need to implement the messaging solution.
Solution: Use the .Net API to add a message to an Azure Service Bus Queue from the mobile
application. Create an Azure Windows VM that is triggered from Azure Service Bus Queue.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Queue size must not grow larger than 80 gigabytes (GB) - yes that’s fine
Use first-in-first-out (FIFO) ordering of messages. - yes, you can get it with service bus
Minimize Azure costs & Create an Azure Windows VM that is triggered from Azure Service Bus Queue - Firstly there’s nothing like an AzureVM triggering, but Azure Functions triggering instead. Secondly - using vm with queue is more expensive, but of course it depends on multiple factors.

123
Q

You are developing an Azure solution to collect point-of-sale (POS) device data from
2,000 stores located throughout the world. A single device can produce 2 megabytes (MB) of
data every 24 hours. Each store location has one to five devices that send data.
You must store the device data in Azure Blob storage. Device data must be correlated based on a
device identifier. Additional stores are expected to open in the future.
You need to implement a solution to receive the device data.
Solution: Provision an Azure Event Hub. Configure the machine identifier as the partition key
and enable capture.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: A

124
Q

You are creating an app that will use CosmosDB for data storage. The app will process batches
of relational data.
You need to select an API for the app.
Which API should you use?
A. MongoDB API
B. Table API
C. SQL API
D. Cassandra API

A

Answer: C
Explanation:
For relational data you will need the SQL API
Incorrect Answers:
A: The MongoDB API is not used for relational data.
B: The Table API only supports data in the key/value format
D: The Cassandra API only supports OLTP (Online Transactional Processing) and not batch
processing.

125
Q

You are developing an e-commerce solution that uses a microservice architecture.
You need to design a communication backplane for communicating transactional messages
between various parts of the solution. Messages must be communicated in first-in-first-out
(FIFO) order.
What should you use?
A. Azure Storage Queue
B. Azure Event Hub
C. Azure Service Bus
D. Azure Event Grid

A

Answer: C
Use Service Bus when your solution requires transactional behavior and atomicity when sending or receiving multiple messages from a queue.

126
Q

You are developing an Azure Service application that processes queue data when it receives a
message from a mobile application. Messages may not be sent to the service consistently.
You have the following requirements:
 Queue size must not grow larger than 80 gigabytes (GB).
 Use first-in-first-out (FIFO) ordering of messages.
 Minimize Azure costs.
You need to implement the messaging solution.
Solution: Use the .Net API to add a message to an Azure Service Bus Queue from the mobile
application. Create an Azure Function App that uses an Azure Service Bus Queue trigger.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: A
Explanation:
You can create a function that is triggered when messages are submitted to an Azure Storage
queue.

127
Q

You are developing an Azure solution to collect point-of-sale (POS) device data from
2,000 stores located throughout the world. A single device can produce 2 megabytes (MB) of
data every 24 hours. Each store location has one to five devices that send data.
You must store the device data in Azure Blob storage. Device data must be correlated based on a
device identifier. Additional stores are expected to open in the future.
You need to implement a solution to receive the device data.
Solution: Provision an Azure Notification Hub. Register all devices with the hub.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Explanation:
Instead use an Azure Service Bus, which is used order processing and financial transactions.

128
Q

You are building a loyalty program for a major snack producer. When customers buy a snack at
any of 100 participating retailers the event is recorded in Azure Event Hub. Each retailer is given
a unique identifier that is used as the primary identifier for the loyalty program.
Retailers must be able to be added or removed at any time. Retailers must only be able to record
sales for themselves.
You need to ensure that retailers can record sales.
What should you do?
A. Use publisher policies for retailers.
B. Create a partition for each retailer.
C. Define a namespace for each retailer.

A

Answer: A
Explanation:
Event Hubs enables granular control over event publishers through publisher policies. Publisher
policies are run-time features designed to facilitate large numbers of independent event
publishers. With publisher policies, each publisher uses its own unique identifier when
publishing events to an event hub.
Incorrect:
Not C: An Event Hubs namespace is a management container for event hubs (or topics, in Kafka
parlance). It provides DNS-integrated network endpoints and a range of access control and
network integration management features such as IP filtering, virtual network service endpoint,
and Private Link.

129
Q

You develop a solution that uses Azure Virtual Machines (VMs).
The VMs contain code that must access resources in an Azure resource group. You grant the VM
access to the resource group in Resource Manager.
You need to obtain an access token that uses the VM’s system-assigned managed identity.
Which two actions should you perform? Each correct answer presents part of the solution.
A. From the code on the VM, call Azure Resource Manager using an access token.
B. Use PowerShell on a remote machine to make a request to the local managed identity for
Azure resources endpoint.
C. Use PowerShell on the VM to make a request to the local managed identity for Azure
resources endpoint.
D. From the code on the VM, call Azure Resource Manager using a SAS token.
E. From the code on the VM, generate a user delegation SAS token.

A

Answer: BD

130
Q

You are developing a road tollway tracking application that sends tracking events by using
Azure Event Hubs using premium tier.
Each road must have a throttling policy uniquely assigned.
You need to configure the event hub to allow for per-road throttling.
What should you do?
A. Use a unique consumer group for each road.
B. Ensure each road stores events in a different partition.
C. Ensure each road has a unique connection string.
D. Use a unique application group for each road.

A

Answer: B

131
Q

You develop and deploy an ASP.NET Core application that connects to an Azure Database for
MySQL instance.
Connections to the database appear to drop intermittently and the application code does not
handle the connection failure.
You need to handle the transient connection errors in code by implementing retries.
What are three possible ways to achieve this goal? Each correct answer presents part of the
solution.
NOTE: Each correct selection is worth one point.
A. Close the database connection and immediately report an error.
B. Disable connection pooling and configure a second Azure Database for MySQL instance.
C. Wait five seconds before repeating the connection attempt to the database.
D. Set a maximum number of connection attempts to 10 and report an error on subsequent
connections.
E. Increase connection repeat attempts exponentially up to 120 seconds.

A

Answer: C,D,E
The first and second case are fairly straight forward to handle. Try to open the connection again. When you succeed, the transient error has been mitigated by the system. You can use your Azure Database for MySQL again. We recommend having waits before retrying the connection. Back off if the initial retries fail. This way the system can use all resources available to overcome the error situation. A good pattern to follow is:

Wait for 5 seconds before your first retry.
For each following retry, the increase the wait exponentially, up to 60 seconds.
Set a max number of retries at which point your application considers the operation failed.

132
Q

You are building a B2B web application that uses Azure B2B collaboration for authentication.
Paying customers authenticate to Azure B2B using federation.
The application allows users to sign up for trial accounts using any email address.
When a user converts to a paying customer, the data associated with the trial should be kept, but the user must authenticate using federation.
You need to update the user in Azure Active Directory (Azure AD) when they convert to a paying customer.
Which Graph API parameter is used to change authentication from one-time passcodes to federation?
A. resetRedemption
B. Status
C. userFlowType
D. invitedUser

A

Answer: A
“When a user redeems a one-time passcode and later obtains an MSA, Azure AD account, or other federated account, they’ll continue to be authenticated using a one-time passcode. If you want to update the user’s authentication method, you can reset their redemption status.”

133
Q
A
134
Q

You are creating an Azure Cosmos DB account that makes use of the SQL API. Data will be
added to the account every day by a web application.
You need to ensure that an email notification is sent when information is received from IoT
devices, and that compute cost is reduced.
You decide to deploy a function app.
Which of the following should you configure the function app to use? Answer by dragging the
correct options from the list to the answer area.

A
135
Q

You have an Azure Active Directory (Azure AD) tenant.
You want to implement multi-factor authentication by making use of a conditional access policy.
The conditional access policy must be applied to all users when they access the Azure portal.
Which three settings should you configure? To answer, select the appropriate settings in the
answer area.

A
136
Q
A
137
Q

You are a developer for a company that provides a bookings management service in the tourism
industry. You are implementing Azure Search for the tour agencies listed in your company’s
solution.
You create the index in Azure Search. You now need to use the Azure Search .NET SDK to
import the relevant data into the Azure Search service.
Which three actions should you perform in sequence? To answer, move the appropriate actions
from the list of actions from left to right and arrange them in the correct order.

(Azure search is out of scope for AZ 204, not sure why this question is in the pdf, but i added it regardless just incase)

A
138
Q

You are developing a C++ application that compiles to a native application named process.exe.
The application accepts images as input and returns images in one of the following image
formats: GIF, PNG, or JPEG.
You must deploy the application as an Azure Function.
You need to configure the function and host json files.
How should you complete the json files? To answer, select the appropriate options in the answer
area.

A

HTTP, Custom handler, True are correct

139
Q

You are developing an Azure Static Web app that contains training materials for a tool company.
Each tool’s training material is contained in a static web page that is linked from the tool’s
publicly available description page.
A user must be authenticated using Azure AD prior to viewing training.
You need to ensure that the user can view training material pages after authentication.
How should you complete the configuration file? To answer, select the appropriate options in the
answer area.

A
140
Q

You are authoring a set of nested Azure Resource Manager templates to deploy Azure resources. You
author an Azure Resource Manager template named mainTemplate.json that contains the following linked
templates: linkedTemplate1.json, linkedTemplate2.json.
You add parameters to a parameters template file named mainTemplate.parameters,json. You save all
templates on a local device in the C:\templates\ folder.
You have the following requirements:
 Store the templates in Azure for later deployment.
 Enable versioning of the templates.
 Manage access to the templates by using Azure RBAC.
 Ensure that users have read-only access to the templates.
 Allow users to deploy the templates.
You need to store the templates in Azure.
How should you complete the command? To answer, select the appropriate options in the answer area.

A
141
Q

You are developing a service where customers can report news events from a browser using Azure Web
PubSub. The service is implemented as an Azure Function App that uses the JSON WebSocket
subprotocol to receive news events.
You need to implement the bindings for the Azure Function App.
How should you configure the binding? To answer, select the appropriate options in the answer area.

A
142
Q

You are building a software-as-a-service (SaaS) application that analyzes DNA data that will run on
Azure virtual machines (VMs) in an availability zone. The data is stored on managed disks attached to the
VM. The performance of the analysis is determined by the speed of the disk attached to the VM.
You have the following requirements:
 The application must be able to quickly revert to the previous day’s data if a systemic error is detected.
 The application must minimize downtime in the case of an Azure datacenter outage.
You need to provision the managed disk for the VM to maximize performance while meeting the
requirements.
Which type of Azure Managed Disk should you use? To answer, select the appropriate options in the
answer area.

A

Ans: Premium SSD and ZRS
They are asking for high performance workloads which is supported by Premium tier

143
Q

You are developing an application that includes two Docker containers.
The application must meet the following requirements:
 The containers must not run as root.
 The containers must be deployed to Azure Container Instances by using a YAML file.
 The containers must share a lifecycle, resources, local network, and storage volume.
 The storage volume must persist through container crashes.
 The storage volume must be deployed on stop or restart of the containers.
You need to configure Azure Container Instances for the application.
Which configuration values should you use? To answer, select the appropriate options in the answer area.

A

Ans: Container group, EmptyDir

Container group is the only logical answer that can have shared lifecycle
Azure files need root permission
Secret is for secrets and read-only
EmtyDir can persist through crash and redeployed on stop and restart
Cloned Git Repo also does the job but it needs more details like Git URL and stuff which are not mentioned to be available in the question.

144
Q

You are implementing a software as a service (SaaS) ASP.NET Core web service that will run as
an Azure Web App. The web service will use an on-premises SQL Server database for storage.
The web service also includes a WebJob that processes data updates. Four customers will use the
web service.
 Each instance of the WebJob processes data for a single customer and must run as a
singleton instance.
 Each deployment must be tested by using deployment slots prior to serving production
data.
 Azure costs must be minimized.
 Azure resources must be located in an isolated network.
You need to configure the App Service plan for the Web App.
How should you configure the App Service plan? To answer, select the appropriate settings in
the answer area.

A

Box 1: 4
There are four customers that use this service, and each instance of the WebJob processes data for a single customer and must run as a singleton instance. So, the number of VM should be 4. WebJobs is a feature of Azure App Service that enables you to run a program or script in the same instance as a web app. Like running background tasks.

Box 2: Isolated
Azure resources must be located in an isolated network .
In the Isolated tier, the App Service Environment defines the number of isolated workers that run your apps, and each worker is charged. In addition, there’s a flat Stamp Fee for the running the App Service Environment itself. Isolated: This tier runs dedicated Azure VMs on dedicated Azure Virtual Networks. It provides network isolation on top of compute isolation to your apps. It provides the maximum scale-out capabilities.

145
Q
A

Box 1: Deployment
To deploy Azure Functions to Kubernetes use the func kubernetes deploy command has several attributes that directly control how our app scales, once it is deployed to Kubernetes.

Box 2: ScaledObject
With –polling-interval, we can control the interval used by KEDA to check Azure Service Bus Queue for messages.

Box 3: Secret
Store connection strings in Kubernetes Secrets.

146
Q
A
147
Q

You are developing an Azure Web App. You configure TLS mutual authentication for the web
app.
You need to validate the client certificate in the web app. To answer, select the appropriate
options in the answer area.

A

Box 1: HTTP request header
If you are using ASP.NET and configure your app to use client certificate authentication, the certificate will be available through the HttpRequest.ClientCertificate property.

Box 2: Base64
For other application stacks, the client cert will be available in your app through a base64 encoded value in the “X-ARR-ClientCert” request header. Your application can create a certificate from this value and then use it for authentication and authorization purposes in your application.

148
Q
A
149
Q

Fourth Coffee has an ASP.NET Core web app that runs in Docker. The app is mapped to the
www.fourthcoffee.com domain.
Fourth Coffee is migrating this application to Azure.
You need to provision an App Service Web App to host this docker image and map the custom
domain to the App Service web app.
A resource group named FourthCoffeePublicWebResourceGroup has been created in the
WestUS region that contains an App Service Plan named AppServiceLinuxDockerPlan.
Which order should the CLI commands be used to develop the solution? To answer, move all of
the Azure CLI commands from the list of commands to the answer area and arrange them in the
correct order.

A
  1. /bin/bash
  2. az webpp create
  3. ~ config container set
  4. ~ config hostname add
150
Q
A
  1. create ~Premium plan Type (Consumption X)
  2. create system-assigned ~ (user-assigned X)
  3. create an access policy in Azure Key Vault~
151
Q

A company is developing a Java web app. The web app code is hosted in a GitHub repository
located at https://github.com/Contoso/webapp.
The web app must be evaluated before it is moved to production. You must deploy the initial
code release to a deployment slot named staging.
You need to create the web app and deploy the code.
How should you complete the commands? To answer, select the appropriate options in the
answer area.

A
152
Q

You have a web service that is used to pay for food deliveries. The web service uses Azure
Cosmos DB as the data store.
You plan to add a new feature that allows users to set a tip amount. The new feature requires that
a property named tip on the document in Cosmos DB must be present and contain a numeric value.
There are many existing websites and mobile apps that use the web service that will not be updated to set the tip property for some time.
How should you complete the trigger?

A
  1. getRequest
  2. (!”tip” in i)
  3. setBody
153
Q
A
  • FROM
  • WORKDIR
  • COPY
  • RUN
  • CMD
154
Q

You are configuring a new development environment for a Java application.
The environment requires a Virtual Machine Scale Set (VMSS), several storage accounts, and networking components.
The VMSS must not be created until the storage accounts have been successfully created and an associated load balancer and virtual network is configured.
How should you complete the Azure Resource Manager template? To answer, select the appropriate options in the answer area.

A
155
Q
A

Box 1: No
It logs the following:
- ExpirationTime - The time that the message expires.
- InsertionTime - The time that the message was added to the queue.

Box 2: Yes
maxDequeueCount: The number of times to try processing a message before moving it to the poison queue. Default value is 5.

Box 3: Yes
When there are multiple queue messages waiting, the queue trigger retrieves a batch of messages and invokes function instances concurrently to process them. By default, the batch size is 16. When the number being processed gets down to 8, the runtime gets another batch and starts processing those messages. So the maximum number of concurrent messages being processed per function on one virtual machine (VM) is 24.

Box 4: Yes
[Table(“Orders”)]ICollector<Order> table bindings
And in the code it adds the order:
tableBindings.Add(JsonConvert.DeserializeObject<object>(myQueueItem.AsString));</object></Order>

156
Q

You are developing a solution for a hospital to support the following use cases:
 The most recent patient status details must be retrieved even if multiple users in different locations have updated the patient record.
 Patient health monitoring data retrieved must be the current version or the prior version.
 After a patient is discharged and all charges have been assessed, the patient billing record contains the final charges.
You provision a Cosmos DB NoSQL database and set the default consistency level for the database account to Strong.
You set the value for Indexing Mode to Consistent.
You need to minimize latency and any impact to the availability of the solution.
You must override the default consistency level at the query level to meet the required consistency guarantees for the scenarios.
Which consistency levels should you implement?
To answer, drag the appropriate consistency levels to the correct requirements. Each consistency level may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.

A
157
Q

You are configuring a development environment for your team. You deploy the latest Visual Studio image from the Azure Marketplace to your Azure subscription.
The development environment requires several software development kits (SDKs) and thirdparty components to support application development across the organization.
You install and customize the deployed virtual machine (VM) for your development team.
The customized VM must be saved to allow provisioning of a new team member development environment.
You need to save the customized VM for future provisioning.
Which tools or services should you use?
To answer, select the appropriate options in the answer area.

A
158
Q

You are developing an application to use Azure Blob storage. You have configured Azure Blob storage to include change feeds.
A copy of your storage account must be created in another region. Data must be copied from the current storage account to the new storage account directly between the storage servers.
You need to create a copy of the storage account in another region and copy the data.
In which order should you perform the actions?
To answer, move all actions from the list of actions to the answer area and arrange them in the correct order.

A
159
Q
A
  1. Run Command
  2. Custom Script Extension
160
Q
A
161
Q
A

“IdentityId” should actually be “IdentityType”

162
Q
A
163
Q
A

Box 1: Custom handler
Custom handlers can be used to create functions in any language or runtime by running an HTTP server process, for example Go or Rust.

Box 2: extension bundles
is needed to support the bindings and triggers that you use

164
Q
A

WEBSITES_ENABLE_APP_SERVICE_STORAGE=true
DIAGDATA=/home

165
Q
A

> Code
Custom Handler
custom (only option when you pick Custom Handler)

166
Q
A
167
Q
A
168
Q
A
169
Q
A
170
Q
A
171
Q
A
172
Q
A
173
Q
A

–sku B1 –is-linux
–deployment-container-image-name images.azurecr.io/website:v1.0.0
– container set –docker-registry-server-url https://images.azurecr.io -u admin -p admin

174
Q

You are developing a back-end Azure App Service that scales based on the number of messages contained in a Service Bus queue.
A rule already exists to scale up the App Service when the average queue length of unprocessed and valid queue messages is greater than 1000.
You need to add a new rule that will continuously scale down the App Service as long as the scale up condition is not met.
How should you configure the Scale rule? To answer, select the appropriate options in the answer area.

A

The correct answers are
1) Service bus queue
2) Active message count
3) Average
4) Less than or equal to
5) Decrease count by

175
Q
A

Since we’re talking about updating the metadata,
- first we need to fetch it, to populate blob’s properties and metadata (we want to update it - without fetching we would just set the new metadata):
FetchAttributesAsync
- second, we need to manipulate the metadatas to update them and the best fitting is
Metadata.Add
- third, we have to persist our changes. We can use a method that initiates an asynchronous operation to update the blob’s metadata, which is
SetMetadataAsync

176
Q
A
  • Upgrade the existing one to GPv2
  • Create a new GPV2 standard account with default access level to cool
  • And then copy archive data to the GPV2 and delete the data from original storage account.
177
Q
A
178
Q
A
179
Q
A
180
Q
A
181
Q
A

-Correlation Filter (with the not existing value of any filed to avoid getting any message)
-SQL filter (as we need to get all high priority AND international orders, but for Correlation filter: A match exists when an arriving message’s value for a property is equal to the value specified in the correlation filter and we need not equal)
-SQL filter
-SQL filter
-No Filter

182
Q
A
183
Q

You are developing an Azure-hosted e-commerce web application. The application will use Azure Cosmos DB to store sales orders. You are using the latest SDK to manage the sales orders in the database.
You create a new Azure Cosmos DB instance. You include a valid endpoint and valid authorization key to an appSettings.json file in the code project.
You are evaluating the following application code: (Line number are included for reference only.)

A
184
Q

You develop an Azure solution that uses Cosmos DB.
The current Cosmos DB container must be replicated and must use a partition key that is optimized for queries.
You need to implement a change feed processor solution.
Which change feed processor components should you use? To answer, drag the appropriate components to the correct requirements. Each component may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view the content.

A
185
Q
A
186
Q
A

“Core (SQL)” is “Api for NoSQL” now.

187
Q

You are developing an application that uses a premium block blob storage account. The application will process a large volume of transactions daily. You enable Blob storage versioning.
You are optimizing costs by automating Azure Blob Storage access tiers. You apply the following policy rules to the storage account. (Line numbers are included for reference only.)

A

Solution is:
- No
- No
- Yes
- No
I guess, third statement (The policy rule tiers..) result is Yes.
Container name “transactions” is in prefixMatch.

188
Q
A
189
Q

You are developing a solution to store documents in Azure Blob storage. Customers upload documents to multiple containers. Documents consist of PDF, CSV, Microsoft Office format and plain text files.
The solution must process millions of documents across hundreds of containers. The solution must meet the following requirements:
 Documents must be categorized by a customer identifier as they are uploaded to the storage account.
 Allow filtering by the customer identifier.
 Allow searching of information contained within a document
 Minimize costs.
You create and configure a standard general-purpose v2 storage account to support the solution.
You need to implement the solution.
What should you implement? To answer, select the appropriate options in the answer area.

A
190
Q
A

- ETag - server returns this tag for a resource to ensure we operate on the same version of the resource in subsequent API calls
- If-Match - update is processed by the server only if the ETag provided matches the latest resource version ETag

The reason for that is we want to make sure we update the latest version of a resource:
“Update operations must use the latest data changes when writing”
So, when using Last-Modified with If-Modified-Since, the operation executes only when another client modifies the resource between our READ and WRITE operations.
If we wanted to use Last-Modified instead, we would need If-Unmodified-Since instead.

191
Q

An organization deploys a blob storage account. Users take multiple snapshots of the blob storage account over time.
You need to delete all snapshots of the blob storage account. You must not delete the blob storage account itself.
How should you complete the code segment? To answer, select the appropriate options in the answer area.

A
192
Q
A
193
Q
A

1. changeFeedClient.GetChanges(x).AsPages()
2. x = c.ContinuationToken;

  1. box:
    - GetChanges() - wrong - var c in the foreach would be BlobChangeFeedEvent which doesn’t contain Values property used in ProcessChanges(c.Values) line below
    - GetChangesAsync - wrong - code won’t compile because it would require await foreach loop instead
    - GetChanges(x).AsPages() - correct - it’s the only option to make this code even compile
    - GetChanges(x).GetEnumerator() - wrong - you cannot use IEnumerator type as foreach source
  2. box:
    - x = c.ContinuationToken - right - variable x was used as continuationToken parameter in changeFeedClient.GetChanges(x).AsPages() above
    - c.GetRawResponse().ReasonPhrase - wrong - that does not make sense to use this value as continuation token
    - x = c.Values.Min - wrong - continuation token is a number not date
    - x = c.Values.Max - wrong - as above
194
Q

You develop an application that sells AI generated images based on user input. You recently started a marketing campaign that displays unique ads every second day.
Sales data is stored in Azure Cosmos DB with the date of each sale being stored in a property named ‘whenFinished’.
The marketing department requires a view that shows the number of sales for each unique ad.
You need to implement the query for the view.
How should you complete the query? To answer, select the appropriate options in the answer area.

A
195
Q

You implement an Azure solution to include Azure Cosmos DB, the latest Azure Cosmos DB SDK, and the Core (SQL) API. You also implement a change feed processor on a new container instance by using the Azure Functions trigger for Azure Cosmos DB.
A large batch of documents continues to fail when reading one of the documents in the batch.
The same batch of documents is continuously retried by the triggered function and a new batch of documents must be read.
You need to implement the change feed processor to read the documents.
Which feature should you implement? To answer, select the appropriate features in the answer area.

A
196
Q
A

there is a chance that both are “constant prefix”

197
Q
A

Box 1: companymedia
List Blobs
The List Blobs operation returns a list of the blobs under the specified container.
Request
You can construct the List Blobs request as follows. HTTPS is recommended. Replace
myaccount with the name of your storage account.
Method, Request URI
GET https://myaccount.blob.core.windows.net/mycontainer?restype=container&comp=list
Box 2: companyimages
Box 3: Status=’Final’

198
Q

You are developing an application to securely transfer data between on-premises file systems and Azure Blob storage. The application stores keys, secrets, and certificates in Azure Key Vault. The application uses the Azure Key Vault APIs.
The application must allow recovery of an accidental deletion of the key vault or key vault objects. Key vault objects must be retained for 90 days after deletion.
You need to protect the key vault and key vault objects.
Which Azure Key Vault feature should you use? To answer, drag the appropriate features to the correct actions. Each feature may be used once, more than once, or not at all.
You may need to
drag the split bar between panes or scroll to view content.

A

Box 1: Soft delete
When soft-delete is enabled, resources marked as deleted resources are retained for a specified period (90 days by default). The service further provides a mechanism for recovering the deleted object, essentially undoing the deletion.
This can be achieved with the help of the soft-delete feature of the key vault.

Box 2: Purge protection
Purge protection is an optional Key Vault behavior and is not enabled by default. Purge protection can only be enabled once soft-delete is enabled.
When purge protection is on, a vault or an object in the deleted state cannot be purged until the retention period has passed. Soft-deleted vaults and objects can still be recovered, ensuring that the retention policy will be followed.
This can be achieved with the help of the purge protection feature of the key vault.

199
Q

You are developing an ASP.NET Core website that can be used to manage photographs which are stored in Azure Blob Storage containers.
Users of the website authenticate by using their Azure Active Directory (Azure AD) credentials.
You implement role-based access control (RBAC) role permissions on the containers that store photographs. You assign users to RBAC roles.
You need to configure the website’s Azure AD Application so that user’s permissions can be used with the Azure Blob containers.
How should you configure the application?
To answer, drag the appropriate setting to the correct location. Each setting can be used once, more than once, or not at all. You may need to drag the
split bar between panes or scroll to view content.

A
200
Q

You are developing an ASP.NET Core app that includes feature flags which are managed by Azure App Configuration. You create an Azure App Configuration store named AppFeatureFlagStore that contains a feature flag named Export.
You need to update the app to meet the following requirements:
 Use the Export feature in the app without requiring a restart of the app.
 Validate users before users are allowed access to secure resources.
 Permit users to access secure resources.
How should you complete the code segment? To answer, select the appropriate options in the answer area.

A
201
Q

You plan to deploy a new application to a Linux virtual machine (VM) that is hosted in Azure.
The entire VM must be secured at rest by using industry-standard encryption technology to address organizational security and compliance requirements.
You need to configure Azure Disk Encryption for the VM.
How should you complete the Azure CLI commands? To answer, select the appropriate options in the answer area.

A
202
Q

You are developing an application. You have an Azure user account that has access to two subscriptions.
You need to retrieve a storage account key secret from Azure Key Vault.
In which order should you arrange the PowerShell commands to develop the solution? To answer, move all commands from the list of commands to the answer area and arrange them in the correct order.

A
  1. Get-AzSubscription
  2. Set-AzContext –SubscriptionId $subscriptionID
  3. Get-AzKeyVaultSecret –VaultName $vaultName
  4. Get-AzStorageAccountKey –ResourceGroupName $resGroup –Name $storAcct
  5. $secretvalue = ConvertTo-SecureString $storAcctkey –AsPlainText –Force
    Set-AzKeyVaultSecret –VaultName $vaultName –Name $secretName –SecretValue $secretvalue
203
Q

You are building a website that is used to review restaurants. The website will use an Azure CDN to improve performance and add functionality to requests.
You build and deploy a mobile app for Apple iPhones. Whenever a user accesses the website from an iPhone, the user must be redirected to the app store.
You need to implement an Azure CDN rule that ensures that iPhone users are redirected to the app store.
How should you complete the Azure Resource Manager template? To answer, select the appropriate options in the answer area.

A

red arrows mark correct answers

204
Q
A

1) groupMembershipClaims
2) oauth2AllowImplicitFlow

205
Q
A

Inbound
Inbound
Inbound
Outbound

206
Q
A
207
Q
A
208
Q

You are developing an application that uses a premium block blob storage account. You are optimizing costs by automating Azure Blob Storage access tiers. You apply the following policy rules to the storage account. You must determine the implications of applying the rules to the data. (Line numbers are included for reference only.)

A
  1. Yes
  2. Yes
  3. Yes
  4. No
209
Q
A
210
Q
A
211
Q
A

1 Azure AD instance
2 In App Registration, select new registration
3 Create a new application and provide the name

212
Q

You are developing an ASP.NET Core app that includes feature flags which are managed by Azure App Configuration. You create an Azure App Configuration store named AppFeatureflagStore as shown in the exhibit:

A
213
Q
A
214
Q

You are developing an application to store and retrieve data in Azure Blob storage. The application will be hosted in an on-premisesvirtual machine (VM). The VM is connected to Azure by using a Site-to-Site VPN gateway connection. The application is secured by using Azure Active Directory (Azure AD) credentials.
The application must be granted access to the Azure Blob storage account with a start time, expiry time, and read permissions. The Azure Blob storage account access must use the Azure AD credentials of the application to secure data access. Data access must be able to be revoked if the client application security is breached.
You need to secure the application access to Azure Blob storage.
Which security features should you use? To answer select the appropriate options in the answer area.

A
215
Q

You are a developer building a web site using a web app. The web site stores configuration data in Azure App Configuration.
Access to Azure App Configuration has been configured to use the identity of the web app for authentication. Security requirements specify that no other authentication systems must be used. You need to load configuration data from Azure App Configuration.
How should you complete the code? To answer, select the appropriate options in the answer area.

A
216
Q

You are building an application that stores sensitive customer data in Azure Blob storage. The data must be encrypted with a key that is unique for each customer.
If the encryption key has been corrupted it must not be used for encryption.
You need to ensure that the blob is encrypted.
How should you complete the code segment? To answer, select the appropriate options in the answer area.

A
217
Q

You are developing a web application that uses the Microsoft Identity platform for user and resource authentication. The web application called several REST APIs.
You are implementing various authentication and authorization flows for the web application.
You need to validate the claims in the authentication token.
Which token type should you use? To answer, select the appropriate options in the answer area.

A
218
Q

You are developing a content management application for technical manuals. The application is deployed as an Azure Static Web app.
Authenticated users can view pages under/manuals but only contributors can access the page
/manuals/new.html.
You need to configure the routing for the web app.
How should you complete the configuration? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

A
219
Q

You develop and deploy a web app to Azure App service. The web app allows users to authenticate by using social identity providers through the Azure B2C service. All user profile information is stored in Azure B2C.
You must update the web app to display common user properties from Azure B2C to include the
following information:
 Email address
 Job title
 First name
 Last name
 Office location

You need to implement the user properties in the web app.
Which code library and API should you use? To answer, select the appropriate options in the
answer area.

A
220
Q

You develop and deploy the following staticwebapp.config.json file to the app_location value specified in the workflow file of an Azure Static Web app:

For each of the following statements, select Yes if the statement is true. Otherwise, select No.

A

1. N
- If you go to the login page, you redirect to Github, just like in the question.
- If the request got a 401 error (Unauthorized), it redirects the user to the login route (with 302 response), which then redirects to the GitHub login.

But in this question, if you check what happens when the request got a 401 error (Unauthorized), it redirects to the AAD route (with 302 response), not to Github. So this part of the question and the example are different.

2. Y
That part of the question seems the same as the example above.
A non-existent file in the /images/ folder -> A 404 error.

3. Y
That part of the question seems the same as the example above.
GET requests from authenticated users in the registeredusers role are sent to the API. Authenticated users not in the registeredusers role and unauthenticated users are served a 401 error.

4. N
It overrides the 401 error to the 302 and redirects the user to the AAD URL.

221
Q
A

1) SecretClient
2) DefaultAzureCredential

222
Q
A

1) Configure the web app to the Standard App Service Tier
2) Enable autoscaling on the web app
3) Add a Scale condition
4) Add a scale rule

people argue on the order of 3 and 4

223
Q
A

1) IDatabase cache = Connection.GetDatabase();
2) cache.KeyDelete(“teams”)

224
Q

A company has multiple warehouses. Each warehouse contains IoT temperature devices which deliver temperature data to an Azure Service Bus queue.
You need to send email alerts to facility supervisors immediately if the temperature at a warehouse goes above or below specified threshold temperatures.
Which five actions should you perform in sequence? To answer, move the appropriate actions
from the list of actions to the answer area and arrange them in the correct order.

A
225
Q
A

1.Funnels
2.Impact
3.Retention
4.User flow

226
Q

You are debugging an application that is running on Azure Kubernetes cluster named cluster1.
The cluster uses Azure Monitor for containers to monitor the cluster.
The application has sticky sessions enabled on the ingress controller.
Some customers report a large number of errors in the application over the last 24 hours.
You need to determine on which virtual machines (VMs) the errors are occurring.
How should you complete the Azure Monitor query? To answer, select the appropriate options in
the answer area.

A
227
Q
A

1) config
2) docker-container-logging
3) webapp
4) tail

228
Q
A

1) External
2) Private
3) Authorization

229
Q
A
230
Q
A
231
Q
A
  1. Create Log Analytics workspace (in Azure Portal)
  2. Create Application Insights resource (in Monitor, Application Insights with workspace attached)
    3.Add VMInsights solution (== activate VMInsights in Monitor, choose VM’s, attach workspace).
    4.Install agents on VM
232
Q
A
233
Q
A

1.Yes
2. No
3. Yes

234
Q

You are developing an Azure App Service hosted ASP.NET Core web app to deliver video-ondemand streaming media. You enable an Azure Content Delivery Network (CDN) Standard for the web endpoint. Customer videos are downloaded from the web app by using the following
example URL: http://www.contoso.com/content.mp4?quality=1.
All media content must expire from the cache after one hour. Customer videos with varying quality must be delivered to the closest regional point of presence (POP) node.
You need to configure Azure CDN caching rules.
Which options should you use? To answer, select the appropriate options in the answer area.

A
235
Q
A
236
Q
A
237
Q
A
238
Q
A
239
Q
A
240
Q
A
241
Q

A company is developing a solution that allows smart refrigerators to send temperature information to a central location.
The solution must receive and store messages until they can be processed. You create an Azure Service Bus instance by providing a name, pricing tier, subscription, resource group, and location.
You need to complete the configuration.
Which Azure CLI or PowerShell command should you run?

A

I think the Service Bus has already been created and now they ask you to complete the configuration. The next step is creating the queue.
In fact, all the steps are shown:
B. Create group.
C. Create Service Bus.
A. Create Queue. <– Correct answer.
D. Get connectionstring.

242
Q
A
243
Q
A

Answer: C
Question is poorly worded. I think what is asked here is that service bus instance is already created and now you need to complete the configuration to start using the bus. In this case, you will need to create Queue and hence correct answer is C

244
Q

You develop a news and blog content app for Windows devices.
A notification must arrive on a user’s device when there is a new article available for them to view.
You need to implement push notifications.
How should you complete the code segment? To answer, select the appropriate options in the answer area.

A
245
Q
A
  1. Https(s) endpoint
  2. Client cert
246
Q
A

people are saying boxes may be switched on actual exam

247
Q
A
248
Q
A
249
Q
A

Box 1: windows
Box 2: application/octet-stream

250
Q
A

Source -> blob storage
Receiver -> event grid
Handler -> logic app

251
Q
A
252
Q

A software as a service (SaaS) company provides document management services. The company has a service that consists of several Azure web apps. All Azure web apps run in an Azure App Service Plan named PrimaryASP.
You are developing a new web service by using a web app named ExcelParser. The web app contains a third-party library for processing Microsoft Excel files. The license for the third-party library stipulates that you can only run a single instance of the library.
You need to configure the service.
How should you complete the script? To answer, select the appropriate options in the answer area.

A
253
Q
A

Inbound
Outbound
Backend

254
Q

A company backs up all manufacturing data to Azure Blob Storage. Admins move blobs from hot storage to archive tier storage every month.
You must automatically move blobs to Archive tier after they have not been modified within 180 days. The path for any item that is not archived must be placed in an existing queue. This operation must be performed automatically once a month. You set the value of TierAgeInDays to -180.
How should you configure the Logic App? To answer, drag the appropriate triggers or action blocks to the correct trigger or action slots. Each trigger or action block may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:

A

Box 1 Recurrence
Box 2 Condition
Box 3 Tier blob (item last modified < current date - 180 days)
Box 4 Put a message on a queue
Box 5 Insert Entity

people are saying Logic App is not part of 204 exam since some time ago

255
Q
A
256
Q
A
257
Q
A

Answer: B
Explanation:
A service bus instance has already been created. Next is step 3, Create a Service Bus queue.

258
Q
A

Answer: C
Explanation:
A service bus instance has already been created. Next is step 3, Create a Service Bus queue.

259
Q
A

Answer: C
Explanation:
A service bus instance has already been created. Next is step 3, Create a Service
Bus queue.

260
Q

You are developing a solution that uses several Azure Service Bus queues. You create an Azure Event Grid subscription for the Azure Service Bus namespace. You use Azure Functions as subscribers to process the messages.
You need to emit events to Azure Event Grid from the queues. You must use the principal of least privilege and minimize costs.
Which Azure Service Bus values should you use? To answer, select the appropriate options in the answer area.

A
261
Q

You develop a web application that provides access to legal documents that are stored on Azure
Blob Storage with version-level immutability policies. Documents are protected with both timebased policies and legal hold policies. All time-based retention policies have the
AllowProtectedAppendWrites property enabled.
You have a requirement to prevent the user from attempting to perform operations that would
fail only when a legal hold is in effect and when all other policies are expired.
You need to meet the requirement.
Which two operations should you prevent? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. adding data to documents
B. deleting documents
C. creating documents
D. overwriting existing documents

A

Answer: BD
Explanation:
The Append Block operation is permitted only for policies with the
allowProtectedAppendWrites or allowProtectedAppendWritesAll property enabled.
The AllowProtectedAppendWrites property setting allows for writing new blocks to an append
blob while maintaining immutability protection and compliance. If this setting is enabled, you
can create an append blob directly in the policy-protected container, and then continue to add
new blocks of data to the end of the append blob with the Append Block operation. Only new
blocks can be added; any existing blocks can’t be modified or deleted. Enabling this setting
doesn’t affect the immutability behavior of block blobs or page blobs.

262
Q

You develop Azure solutions.
You must grant a virtual machine (VM) access to specific resource groups in Azure Resource
Manager.
You need to obtain an Azure Resource Manager access token.
Solution: Use the Reader role-based access control (RBAC) role to authenticate the VM with
Azure Resource Manager.
Does the solution meet the goal?
A. Yes
B. No

A

Answer: B
Explanation:
Instead run the Invoke-RestMethod cmdlet to make a request to the local managed identity for
Azure resources endpoin

263
Q
A
264
Q
A
265
Q
A
266
Q

You implement an Azure solution to include Azure Cosmos DB, the latest Azure Cosmos DB SDK, and the Core (SQL) API. You also implement a change feed processor on a new container instance by using the Azure Functions trigger for Azure Cosmos DB.
A large batch of documents continues to fail when reading one of the documents in the batch.
The same batch of documents is continuously retried by the triggered function and a new batch of documents must be read.
You need to implement the change feed processor to read the documents.
Which feature should you implement? To answer, select the appropriate features in the answer area.

A
267
Q
A
268
Q
A
269
Q
A
270
Q
A
271
Q

You develop new functionality in a web application for a company that provides access to seismic data from around the world. The seismic data is stored in Redis Streams within an Azure Cache for Redis instance.
The new functionality includes a real-time display of seismic events as they occur.
You need to implement the Azure Cache for Redis command to receive seismic data.
How should you complete the command? To answer, select the appropriate options in the answer area.

A

Box 1: XREAD
Data is consumed from a stream using the XREAD command.
Box 2: BLOCK 0
Apart from the fact that XREAD can access multiple streams at once, and that we are able to
specify the last ID we own to just get newer messages, in this simple form the command is not
doing something so different compared to XRANGE. However, the interesting part is that we
can turn XREAD into a blocking command easily, by specifying the BLOCK argument:
> XREAD BLOCK 0 STREAMS mystream $
BLOCK 0 is a timeout of 0 milliseconds, which means to never timeout.
Box 3: $
The $ operator indicates to start reading the stream from the end. Effectively, this example will
only receive new entries instead of historical entries.
The special ID $ means that XREAD should use as last ID the maximum ID already stored in the
stream mystream, so that we will receive only new messages, starting from the time we started
listening.

272
Q
A
273
Q

A software as a service (SaaS) company provides document management services. The company has a service that consists of several Azure web apps. All Azure web apps run in an Azure App Service Plan named PrimaryASP.
You are developing a new web service by using a web app named ExcelParser. The web app contains a third-party library for processing Microsoft Excel files. The license for the third-party library stipulates that you can only run a single instance of the library.
You need to configure the service.
How should you complete the script? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

A
274
Q

You develop several Azure Functions app functions to process JSON documents from a thirdparty system. The third-party system publishes events to Azure Event Grid to include hundreds of event types, such as billing, inventory, and shipping updates.
Events must be sent to a single endpoint for the Azure Functions app to process. The events must be filtered by event type before processing. You must have authorization and authentication control to partition your tenants to receive the event data.
You need to configure Azure Event Grid.
Which configuration should you use? To answer, select the appropriate values in the answer area.

A
275
Q
A

Box 1: inbound
Include a fragment in a policy definition
Configure the include-fragment policy to insert a policy fragment in a policy definition.
You may include a fragment at any scope and in any policy section, as long as the underlying
policy or policies in the fragment support that usage.
You may include multiple policy fragments in a policy definition.
For example, insert the policy fragment named ForwardContext in the inbound policy section:

<policies>
<inbound>
<include-fragment></include-fragment>
<base></base>
</inbound>
[...]
**Box 2: include-fragment**
**Box 3: fragment-id**
**Box 4: inbound**

![!BS! ](https://s3.amazonaws.com/brainscape-prod/system/cm/525/160/838/a_image_ios.?1724608648 "eyJvcmlnaW5hbFVybCI6Imh0dHBzOi8vczMuYW1hem9uYXdzLmNvbS9icmFpbnNjYXBlLXByb2Qvc3lzdGVtL2NtLzUyNS8xNjAvODM4L2FfaW1hZ2Vfb3JpZ2luYWwuP2Y4NzhmNWY0YzE2MTQ5ZDMzOWY0MDZkYzU4ZDdjZDMwIn0=")
</policies>

276
Q

You are developing an Azure Durable Function based application that processes a list of input values. The application is monitored using a console application that retrieves JSON data from an Azure Function diagnostic endpoint.
During processing a single instance of invalid input does not cause the function to fail. Invalid input must be available to the monitoring application.
You need to implement the Azure Durable Function and the monitoring console application.
How should you complete the code segments? To answer, select the appropriate options in the answer area.

A
277
Q

You develop and deploy an Azure Logic app that calls an Azure Function app. The Azure Function app includes an OpenAPI (Swagger) definition and uses an Azure Blob storage account. All resources are secured by using Azure Active Directory (Azure AD).
The Azure Logic app must securely access the Azure Blob storage account. Azure AD resources must remain if the Azure Logic app is deleted.
You need to secure the Azure Logic app.
What should you do?
A. Create a user-assigned managed identity and assign role-based access controls.
B. Create an Azure AD custom role and assign the role to the Azure Blob storage account.
C. Create an Azure Key Vault and issue a client certificate.
D. Create a system-assigned managed identity and issue a client certificate.
E. Create an Azure AD custom role and assign role-based access controls.

A

Answer: A
Explanation:
To give a managed identity access to an Azure resource, you need to add a role to the target resource for that identity.
Note: To easily authenticate access to other resources that are protected by Azure Active Directory (Azure AD) without having to sign in and provide credentials or secrets, your logic app can use a managed identity (formerly known as Managed Service Identity or MSI). Azure manages this identity for you and helps secure your credentials because you don’t have to provide or rotate secrets.
If you set up your logic app to use the system-assigned identity or a manually created, userassigned identity, the function in your logic app can also use that same identity for authentication.

278
Q

You are developing applications for a company. You plan to host the applications on Azure App Services.
The company has the following requirements:
 Every five minutes verify that the websites are responsive.
 Verify that the websites respond within a specified time threshold. Dependent requests such as images and JavaScript files must load properly.
 Generate alerts if a website is experiencing issues.
 If a website fails to load, the system must attempt to reload the site three more times.
You need to implement this process with the least amount of effort.
What should you do?
A. Create a Selenium web test and configure it to run from your workstation as a scheduled task.
B. Set up a URL ping test to query the home page.
C. Create an Azure function to query the home page.
D. Create a multi-step web test to query the home page.
E. Create a Custom Track Availability Test to query the home page

A

Answer: D
Explanation:
You can monitor a recorded sequence of URLs and interactions with a website via multi-step web tests.
Incorrect Answers:
A: Selenium is an umbrella project for a range of tools and libraries that enable and support the automation of web browsers.
It provides extensions to emulate user interaction with browsers, a distribution server for scaling browser allocation, and the infrastructure for implementations of the W3C WebDriver
specification that lets you write interchangeable code for all major web browsers.