Practice Set 2 Flashcards
You use Azure Table storage to store customer information for an application. The data contains customer details and is partitioned by last name.
You need to create a query that returns all customers with the last name Smith.
Which code segment should you use?
A. TableQuery.GenerateFilterCondition(“PartitionKey”, Equals, “Smith”)
B. TableQuery.GenerateFilterCondition(“LastName”, Equals, “Smith”)
C. TableQuery.GenerateFilterCondition(“PartitionKey”, QueryComparisons.Equal, “Smith”)
D. TableQuery.GenerateFilterCondition(“LastName”, QueryComparisons.Equal, “Smith”)
Correct Answer: C
Retrieve all entities in a partition. The following code example specifies a filter for entities where ‘Smith’ is the partition key. This example prints the fields of each entity in the query results to the console.
Construct the query operation for all customer entities where PartitionKey=”Smith”.
TableQuery<customerentity> query = new TableQuery<customerentity>().Where(TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal,<br></br>"Smith"));<br></br>References:<br></br>https://docs.microsoft.com/en-us/azure/cosmos-db/table-storage-how-to-use-dotnet</customerentity></customerentity>
You are developing an app that manages users for a video game. You plan to store the region, email address, and phone number for the player. Some players may not have a phone number. The player’s region will be used to load-balance data.
Data for the app must be stored in Azure Table Storage.
You need to develop code to retrieve data for an individual player.
How should you complete the code? To answer, select the appropriate options in the answer area.
Correct Answer: Explanation
Box 1: region -
The player’s region will be used to load-balance data.
Choosing the PartitionKey.
The core of any table’s design is based on its scalability, the queries used to access it, and storage operation requirements. The PartitionKey values you choose will dictate how a table will be partitioned and the type of queries that can be used. Storage operations, in particular inserts, can also affect your choice of
PartitionKey values.
Box 2: email -
Not phone number some players may not have a phone number.
Box 3: CloudTable -
Box 4 : TableOperation query =..
Box 5: TableResult -
References:
https://docs.microsoft.com/en-us/rest/api/storageservices/designing-a-scalable-partitioning-strategy-for-azure-table-storage
You are developing a data storage solution for a social networking app.
The solution requires a mobile app that stores user information using Azure Table Storage.
You need to develop code that can insert multiple sets of user information.
How should you complete the code? To answer, select the appropriate options in the answer area.
Correct Answer: Explanation
Box 1, Box 2: TableBatchOperation
Create the batch operation.
TableBatchOperation op = new TableBatchOperation();
Box 3: ExecuteBatch -
/ Execute the batch operation.
table.ExecuteBatch(op);
Note: You can insert a batch of entities into a table in one write operation. Some other notes on batch operations:
You can perform updates, deletes, and inserts in the same single batch operation.
A single batch operation can include up to 100 entities.
All entities in a single batch operation must have the same partition key.
While it is possible to perform a query as a batch operation, it must be the only operation in the batch.
References:
https://docs.microsoft.com/en-us/azure/cosmos-db/table-storage-how-to-use-dotnet
You must implement Application Insights instrumentation capabilities utilizing the Azure Mobile Apps SDK to provide meaningful analysis of user interactions with a molbile app.
You need to capture the data required to implement the Usage Analytics feature of Application Insights.
Which three data values should you capture? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Trace
B. Session Id
C. Exception
D. User Id
E. Events
Correct Answer: ADE
Application Insights is a service for monitoring the performance and usage of your apps. This module allows you to send telemetry of various kinds (events, traces, etc.) to the Application Insights service where your data can be visualized in the Azure Portal.
Application Insights manages the ID of a session for you.
References:
https://github.com/microsoft/ApplicationInsights-Android
Your company has several websites that use a company logo image. You use Azure Content Delivery Network (CDN) to store the static image.
You need to determine the correct process of how the CDN and the Point of Presence (POP) server will distribute the image and list the items in the correct order.
In which order do the actions occur? To answer, move all actions from the list of actions to the answer area and arrange them in the correct order.
Correct Answer: Explanation
Step 1: A user requests the image..
A user requests a file (also called an asset) by using a URL with a special domain name, such as <endpoint>.azureedge.net. This name can be an endpoint hostname or a custom domain. The DNS routes the request to the best performing POP location, which is usually the POP that is geographically closest to the user.<br></br>Step 2: If no edge servers in the POP have the..<br></br>If no edge servers in the POP have the file in their cache, the POP requests the file from the origin server. The origin server can be an Azure Web App, Azure<br></br>Cloud Service, Azure Storage account, or any publicly accessible web server.<br></br>Step 3: The origin server returns the..<br></br>The origin server returns the file to an edge server in the POP.<br></br>An edge server in the POP caches the file and returns the file to the original requestor (Alice). The file remains cached on the edge server in the POP until the time-to-live (TTL) specified by its HTTP headers expires. If the origin server didn't specify a TTL, the default TTL is seven days.<br></br>Step 4: Subsequent requests for..<br></br>Additional users can then request the same file by using the same URL that the original user used, and can also be directed to the same POP.<br></br>If the TTL for the file hasn't expired, the POP edge server returns the file directly from the cache. This process results in a faster, more responsive user experience.<br></br>References:<br></br>https://docs.microsoft.com/en-us/azure/cdn/cdn-overview</endpoint>
You develop a solution that uses an Azure SQL Database to store user information for a mobile app.
The app stores sensitive information about users.
You need to hide sensitive information from developers that query the data for the mobile app.
Which three items must you identify when configuring dynamic data masking? Each correct answer presents a part of the solution.
NOTE: Each correct selection is worth one point.
A. Column
B. Table
C. Trigger
D. Index
E. Schema
A,B,E
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-dynamic-data-masking-get-started-portal
You develop an app that allows users to upload photos and videos to Azure storage. The app uses a storage REST API call to upload the media to a blob storage account named Account1. You have blob storage containers named Container1 and Container2.
Uploading of videos occurs on an irregular basis.
You need to copy specific blobs from Container1 to Container2 in real time when specific requirements are met, excluding backup blob copies.
What should you do?
A. Download the blob to a virtual machine and then upload the blob to Container2.
B. Run the Azure PowerShell command Start-AzureStorageBlobCopy.
C. Copy blobs to Container2 by using the Put Blob operation of the Blob Service REST API.
D. Use AzCopy with the Snapshot switch blobs to Container2.
Correct Answer: B
The Start-AzureStorageBlobCopy cmdlet starts to copy a blob.
Example 1: Copy a named blob -
C:\PS>Start-AzureStorageBlobCopy -SrcBlob “ContosoPlanning2015” -DestContainer “ContosoArchives” -SrcContainer “ContosoUploads”
This command starts the copy operation of the blob named ContosoPlanning2015 from the container named ContosoUploads to the container named
ContosoArchives.
References:
https://docs.microsoft.com/en-us/powershell/module/azure.storage/start-azurestorageblobcopy?view=azurermps-6.13.0
You are developing and deploying several ASP.Net web applications to Azure App Service. You plan to save session state information and HTML output. You must use a storage mechanism with the following requirements:
✑ Share session state across all ASP.NET web applications
✑ Support controlled, concurrent access to the same session state data for multiple readers and a single writer
✑ Save full HTTP responses for concurrent requests
You need to store the information.
1. Proposed Solution: Deploy and configure an Azure Database for PostgreSQL. Update the web applications.
2. Proposed Solution: Deploy and configure Azure Cache for Redis. Update the web applications.
Does the solution meet the goal?
- No
- Yes
https: //docs.microsoft.com/en-us/azure/architecture/best-practices/caching?source=docs#caching-session-state-and-html-output
You are developing an Azure solution to collect point-of-sale (POS) device data from 2,000 stores located throughout the world. A single device can produce 2 megabytes (MB) of data every 24 hours. Each store location has one to five devices that send data.
You must store the device in Azure Blob storage. Device data must be correlated based on a device identifier. Additional stores are expected to open in the future.
You need to implement a solution to receive the device data.
1. Solution: Provision an Azure Event Hub. Configure the machine identifier as the partition key and enable capture.
Does the solution meet the goal?
- Solution: Provision an Azure Event Grid. Configure event filtering to evaluate the device identifier.
Does the solution meet the goal? - Solution: Provision an Azure Notification Hub. Register all devices with the hub.
Does the solution meet the goal?
- Yes
https: //docs.microsoft.com/en-us/azure/event-hubs/event-hubs-programming-guide - No
https: //docs.microsoft.com/en-us/azure/event-grid/event-filtering - No
Instead provision an Azure Event Hub. Configure the machine identifier as the partition key and enable capture.
https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-programming-guide
You develop and deploy a Java RESTful API to Azure App Service.
You open a browser and navigate to the URL for the API. You receive the following error message:
You need to resolve the error.
What should you do?
A. Bind an SSL certificate
B. Enable authentication
C. Enable CORS
D. Map a custom domain
E. Add a CDN
Correct Answer: C
We need to enable Cross-Origin Resource Sharing (CORS).
You are developing an internal website for employees to view sensitive data. The website uses Azure Active Directory (AAD) for authentication.
You need to implement multifactor authentication for the website.
What should you do? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Upgrade to Azure AD Premium.
B. In Azure AD conditional access, enable the baseline policy.
C. In Azure AD, create a new conditional access policy.
D. In Azure AD, enable application proxy.
E. Configure the website to use Azure AD B2C.
Reveal Solution Discussion
Previous QuestionsNext Questions
Correct Answer: AC
A: Multi-Factor Authentication comes as part of the following offerings:
✑ Azure Active Directory Premium licenses - Full featured use of Azure Multi-Factor Authentication Service (Cloud) or Azure Multi-Factor Authentication Server
(On-premises).
✑ Multi-Factor Authentication for Office 365
✑ Azure Active Directory Global Administrators
C: MFA Enabled by conditional access policy. It is the most flexible means to enable two-step verification for your users. Enabling using conditional access policy only works for Azure MFA in the cloud and is a premium feature of Azure AD.
References:
https://docs.microsoft.com/en-us/azure/active-directory/authentication/howto-mfa-getstarted
Contoso, Ltd. provides an API to customers by using Azure API Management (APIM). The API authorizes users with a JWT token.
You must implement response caching for the APIM gateway. The caching mechanism must detect the user ID of the client that accesses data for a given location and cache the response for that user ID.
You need to add the following policies to the policies file:
✑ a set-variable policy to store the detected user identity
✑ a cache-lookup-value policy
✑ a cache-store-value policy
✑ a find-and-replace policy to update the response body with the user profile information
To which policy section should you add the policies? To answer, drag the appropriate sections to the correct policies. Each section may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
Correct Answer: Explanation
Box 1: Inbound.
A set-variable policy to store the detected user identity.
Example:
<policies><br></br><inbound><br></br><!-- How you determine user identity is application dependent --><br></br><set-variable></set-variable>name="enduserid"<br></br>value="@(context.Request.Headers.GetValueOrDefault("Authorization","").Split(' ')[1].AsJwt()?.Subject)" /><br></br>Etc.<br></br><br></br>Box 2: Inbound -<br></br><br></br>A cache-lookup-value policy -<br></br>Example:<br></br><inbound><br></br><base></base>
<br></br><cache-lookup><br></br><vary-by-query-parameter>parameter name</vary-by-query-parameter> <!-- optional, can repeated several times --><br></br></cache-lookup><br></br></inbound><br></br><br></br>Box 3: Outbound -<br></br>A cache-store-value policy.<br></br>Example:<br></br><outbound><br></br><base></base>
<br></br><cache-store></cache-store><br></br></outbound><br></br><br></br>Box 4: Outbound -<br></br>A find-and-replace policy to update the response body with the user profile information.<br></br>Example:<br></br><outbound><br></br><!-- Update response body with user profile--><br></br><find-and-replace></find-and-replace>from='"$userprofile$"'<br></br>to="@((string)context.Variables["userprofile"])" /><br></br><base></base>
<br></br></outbound><br></br>References:<br></br>https://docs.microsoft.com/en-us/azure/api-management/api-management-caching-policies https://docs.microsoft.com/en-us/azure/api-management/api-management-sample-cache-by-key</inbound></policies>
You plan to deploy a new application to a Linux virtual machine (VM) that is hosted in Azure.
The entire VM must be secured at rest by using industry-standard encryption technology to address organizational security and compliance requirements.
You need to configure Azure Disk Encryption for the VM.
How should you complete the Azure CLI commands? To answer, select the appropriate options in the answer area.
Correct Answer: Explanation
Box 1: keyvault -
Create an Azure Key Vault with az keyvault create and enable the Key Vault for use with disk encryption. Specify a unique Key Vault name for keyvault_name as follows: keyvault_name=myvaultname$RANDOM az keyvault create \
–name $keyvault_name \
–resource-group $resourcegroup \
–location eastus \
–enabled-for-disk-encryption True
Box 2: keyvault key -
The Azure platform needs to be granted access to request the cryptographic keys when the VM boots to decrypt the virtual disks. Create a cryptographic key in your Key Vault with az keyvault key create. The following example creates a key named myKey: az keyvault key create \
–vault-name $keyvault_name \
–name myKey \
–protection software
Box 3: vm -
Create a VM with az vm create. Only certain marketplace images support disk encryption. The following example creates a VM named myVM using an Ubuntu
16.04 LTS image:
az vm create \
–resource-group $resourcegroup \
–name myVM \
–image Canonical:UbuntuServer:16.04-LTS:latest \
–admin-username azureuser \
–generate-ssh-keys \
Box 4: vm encryption -
Encrypt your VM with az vm encryption enable:
az vm encryption enable \
–resource-group $resourcegroup \
–name myVM \
–disk-encryption-keyvault $keyvault_name \
–key-encryption-key myKey \
–volume-type all
Note: seems to an error in the question. Should have enable instead of create.
Box 5: all -
Encrypt both data and operating system.
References:
https://docs.microsoft.com/bs-latn-ba/azure/virtual-machines/linux/encrypt-disks
You are developing an Azure App Service hosted ASP.NET Core web app to deliver video on-demand streaming media. You enable an Azure Content Delivery
Network (CDN) Standard for the web endpoint. Customer videos are downloaded from the web app by using the following example URL:http//www.contoso.com/ content.p4?quality=1
All media content must expire from the cache after one hour. Customer videos with varying quality must be delivered to the closest regional point of presence
(POP) node.
You need to configure Azure CDN caching rules.
Which options should you use? To answer, select the appropriate options in the answer area.
Correct Answer: Explanation
Box 1: Override -
Override: Ignore origin-provided cache duration; use the provided cache duration instead. This will not override cache-control: no-cache.
Set if missing: Honor origin-provided cache-directive headers, if they exist; otherwise, use the provided cache duration.
Incorrect:
Bypass cache: Do not cache and ignore origin-provided cache-directive headers.
Box 2: 1 hour -
All media content must expire from the cache after one hour.
Box 3: Cache every unique URL -
Cache every unique URL: In this mode, each request with a unique URL, including the query string, is treated as a unique asset with its own cache. For example, the response from the origin server for a request for example.ashx?q=test1 is cached at the POP node and returned for subsequent caches with the same query string. A request for example.ashx?q=test2 is cached as a separate asset with its own time-to-live setting.
Incorrect Answers:
Bypass caching for query strings: In this mode, requests with query strings are not cached at the CDN POP node. The POP node retrieves the asset directly from the origin server and passes it to the requestor with each request.
Ignore query strings: Default mode. In this mode, the CDN point-of-presence (POP) node passes the query strings from the requestor to the origin server on the first request and caches the asset. All subsequent requests for the asset that are served from the POP ignore the query strings until the cached asset expires.
References:
https://docs.microsoft.com/en-us/azure/cdn/cdn-query-string
You have an Azure App Services Web App Azure SQL Database instance Azure Storage Account, and an Azure Redis Cache instance in a resource group.
A developer must be able to publish code to the web app. You must grant the developer the Contribute role to the web app.
You need to grant the role.
Which two commands can you use? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. az role assignment create
B. az role definition create
C. New-AzureRmRoleAssignment
D. New-AzureRmRoleDefinition
Correct Answer: AC
A: The az role assignment create command creates a new role assignment for a user, group, or service principal.
Example: Create role assignment for an assignee.
az role assignment create –assignee sp_name –role a_role
C: The New-AzureRmRoleAssignment command assigns the specified RBAC role to the specified principal, at the specified scope.
Incorrect Answers:
B, D: Creates a custom role in Azure RBAC.
References:
https://docs.microsoft.com/en-us/cli/azure/role/assignment?view=azure-cli-latest#az-role-assignment-create https://docs.microsoft.com/en-us/powershell/module/azurerm.resources/new-azurermroleassignment?view=azurermps-6.13.0