Azure Events Queues and Processing Flashcards

1
Q

What set of solutions can Azure Event Hubs provide?

How does it differ from IOT Hubs?

A

Both solutions are designed for data ingestion at a massive scale

Azure Event Hubs is a big data streaming platform and event ingestion service.

It can receive and process millions of events per second.

Event Hubs uses a partitioned consumer model, enabling multiple applications to process the stream concurrently

IOT Hubs use Event Hubs for it’s telemetery flow path.
But it is a separate service tailored for IOT devices - it enables Bi Directional communication so you can send commands and policies back to devices.

Microsoft recommends using Azure IoT Hub to connect IoT devices to Azure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

If you want to process incoming data 24-7 from a device such as a weather station.

What solution would be ideal, and what pricing plan would you recommend?

A

An Azure Function with an IOT Hub trigger.

You would need to make sure it is configured to be “Always On” so it’s warmed up

You only get that on the Premium Plan

If you need to Audit the data, specify a Storage account as the output binding

You might need to Read related data from other services or storage, You would request these dependencies as parameters using an Input binding

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

You need to react to events happening in the systems components, which is the correct technology?

Event Grid
Event Hub
Azure Notification Hub
IOT Hub

A

Event Grid

You can add Event Grid bindings to the following services that are referenced in the Az-204 text book

Azure App Configuration
Azure Blob Storage
Azure Container Registry
Azure Event Hubs
Azure IoT Hub
Azure Key Vault
Azure Resource groups
Azure Service Bus
As well as:
Azure SignalR
Azure subscriptions
Azure Communication Services
Azure Media Services
Azure Machine Learning
Azure Maps
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

True or False:

To trigger a function from an event you MUST configure Azure Event Grid

A

False

You can configure triggers based on events in some services directly

Azure Event Grid
Azure Blob Storage
Azure Event Hubs
Azure Cosmos DB
Azure Service Bus
Azure Queue storage

https://docs.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings?tabs=csharp

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

A series of functions should execute in a specific order, with the output of one function applied to the import of the next function.

What is this pattern known as?

A

Function chaining

https: //docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=csharp#chaining
https: //docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=csharp
https: //docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-types-features-overview

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Multiple functions execute in parallel, and then wait for all functions to finish when aggregation work is done on the results from the functions.

What is this pattern known as?

A

Fan-out/fan-in

https: //docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=csharp#fan-in-out
https: //docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-orchestrations?tabs=csharp

Example use case for Backups

https://docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-cloud-backup?tabs=csharp

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

For Azure Functions
Event data provided in batches by multiple sources over a period of time must be combined into a single addressable entity - what is this pattern known as

A

Aggregation

You can use Durable entities

https://docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=csharp#aggregator

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are Durable Functions?

A

Durable Functions is an extension of Azure Functions. You can use Durable Functions for stateful orchestration of function execution.

A durable function app is a solution that’s made up of different Azure functions. Functions can play different roles in a durable function orchestration.

Stateless functions present problems of concurrency control

Not only do you need to worry about multiple threads modifying the same data at the same time, you also need to worry about ensuring that the aggregator only runs on a single VM at a time.

You can use Durable entities to preserve state in a Function App

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is an Entity Function?

A

Entity functions define operations for reading and updating small pieces of state, known as durable entities.

Like orchestrator functions, entity functions are functions with a special trigger type, the entity trigger.

Entity functions and related functionality are only available in Durable Functions 2.0 and above.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are Durable Orchestrations?

A

You can use an orchestrator function to orchestrate the execution of other Durable functions within a function app. Orchestrator functions have the following characteristics:

Define function workflows using procedural code. No declarative schemas or designers are needed.

Can call other durable functions synchronously and asynchronously. Output from called functions can be reliably saved to local variables.

Durable and reliable. Execution progress is automatically checkpointed when the function “awaits” or “yields”. Local state is never lost when the process recycles or the VM reboots.

Can be long-running. The total lifespan of an orchestration instance can be seconds, days, months, or never-ending.

https://docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-orchestrations?tabs=csharp

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

If you haven’t previously used Event Grid in your Azure subscription, you might need to register the Event Grid resource provider as the first step to completing your integration.

What is the Azure cli command to enable Event Grid functionality within a subscription?

A

az provider register –namespace Microsoft.EventGrid

You can check the status with
az provider show –namespace Microsoft.EventGrid –query “registrationState”

https://docs.microsoft.com/en-us/azure/event-grid/custom-event-quickstart

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

A client in the Marketing department wants to review uploaded files submitted to Azure Storage Blobs, so that they can sign them off for use in marketing campaigns

You have configured the application to upload the files and a trigger to begin the process.

The advert review process should be managed by members of the creative team, because it will need to change regularly. The creative team would prefer not to have to wait for a developer to become available whenever a change is needed.

You have hired a small team of developers to do the work and you prefer a design-first approach.

Which technology should you use to automate the advert review process?

A Microsoft Power Automate
B Azure Logic Apps
C Azure Functions
D Azure App Service Web Jobs

A

A Microsoft Power Automate

Implement the workflow using Microsoft Power Automate because this allows the creative team, who are not developers, to manage the flow.

None of the other technologies allow the creative team, who are not developers, to manage the flow.

https: //docs.microsoft.com/en-us/learn/modules/choose-azure-service-to-integrate-and-automate-business-processes/6-knowledge-check
https: //docs.microsoft.com/en-us/learn/modules/choose-azure-service-to-integrate-and-automate-business-processes/

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

A client in the Marketing department wants to review uploaded files submitted to Azure Storage Blobs, so that they can sign them off for use in marketing campaigns

You have configured the application to upload the files and a trigger to begin the process.

The feedback collection process calls an on-premises SharePoint server. Because this server is not as reliable as a cloud-based server would be, developers want to carefully control the way the workflow retries this connection, if there is a failure.

You have hired a small team of developers to do the work and you prefer a design-first approach.

Which technology should you use to automate the feedback collection process?

A Microsoft Power Automate
B Azure Logic Apps
C Azure Functions
D Azure App Service Web Jobs

A

Azure App Service Web Jobs

This is the only technology that allow developers to control retry policies.

Therefore all other answers are incorrect

https: //docs.microsoft.com/en-us/learn/modules/choose-azure-service-to-integrate-and-automate-business-processes/6-knowledge-check
https: //docs.microsoft.com/en-us/learn/modules/choose-azure-service-to-integrate-and-automate-business-processes/

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

You work for a company that makes digital cameras. The company has recently acquired a smaller company that makes lenses. You want to ensure that the same procedures are in use throughout the company for the following processes:

Lens quality control. The company you acquired has a good reputation for lens reliability because of its quality control procedure. You want to implement this procedure across the merged company and integrate it with your parts ordering system, which includes a REST API.

Ordering and dispatch. The company you acquired had no formal order and dispatch procedure, so you want to ensure its employees use your original business procedure. The ordering system has a user interface that is built as an Azure App service web app but you want to manage the order and dispatch workflow as a separate project.

You have hired a small team of developers to do the work and you prefer a design-first approach.

which technology would you use for the lens quality control procedure?

A Microsoft Power Automate
B Azure Logic Apps
C Azure Functions
D Azure App Service Web Jobs

A

Azure Logic Apps is the only one of the four technologies that provides a design-first approach intended for developers.

https: //docs.microsoft.com/en-us/learn/modules/choose-azure-service-to-integrate-and-automate-business-processes/6-knowledge-check
https: //docs.microsoft.com/en-us/learn/modules/choose-azure-service-to-integrate-and-automate-business-processes/

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the Design First approach in Azure Logic Apps

A

In the design-first approach, you can visualize the workflow and therefore easily understand the business processes.

As the diagram is not a separate document, there is no possibility that the diagram is not updated when the process is changed.

Logic Apps are the technology provided by Microsoft to enable this.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Choose queue storage if:

A

If your requirements are simple, if you want to send each message to only one destination, or if you want to write code as quickly as possible, a storage queue may be the best option

You need a simple queue with no particular additional requirements
You need an audit trail of all messages that pass through the queue
You expect the queue to exceed 80 GB in size
You want to track progress for processing a message inside of the queue

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Choose Service Bus queues if:

A

You need an at-most-once delivery guarantee
You need a FIFO guarantee
You need to group messages into transactions
You want to receive messages without polling the queue
You need to provide role-based access to the queues
You need to handle messages larger than 64 KB but smaller than 256 KB
Your queue size will not grow larger than 80 GB
You would like to be able to publish and consume batches of messages

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What are the three types of Service Bus Subscription Filters?

A

Boolean Filters.
The TrueFilter ensures that all messages sent to the topic are delivered to the current subscription.

The FalseFilter ensures that none of the messages are delivered to the current subscription. (This effectively blocks or switches off the subscription.)

SQL Filters.
A SQL filter specifies a condition by using the same syntax as a WHERE clause in a SQL query. Only messages that return True when evaluated against this subscription will be delivered to the subscribers.

Correlation Filters.
A correlation filter holds a set of conditions that are matched against the properties of each message. If the property in the filter and the property on the message have the same value, it is considered a match.

19
Q

Which of the following queues should you use if you need first-in-first-out order and support for transactions?

A: Azure Service Bus queues
B: Azure Storage queues

A

A: Azure Service Bus Queues
Azure Service Bus queues handle messages in the same order they’re added and also support transactions. This means that if one message in a transaction fails to be added to the queue, all messages in the transaction will not be added.

Even though a queue is a first-in-first-out data structure, Azure Storage queues do not guarantee it.

20
Q

Suppose you’re sending a message with Azure Service Bus and you want multiple components to receive it. Which Azure Service Bus exchange feature should you use?

Queue
Topic
Relay

A

B: Topic
A topic allows multiple destination components to subscribe. This means that each message can be delivered to multiple receivers.

A queue can only have one destination component at a time, which means that each message in the queue is delivered to only one receiver.

A relay is used for two-way communication and it provides bidirectional connections across network boundaries.

21
Q

True or false: you can add a message to an Azure Service Bus queue that is 2 MB in size.

A

False

An Azure Service Bus queue message must be larger than 64 KB but smaller than 256 KB.

22
Q

How do you enforce a FIFO guarantee in Service Bus

A

To realize a FIFO guarantee in Service Bus, use sessions.

Service Bus isn’t prescriptive about the nature of the relationship between the messages, and also doesn’t define a particular model for determining where a message sequence starts or ends.

An example of how to delineate a sequence for transferring a file is to set the Label property for the first message to start, for intermediate messages to content, and for the last message to end. The relative position of the content messages can be computed as the current message SequenceNumber delta from the start message SequenceNumber.

The session feature in Service Bus enables a specific receive operation, in the form of MessageSession in the C# and Java APIs. You enable the feature by setting the requiresSession property on the queue or subscription via Azure Resource Manager

When Sessions are enabled on a queue or a subscription, the client applications can no longer send/receive regular messages. All messages must be sent as part of a session (by setting the session id) and received by receiving the session.

Sessions provide concurrent de-multiplexing of interleaved message streams while preserving and guaranteeing ordered delivery.

https: //docs.microsoft.com/en-us/dotnet/api/microsoft.servicebus.messaging.messagesession?view=azure-dotnet
https: //github.com/Azure/azure-service-bus/tree/master/samples/DotNet/Microsoft.Azure.ServiceBus/Sessions

23
Q

Suppose you work for a government agency that plans the long-term expansion of the highway system. You receive traffic data from thousands of sensors and analyze it to make your recommendations.

The amount of incoming data varies throughout the day; for example, it spikes during the morning and evening commuting hours.

True or false: a server-side architecture consisting of an Azure Queue connected to a single virtual machine is a reasonable choice for this workload?

A

True

This data is used for long-term planning, so there is no need to process it in real time. A queue connected to a single VM is likely to handle the workload and be a cost-efficient solution.

24
Q

What information uniquely identifies a queue?

A: Queue name
B: Account key
C: Storage account name and queue name

A

C: Storage account name and queue name

Storage account names must be globally unique. Queue names must be unique within their containing storage account. This means the combination of storage account name and queue name uniquely identifies a queue.

Queue names must only be unique within their containing storage account; they do not need to be globally unique.

Account keys are associated with a storage account, not a queue.

25
Q

True or false: when a client programmatically retrieves a message from a queue, the message is automatically deleted from the queue?

A

False
By design, messages are not automatically deleted from a queue after they are retrieved for processing. This helps ensure that every message is processed to completion. If a consumer application crashes during processing, the message is still available to be processed by a different instance of the consumer app.

The operations ‘get message’ and ‘delete message’ are separate.

26
Q

What is Durable Functions?

A: Durable Functions is an extension of Azure Functions, that allow you to simplify complex stateful executions in a serverless-environment

B: Durable Functions is a logical container for a single workflow that you define using triggers and actions.

C: Durable Functions is a serverless compute service that enables you to run code on-demand without having to explicitly provision or manage infrastructure.

A

A is the correct answer
Durable Functions is an extension of Azure Functions, that allow you to simplify complex stateful executions in a serverless-environment

B is incorrect, this is a description of Logic Apps. Durable Functions is an extension of Azure Functions.

C is incorrect, this is a description of Azure Functions.

27
Q

Which of the following best describes the role of the Orchestrator function in a workflow?

A: It’s used as the basic unit of work (actions and tasks) in a durable function orchestration.

B: It’s the entry point for creating an instance of a Durable Functions orchestration.

C: It’s used for describing how actions are executed and the order in which actions are executed.

A

C:
The Orchestrator Function is written in code. The function is used for describing how actions are executed and the order in which actions are executed.

A is incorrect. The basic unit of work is the Activity function.
B is incorrect: The client function serves as the entry point for a Durable Functions orchestration.

28
Q

Which of the following best explains why the Human Interaction application pattern benefits from Durable Functions?

A: A manual process within an automated process because people aren’t as highly available and as responsive as computers.

B: It addresses the problem of coordinating the state of long-running operations with external clients.

C: It allows the output from one function to be applied to input of the next function in a series of function calls.

A

A: Human interaction can be incorporated using timeouts and compensation logic.

B is incorrect You can use the Async HTTP APIs application pattern to handle this situation.

C is incorrect. Use the Function Chaining application pattern for this situation.

29
Q

Text description of An illustration of a social-media monitoring workflow.

This workflow triggers when a user posts a new tweet that mentions a specific product. It sends the text of the tweet through Text Analytics to determine sentiment. If the sentiment score is greater than 0.7, then a row containing the tweet is added to a database. Else, an email will be sent to customer support.

What would the behaviour be if the tweet was rated exactly 0.7

A: Send email
B: Insert row
C: Both send email and insert row

A

A Send an email

The workflow sends an email anytime the sentiment is less-than-or-equal-to 0.7.

30
Q

An illustration of an email attachment processing workflow.

This workflow is triggered when a new email arrives. Next, there is a if statement that checks if the email has an attachment. If there are no attachments on the email, the workflow ends.
If there are attachments, the workflow creates a blob for the email body.
Next, a foreach loop creates a blob for every attachment.
Finally, an email is sent for review.

Examine the illustration of the social-media workflow to answer this question. How many blobs would be created for an email with 6 attachments?

6
7
8

A

Answer: 7

One blob is created for the body of the email and one for each attachment.

31
Q

When filtering message subscriptions in Service bus,
Given an order where the correlationID is used to store the priority of an order which of the following would you implement to filter High priortity orders?

A: SQL Filter
B: False Filter
C: True Filter
D: Correllation Filter
E: No Filter
A

D Correlation Filter

Since the property is on the Correlation ID

A SqlFilter is evaluated in the broker against the arriving messages’ user-defined properties and system properties, but the documentation says:

“Whenever possible, applications should choose correlation filters over SQL-like filters because they’re much more efficient in processing and have less impact on throughput.”

The documentation also says:

A CorrelationFilter holds a set of conditions that are matched against one or more of an arriving message’s user and system properties. A common use is to match against the CorrelationId property, but the application can also choose to match against the following properties:

    ContentType
    Label
    MessageId
    ReplyTo
    ReplyToSessionId
    SessionId
    To
    any user-defined properties.
32
Q

Your team has to develop an application that will be used to capture events from multiple IoT enabled devices. Your team is planning on using Azure Event Hubs for the ingestion of the events. As part of the requirement, you have to ensure that the events are persisted to Azure Blob Storage.

Which of the following Azure Event Hub feature would you use to persist the data onto Azure Blob Storage?

A. Throughput Units
B. Partition Keys
C. Event Hubs Capture
D. Event Streams

A

Answer – C : Event Hubs Capture

Data can be persisted from Azure Event Hubs Capture onto Azure Blob storage with the help of Azure Event Hubs Capture.

The Microsoft documentation mentions the following

Azure Event Hubs enables you to automatically capture the streaming data in Event Hubs in an Azure Blob storage or Azure Data Lake Storage Gen 1 or Gen 2 account of your choice, with the added flexibility of specifying a time or size interval.

Setting up Capture is fast, there are no administrative costs to run it, and it scales automatically with Event Hubs throughput units.

Event Hubs Capture is the easiest way to load streaming data into Azure, and enables you to focus on data processing rather than on data capture.

33
Q

You use Visual Studio to add an empty file named LogicApp.json to a project. You want to use this file to design a custom Logic App template.

You want to open this file in the Logic App Designer. However, the Open with Logic App Designer menu item is not available on the context menu.

You open the file in text mode and view the following:

{
	"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
	"contentVersion": "1.0.0.0",
	"parameters": {},
	"variables": {},
	"resources": [],
	"outputs": {}
}

You need to solve this problem.

What should you do?

A: Add the following content to the variables section
{
“type”: “Microsoft.Logic/workflows”
}

B: Change deploymentTemplate.json to LogicAppTemplate.json

C: Add the following content to the resources section
{
“type”: “Microsoft.Logic/workflows”
}

D: Remove the contents of the JSON file

A

C Add the content to the resources section

This creates a LogicApp resource, allowing Visual studio to recognize the deployment template as one that contains a Logic App

You should not change deploymentTemplate.json - this is what allows Visual studio to recognize the data in the file is a deployment template

You should not remove the contents from the JSON file - Visual studio examines the resources section to determine which type of designer to display

You should not add the content to the variables section, The variables section defines variables that can be passed to a deployment template

Visual studio does not use this section to determine whether or not it can display the Logic App Designer

34
Q
What should we use for Parameter 1
Heartbeat
| where TimeGenerated > ago(20)
| [Parameter 1]
| where isnotempty(Compouetr)
| where LastHeartbeat < ago(1h)

A summarize LastHeartbeat = max(TimeGenerated) by
B select Computer = MachineName

A

A
summarize LastHeartbeat = max(TimeGenerated) by

Search the Heartbeat table for all events that were generated more that two days ago

Summarize the events by the maximum time a VM sent a Heartbeat

It then filters the empty results using isnotempty(Computer)
Finaly it uses the LastHeartbeat summary to filter events where the maximum time generated is less than four hours

https: //docs.microsoft.com/en-us/azure/azure-monitor/log-query/examples#find-stale-computer
https: //docs.microsoft.com/en-us/azure/azure-monitor/log-query/log-query-overview

35
Q

What data does Azure Monitor collect?

A: Data from a variety of sources, such as the application event log, the operating system (Windows and Linux), Azure resources, and custom data sources
B: Azure billing details
C: Backups of database transaction logs

A

A - Data from a variety of sources

Azure Monitor does not collect information related to billing. You can use Azure Cost Management to track this information.

Azure Monitor does not perform backups of database transaction logs. You can use Azure Backup for backups of your systems.

36
Q

What two fundamental types of data does Azure Monitor collect?

A: Metrics and logs
B: Username and password
C: Email notifications and errors

A

A: Metrics and logs
Azure Monitor collects two types of data: metrics and logs. Metrics are numerical values that describe some aspect of a system at a particular time. Logs contain different kinds of data, such as event information, organized into records.

Username and password is incorrect
You might be able to track which users are connecting to your system, but items such as passwords are not recorded.

Azure Monitor does not perform backups of database transaction logs. You can use Azure Backup for backups of your systems.

37
Q

You create a backend application that sends push notifications to mobile devices in Spain. You deploy both a production backend application and a test backend application. You want to test push notifications through both backend applications. You want to use Azure Notification Hubs to implement push notifications.

You need to determine the minimum number of hubs and namespaces to create.

How many namespaces and hubs should you create?

A: Two hubs and one namespace
B: One hub and two namespaces
C: Two hubs and two namespaces
D: One hub and one namespace

A

You should create two hubs and one namespace. A hub represents a push resource for one app.

In this scenario, one hub should represent the production backend application, while the other hub should represent the test backend application.

A namespace represents a collection of hubs for a specific region.

https://docs.microsoft.com/en-us/azure/notification-hubs/notification-hubs-push-notification-faq

38
Q

You have a script that issues the following HTTP request:

PUT https://company1.blob.core.windows.net/taxreturns/2019.pdf?comp=lease HTTP/1.1

x-ms-version: 2018-03-28
x-ms-lease-action: acquire
x-ms-lease-duration: 60
x-ms-proposed-lease-id: 18f12371-b41f-43e6-a153-e4b542f851c5
x-ms-date: Tue, 12 Mar 2019 10:23:27 GMT
Authorization: SharedKey company1:esSLMOYaK4o+xGTuKyeOLBI+xrnqi6aBniE4XI499+o=

You need to determine what this request does.

A: It sets properties on a blob named 2019.pdf that causes the blob to automatically be deleted after one hour.
B: It prevents the 2019.pdf blob from being deleted or overwritten for one minute.
C: It uploads a blob named 2019.pdf to the taxreturns container and times out after one minute.
D: It sets metadata on a blob named 2019.pdf that causes the blob to be read-write for only one hour.

A

Explanation

The request prevents the 2019.pdf blob from being deleted or overwritten for one minute. This is referred to as blob leasing. The comp query string of the PUT request URL is set to lease, which represents a blob lease request. Setting the x-ms-lease-action header to acquire causes a new lease to be requested or an existing lease to be extended. To extend an existing lease if it has not already expired, the x-ms-proposed-lease-id header is set to the existing lease ID, which is 18f12371-b41f-43e6-a153-e4b542f851c5. The x-ms-lease-duration header is set to 60, which leases the blob for 60 seconds (one minute).

The request does not set properties on a blob named 2019.pdf that causes the blob to automatically be deleted after one hour. You must write custom code to delete blobs after certain time periods.

The request does not upload a blob named 2019.pdf to the taxreturns container and time out after one minute. To upload a blob, you must specify content in the request body. In this scenario, no request body is present.

The request does not set metadata on a blob named 2019.pdf that causes the blob to be read-write for only one hour. You must write custom code to cause a blob to be modifiable for a specific time period.

39
Q

You are creating an order processing application. You want to allow multiple applications to be notified whenever a new order is placed. When one application processes an order, the order must be removed so that other applications will not attempt to process it.

You need to complete the code.

static async Taks SendMessage(string connectionString, string entityPath, byte[] message){
var client - new [ClientOption] (connectionString, enitityPath);
await client.SendAsync([Message Option]);
}

How should you complete the code?

Client Option:

A: QueueClient
B: EventHubClient
C: Topic Client

Message Option
A: message
B: new Message(message)
C: new EventData(message)

A

You should use the following code:

static async Task SendMessage(string connectionString, string entityPath, byte[] message)
{
  var client = new QueueClient(connectionString, entityPath);
  await client.SendAsync(new Message(message));
}

This code uses the QueueClient class to send a message to a Service Bus queue. A Service Bus queue allows a single client to receive messages sent to the queue. The SendAsync method of the QueueClient class accepts a Message instance, which represents the message being sent. The Message constructor accepts a byte array representing the message.

You should not create an instance of the TopicClient class. This class represents a Service Bus topic, which allows multiple subscribers to receive messages sent to a topic. A message is not automatically removed after one subscriber receives it.

You should not create an instance of the EventHubClient class. This class represents a connection to an Event Hub, not a Service Bus queue.

You should not pass a byte array as a parameter to the SendAsync method. You must wrap the byte array in a Message instance.

You should not pass an EventData instance to the SendAsync method. The EventData instance represents an event sent to an Event Hub, not a Service Bus queue.

https: //docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-queues-topics-subscriptions
https: //docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-dotnet-how-to-use-topics-subscriptions
https: //docs.microsoft.com/en-us/azure/event-hubs/event-hubs-features

40
Q

You are creating an order processing application. You want to allow multiple applications to be notified whenever a new order is placed.

You need to complete the code.

static async Task SendMessage(string connectionString, string entityPath, byte[] message)
{
  var client = new [ClientParameter](connectionString, entityPath);
  await client.SendAsync([Message Parameter]);
}

How should you complete the code? To answer, select the appropriate code segments from the drop-down menus.

Client Option:

A: QueueClient
B: EventHubClient
C: Topic Client

Message Option
A: message
B: new Message(message)
C: new EventData(message)

A

You should use the following code:

static async Task SendMessage(string connectionString, string entityPath, byte[] message)
{
  var client = new TopicClient(connectionString, entityPath);
  await client.SendAsync(new Message(message));
}

This code uses the TopicClient class to send a message to a Service Bus topic. A Service Bus topic allows multiple applications to create subscriptions for receiving messages sent to the topic. The SendAsync method of the TopicClient class accepts a Message instance, which represents the message being sent. The Message constructor accepts a byte array representing the message.

You should not create an instance of the QueueClient class. This class represents a Service Bus queue, which allows only one client to retrieve a message from the queue. Once the message is retrieved, it is removed from the queue.

You should not create an instance of the EventHubClient class. This class represents a connection to an Event Hub, not a Service Bus topic. Also, to create an EventHubClient, you should use the CreateFromConnectionString, not the class constructor.

You should not pass a byte array as a parameter to the SendAsync method. You must wrap the byte array in a Message instance.

You should not pass an EventData instance to the SendAsync method. The EventData instance represents an event sent to an Event Hub, not a Service Bus topic.

41
Q

True or False

The following logic App sequence is triggered when an email arrives,
It selects all attachments and stores them in storage in the same order in which they were added

When a new email arrives:
Attachments
ForEach
Create Blob

A

False

You should use the When new email arrives trigger as the start of the Logic App. A trigger defines how the Logic App starts. The When new email arrives trigger monitors an email account for new messages.

You should use a For each action for that trigger. This allows the Logic App to loop through every attachment in every new message.

Within the For each action, you should select Attachments as output from the previous step. This allows you to store the email attachment retrieved for later use in the Logic App.

Next you should use the Put a message on a queue action. This allows you to add the attachment to an Azure queue. Azure queues provide first-in, first-out (FIFO) storage, which meets the requirement in this scenario.

You should not use the Create a blob action. This creates an Azure blob, which also allows you to store binary data such as email attachments. However, blobs do not provide FIFO storage.

42
Q

True or False

You can store the instrumentation key for App Insights in a Key Vault secret, and use it to record telemetics in an Azure Function App

A

False

Azure function apps do not read Secrets from KeyVault, they store the name of the secret key in app settings

43
Q

True or False

If you want to record metrics from an Azure Function App in App Insights, You should call Track Event on the Telemetery Client

A

False

You should call the Info Method of the Trace Writer class. An instance of this class can be passed as a parameter to each function in a function app.

Track Event is used to log custom data from Asp.Net or a Console application.

44
Q

Which property in an Event Hub stream ensures messages from devices are delivered in order?

A

The Partition key

This ensures messages are delivered to the same Hub and Partition, and in order