AZ-204: Developer Associate Flashcards

Exam question prep

1
Q

You need to deploy a containerized application to Azure. The application consists of multiple containers that need to communicate with each other. Which Azure service should you use?

A. Azure Container Instances (ACI)
B. Azure App Service
C. Azure Kubernetes Service (AKS)
D. Azure Functions

A

Answer: C. Azure Kubernetes Service (AKS)

Explanation: Azure Kubernetes Service (AKS) is designed for deploying, managing, and scaling containerized applications using Kubernetes. It’s ideal for multi-container applications that need container orchestration, scaling, and communication between containers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

You are developing an application that processes images uploaded by users. The processing is triggered by new uploads to a blob storage container. Which Azure service should you use to implement this event-driven processing?

A. Azure Logic Apps
B. Azure Functions
C. Azure Service Bus
D. Azure Event Grid

A

Answer: B. Azure Functions

Explanation: Azure Functions is ideal for this scenario as it provides serverless compute with native triggers for blob storage events. When a new image is uploaded to the blob container, an Azure Function can be automatically triggered to process the image.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

You are developing an Azure Function App with multiple functions that share code. What is the best way to share code between these functions?

A. Use Azure DevOps Artifacts to manage shared libraries
B. Create a separate function for shared code and call it from other functions
C. Use shared code files in a common directory within the Function App
D. Store shared code in Azure Blob Storage and download it when needed

A

Answer: C. Use shared code files in a common directory within the Function App

Explanation: Azure Functions supports sharing code through common files placed in directories such as a ‘shared’ folder within the Function App. Functions within the same Function App can reference and use this code.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

You are deploying a web application to Azure App Service. The application needs to automatically scale based on CPU usage. Which feature should you configure?

A. Deployment slots
B. Auto-scaling
C. Always On
D. Application Insights

A

Answer: B. Auto-scaling

Explanation: Auto-scaling in Azure App Service allows you to automatically adjust the number of instances based on metrics like CPU usage. Deployment slots are for staging environments, Always On prevents the app from being unloaded due to inactivity, and Application Insights is for monitoring and diagnostics.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

You need to implement a microservice architecture where services communicate asynchronously. Which Azure service is most appropriate for this communication pattern?

A. Azure Storage Queues
B. Azure Redis Cache
C. Azure Service Bus
D. Azure API Management

A

Answer: C. Azure Service Bus

Explanation: Azure Service Bus is designed for enterprise messaging scenarios and supports asynchronous communication between microservices with features like message sessions, transactions, duplicate detection, and FIFO guarantees.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

You are developing a containerized application and need to store container images securely. Which Azure service should you use?

A. Azure Blob Storage
B. Azure Container Registry (ACR)
C. Azure Kubernetes Service (AKS)
D. Azure DevOps Artifacts

A

Answer: B. Azure Container Registry (ACR)

Explanation: Azure Container Registry is specifically designed for storing and managing container images with features like geo-replication, image scanning, and integration with Azure security features.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Your company has a web application hosted in Azure App Service. You need to implement a staging environment for testing before deploying to production. Which App Service feature should you use?

A. Deployment slots
B. App Service Environment
C. Azure DevOps Pipelines
D. Scale-out

A

Answer: A. Deployment slots

Explanation: Deployment slots in Azure App Service provide a staging environment for your web app. You can deploy changes to a slot, test them, and then swap the slot with production, minimizing downtime and risk.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

You are developing a .NET application that will be deployed to Azure App Service. The application needs to run a background task that performs data processing. How should you implement this background task?

A. Use Azure Functions with a Timer trigger
B. Implement IHostedService in the application
C. Use WebJobs SDK
D. Any of the above, depending on specific requirements

A

Answer: D. Any of the above, depending on specific requirements

Explanation: All three options are valid ways to implement background processing in Azure. Azure Functions with a Timer trigger is good for serverless scenarios, IHostedService is a native .NET Core way to implement background tasks within the web app process, and WebJobs provide a way to run background tasks in the same context as the App Service.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

You are developing an Azure Function that will be triggered when a message appears in an Azure Service Bus queue. The function should process the message and then create an entry in Azure Cosmos DB. What is the recommended way to handle a Cosmos DB failure in the function?

A. Let the function fail and rely on Azure Functions’ automatic retry for Service Bus triggers
B. Implement a try-catch block and ignore the error
C. Implement a custom retry policy using Polly
D. Use Azure Logic Apps instead of Functions

A

Answer: A. Let the function fail and rely on Azure Functions’ automatic retry for Service Bus triggers

Explanation: Azure Functions with Service Bus triggers have built-in retry mechanisms. If the function fails (e.g., due to a Cosmos DB error), the Service Bus message won’t be removed from the queue and will be available for reprocessing based on the queue’s retry policy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

You are developing a new API using Azure Functions. You need to ensure that the API can be managed and secured properly. Which service should you use?

A. Azure Front Door
B. Azure API Management
C. Azure Application Gateway
D. Azure Traffic Manager

A

Answer: B. Azure API Management

Explanation: Azure API Management is designed for managing APIs, providing features like security, rate limiting, quotas, analytics, and developer portal.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

You need to deploy a Docker container to Azure with minimal management overhead. The container doesn’t need to scale out to multiple instances. Which service should you use?

A. Azure Kubernetes Service (AKS)
B. Azure Container Instances (ACI)
C. Azure App Service with container support
D. Azure Container Registry (ACR)

A

Answer: B. Azure Container Instances (ACI)

Explanation: Azure Container Instances provides the simplest way to run a container in Azure without having to manage virtual machines or adopt a higher-level service. ACI is ideal for simple applications, batch jobs, or task automation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

You are deploying a CPU-intensive application to Azure App Service. You want to ensure the application has enough resources during peak usage times. Which action should you take?

A. Implement connection pooling
B. Scale up to a higher pricing tier
C. Enable auto-scaling
D. Enable local cache

A

Answer: B. Scale up to a higher pricing tier

Explanation: Scaling up (vertical scaling) to a higher pricing tier provides more CPU, memory, and disk resources per instance, which is beneficial for CPU-intensive applications.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

You are developing a new API with Azure Functions and need to allow cross-origin resource sharing (CORS). Where should you configure CORS settings?

A. In the function code using response headers
B. In the Azure portal under Function App CORS settings
C. In the host.json file
D. In the function.json file

A

Answer: B. In the Azure portal under Function App CORS settings

Explanation: For Azure Functions, CORS can be configured at the Function App level in the Azure portal under the ‘CORS’ section of the Function App settings. This is the recommended approach for enabling cross-origin requests to your function API.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

You need to implement a solution that runs a container whenever a new message is received in an Azure Storage Queue. The solution should minimize management overhead. What should you use?

A. Azure Container Instances with Event Grid
B. Azure Kubernetes Service with KEDA
C. Azure Functions with a Docker container
D. Azure Logic Apps with a container action

A

Answer: C. Azure Functions with a Docker container

Explanation: Azure Functions supports custom containers and provides native triggers for Azure Storage Queues. By using a container-based Azure Function with a Queue trigger, you get serverless scaling with minimal management overhead.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

You have a web application that needs to process background tasks. The tasks are CPU-intensive and can take up to 10 minutes to complete. Which Azure service is best suited for these background tasks?

A. Azure Functions on Consumption Plan
B. Azure Functions on Premium Plan
C. Azure Logic Apps
D. Azure App Service WebJobs

A

Answer: B. Azure Functions on Premium Plan

Explanation: Azure Functions on the Premium Plan supports longer execution times (up to 60 minutes) and provides more powerful instances for CPU-intensive workloads.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

You are developing an Azure Function App that needs to connect to an on-premises SQL Server database. Which networking solution should you use?

A. Public endpoint with IP restrictions
B. VNet Integration
C. Hybrid Connections
D. VPN Gateway

A

Answer: C. Hybrid Connections

Explanation: Hybrid Connections is a feature of Azure App Service (including Function Apps) that enables secure communication with on-premises resources without requiring changes to your corporate network infrastructure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

You are developing an Azure Function that needs to be triggered when a new user is created in Azure Active Directory. Which trigger should you use?

A. HTTP trigger with a webhook
B. Queue trigger
C. Event Grid trigger
D. Timer trigger with Microsoft Graph API calls

A

Answer: C. Event Grid trigger

Explanation: Azure Active Directory can publish events to Azure Event Grid when certain directory events occur, such as when a user is created. By using an Event Grid trigger in your Azure Function, you can directly respond to these events.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

You are implementing a solution that needs to run a batch process every hour. The process must be executed exactly once per hour, even if the previous run is still in progress. Which Azure service is best suited for this requirement?

A. Azure Functions with a Timer trigger
B. Azure Logic Apps with a recurrence trigger
C. Azure Batch
D. Azure Automation

A

Answer: D. Azure Automation

Explanation: Azure Automation with scheduled runbooks provides job scheduling with the ability to ensure that new job instances start regardless of whether previous jobs have completed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

You are developing an Azure Function that will be triggered by an HTTP request. You need to secure the function so that only authenticated users can access it. Which authentication method should you use?

A. Function Keys
B. Azure Active Directory
C. Client Certificates
D. Any of the above is valid, depending on requirements

A

Answer: D. Any of the above is valid, depending on requirements

Explanation: Azure Functions support multiple authentication methods, and the best choice depends on specific requirements. Function Keys provide a simple API key approach, Azure Active Directory provides identity-based auth with support for users and service principals, and Client Certificates provide certificate-based authentication.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

You are developing a stateful application that will be deployed to Azure. The application needs to maintain session state across multiple instances. Which service should you use to store the session state?

A. Azure Blob Storage
B. Azure Redis Cache
C. In-memory session state with sticky sessions
D. Azure Cosmos DB

A

Answer: B. Azure Redis Cache

Explanation: Azure Redis Cache is specifically designed for scenarios like distributed session management. It provides high-performance, in-memory storage with features designed for session state scenarios.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

You are developing an application that needs to store large amounts of unstructured data. The data will be accessed infrequently but needs to be retained for compliance reasons. Which storage tier should you use?

A. Hot
B. Cool
C. Archive
D. Premium

A

Answer: C. Archive

Explanation: Archive storage tier is designed for data that will be rarely accessed and stored for at least 180 days, with lower storage costs and higher retrieval costs. It’s ideal for long-term retention for compliance reasons.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

You need to implement a solution that allows multiple applications to read and process messages in order. Which Azure storage service should you use?

A. Azure Table Storage
B. Azure Cosmos DB
C. Azure Queue Storage
D. Azure Blob Storage

A

Answer: C. Azure Queue Storage

Explanation: Azure Queue Storage provides a messaging solution for communication between application components. While basic Queue Storage doesn’t guarantee strict ordering by default, you can implement patterns to ensure ordered processing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

You need to migrate a MongoDB database to Azure with minimal code changes. Which Azure service should you use?

A. Azure SQL Database
B. Azure Database for PostgreSQL
C. Azure Cosmos DB API for MongoDB
D. Azure Table Storage

A

Answer: C. Azure Cosmos DB API for MongoDB

Explanation: Azure Cosmos DB API for MongoDB provides MongoDB compatibility, allowing applications to connect using the MongoDB protocol with minimal code changes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

You are developing an application that needs to read and write data to Azure Blob Storage. Which of the following is the correct approach to authenticate the application?

A. Use a connection string with Shared Key authentication
B. Use Azure Active Directory (AAD) with a Managed Identity
C. Use a SAS (Shared Access Signature) token
D. All of the above are valid approaches

A

Answer: D. All of the above are valid approaches

Explanation: All three authentication methods are valid depending on the scenario. Shared Key authentication uses account keys, Managed Identities provide AAD-based authentication without secrets, and SAS tokens provide limited access with specific permissions and expiration times.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
You need to implement optimistic concurrency when updating data in Azure Blob Storage. Which property should you use? A. ETag B. ContentMD5 C. LastModified D. BlobTier
Answer: A. ETag Explanation: ETag (Entity Tag) is used for optimistic concurrency in Azure Blob Storage. When you retrieve a blob, you get its ETag. When updating, you can specify the ETag in the request condition, and the update will only succeed if the blob's current ETag matches the one specified.
26
You are developing an application that uses Azure Cosmos DB. You need to ensure that all writes to the database are automatically expired after 30 days. Which feature should you use? A. Stored Procedures B. Time-to-Live (TTL) C. Change Feed D. Indexing Policy
Answer: B. Time-to-Live (TTL) Explanation: Time-to-Live (TTL) in Azure Cosmos DB allows you to set an expiration time for items in a container. After the specified period elapses, the items are automatically deleted.
27
Your application needs to store structured data with strong consistency guarantees and support for SQL queries. Which Azure storage service should you use? A. Azure Cosmos DB with SQL API B. Azure Blob Storage C. Azure Table Storage D. Azure Files
Answer: A. Azure Cosmos DB with SQL API Explanation: Azure Cosmos DB with SQL API provides structured data storage with customizable consistency levels (including strong consistency) and support for SQL-like queries.
28
You are developing an application that needs to perform point reads and range queries on a large dataset. You need to optimize for read performance with the lowest latency. Which Azure Cosmos DB consistency level should you choose? A. Strong B. Bounded staleness C. Session D. Eventual
Answer: D. Eventual Explanation: Eventual consistency provides the lowest latency and highest throughput for reads at the cost of possibly returning stale data. For scenarios where absolute consistency is not critical and read performance is paramount, eventual consistency is the optimal choice.
29
You need to implement a solution for storing and processing large binary files in Azure. The files will be accessed randomly, not sequentially. Which storage option is most appropriate? A. Azure Blob Storage - Block Blobs B. Azure Blob Storage - Page Blobs C. Azure Blob Storage - Append Blobs D. Azure Files
Answer: B. Azure Blob Storage - Page Blobs Explanation: Page Blobs are optimized for random read/write operations, making them suitable for scenarios where you need to access portions of a large file randomly.
30
You are developing an application that will store customer data in Azure Cosmos DB. You expect high read and write throughput with data distributed globally. Which partition key strategy is best? A. Use a unique identifier like customerID B. Use a timestamp field C. Use a commonly queried property with high cardinality D. Use a constant value
Answer: A. Use a unique identifier like customerID Explanation: Using a unique identifier like customerID as a partition key ensures even distribution of data and operations across physical partitions, which is optimal for high throughput and scale.
31
Your application uses Azure Cosmos DB with the SQL API. You need to retrieve multiple items efficiently using their IDs. Which approach should you use? A. Multiple individual point reads B. A single query with a WHERE IN clause C. A stored procedure that performs multiple reads D. A query using the CONTAINS function
Answer: B. A single query with a WHERE IN clause Explanation: A single query with a WHERE IN clause (e.g., `SELECT * FROM c WHERE c.id IN (\"id1\", \"id2\", \"id3\")`) is the most efficient way to retrieve multiple items by their IDs in a single operation.
32
You are developing an application that needs to perform bulk inserts into Azure Cosmos DB. You want to minimize Request Unit (RU) consumption. Which approach should you use? A. Use multiple single-document writes B. Use the bulk executor library C. Use a stored procedure for batch writes D. Either B or C depending on specific requirements
Answer: D. Either B or C depending on specific requirements Explanation: Both the bulk executor library and stored procedures can efficiently handle bulk inserts in Cosmos DB with lower RU consumption compared to individual writes. The bulk executor library is optimized for high-throughput operations from client applications, while stored procedures execute server-side and can perform multiple operations in a single transaction.
33
You are developing an application that uses Azure Blob Storage. You need to implement a solution that prevents deletion of blobs for a specific period of time. Which feature should you use? A. Shared Access Signatures with expiry time B. Immutable storage with time-based retention policy C. Soft delete D. Lease blob operation
Answer: B. Immutable storage with time-based retention policy Explanation: Immutable storage with time-based retention policy provides WORM (Write Once, Read Many) capabilities that prevent deletion or modification of data for a specified retention period.
34
You are developing an application that needs to store large amounts of semi-structured data with automatic indexing. The data will be queried using various properties. Which Azure service should you use? A. Azure Blob Storage B. Azure Table Storage C. Azure Cosmos DB D. Azure SQL Database
Answer: C. Azure Cosmos DB Explanation: Azure Cosmos DB is designed for semi-structured data with automatic indexing on all properties by default, enabling efficient querying across any property.
35
You need to implement a solution for efficient caching of frequently accessed data in your Azure application. The solution must support complex data structures and have high throughput. Which service should you use? A. Azure CDN B. Azure Front Door C. Azure Redis Cache D. In-memory caching with IMemoryCache
Answer: C. Azure Redis Cache Explanation: Azure Redis Cache provides high-throughput, low-latency caching with support for complex data structures, making it ideal for application caching scenarios.
36
You are implementing a solution to copy large amounts of data from an on-premises SQL Server to Azure Blob Storage. Which service should you use? A. Azure Data Factory B. AzCopy C. Azure Backup D. Azure Databox
Answer: A. Azure Data Factory Explanation: Azure Data Factory is a cloud-based ETL and data integration service that allows you to create data-driven workflows for orchestrating data movement between various data stores, including from on-premises SQL Server to Azure Blob Storage.
37
You are developing an application that uses Azure Storage. You need to ensure that data is encrypted during transfer. What should you do? A. Enable Azure Storage encryption B. Use client-side encryption C. Enable HTTPS for storage service D. Use Azure Key Vault to store encryption keys
Answer: C. Enable HTTPS for storage service Explanation: Azure Storage uses HTTPS to encrypt data in transit. By ensuring that your application only connects via HTTPS (which is the default and recommended approach), you ensure data is encrypted during transfer.
38
You are developing an application that needs to efficiently upload large files to Azure Blob Storage in parallel. Which API should you use? A. Put Blob B. Put Block List C. Put Page D. Put Append Block
Answer: B. Put Block List Explanation: For uploading large files efficiently, the recommended approach is to split the file into blocks, upload each block in parallel using Put Block, and then commit all blocks using Put Block List.
39
You need to implement a solution for storing configuration data that needs to be accessed by multiple services. The configuration should be versioned and support feature flags. Which Azure service should you use? A. Azure Blob Storage B. Azure Key Vault C. Azure App Configuration D. Azure Cosmos DB
Answer: C. Azure App Configuration Explanation: Azure App Configuration is a service specifically designed for storing and managing application configuration data with features like versioning, feature flags, and hierarchical keys.
40
You are developing an application that uses Azure Cosmos DB. You need to ensure that a query is executed against the secondary replicas to reduce the load on the primary replica. Which consistency level should you use? A. Strong B. Bounded Staleness C. Session D. Eventual
Answer: D. Eventual Explanation: Eventual consistency allows reads from secondary replicas, which reduces the load on the primary replica and provides lower latency at the cost of potentially returning stale data.
41
You need to securely store application secrets for an Azure web application. Which service should you use? A. Azure Storage B. Azure App Configuration C. Azure Key Vault D. Azure SQL Database
Answer: C. Azure Key Vault Explanation: Azure Key Vault is specifically designed for securely storing and accessing secrets, keys, and certificates. It provides features like access control, auditing, and hardware security module (HSM) backing.
42
You are implementing authentication for an API deployed to Azure API Management. Which of the following methods can you use to secure the API? (Select all that apply) A. OAuth 2.0 B. Client certificates C. Azure Active Directory D. Basic authentication
Answer: A, B, C, D (All of the above) Explanation: Azure API Management supports multiple authentication methods including OAuth 2.0, client certificates (mutual TLS), Azure Active Directory integration, and basic authentication.
43
You need to grant an Azure Function access to an Azure Key Vault without storing any credentials in the application settings. What should you use? A. Service Principal B. Managed Identity C. Shared Access Signature D. Connection String
Answer: B. Managed Identity Explanation: Managed Identity provides an automatically managed identity in Azure AD for your Azure service (like Azure Functions). With a Managed Identity, the function can authenticate to Key Vault without any credentials stored in the application itself or its configuration.
44
You are developing a web application that will be deployed to Azure App Service. The application needs to access Azure Key Vault. You need to implement authentication between the application and Key Vault with minimal configuration. Which approach should you use? A. Use a service principal with a client secret B. Use the App Service Managed Identity C. Use Azure AD B2C D. Use a SAS token
Answer: B. Use the App Service Managed Identity Explanation: App Service Managed Identity provides an automatically managed identity for Azure resources in Azure AD. It eliminates the need to manage credentials. By enabling Managed Identity for your App Service and granting it access to Key Vault, the application can authenticate without additional credential configuration.
45
You are developing an application that needs to authenticate users from multiple identity providers, including social media accounts. Which Azure service should you use? A. Azure Active Directory (Azure AD) B. Azure Active Directory B2C (Azure AD B2C) C. Azure Active Directory Domain Services D. App Service Authentication
Answer: B. Azure Active Directory B2C (Azure AD B2C) Explanation: Azure AD B2C is specifically designed for customer-facing applications needing support for multiple identity providers including social accounts (like Google, Facebook, etc.).
46
You are implementing CORS for an Azure Function API. Where should you configure the CORS settings? A. In the function code using middleware B. In the host.json file C. In the Azure portal under Function App CORS settings D. In the Azure API Management policy
Answer: C. In the Azure portal under Function App CORS settings Explanation: For Azure Functions, CORS can be configured at the Function App level in the Azure portal under the 'CORS' section of the Function App settings. This is the recommended approach for enabling cross-origin requests to your function API.
47
You need to implement a secure solution for allowing users to upload files directly to Azure Blob Storage from a browser. Which approach should you use? A. Provide the storage account key in the client-side code B. Generate a Shared Access Signature (SAS) with limited permissions C. Use a service principal with a client secret D. Configure anonymous access for the blob container
Answer: B. Generate a Shared Access Signature (SAS) with limited permissions Explanation: Shared Access Signatures provide time-limited access with specific permissions to Azure Storage resources. For browser-based uploads, you should generate a SAS token server-side with minimal required permissions (e.g., write-only to a specific path) and a short expiration time.
48
You are developing a multi-tenant SaaS application using Azure AD for authentication. You need to ensure that users can only access data belonging to their organization. Which approach should you use? A. Implement role-based access control (RBAC) B. Use separate storage accounts for each tenant C. Implement application-level multitenancy with security filtering D. Use Azure AD B2C instead of Azure AD
Answer: C. Implement application-level multitenancy with security filtering Explanation: In a multi-tenant SaaS application authenticated with Azure AD, implementing application-level multitenancy with security filtering (using the user's tenant ID or organization ID from their claims) ensures that users can only access data belonging to their organization.
49
You are developing an application that generates sensitive reports. You need to ensure that the reports can only be accessed by authorized users and that access is revoked after 24 hours. Which approach should you use? A. Store reports in Azure Blob Storage with a time-limited SAS token B. Use Azure Key Vault to store report encryption keys with access policies C. Implement custom authorization checks in your application code D. Use Azure Information Protection to classify and protect reports
Answer: A. Store reports in Azure Blob Storage with a time-limited SAS token Explanation: Storing reports in Azure Blob Storage and providing access using SAS tokens with a 24-hour expiration time is the most straightforward way to implement time-limited access to content.
50
You are developing a web application that will be deployed to multiple Azure regions for high availability. You need to ensure that SSL/TLS certificates are managed centrally and automatically renewed. Which approach should you use? A. Manually create and upload certificates to each App Service B. Use App Service Managed Certificates C. Use Azure Key Vault with certificate auto-rotation D. Use Let's Encrypt certificates with a custom renewal script
Answer: C. Use Azure Key Vault with certificate auto-rotation Explanation: Azure Key Vault provides centralized certificate management with features like auto-rotation, which automatically renews certificates before expiration. By integrating App Service with Key Vault, you can ensure that all regional deployments use the same certificates without manual intervention.
51
You are developing an application that needs to securely store connection strings and API keys. The security team requires that these secrets are never exposed in plaintext in the application's configuration or code. Which approach should you use? A. Store secrets in Azure Key Vault and use Key Vault references in App Settings B. Encrypt secrets in the web.config file C. Use environment variables to store secrets D. Hard-code secrets in a separate, access-controlled code file
Answer: A. Store secrets in Azure Key Vault and use Key Vault references in App Settings Explanation: Storing secrets in Azure Key Vault and using Key Vault references in application settings (e.g., @Microsoft.KeyVault(SecretUri=...)) ensures that secrets are never exposed in plaintext in the application configuration.
52
You need to implement row-level security in an Azure SQL Database to ensure that users can only access data related to their department. Which approach should you use? A. Create separate tables for each department B. Implement views with WHERE clauses for each department C. Use SQL Database row-level security with SESSION_CONTEXT D. Use application-level filtering based on the user's department
Answer: C. Use SQL Database row-level security with SESSION_CONTEXT Explanation: Azure SQL Database row-level security allows you to implement access restrictions at the database level using security predicates. By setting SESSION_CONTEXT with the user's department and creating a security policy that filters rows based on this context, you can enforce row-level security consistently across all application access.
53
You are developing an application that needs to encrypt sensitive data before storing it in Azure Blob Storage. The encryption keys must be managed by your organization (not Microsoft). Which approach should you use? A. Enable Azure Storage Service Encryption with Microsoft-managed keys B. Enable Azure Storage Service Encryption with customer-managed keys in Azure Key Vault C. Implement client-side encryption using the Azure Storage client library D. Either B or C depending on specific requirements
Answer: D. Either B or C depending on specific requirements Explanation: Both customer-managed keys for Storage Service Encryption and client-side encryption allow your organization to manage the encryption keys. With customer-managed keys, Azure encrypts/decrypts the data using your keys in Key Vault. With client-side encryption, your application encrypts data before sending it to Azure.
54
You are developing a web application that uses Azure Key Vault to store secrets. You need to ensure that the application can still function during short network outages or Key Vault service interruptions. Which approach should you use? A. Store a backup copy of all secrets in Azure Blob Storage B. Implement caching of secrets in the application memory C. Use the Azure App Configuration service instead of Key Vault D. Implement a retry policy with exponential backoff
Answer: B. Implement caching of secrets in the application memory Explanation: Caching secrets in application memory (securely, using appropriate data protection) allows the application to continue functioning during short network outages or Key Vault service interruptions.
55
You need to implement a solution for encrypting sensitive configuration values in your Azure App Service application. The solution must ensure that even administrators cannot view the decrypted values. Which approach should you use? A. Use Azure Key Vault with Key Vault references B. Use the Microsoft.AspNetCore.DataProtection API C. Store encrypted values in application settings D. Use Azure App Configuration with managed identities
Answer: A. Use Azure Key Vault with Key Vault references Explanation: Azure Key Vault with Key Vault references in application settings ensures that sensitive values are never exposed in plaintext in the application configuration. With proper access policies, even administrators cannot view the actual secret values in Key Vault, only manage them.
56
You are implementing Azure AD authentication for an API deployed to Azure API Management. You need to ensure that only specific client applications can call the API. Which feature should you use? A. Subscription keys B. OAuth 2.0 client credentials C. Application ID (client ID) validation D. Client certificate authentication
Answer: C. Application ID (client ID) validation Explanation: To restrict API access to specific client applications registered in Azure AD, you should validate the Application ID (client ID) of the calling application. This can be done using JWT validation in API Management policies to ensure that only tokens issued to specific client applications are accepted.
57
You are developing a microservices application where services need to securely communicate with each other. The services are deployed to Azure Kubernetes Service (AKS). Which approach should you use for service-to-service authentication? A. Use shared access keys distributed to all services B. Implement Kubernetes Service Accounts with Azure AD integration C. Use client certificates for mutual TLS authentication D. Implement a service mesh like Istio for automatic mTLS
Answer: D. Implement a service mesh like Istio for automatic mTLS Explanation: A service mesh like Istio provides automatic mutual TLS (mTLS) between services, handling certificate management, identity verification, and encryption without code changes.
58
You are implementing a solution to protect sensitive data in Azure Cosmos DB. You need to ensure that specific properties of documents are encrypted. Which approach should you use? A. Enable Azure Cosmos DB server-side encryption B. Implement client-side encryption for sensitive properties C. Use Azure Information Protection to protect documents D. Store sensitive data in Azure Key Vault instead of Cosmos DB
Answer: B. Implement client-side encryption for sensitive properties Explanation: To encrypt specific properties of documents in Cosmos DB, you should implement client-side encryption. This allows you to selectively encrypt only sensitive fields before storing them, while keeping other properties in plaintext for querying.
59
You are developing an application that needs to share data securely between multiple Azure regions. The data includes sensitive personal information that must be encrypted during transmission. Which approach should you use? A. Use Azure Storage with geo-replication enabled B. Implement Azure Private Link for cross-region access C. Use Azure Event Hub with encrypted connections D. Configure Azure Virtual Network peering with IPsec
Answer: B. Implement Azure Private Link for cross-region access Explanation: Azure Private Link enables private and secure connectivity between Azure services across regions, keeping data on the Microsoft network and avoiding exposure to the public internet.
60
You need to implement a solution for securing access to an Azure Cosmos DB account from an Azure Function. The solution must meet the following requirements: minimize management overhead, avoid storing secrets, and limit access to specific containers. Which approach should you use? A. Use a connection string with an account key B. Use a Managed Identity with RBAC and scope to specific containers C. Generate a SAS token for each function execution D. Create a custom role with least privilege access
Answer: B. Use a Managed Identity with RBAC and scope to specific containers Explanation: Using a Managed Identity with Azure RBAC (Role-Based Access Control) for Cosmos DB allows you to grant the function access to specific containers without storing secrets. This minimizes management overhead as the identity is automatically managed, and you can limit access scope using RBAC roles at the container level.
61
You need to monitor an Azure web application for performance issues and exceptions. Which service should you use? A. Azure Monitor B. Azure Application Insights C. Azure Log Analytics D. Azure Security Center
Answer: B. Azure Application Insights Explanation: Azure Application Insights is specifically designed for monitoring and diagnosing web applications, providing features for tracking performance metrics, exceptions, dependencies, and user behavior.
62
You are developing an Azure Function and need to view execution logs for debugging. Where can you access these logs in real-time? A. Azure Application Insights B. Azure Monitor Metrics C. Function App logs (Log stream) D. Azure Storage Account
Answer: C. Function App logs (Log stream) Explanation: The Log stream feature in the Azure portal provides real-time streaming of function execution logs, making it ideal for immediate debugging.
63
Your web application deployed in Azure App Service is experiencing high memory usage. Which metric should you monitor to identify the issue? A. CPU Percentage B. Data In C. Http Queue Length D. Memory Working Set
Answer: D. Memory Working Set Explanation: Memory Working Set is the memory metric that indicates the current memory usage of your application in Azure App Service. Monitoring this metric will help identify memory issues and potential leaks.
64
You need to ensure your Azure Function scales efficiently under high load. Which hosting plan should you use? A. Consumption Plan B. Premium Plan C. App Service Plan D. Dedicated Plan
Answer: A. Consumption Plan Explanation: The Consumption Plan for Azure Functions provides automatic scaling based on the number of incoming events, with instances added and removed dynamically. This plan is ideal for variable or unpredictable workloads as it scales efficiently under high load and scales to zero when there's no traffic.
65
You are optimizing an Azure Cosmos DB implementation. Which of the following will help reduce the request unit (RU) consumption? A. Increasing the indexing paths B. Using point reads instead of queries C. Using cross-partition queries D. Increasing the page size of results
Answer: B. Using point reads instead of queries Explanation: Point reads (retrieving items directly by ID and partition key) are more efficient in terms of RU consumption compared to queries. They are more precise operations that don't require filtering or scanning.
66
You have an Azure web application that experiences slow response times during peak hours. After investigation, you discover that database queries are the bottleneck. Which Azure service should you implement to improve performance? A. Azure CDN B. Azure Redis Cache C. Azure Traffic Manager D. Azure Front Door
Answer: B. Azure Redis Cache Explanation: Azure Redis Cache provides an in-memory caching layer that can significantly reduce database load by caching frequently accessed data. This is ideal for improving performance when database queries are the bottleneck.
67
You need to implement logging for an Azure Function that processes sensitive data. The logs must be retained for 90 days and should not contain any personal identifiable information (PII). Which approach should you use? A. Use the default Azure Function logs with a retention policy B. Implement custom logging with data scrubbing before writing logs C. Use Application Insights with data filtering D. Disable logging entirely to prevent PII exposure
Answer: B. Implement custom logging with data scrubbing before writing logs Explanation: To handle sensitive data properly in logs, you should implement custom logging that scrubs or masks PII before writing to logs. This ensures that you maintain useful logging while complying with privacy requirements.
68
Your Azure App Service web application is experiencing intermittent network-related exceptions when connecting to Azure SQL Database. Which approach should you use to improve resilience? A. Implement connection pooling B. Increase the database DTU or vCore allocation C. Implement a retry policy with exponential backoff D. Deploy the application to multiple regions
Answer: C. Implement a retry policy with exponential backoff Explanation: For intermittent network issues, implementing a retry policy with exponential backoff allows the application to automatically retry failed operations with increasing delays between attempts. This improves resilience against transient failures.
69
You need to monitor the performance of an Azure Cosmos DB database and be alerted when the consumed Request Units (RUs) exceed 80% of the provisioned capacity. Which approach should you use? A. Create a Log Analytics query and alert rule B. Set up an Azure Monitor metric alert C. Implement custom Application Insights tracking D. Use Cosmos DB built-in alerting
Answer: B. Set up an Azure Monitor metric alert Explanation: Azure Monitor metric alerts allow you to monitor Cosmos DB metrics like normalized RU consumption percentage and trigger alerts when thresholds are exceeded.
70
Your Azure Function application is experiencing cold starts that impact performance. Which approach should you use to reduce cold start times? A. Increase the function timeout setting B. Use the Premium plan with pre-warmed instances C. Implement asynchronous processing patterns D. Reduce the function's memory footprint
Answer: B. Use the Premium plan with pre-warmed instances Explanation: The Azure Functions Premium plan provides pre-warmed instances that remain available, eliminating cold starts for better performance.
71
You have a web application deployed to Azure App Service. You need to identify memory leaks in the application. Which tool should you use? A. Azure Monitor metrics B. Application Insights Profiler C. Kusto Query Language (KQL) in Log Analytics D. Azure Security Center
Answer: B. Application Insights Profiler Explanation: Application Insights Profiler collects detailed performance traces from your live web application, helping identify memory leaks and other performance issues.
72
Your Azure App Service web application is experiencing increased response times. You need to determine if the issue is related to database performance, external API calls, or application code. Which feature should you use? A. Azure Front Door latency metrics B. Application Insights Application Map C. Azure Monitor Metrics D. App Service Diagnostics
Answer: B. Application Insights Application Map Explanation: Application Insights Application Map provides a visual representation of the dependencies and their performance in your application. It shows the relationships and response times between your application and its dependencies (databases, external APIs, etc.).
73
You need to implement a solution to monitor the health of an Azure Kubernetes Service (AKS) cluster and the applications running on it. Which approach should you use? A. Enable Azure Monitor for containers B. Implement Prometheus and Grafana C. Use the Kubernetes dashboard D. Either A or B depending on specific requirements
Answer: D. Either A or B depending on specific requirements Explanation: Both Azure Monitor for containers and Prometheus with Grafana are valid approaches for monitoring AKS clusters and applications. Azure Monitor for containers provides integrated monitoring with Azure's monitoring stack, while Prometheus with Grafana is an open-source solution with more customization options.
74
You have an Azure Logic App that orchestrates a business process involving multiple services. You need to monitor the execution of this Logic App and be alerted when a specific action fails. Which approach should you use? A. Use the Logic Apps run history B. Implement Azure Monitor alerts with action-specific conditions C. Use Application Insights custom tracking D. Implement custom logging in each service
Answer: B. Implement Azure Monitor alerts with action-specific conditions Explanation: Azure Monitor alerts for Logic Apps can be configured with specific conditions that trigger when an action fails. This provides automated monitoring and notification without requiring manual checks of the run history.
75
You need to implement a solution for load testing an Azure App Service web application. The solution must simulate realistic user traffic patterns from multiple geographic regions. Which approach should you use? A. Azure DevOps Load Testing B. JMeter with Azure Container Instances C. Azure Front Door with traffic splitting D. Custom testing scripts running on VMs
Answer: B. JMeter with Azure Container Instances Explanation: JMeter deployed to Azure Container Instances in multiple regions provides a flexible and scalable solution for load testing with realistic geographic distribution.
76
Your Azure App Service web application is experiencing intermittent 500 errors that are difficult to reproduce in the development environment. You need to implement a solution to capture detailed information about these errors. Which approach should you use? A. Enable Application Insights snapshot debugger B. Implement custom error handling and logging C. Enable verbose logging in App Service diagnostic logs D. Use Azure Front Door custom error responses
Answer: A. Enable Application Insights snapshot debugger Explanation: Application Insights snapshot debugger captures a snapshot of application state when exceptions occur in production, allowing you to see the exact state that caused the error without impacting production performance.
77
You need to optimize the performance of an Azure Cosmos DB application that performs frequent queries on the same data. Which feature should you use? A. Indexing policy optimization B. Server-side programming (stored procedures) C. Provisioned throughput increases D. Integrated cache
Answer: D. Integrated cache Explanation: Azure Cosmos DB integrated cache improves performance for frequently queried data by serving it from memory instead of disk. This significantly reduces latency and RU consumption for read-heavy workloads.
78
You are troubleshooting an Azure App Service web application that is experiencing unexpected restarts. Which logs should you examine? A. Application Insights logs B. Web server logs C. Application Event logs D. Deployment logs
Answer: C. Application Event logs Explanation: Application Event logs contain information about application crashes, worker process recycling, and other host-level events that can cause unexpected restarts.
79
You need to implement a solution for automatic remediation of common issues in an Azure App Service web application. Which approach should you use? A. Azure Automation runbooks B. Azure App Service Auto-Heal C. Azure Logic Apps with alert triggers D. Custom health probes with recovery scripts
Answer: B. Azure App Service Auto-Heal Explanation: Azure App Service Auto-Heal provides built-in capabilities to automatically detect and remediate common issues like memory leaks, slow requests, and application deadlocks by automatically recycling the worker process based on configured rules.
80
You are developing an Azure Function that processes messages from a queue. Some messages cause the function to fail repeatedly. How should you handle these poison messages? A. Implement a try-catch block and ignore failed messages B. Configure the queue trigger to use a poison queue for failed messages C. Implement a circuit breaker pattern D. Use Azure Logic Apps for message processing instead
Answer: B. Configure the queue trigger to use a poison queue for failed messages Explanation: Azure Functions with queue triggers support poison message handling, which automatically moves messages that cause repeated failures to a poison queue. This prevents the function from continuously failing on the same message and allows for separate handling of problematic messages.
81
You need to implement a service that processes messages in the exact order they were sent and ensures each message is delivered exactly once. Which Azure service should you use? A. Azure Event Hub B. Azure Service Bus Queue C. Azure Storage Queue D. Azure Event Grid
Answer: B. Azure Service Bus Queue Explanation: Azure Service Bus supports both FIFO (First-In-First-Out) message delivery with Sessions and exactly-once delivery with duplicate detection.
82
You are developing an application that needs to process a stream of events from IoT devices in real-time. Which Azure service is most appropriate? A. Azure Service Bus B. Azure Event Hub C. Azure Logic Apps D. Azure Storage Queue
Answer: B. Azure Event Hub Explanation: Azure Event Hub is designed for high-throughput event streaming scenarios like IoT telemetry ingestion. It can handle millions of events per second and integrates with stream processing services.
83
You are implementing retry logic for an application that calls an unreliable third-party API. Which retry pattern should you use? A. Circuit Breaker pattern B. Exponential backoff C. Retry immediately D. Throttling pattern
Answer: B. Exponential backoff Explanation: Exponential backoff is a retry strategy where the wait time between retries increases exponentially. This prevents overwhelming the service with repeated immediate retries while still attempting to recover from transient failures.
84
You need to implement a solution that allows multiple consumers to process messages from a queue concurrently, with each message processed by only one consumer. Which Azure messaging service feature should you use? A. Azure Service Bus Topics B. Azure Event Hub Consumer Groups C. Azure Service Bus Competing Consumers D. Azure Storage Queue with multiple readers
Answer: C. Azure Service Bus Competing Consumers Explanation: The competing consumers pattern in Azure Service Bus allows multiple consumers to read from the same queue concurrently, with each message being delivered to only one consumer.
85
You are developing an application that needs to respond to events from multiple Azure services. Which service should you use to centralize event handling? A. Azure Logic Apps B. Azure Service Bus C. Azure Event Grid D. Azure Functions
Answer: C. Azure Event Grid Explanation: Azure Event Grid is specifically designed for event routing and distribution from various Azure services to multiple handlers. It provides a unified event management system for reacting to status changes and events across your Azure resources.
86
You need to implement API management for your microservices architecture. Which feature of Azure API Management allows you to route requests to different backend services based on the URL path? A. Policies B. Products C. Subscriptions D. Gateways
Answer: A. Policies Explanation: API Management policies, specifically the `` and routing policies, allow you to route requests to different backend services based on criteria like URL path.
87
You are implementing a messaging solution for an e-commerce application. Order processing must happen exactly once, even during service restarts. Which Azure messaging service feature should you use? A. Azure Service Bus with duplicate detection B. Azure Event Hub with consumer groups C. Azure Storage Queue with visibility timeout D. Azure Event Grid with retry policies
Answer: A. Azure Service Bus with duplicate detection Explanation: Azure Service Bus with duplicate detection ensures that messages are processed exactly once by identifying and discarding duplicate messages based on a message ID. This is crucial for order processing where duplicate processing could lead to issues like double charges.
88
You need to implement a solution for processing files uploaded to Azure Blob Storage. The solution must trigger processing automatically when new files are uploaded. Which approach should you use? A. Use Azure Functions with a Blob storage trigger B. Implement a timer-triggered function that polls for new blobs C. Use Azure Logic Apps with a Blob storage connector D. Either A or C depending on specific requirements
Answer: D. Either A or C depending on specific requirements Explanation: Both Azure Functions with a Blob storage trigger and Logic Apps with a Blob storage connector can automatically process files when they're uploaded to Blob Storage. Functions are more code-centric and better for complex processing, while Logic Apps are more workflow-oriented and better for orchestration with minimal code.
89
You are developing an application that needs to process messages from multiple sources with different formats. The processing logic varies by message type. Which Azure service is most appropriate for this scenario? A. Azure Service Bus Topics with subscription filters B. Azure Event Hub with consumer groups C. Azure Logic Apps with conditions D. Azure Functions with HTTP triggers
Answer: A. Azure Service Bus Topics with subscription filters Explanation: Azure Service Bus Topics with subscription filters allow you to route different types of messages to different handlers based on message properties or content.
90
You need to implement a solution that allows users to upload files directly to Azure Blob Storage from a web application. The solution must support files up to 10 GB in size. Which approach should you use? A. Use the Azure Storage SDK to upload files in a single operation B. Use the Azure Storage SDK with chunked uploads C. Generate a SAS token and use the REST API for direct uploads D. Use Azure Logic Apps for file uploads
Answer: B. Use the Azure Storage SDK with chunked uploads Explanation: For large files (up to 10 GB), using the Azure Storage SDK with chunked uploads is the most reliable approach. This allows the file to be split into smaller blocks that can be uploaded in parallel and resumed if interrupted.
91
You are developing a SaaS application that needs to integrate with third-party APIs. The integration must handle differences in API formats and protocols. Which Azure service should you use? A. Azure API Management B. Azure Logic Apps C. Azure Functions D. Azure App Service
Answer: B. Azure Logic Apps Explanation: Azure Logic Apps is specifically designed for integration scenarios with built-in connectors for many third-party services and the ability to transform data between different formats.
92
You need to implement a solution for sending transactional emails from an Azure application. The solution must support high volume, delivery tracking, and bounce handling. Which approach should you use? A. Use the SMTP protocol directly from the application B. Use a third-party email service with an API connector C. Implement an Azure Function that uses SendGrid D. Use Azure Communication Services
Answer: C. Implement an Azure Function that uses SendGrid Explanation: SendGrid is a transactional email service that integrates well with Azure and supports high volume, delivery tracking, and bounce handling. Implementing an Azure Function that uses SendGrid provides a scalable and reliable solution for sending transactional emails.
93
You are implementing authentication for an Azure API Management instance. You need to secure the API so that it can only be called by a specific Azure Function. Which authentication method should you use? A. Subscription keys B. Client certificates C. Azure Active Directory with Managed Identities D. Basic authentication
Answer: C. Azure Active Directory with Managed Identities Explanation: Azure Active Directory with Managed Identities provides a secure way for Azure services like Functions to authenticate to other Azure services without managing credentials.
94
You need to implement a solution for connecting your Azure App Service to an on-premises SQL Server database. The solution must be secure and not require changes to your corporate firewall. Which approach should you use? A. VNet Integration with ExpressRoute B. Hybrid Connections C. Site-to-Site VPN D. SQL Server Virtual Network service endpoints
Answer: B. Hybrid Connections Explanation: Azure App Service Hybrid Connections provide a secure way to access on-premises resources without requiring firewall changes. They work by establishing an outbound connection from your on-premises network to Azure, avoiding the need for inbound connections.
95
You are implementing a solution to process sales data from multiple sources. The data arrives at different times and needs to be combined before processing. Which Azure service should you use? A. Azure Data Factory B. Azure Stream Analytics C. Azure Logic Apps D. Azure Functions
Answer: A. Azure Data Factory Explanation: Azure Data Factory is specifically designed for data integration scenarios, with features for extracting, transforming, and loading data from multiple sources. It supports data arrival at different times and provides orchestration for combining and processing data.
96
You need to implement a solution that allows an Azure web application to call an external API that has rate limiting. The solution must handle throttling gracefully. Which pattern should you implement? A. Circuit Breaker pattern B. Retry pattern with exponential backoff C. Bulkhead pattern D. Throttling pattern
Answer: B. Retry pattern with exponential backoff Explanation: For handling rate limiting (throttling) from external APIs, implementing a retry pattern with exponential backoff allows your application to automatically retry requests after being throttled, with increasing delays between retries.
97
You are developing an application that uses Azure Service Bus Topics. You need to ensure that messages are automatically removed after 7 days if they haven't been processed. Which feature should you use? A. Message lock duration B. Auto-forwarding C. Time-to-live (TTL) D. Dead-lettering
Answer: C. Time-to-live (TTL) Explanation: Time-to-live (TTL) in Azure Service Bus determines how long a message remains in a queue or topic before it expires and is automatically removed.
98
You need to implement a solution for an IoT application that must process device telemetry data in real-time and support time-based queries over historical data. Which combination of Azure services should you use? A. Azure IoT Hub and Azure Cosmos DB B. Azure IoT Hub, Azure Stream Analytics, and Azure Time Series Insights C. Azure Event Hub and Azure SQL Database D. Azure IoT Hub and Azure Data Lake Storage
Answer: B. Azure IoT Hub, Azure Stream Analytics, and Azure Time Series Insights Explanation: This combination provides a complete solution for IoT telemetry processing: IoT Hub ingests device data, Stream Analytics processes it in real-time, and Time Series Insights stores and enables time-based analysis of historical data.
99
You are developing an application that needs to make authenticated calls to an Azure Storage account. The application is deployed to Azure App Service. What is the most secure way to authenticate to the storage account? A. Store the storage account key in application settings B. Use a Shared Access Signature (SAS) token C. Use a Managed Identity with Azure RBAC D. Store the storage account connection string in Key Vault
Answer: C. Use a Managed Identity with Azure RBAC Explanation: Using a Managed Identity with Azure RBAC (Role-Based Access Control) is the most secure authentication method as it eliminates the need to store any credentials. The App Service will automatically authenticate to Storage using its managed identity, and you can control access using RBAC.
100
You are implementing a solution to synchronize data between an on-premises database and Azure SQL Database. The synchronization must be bidirectional and handle conflict resolution. Which Azure service should you use? A. Azure Data Factory B. Azure SQL Data Sync C. Azure Migrate D. Azure Database Migration Service
Answer: B. Azure SQL Data Sync Explanation: Azure SQL Data Sync is specifically designed for bidirectional synchronization between SQL databases with features for conflict resolution. It allows you to define synchronization groups, set synchronization schedules, and configure conflict resolution policies.