AZ-204 Private Flashcards
You are planning on using the Azure container registry service. You want to ensure that your application or service can use it for headless authentication. You also want to allow role-based access to the registry.
You decide to use the Admin account associated with the container registry
Would this fulfil the requirement?
No.
Why not:
This is only used for single user access to the registry
Azure Container Registry - Admin account
Each container registry includes an admin user account, which is disabled by default. You can enable the admin user and manage its credentials in the Azure portal, or by using the Azure CLI or other Azure tools. The admin account has full permissions to the registry.
The admin account is currently required for some scenarios to deploy an image from a container registry to certain Azure services. For example, the admin account is needed when you deploy a container image in the portal from a registry directly to Azure Container Instances or Azure Web Apps for Containers.
Important
The admin account is designed for a single user to access the registry, mainly for testing purposes. We do not recommend sharing the admin account credentials among multiple users. All users authenticating with the admin account appear as a single user with push and pull access to the registry. Changing or disabling this account disables registry access for all users who use its credentials. Individual identity is recommended for users and service principals for headless scenarios.
You are planning on using the Azure container registry service. You want to ensure that your application or service can use it for headless authentication. You also want to allow role-based access to the registry.
You decide to perform an individual login to the registry
Would this fulfil the requirement?
Yes.
Why:
This will allow you to assign role-based access control or even allow for headless authentication
Azure Container Registry/Individual Login/Azure AD
When working with your registry directly, such as pulling images to and pushing images from a development workstation to a registry you created, authenticate by using your individual Azure identity.
You are planning on using the Azure container registry service. You want to ensure that your application or service can use it for headless authentication. You also want to allow role-based access to the registry.
You decide to assign a service principal to the registry
Would this fulfil the requirement?
Yes.
Why:
If you assign a service principal to your registry, your application or service can use it for headless authentication.
Azure Container Registry/Service Principal/AD
If you assign a service principal to your registry, your application or service can use it for headless authentication. Service principals allow Azure role-based access control (Azure RBAC) to a registry, and you can assign multiple service principals to a registry. Multiple service principals allow you to define different access for different applications.
az webapp cors add
- Add allowed origins.
Code:
az webapp cors add --allowed-origins [--ids] [--name] [--resource-group] [--slot] [--subscription]
Ex:
az webapp cors add -g {myRG} -n {myAppName} –allowed-origins https://myapps.com
az webapp commands
az webapp cors remove -g {myRG} -n {myAppName} –allowed-origins https://myapps.com
az webapp cors show –name MyWebApp –resource-group MyResourceGroup
Azure Database Migration Service
You can use Azure Database Migration Service to perform an online (minimal downtime) migration of databases from an on-premises or cloud instance of MongoDB to Azure Cosmos DB’s API for MongoDB.
Using Azure Database Migration Service to perform an online migration requires creating an instance based on the Premium pricing tier.
For an optimal migration experience, Microsoft recommends creating an instance of Azure Database Migration Service in the same Azure region as the target database. Moving data across regions or geographies can slow down the migration process.
When you migrate databases to Azure by using Azure Database Migration Service, you can do an offline or an online migration. With an offline migration, application downtime starts when the migration starts. With an online migration, downtime is limited to the time to cut over at the end of migration. We suggest that you test an offline migration to determine whether the downtime is acceptable; if not, do an online migration.
The service uses the Data Migration Assistant to generate assessment reports that provide recommendations to guide you through the changes required prior to performing a migration
Azure Migrate
Azure Migrate provides a centralized hub to assess and migrate to Azure on-premises servers, infrastructure, applications, and data. It provides the following:
Unified migration platform: A single portal to start, run, and track your migration to Azure.
Range of tools: A range of tools for assessment and migration.
Data Migration Assistant
Data Migration Assistant helps pinpoint potential problems blocking migration. It identifies unsupported features, new features that can benefit you after migration, and the right path for database migration.
Azure Cosmos DB Data Migration Tool
The Azure Cosmos DB Data Migration tool is an open source tool designed for small migrations.
This tutorial provides instructions on using the Azure Cosmos DB Data Migration tool, which can import data from various sources into Azure Cosmos containers and tables. You can import from JSON files, CSV files, SQL, MongoDB, Azure Table storage, Amazon DynamoDB, and even Azure Cosmos DB SQL API collections. You migrate that data to collections and tables for use with Azure Cosmos DB. The Data Migration tool can also be used when migrating from a single partition collection to a multi-partition collection for the SQL API.
Integration Service Environment
Sometimes, your logic apps need access to secured resources, such as virtual machines (VMs) and other systems or services, that are inside or connected to an Azure virtual network. To set up this access, you can create an integration service environment (ISE).
If your logic apps need access to virtual networks that use private endpoints, you must create, deploy, and run those logic apps inside an ISE.
When you create an ISE, Azure injects or deploys that ISE into your Azure virtual network. You can then use this ISE as the location for the logic apps and integration accounts that need access.
Azure App Service Environment
The Azure App Service Environment is an Azure App Service feature that provides a fully isolated and dedicated environment for securely running App Service apps at high scale.
App Service environments (ASEs) are appropriate for application workloads that require:
Very high scale.
Isolation and secure network access.
High memory utilization.
Customers can create multiple ASEs within a single Azure region or across multiple Azure regions. This flexibility makes ASEs ideal for horizontally scaling stateless application tiers in support of high requests per second (RPS) workloads.
Azure AD B2B Integration
Azure Active Directory (Azure AD) business-to-business (B2B) collaboration is a feature within External Identities that lets you invite guest users to collaborate with your organization
VNet Service Endpoint
Virtual Network (VNet) service endpoint provides secure and direct connectivity to Azure services over an optimized route over the Azure backbone network. Endpoints allow you to secure your critical Azure service resources to only your virtual networks. Service Endpoints enables private IP addresses in the VNet to reach the endpoint of an Azure service without needing a public IP address on the VNet.
You are developing an ASP.Net Core application. This application would need to be deployed to the Azure Web App service from a GitHub repository. The web application contains static content that is generated by a script.
You are planning on using the Azure Web App continuous deployment feature. The script which is used to generate static content needs to run first before the web site can start serving traffic.
Which of the following are options that can be used for this fulfilling this requirement?
Customize the deployment by creating a .deployment file at the root of the repository. Ensure the deployment file calls the script which generates the static content.
.deployment file
Deployment configuration files let you override the default heuristics of deployment by allowing you to specify a project or folder to be deployed. It has to be at the root of the repository and it’s in .ini format.
Code:
[config]
command = deploy.cmd
Powershell:
command = powershell -NoProfile -NoLogo -ExecutionPolicy Unrestricted -Command “& “$pwd\deploy.ps1” 2>&1 | echo”
\
Deploying a specific ASP.NET or ASP.NET Core project file
You can specify the path to the project file, relative to the root of your repo. Note that this is not a path to the solution file (.sln), but to the project file (.csproj/.vbproj). The reason for this is that Kudu only builds the minimal dependency tree for this project, and avoids building unrelated projects in the solution that are not needed by the web project.
Azure Function authLevels
Determines what keys, if any, need to be present on the request in order to invoke the function. The authorization level can be one of the following values:
anonymous—No API key is required.
function—A function-specific API key is required. This is the default value if none is provided.
admin—The master key is required.
Azure Functions Blob storage binding
Integrating with Blob storage allows you to build functions that react to changes in blob data as well as read and write values.
Azure Functions HTTP triggers
Azure Functions may be invoked via HTTP requests to build serverless APIs and respond to webhooks.
Run a function from an HTTP request
Return an HTTP response from a function
Azure Functions Queue storage trigger
Azure Functions can run as new Azure Queue storage messages are created and can write queue messages within a function.
Run a function as queue storage data changes
Write queue storage messages
Azure Functions Timer Trigger
A timer trigger lets you run a function on a schedule.