Exam Prep Questions - Implement Storage Flashcards

1
Q

You administer Azure for the Verigon Corporation. You are responsible for your Azure SQL database named VerigonDB, which is an Azure SQL Database version 12 database. Your active database is hosted in the West US region, with one offline secondary database in West Europe.

The users from USA and from West Europe are using a web application named VerigonApp. The application data is stored in VerigonDB. Because of performance issues, you want the users from West Europe to read their application data from the secondary backup DB in West Europe. For the future, you need to support up to four readable secondary backups of VerigonDB in West Europe.

The database tier is S0 Standard. Which is the most cost-efficient Azure SQL Database pricing tier you can upgrade to improve the performance level?

A

Azure SQL Database P1 Premium pricing tier

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

You administer Azure for the Verigon Corporation. You are responsible for the Azure SQL databases and for recovery planning. You have these three Azure SQL databases:
•VerigonDB1, Azure pricing tier B0 Basic
•VerigonDB2, Azure pricing tier S0 Standard
•VerigonDB3, Azure pricing tier P1 Premium

You need to report the following information to your management team:
•What are the retention periods for the Azure point-in-time backups of these databases, by default?
•How often does the Azure SQL Database service create full backups, differential backups, and log backups of your databases?

What should you present to management?

A
  • VerigonDB1 - 7 Day Restore
  • VerigonDB2, 14 Day Restore
  • VerigonDB3, 35 Day Restore

The intervals for full backup, differential backup, and log backup are independent of the Azure SQL Database pricing tier. A full backup is performed every week, differential backup is performed every day, and a log backup is performed every 5 minutes. This is very import to know, because if your backup/restore requirements are different from the Azure SQL Database default values, you will have to configure the appropriate settings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

You are the administrator for the Verigon Corporation. You manage an Azure storage account named veristacco. You want to use Azure blob snapshots as a fast recovery solution to recover to old blob data versions when a user mistakenly overwrites blob data. You have created a container named snapshots as the location for your blob snapshots, and written an Azure Automation PowerShell Script that automatically creates a blob snapshot every hour and saves the snapshots in the snapshots container.

You want to test that script and verify the snapshot creation. You examine the snapshots with the following PowerShell code:

$Container.CloudBlobContainer.ListBlobs($BlobName, $true, “Snapshots”) | select Name, SnapshotTime

You get the following output:

Name: SnapshotTime:
VerigonPic1.jpg 7/10/2015 1:00:00 PM +00:00
VerigonPic1.jpg 7/10/2015 2:00:00 PM +00:00
VerigonPic1.jpg 7/10/2015 3:00:00 PM +00:00
VerigonPic1.jpg 7/10/2015 4:00:00 PM +00:00

What should you use to promote the blob snapshot from 3:00:00 PM over the base blob?

A

Start-CopyAzureStorageBlob or Start-AzureStorageBlobCopy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

You are the administrator at the Verigon Corporation. You have registered a custom domain name, graphics.verigon.com, for your Azure storage account named VerigonStAcc. You plan to access all blob data through that URL.

You successfully verified access to http://graphics.verigon.com/Graphics/VerigonLogo.jpg. You want to make it possible to access all blob data through HTTPS in a more secure way, but with the least administrative effort.

What do you have to do to accomplish this?

A

You have to use https://VerigonStAcc.blob.core.windows.net/ to connect to blob data.

Currently Azure does not allow you to use a URL with a custom domain name to access the blob data using the HTTPS protocol. You can only access blob data with the HTTPS protocol if you use the default URL, which is in the scenario.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

You administer Azure for the Verigon Corporation. You have deployed an Azure website named VerApp1. Now you want to enable the Azure diagnostics logs feature for VerApp1. Specifically, you want to do the following:
•Save and analyze the log information of VerApp1 through your Azure Storage account.
•Identify all failed HTTP requests in the logs.

What three PowerShell cmdlets do you have run?

A

Set-AzureWebSite -HttpLoggingEnabled 1

Enable-AzureWebsiteApplicationDiagnostic -BlobStorage -LogLevel Error

Set-AzureWebSite -RequestTracingEnabled $true-name VerApp1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

You are the administrator at Verigon Corporation. You have an Azure storage container named verigonvms. You want to organize this container through a directory hierarchy. You want to build a hierarchy that resembles the following example:

verigonvms
  microsoftservers
       dcs
       webservers
       fileservers
       applicationservers
  linuxservers
      webservers
      fileservers
      applicationservers

Which string delimiter within a blob name do you have to use to build this virtual hierarchy?

A

You have to use the / string delimiter, because the Azure blob service is based on a flat storage scheme, not a hierarchical. You easily can simulate a hierarchy by using this string delimiter as part of the blob name

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

You administer Azure for the Verigon Corporation. You have to set up the Azure backup service for an on-premises Windows Server 2012 R2 file server that holds tax records. Tax records are added by the Tax department. After 45 to 90 days, tax records must be reconciled with the actual taxes paid. The reconciliation process is conducted by an auditor in another department who must review the original spreadsheet from the previous month and compare it to the tax receipt. The original spreadsheet may have to be retrieved from a backup because the original spreadsheet may have been deleted or placed in offsite storage.

You must design the backup and restore process for these tax records. What are the necessary steps?

A
  1. configure the Backup Vault
  2. Download the vault credentials from your account
  3. Run MARSAgentInstaller.exe /m /q
  4. Create passphrase to encrypt and decrypt backups
  5. Specify backup schedule
  6. Restore the file to different location with different ACL
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

You are the administrator of the Verigon Corporation. You are responsible for the Azure storage account named VerigonUSStAcc. You have an Azure container named VerigonFiles. You want to change the Author property of the blob verigonlogo.jpg in this container to MichaelS. You have written some PowerShell code to prepare for that change:

$author1 = “FrankZ”
$author2 = “jeffH”
$author3 = “MichaelS”
$StorageAccountName = “VerigonUSStAcc”
$StorageAccountKey = “ 239dzad38e9e8923288e2e92e238d2eu2839ud2839d2839ud==”
$VerContext = New-AzureStorageContext -StorageAccountName $StorageAccountName
-StorageAccountKey $StorageAccountKey
$ConName = “VerigonFiles”
$BName = “verigonlogo.jpg”
$Blob = Get-AzureStorageBlob -Context $VerContext -Container $ConName -Blob $BName
$CloudBlockBlob = [Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob] $Blob.ICloudBlob

What additional PowerShell code do you have to add?

A

$CloudBlockBlob.Metadata[“author”] = “MichaelS” | $CloudBlockBlob.SetMetadata(),

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

You administer Azure for the Verigon Corporation. You open the Microsoft Edge browser on a Windows 10 client and try to access a picture saved in an Azure container through https://verigon.blob.core.windows.net/test/pic.jpg. You see the following error message:

HTTP 404 error. That’s odd… Microsoft Edge can’t find this page.

The file is saved as a blob in the Azure container named test. You want to allow only the storage account owner to access files in this Azure container.

What PowerShell command or commands can you use?

A

Set-AzureStorageContainerAcl -Container test -Permission off

This statement uses the Set-AzureStorageContainerAcl cmdlet to set access permissions on a storage container named test. The -Permission parameter is set to off, which will allow only the storage account owner to access files in an Azure container.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

You are the administrator at the Verigon Corporation. You maintain an Azure storage account named VerigonUSStAcc. You want to use Azure file storage to a log file named Log1.txt in the Custom Logs subdirectory of an Azure storage share named VerigonLogs. The source files are on the local drive on the C:\Logs folder.

Which PowerShell commands must you use to make that possible?

A

To begin, you will require the name of your Azure storage account and the primary key. With that information, you can create a context for your storage account and the key using the New-AzureStorageContext cmdlet:

$Verctx=New-AzureStorageContext VergionUsStAcc J2rnMPqNAJYXQilXrENgZa3ARXVMGbTiYla7

In the second step you will create the new Azure File Share with the New-AzureStorageShare cmdlet. The following statement creates a new share named VerigonLogs using the context created in the previous step:

$s=New-AzureStorageShare VerigonLogs -Context $Verctx

The next step is to create the directory in the file share with the New-AzureStorageDirectory cmdlet. The following creates a directory named CustomLogs with in the share that was created in the previous step:

New-AzureStorageDirectory -Share $s -Path CustomLogs

After the directory in the file share has been created, you can upload the log-file to Azure File Storage Share with the Set-AzureStorageFileContent cmdlet. The following uploads the c:\logs\log1.txt file to the CustomLogs directory in the share stored in the variable named $s:

Set-AzureStorageFileContent -Share $s -Source c:\logs\log1.txt -Path CustomLogs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

You are the administrator of the Verigon Corporation. You have three SQL databases named DB1, DB2, and DB3 running on different on-premises SQL servers. The databases have following sizes:

DB1 = 50 MB
DB2 = 500 GB
DB3 = 2 TB

DB1 is hosted on an SQL Server in Chicago (USA). DB2 and DB3 are hosted on an SQL Server in Berlin (Germany). Your Azure Storage account named VerStAcc is located in the West US region.

You want to migrate all three databases to Azure SQL Services.

Which database migration option is the best choice for DB1, DB2, and DB3?

A

You can migrate DB1 by deploying a database to Microsoft Database Wizard in SQL Server Management Studio. DB1 is a small database with a size of only 50 MB, and it is hosted in the USA, which is the same as the Azure storage account region, so there should be no connectivity or bandwidth problems.

You can migrate DB2 by exporting the data and schema to a BACPAC file in an Azure Blob and then importing the BACPAC file into your Azure SQL instance. DB2 is a medium sized database and is located in a region other than the Azure storage account region. Since it is located in another region, you should expect a lower bandwidth connection and possible network connectivity issues. With a lower bandwidth connection, it is a faster and more reliable solution to use the export/import function through BACPAC files.

You can migrate DB3 by migrating the schema and the data separately, because this will achieve the best performance. To do so, perform the following steps:
1.Script the schema of the database. You could use SQL Server Management Studio to script the database and objects in the database.
2.Deploy the schema to Azure SQL Database.
3.Use BCP to extract the data into flat files. BCP is a utility that can be used to import objects from a flat file or export objects from a database into a flat file.
4.Import files into Azure SQL Database.
DB3 is not in the same region as the Azure storage account and is a very large database. You will achieve the best performance for the migration by splitting the database into schema and data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

You administer Azure for the Verigon Corporation. You have created an Azure backup vault named VerigonVaultWE with a storage replication configuration set to locally redundant. Next, you registered an Azure VM named VerigonVM1 in that vault.

You attempt to change the storage replication configuration setting to geo-redundant. In the VerigonVaultWE settings, you notice the following message:

The storage replication choice cannot be changed once items have been registered to the vault.

You unregister the Azure VM under Registered items, but you still cannot change the storage replication configuration setting. How can you change the storage replication configuration setting to geo-redundant on a backup vault?

A

Create a new Azure backup vault

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

You administer Azure for the Verigon Corporation. You have an Azure subscription and an Azure storage account. You want to upload 96 TB of blob data to your storage account, but you have decided that uploading all that data over the network is too expensive for your company.

You want to use Azure Import/Export service as a solution. What are the requirements?

A

You can use the following requirements for Azure Import/Export:
•3.5-inch SATA II or III internal hard drives are supported.
•Disk size of up to 10 TB.
•Disk drive volumes must be formatted in the NTFS filesystem.
•The disks must be prepared with the Azure Import/Export tool, WAImportExport.exe.
•Max 10 hard drives

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

You are the administrator of the Verigon Corporation. You have an Azure stored access policy named Policy1, which grants READ permission, and another policy named Policy2, which grants WRITE permission. These policies grant permissions for blobs in the Azure storage blob container named images for the Azure storage account named veristacco. You want to control access through these policies.

On October 4, 2015, a user named VerigonUser1 tries to access the Azure Blob (A) through the following URL:

https://veristaccoblob.blob.core.windows.net/images/verigon.jpg?se=2015-10-05T09%3A07Zsr=b&si=Policy2%sig=Y2CQLua868d6Thi6uMUw1tPtMQo%2FVmZp1Eorqqtzg%3D

He gets the error HTTP 404, The webpage cannot be found in his browser.

On October 6, 2015, the same user tries to access another Azure Blob (B) with the following URL:

https://veristaccoblob.blob.core.windows.net/images/verigon2.jpg?se=2015-10-05T09%3A07Zsr=b&si=Policy1%sig=Y2CQLua868d6Thi6uMUw1tPtMQo%2FVmZp1Eorqqtzg%3D

He gets the following error message:

“AuthenticationFailed Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.”

What do you have to do to allow VerigonUser1 access to both Azure blobs?

A

For Azure Blob A you have to edit the stored access policy named Policy2 to add the READ permission to that policy. The user had no READ permission, and therefore could not access the file. You can do that with the Azure Storage Explorer tool or with PowerShell.

For Azure Blob B you have to change the expiration date in the policy, because se=2015-10-05 in the URL configures the expiration date. Because the user wants to access the file one day after that date, he cannot access the blob. Because the expiration date comes from the policy named Policy1, you have to change the policy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

You administer Azure for the Verigon Corporation. You want to prepare Azure Backup so that you can schedule regular backups from your on-premises file server to Azure.

What are the necessary steps

A
  1. Configure the Backup vault
  2. Download the vault creds
  3. download and install the backup agent
  4. Register the server
  5. Configure the backup scheduled
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

You administer Azure for the Verigon Corporation. You set up Azure Backup on a Windows Server 2012 R2 file server with Azure PowerShell. You successfully have done the following:
•Verified the installation of PowerShell 4.0 and Azure PowerShell version 1.0.0
•Logged in to Azure with the Login-AzureRMAccount cmdlet
•Run the Switch-AzureMode AzureResourceManager command
•Created an Azure Backup Vault
•Installed the Azure Backup agent
•Downloaded Vault Credentials
•Registered the server with the Azure Backup Service
•Defined the encryption passphrase

You have to perform the final administrative steps so that files from specific folders on your Windows Server 2012 R2 server will be backed up every Saturday at 4:00 PM via a backup policy. The recovery points created from the backup must be retained for 14 days through a Azure backup retention policy. You have to verify the settings of the schedule, retention policy, and the included files before performing the backup.

A

You should first run Set-OBSchedule to associate the backup schedule with a policy. For example, if you have created a schedule with $VerSched = New-OBSchedule -DaysofWeek Saturday -TimesofDay 18:00, then you have to associate this schedule to the policy. The following associates the policy saved to the $BackupPolicy variable with the schedule saved to the $VerSched variable:

Set-OBSchedule -Policy $BackupPolicy -Schedule $VerSched

You must run the New-OBRetentionPolicy cmdlet to configure the retention policy. The retention policy defines how long the recovery points will be retained. You can specify in days the duration for which the recovery points need to be retained. The following example sets a retention policy of two weeks:

$retpol = New-OBRetentionPolicy -retentiondays 14

You will run the Set-OBRetentionPolicy cmdlet to associate the retention policy with the main policy. The following example associates the retention policy saved in the $retpol variable with the policy saved in the $backuppolicy variable:

Set-OBRetentionPolicy -Policy $backuppolicy -retentionpolicy $retpol

You will then need to specify which files are being backed up. You will use the Add-OBFileSpec cmdlet to define which files from which folders have to be included in the backup or excluded from the backup. In the following example, all files from the Marketing folder on the C drive and the Accounting folder on the D drive will be backed up:

$includedfiles = New-OBFileSpec -FileSpec @(“C:\Marketing”, “D:\Accounting”)
Add-OBFileSpec -Policy -backuppolicy -FileSpec $includedfiles

You also can exclude folders with the -exclude parameter.

You will need to commit the backup policy. You can use the Set-OBPolicy cmdlet to commit the backup policy, including the Retention Policy. The following example commits the backup policy saved to the $backuppolicy variable:

Set-OBPolicy -Policy $backuppolicy

To skip the confirmation, you can use the parameter -Confirm:$false.

You will be required to verify the settings of the schedule, retention policy, and the included files before performing a backup. To do so, you would use the Get-OBPolicy cmdlet. The following syntax retrieves the schedule, retention policy, and included files settings:

Get-OBPolicy | Get-OBSchedule
Get-OBPolicy | Get-OBRetentionPolicy
Get-OBPolicy | Get-OBFileSpec

At the end, if you want to start the backup ad hoc, you can do this with the following command:

Get-OBPolicy | Start-OBBackup

17
Q

You are the Azure administrator for the Verigon Corporation. Verigon is a worldwide company with headquarters in Washington, DC, and branch offices in Europe and Asia. All users worldwide have to use an intranet ASP.NET web application named VerigonApp. You have detected that VerigonApp has unacceptably slow performance request times from Europe and Asia if the application accesses static content data from your Azure Storage account, VerigonUSStorage, through HTTPS.

You want to implement the Azure Content Delivery Network (CDN) so that VerigonApp users will have improved performance request times if they access static content data from VerigonUSStorage. You also want to make file changes in blob containers available more quickly through the CDN URL. Finally, you want the developers to be able to push content updates immediately.

Which steps do you have to take to accomplish these requirements?

A
  1. Create a CDN linked to the storage account VergionUS Storage
  2. Ping CDN endopoint to make sure its online
  3. Enable the query string
  4. Change the blogs Cache-Control headers
18
Q

You administer Azure for the Verigon Corporation. You want to access blob data in a storage account named veristacco through a registered custom domain name. You want to map the custom subdomain name to your blob service endpoint, veristacco.blob.core.windows.net, through the Azure Management Portal.

What DNS records can you use to complete the registration process for the custom domain?

A

A CNAME record named asverify.www

A CNAME record named www

19
Q

You are responsible for managing access to blobs in an Azure container named vercontainer in your Azure storage account named veristacco. You want to deliver a blob URL to your employee so that she can access the verfile.txt file with READ/WRITE-permissions.

She should only be able to access verfile.txt through the Azure Blob Service from October 20 until October 27, 2015, and only if the requesting client comes from the IP range 168.1.1.20 - 168.1.1.50.

You pre-created the Azure shared access signature (SAS):

Z%2FrHIY5XFg0Mq2rqI3OlWTjFh1tYkboXr1L8ZUXDtkk%3D

Which URL you should deliver to the user?

A

https://veristacco.blob.core.windows.net/vercontainer/verfile.txt?sv=2015-10-5&st=2015-10-20T22%3A18%3A26Z&se=2015-10-27T02%3A23%3A26Z&sr=b&sp=rw&sip=168.1.1.20-168.1.1.50&spr=https&sig=Z%2FrHIY5XFg0Mq2rqI3OlWTjFh1tYkboXr1L8ZUXDtkk%3D

The URL should begin with the name of the storage account, the container name, and the name of the file. Then you will have place a question mark (?) after verfile.txt to determine what you are searching for.

You will then add the sv= parameter to define the version of the blob file.

The st= parameter defines the start time. The value specified is when the Shared Access Signatures (SAS) becomes valid. The value must be specified in the ISO 8061 format. If you omit the start time value, the SAS will become valid immediately. The following specifies the start time in the ISO 8061 format as 2015-10-20T22%3A18%3A26Z.

The se= parameters define the expiration time of the SAS and the blob file. The following specifies the end time in the ISO 8061 format as 2015-10-27T02%3A23%3A26Z.

The sr= parameter defines the resource type. In this case it is b, which means it is accessible through the Azure Blob service and not Azure File service. If you type bf, it can be used for both.

The sip= parameter defines the IP range from which the client can request access. The following defines the IP address from 168.1.1.20 - 168.1.1.50.

The spr= parameter is for the used to specify the protocol, and this always has to be https. HTTP is not allowed for Azure SAS.

The sig= parameter specifies the signature file that is used to authenticate access to the blob.

20
Q

You administer Azure for the Verigon Corporation. You have created some blobs in the Azure storage blob container named marketingdocuments for the Azure storage account named veristacco. You have created some service shared access signatures (SAS) based on the Azure storage account primary key. You have not created any Azure stored access policies.

On July 6 you created the shared access signatures. On the same day you delivered the relevant SAS URLs to the Marketing department, including the ad hoc SAS for every blob. The signature expiration date for these SAS URLs is set to October 4.

On September 1, your IT contact person in the Marketing department reported a burglary in the offices of the marketing department in Chicago. Some standalone laptops from marketing employees were stolen, including login information. You are asked to prevent access to the blob URLs immediately, because some of these URLs can be compromised through the stolen laptops. The shared access signature will stay valid as long as it is not expired.

How can you invalidate the SAS for the blob SAS URLs for the marketing department?

A

Regenerate the Azure storage account primary key

21
Q

You administer Azure for the Verigon Corporation. You have created an Azure backup vault named VerigonVaultWE with a storage replication configuration set to locally redundant. Next, you registered an Azure VM named VerigonVM1 in that vault.

You attempt to change the storage replication configuration setting to geo-redundant. In the VerigonVaultWE settings, you notice the following message:

“The storage replication choice cannot be changed once items have been registered to the vault.”

You unregister the Azure VM under Registered items, but you still cannot change the storage replication configuration setting.

How can you change the storage replication configuration setting to geo-redundant on a backup vault?

A

Create a new Azure backup vault

22
Q

You are the Azure administrator for Verigon Corporation. You are responsible for the Azure SQL database scaling strategy. You have evaluated the following technical requirements for your scaling strategy: •Add or remove databases at any time without additional costs
•Perform administrative tasks for large numbers of Azure SQL databases
•Execute Transact-SQL queries that span multiple databases
•Increase or decrease computing power for the databases as needed
Which elastic database feature will satisfy which requirement?

A

To add or remove databases at any time without additional costs, use the elastic database pool feature. An elastic database pool is a set of elastic database throughput units (eDTUs) and storage that is shared by multiple databases. You can add or remove databases to or from the pool at any time. The databases inside a pool use only the resources they require.

To perform administrative tasks for large numbers of Azure SQL databases, you can use the elastic database client library feature which makes life easier for administrators and developers. To manage a shared collection, a special database called a shard map manager will be created. Developers can use this to register databases as shards, and specify the mapping of sharding keys or ranges to those databases. Without the elastic database client library, they need much more time to write management code for sharding.

To run Transact-SQL queries that span multiple databases, you can use the elastic database jobs features. This feature allows you to execute a Transact-SQL script or to apply a DACPAC through a database group, including a database collection and all elastic databases. You can create custom Azure SQL database groups; create, persist, and maintain SQL scripts to be executed across Azure SQL DBs groups; deploy DACPACs; and define schedules.

To increase or decrease computing power for the databases as needed, you can use the vertical scaling feature. This is known as “scaling-up.” For example, this feature allows each customer’s database to grow or shrink the relevant resources as needed through the workload.