Linux Academy Flashcards
First practice exam
Your company has resources hosted in two different regions. You want to keep data in sync across region 1 and region 2. Which product should you use?
A. Google Dataprep
B. Google Compute Engine
C. Google Cloud SQL
D. Google Cloud Storage
D. Google Cloud Storage
What protocol handles the process of automatically discovering new routes and subnets between two Cloud Routers?
A. ASN
B. Network Discovery
C. TCP/IP
D. Border Gateway Protocol
Border Gateway Protocol
Border Gateway Protocol (BGP) handles the process for Cloud Routers to automatically discover new subnets on a peer network over VPN.
If external auditors need to be able to access your admin activity logs once a year for compliance, what is the best method of preserving and sharing that log data? (Choose two)
A. If they do not need a GCP account and need to view a single date’s object, export the logs to a Cloud Storage bucket for long-term retention and generate a signed URL for temporary object-level access.
B. Export logs to Cloud Storage bucket and email a list of the logs once per year.
C. If they need access to multiple logs in a single bucket, and they have a GCP account, export logs to a Cloud Storage bucket for long-term retention and grant auditor accounts the Storage Object Viewer role to the bucket.
D. Create GCP accounts for the auditors and grant the Project Viewer role to view logs in Stackdriver Logging.
A. If they do not need a GCP account and need to view a single date’s object, export the logs to a Cloud Storage bucket for long-term retention and generate a signed URL for temporary object-level access.
C. If they need access to multiple logs in a single bucket, and they have a GCP account, export logs to a Cloud Storage bucket for long-term retention and grant auditor accounts the Storage Object Viewer role to the bucket.
Why is this correct?
The choice to use IAM or signed URL’s depends on if the auditors need a GCP account, or need access to a single object or all logs in a bucket.
A recent software update to an e-commerce website running on Google Cloud has caused the website to crash for several hours. The CTO decides that all critical changes must now have a back-out/roll-back plan. The website is deployed on hundreds of virtual machines (VMs), and critical changes are frequent. Which two actions should you take to implement the back-out/roll-back plan? (Choose two)
A. Enable object versioning on the website’s static data files stored in Google Cloud Storage.
B. Create a Nearline copy for the website’s static data files stored in Google Cloud Storage.
C. Use managed instance groups with the “update-instances” command when starting a rolling update.
D. Create a snapshot of each VM prior to an update, and recover the VM from the snapshot in case of a new version failure.
A. Enable object versioning on the website’s static data files stored in Google Cloud Storage.
B. Create a Nearline copy for the website’s static data files stored in Google Cloud Storage.
C. Use managed instance groups with the “update-instances” command when starting a rolling update.
Managed instance group updater allows for easy management of the VMs and lets GCE take care of updating each instance.
You want to archive the most recent version of an object ‘file1.txt’ in your bucket ‘log-files’ in Cloud Storage with versioning turned on. What is the correct command to do so?
A. gsutil del gs://log-files/file1.txt
B. gsutil rm gs://log-files/file1.txt
C. gsutil rm -r gs://log-files/file1.txt
D. gcloud rm -r gs://log-files/file1.txt
B. gsutil rm gs://log-files/file1.txt
Be careful with the -r option, it will remove all versions of the document. Note that -a has the same functionality as -r in this case.
You need to analyze log data from your credit card processing application while staying in compliance with PCI regulations. What is the best method to perform this task?
A. Forward data from Cloud Storage into Cloud Dataproc.
B. Export data from your on-premises application into BigQuery for analysis.
C. Using a Squid Proxy, have data collected by Stackdriver Logging exported to BigQuery via a sink based on needed log filters.
D. Export data from your Squid Proxy via Cloud Pub/Sub into BigQuery.
C. Using a Squid Proxy, have data collected by Stackdriver Logging exported to BigQuery via a sink based on needed log filters.
The proper model for exporting credit card processing data is to forward from a squid proxy to Stackdriver Logging, and export from Stackdriver Logging into BigQuery.
When creating firewall rules, what forms of segmentation can narrow which resources the rule is applied to? (Choose all that apply)
A. Region
B. Zone
C. Network tags
D. Network range in source filters
C. Network tags
D. Network range in source filters
Your customer is moving their storage product to Google Cloud Storage (GCS). The data contains personally identifiable information (PII) and sensitive customer information. Once migrated, what security strategy should you use for GCS to minimize exposure to internal users and the public?
A. Use signed URLs to generate time bound access to objects.
B. Grant IAM read-only access to internal users, and use default ACLs on the bucket.
C. Grant no Google Cloud Identity and Access Management (Cloud IAM) roles to internal users, and use granular ACLs on the bucket.
D. Create randomized bucket and object names. Enable public access, but only provide specific file URLs to people who do not have Google accounts and need access.
C. Grant no Google Cloud Identity and Access Management (Cloud IAM) roles to internal users, and use granular ACLs on the bucket.
Using principal of least privilege and allowing for maximum automation, what steps can you take to store audit logs for long-term access and to allow access for external auditors to view? (Choose two)
A. Export audit logs to BigQuery via an export sink.
B. Export audit logs to Cloud Storage via an export sink.
C. Generate a signed URL to the Stackdriver export destination for auditors to access.
D. Create an account for auditors to have view access to Stackdriver Logging.
C. Generate a signed URL to the Stackdriver export destination for auditors to access.
D. Create an account for auditors to have view access to Stackdriver Logging.
You are developing a new application that needs to store and analyze over a petabyte of data in NoSQL format. Which product would you choose?
A. Cloud Spanner
B. BigQuery
C. Cloud Bigtable
D. Cloud Datastore
C. Cloud Bigtable
Your company is planning on deploying a web application to Google Cloud hosted on a custom Linux distribution. Your website will be accessible globally and needs to scale to meet demand. Choose all of the components that will be necessary to achieve this goal. (Choose all that apply)
A. Managed Instance Group on Compute Engine
B. Network Load Balancer
C. App Engine Standard environment
D. HTTP Load Balancer
A. Managed Instance Group on Compute Engine
D. HTTP Load Balancer
Within your Kubernetes Engine (GKE) cluster, you want to automatically and simultaneously deploy new code to a GKE cluster in two different regions. Which method should you use?
A. Change the clusters to activate federated mode.
B. Use Google Cloud Container Builder to publish the new images.
C. Use Parallel SSH with Google Cloud Shell and kubectl.
D. Use an automation tool, such as Jenkins.
D. Use an automation tool, such as Jenkins.
You have a mission-critical database running on an Linux instance on Google Compute Engine. You need to automate a database backup once per day to another disk. The database must remain fully operational and functional and can have no downtime. How can you best perform an automated backup of the database with no downtime and minimal costs?
A. Use the automated snapshot service on Compute Engine to schedule a snapshot.
B. Write the database to two different disk locations simultaneously, then schedule a snapshot of the secondary disk, which will allow the primary disk to continue running.
C. Use a cron job to schedule a disk snapshot once per day.
D. Use a cron job to schedule your application to backup the database to another persistent disk.
D. Use a cron job to schedule your application to backup the database to another persistent disk.
To both minimize costs (don’t want extra disks) and minimize downtime (cannot freeze database). Backing up just the database to another disk using a cron job is the preferred answer.
It is also possible to backup the database to a Cloud Storage bucket instead of a disk, which would be cheaper for the same amount of storage. Be sure to note what specific parameters the exam questions give.
What is the best practice for separating responsibilities and access for production and development environments?
A. Separate project for each environment, both teams have access to both projects.
B. Separate project for each environment, each team only has access to their project.
C. Both environments use the same project, just note which resources are in use by which group.
D. Both environments use the same project, but different VPC’s.
B. Separate project for each environment, each team only has access to their project.
What information is required to connect to an on-premises network router over VPN using Cloud Router for dynamic routing? (Choose all that apply)
A. Remote router DNS name
B. Shared secret
C. Remote router (peer) IP address
D. Border Gateway Protocol address
B. Shared secret
C. Remote router (peer) IP address
Using Cloud Router for dynamic routing requires a BGP address along with the peer address and shared secret for secure access.
D. Border Gateway Protocol address
You want to automate collecting billing data for analysis. What is the best way to do this?
A. Export billing reports to Cloud Storage.
B. Forward daily reports to your data analysis team.
C. Download a CSV file of billing info.
D. Export billing reports to BigQuery.
D. Export billing reports to BigQuery.
You have approximately 10 separate media files over 500GB each that you need to migrate to Google Cloud Storage. The files are in your on-premises data center. What migration method can you use to help speed up the transfer process?
A. Start a recursive upload.
B. Use multi-threaded uploads using the -m option.
C. Use parallel uploads to break the file into smaller chunks then transfer it simultaneously.
D. Use the Cloud Transfer Service to transfer.
C. Use parallel uploads to break the file into smaller chunks then transfer it simultaneously.
Parallel uploads are for breaking up larger files into pieces for faster uploads.
D not correct because Storage Transfer Service is limited to AWS S3, Google Cloud Storage, and HTTP/HTTPS locations.