GCP General Flashcards

1
Q

What is the purpose of Google Compute Engine?

A

Creating VM’s, it much like AWS EC2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the purpose of Google Cloud Storage?

A

It is object storage used for storing unstructured data like files, images, videos. It is much like AWS S3.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the purpose of Google Cloud SQL?

A

It is a fully-managed MySQL/PostgresSQL database service.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the purpose of Google App Engine?

A

It is a PaaS platform for running cloud native, highly available and scalable application, it is a PAAS much like AWS Beanstalk.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the purpose of Google Cloud Kubernetes engine?

A

It is a managed Kuberentes service for running container-based applications.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is Google Cloud Functions?

A

It is a function as a service. (Lambda)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the main unit in GCP used to group resources?

A

Project

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Is a project tied to a region?

A

No, it is independent of the region.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Does GCP bill per hour, what if you something for 5 minutes?

A

5min, GCP is billed per minute.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

To receive discounts on pricing do you have to make an upfront commitment like you do with Azure and AWS using reserved instances?

A

No, you automatically receive a discount based on sustained usage by aggregating the instance usage over the month.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

When I store data on GCP do I need to enable encryption for data at rest and in transit?

A

No, all data in GCP is automatically encrypted for both transit and at rest.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are the big category areas in GCP?

A

Compute
-Compute Engine
-Container Engine
-Cloud App engine
-Cloud Functions

Storage
- Cloud storage (Object storage)
- Bigtable (Wide column DB)
- Cloud SQL (MySQL and Postgres)
- Cloud Datastore (Document DB)
-Spanner (SQL like DB)

Big data
- Pub/Sub (High performance messaging)
- Dataflow (ETL)
- BigQuery (Data wherehouse, SQL query)
- Dataproc
- Datalab

Machine Learning
-Vision API
Machine Learning
Speech API
Translate API

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

For GCP what is the structure hierarchy?

A

Orgnizatiuons->Folders->Projects-Resources

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

For GCP what is the hierarchy parent of a resource?

A

Project

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

For GCP what is the hierarchy parent of a project?

A

Folder

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Can I attach a Project to two organizations?

A

No, an organization can only be attached to a single organization.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Is the project a core organizational component of GCP?

A

Yes, the project is the core hierarchy component used for,
- Permission
- Billing
- API’s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What are the 3 projetc attributes?

A
  • Frendly name : ‘keith-project-01’
  • Project ID: also known as an app ID
  • Project number: Used to identify resources belong to a project
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Can you change the project frendly name?

A

Yes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Can you change the project frendly name?

A

YES

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Can you have duplicate projetc name on GCP?

A

No the name has to be unique.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Where cna I quickely find my projetc ID?

A

When you open a projetc in GCP portal, the project panel is in the top righ. It contanns,
-Frendly name
- Project ID

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Can you rename a GCP projetc?

A

Yes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

When you shutdown a project can it be stgopped with in 30days.

A

Yes, the billing owner will be notified and has the poition to stop the shutdown.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

When you shutdown a GCP project, whne will the project be deleted and gone for ever?

A

30day from shutdown.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

How can i restore a projetc form the console screen?

A

In IAM -> Managed resources -> there is a link resources pending deletion.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

In GCP what is IAS?

A

IAS is identity and acces managment, it is like IAM in AWS.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

What is the GCP hierarchy?

A

Organizations
Folders
Projects
Resources

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

What is a GCP organization?

A

The organization represents a company.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Is the GCP organization a root node?

A

Yes, it is the start of the hierarchy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

What are the 3 types of interaction with GCP?

A

GCP web portal, CLI and API.

32
Q

What are gcloud, gsutil, bq?

A

gcloud is many common GCP tasks
gsutil is google cloud storage
bq is goodle cloud big query

33
Q

What format does the API use?

A

JSON

34
Q

What authorization and authentication does the GCP API use?

A

OAUTH 2.0
To turn on and off APIs in Google Cloud Platform (GCP), you need the following permissions:

cloudresourcemanager.projects.update: This permission allows you to modify project metadata, including enabling or disabling APIs.
cloudresourcemanager.projects.get: This permission is required to retrieve project information before making changes.
Note: These permissions can be granted at the project level or at the organization level.

35
Q

If you were polling the GCP often and started to receive errors what could it be?

A

API has a daily limit.

36
Q

What is the GCP cloud shell?

A

It is a shell available in the CCP portal, it is a Linux machine with 5GB persistent storage. The home directory is where data is preserved.

37
Q

What is the format of the GCLOUD command?

A

gcloud compute instance create
gcloud group group command args

38
Q

How can you call an AP{I to exercise?

A

Call google API explorer, this is a GCP tool enabling you to call all the GCP API from a web console.

39
Q

What is the name of the service for running VM’s?

A

Google Compute Engine

40
Q

What is the name of the service for running applications?

A

Google app engine

41
Q

What is the name of the service for running containers?

A

Google container engine

42
Q

What is the name of the Google service for running function?

A

Google cloud functions

43
Q

What is google compute engine used for?

A

Running instances

44
Q

What is google container engine used for?

A

Running Kubernetes type containers.

45
Q

What is google app engine used for?

A

It is a PaaS for running apps like Python, Go, PHP, Java, .NET Core, Docker Image

46
Q

What is google cloud functions used for?

A

It’s functions as a service and is restricted to running node js or java. It is triggered by an event.

47
Q

What is a label and where is it used?

A

A label is a tag and is used to group and filter resources.

48
Q

What is a fast and easy way to grab an unknown CLI command?

A

Using the web portal you have a way to get what the gcloud CLI command would look like.

49
Q

What is the URL for the google login?

A

console.cloud.google.com

50
Q

Is a zone equal to a physical data center?

A

No, the are fault tolerant physical units, we can think of zones as existing in multiple data centers where each data center has multiple independent zones.

51
Q

Are zones separated by more than 160 KM

A

No, zones exist in more then one physical data center in a region, with each data center having multiple zones.

52
Q

When resources store data in a multi region, what is the expected minimum distance between regions?

A

More than 160 KM
In Google Cloud (GCP), the minimum distance between regions typically exceeds 100 miles (160 km). This geographic separation helps to ensure data redundancy and high availability by mitigating the impact of local natural disasters or network failures. GCP regions are designed to be physically and logically separated, with multiple availability zones within each region that are also spread out geographically.

53
Q

For a project can you turn off and on the GCP API’s per service?;

A

Yes, you can say turn off the API for Compute Engine.

54
Q

What are the different types of workloads in GKE?

A

ypes of workloads
Kubernetes provides different kinds of controller objects that correspond to different kinds of workloads you can run. Certain controller objects are better suited to representing specific types of workloads. The following sections describe some common types of workloads and the Kubernetes controller objects you can create to run them on your cluster, including:

Stateless applications
Stateful applications
Batch jobs
Daemons

55
Q

What is a deployment in GKE?

A

What is a Deployment?
Deployments represent a set of multiple, identical Pods with no unique identities. A Deployment runs multiple replicas of your application and automatically replaces any instances that fail or become unresponsive. In this way, Deployments help ensure that one or more instances of your application are available to serve user requests. Deployments are managed by the Kubernetes Deployment controller.

Deployments use a Pod template, which contains a specification for its Pods. The Pod specification determines how each Pod should look like: what applications should run inside its containers, which volumes the Pods should mount, its labels, and more.

When a Deployment’s Pod template is changed, new Pods are automatically created one at a time.

56
Q

What are VPC Service Controls?

A

Control dataflow in and out of VPC
VPC Service Controls provides an extra layer of security defense for Google Cloud services that is independent of Identity and Access Management (IAM). While IAM enables granular identity-based access control, VPC Service Controls enables broader context-based perimeter security, including controlling data egress across the perimeter. We recommend using both VPC Service Controls and IAM for defense in depth.
Enforced mode
Enforced mode is the default mode for service perimeters. When a service perimeter is enforced, requests that violate the perimeter policy, such as requests to restricted services from outside a perimeter, are denied.

A perimeter in enforced mode protects Google Cloud resources by enforcing the perimeter boundary for the services restricted in the perimeter configuration. API requests to restricted services do not cross the perimeter boundary unless the conditions of the necessary ingress and egress rules of the perimeter are satisfied. An enforced perimeter protects against data exfiltration risks, such as stolen credentials, misconfigured permissions, or malicious insiders that have access to the projects.

Establish virtual security perimeters for API-based services
Users can define a security perimeter around Google Cloud resources such as Cloud Storage buckets, Bigtable instances, and BigQuery datasets to constrain data within a VPC and control the flow of data. With VPC Service Controls, enterprises can keep their sensitive data private as they take advantage of the fully managed storage and data processing capabilities of Google Cloud.

57
Q

How many ways can you publish a new topic to Cloud Pub Sub?

A

Console
CLI
API call
publish messages to a topic which is associated with a schema.

58
Q

What file types can you use for deployment manager?

A

Python, Yaml, Jinja2

59
Q

Which load balancers can use a target pool of instances for the backend?

A

Using target pools

External TCP/UDP Network Load Balancing can use either a backend service or a target pool to define the group of backend instances that receive incoming traffic.

When a network load balancer’s forwarding rule directs traffic to a target pool, the load balancer chooses an instance from the target pool based on a hash of the source IP address, the source port, the destination IP address, and the destination port.

60
Q

How do you figure out LB type?

A

From internet or not
For TCP or HTTP traffic

61
Q

What permissions do you need to monitoring workspace?

A

Monitoring Editor, Monitoring Owner, and Project Owner

62
Q

What are the two ways to have a workspace for two projects?

A

If you have created a multiproject Workspace in the past, you have a few options to consider.

You can either create a new Workspace
or

Add the project to an existing Workspace.

Google describes these as the

Add Workspace Approach

Merge Workspace Approach

63
Q

A Workspace requires that your project be provisioned with the following user roles in IAM:

A

•Monitoring Editor

•Monitoring Admin

•Project Owner

Before you create a new Workspace, you need to identify who in the organization has roles in a given project. To do so, go to Cloud Console and select IAM & Admin. Each user role is listed beside the member name (see Figure 11-2). You should filter your user list to only actual users in order to remove service accounts.

64
Q

How do you establish a service account for a new application on your GCE VM Instance.

A

You create the service account in IAM.
You add the account to the GCE Instance
Don’t use the Google default service acount, that doesn’t have proper permissions.

65
Q

Your company uses BigQuery to store customer information in its data warehouse. The company is made up of five different business units. Each business unit has created thousands of projects based on different marketing metrics. The chief marketing officer has asked the CIO if the cloud engineering team can create a single dataset that evaluates all tables where customers purchased specific products that were recently recalled across stores. The column name is PROD_ID. What is the most economical way to complete this task?

A

Establish a Cloud Dataflow job. This requires the creation of an empty BigQuery table and the use of a Dataflow SQL query looping through all the project data pooled together in the organization. You would query on the schema columns in the view you’ve create that match PROD_ID.

Cloud Dataflow is GCP’s data processing service for both batch and real-time data streaming applications. Dataflow allows developers to set up processing capabilities to integrate, prepare, and analyze big data analytics sets. Creating a single table in BigQuery that ingests all datapoints using a common information schema creates many opportunities for data efficiency. Once all data is isolated in the single table, querying for the specific identifier is easier than querying across several hundred datasets or having to migrate across many tools unnecessarily.

66
Q

You recently configured a Google Kubernetes instance. You want to make this application available to the public. You consider the need for Cloud Load Balancing given the application is publicly exposed, with a significant population of users. What type of configuration best fits these requirements?

A

Create a Kubernetes service using type NodePort. Then create an ingress resource to ensure the HTTPS web server application is publicly accessible.

Node Port is another option for exposing your GKE application to the public, but it has some limitations compared to HTTP(S) load balancing:

Less scalable: Node Port doesn’t offer the same level of scalability and performance as HTTP(S) load balancing, especially for large-scale applications.
Requires manual configuration: You need to manually configure firewall rules and DNS records to expose your application using Node Port.
Security concerns: Node Port can expose your application directly to the internet, making it more vulnerable to attacks.
When to use Node Port:

Small-scale applications: Node Port is suitable for small-scale applications with limited traffic.
Testing and development: It can be used for testing and development purposes before deploying a more scalable solution.
Specific use cases: There may be specific use cases where Node Port is preferred, such as when you need to access the application from within the cluster or for certain networking configurations.
However, for most production applications, HTTP(S) load balancing is the recommended approach due to its superior scalability, performance, and security benefits.

67
Q

How do you exposing applications using GKE services

A

bookmark_border
This page shows how to create Kubernetes Services in a Google Kubernetes Engine (GKE) cluster. For an explanation of the Service concept and a discussion of the various types of Services, see Service.

Introduction
The idea of a Service is to group a set of Pod endpoints into a single resource. You can configure various ways to access the grouping. By default, you get a stable cluster IP address that clients inside the cluster can use to contact Pods in the Service. A client sends a request to the stable IP address, and the request is routed to one of the Pods in the Service.

There are five types of Services:

ClusterIP (default)
NodePort
LoadBalancer
ExternalName
Headless

68
Q

You created a Google Kubernetes Engine cluster. The cluster requires a minimum of four Pods to be operational at any given time. You need to make sure that the minimum number of Pods in the cluster is always available. What approach should be utilized to ensure operational continuity?

A

The purpose of a ReplicaSet is to maintain a stable set of replica Pods running at any given time. ReplicaSets are used to guarantee the availability of a specified number of identical Pods.

69
Q

A new teammate needs to gain access to a recently deployed Linux VM using SSH. They are only familiar with how to connect to a VM using Windows RDP. What would you suggest the user consider?

A

You would instruct the user that Compute Engine self-manages the SSH keys when you connect to a Linux VM using a browser. The instance creates and applies the SSH key pairs automatically. Since you cannot manage the SSH keys, you connect using the browser where roles are controlled by Cloud IAM. You must be a project member with roles/compute.instanceAdmin.
The user is given the appropriate project membership role, role/compute.instanceAdmin. The permission assigned allows for the creation, modification, and deletion of virtual machine instances, including Shielded VMs. If the user is intended to run VM instances as service accounts, the user also requires the roles/iam.serviceAccountUser role.

70
Q

You recently deployed a highly available application in your GCP environment. The helpdesk has received notifications daily that the system experiences performance issues around noon. Many users are getting timeout errors. Which Operations solution is best to determine the initial end-user problems?

A

Cloud Trace is a distributed tracing application. The purpose of Cloud Trace is to enable developers and their counterpart operations engineers to identify where performance issues exist, specifically code-based performance challenges.

71
Q

Your organization would like to create new projects for each of the DevOps interns. The interns will be partaking in a corporate-wide initiative to develop mobile applications on GCP. The cloud engineer is having trouble creating new accounts for each intern. What permissions are required to create a project?

A

The user must have roles/resourcemanager.projectCreator permission.
The role provides access to create new projects. If the user creates a project, they are then automatically given owner role access for that project.

72
Q

You work for a design company. You recently created various graphics stored in a ZIP file called Graphic.ZIP. You housed them in the Cloud Storage bucket called Demo. Much of the team develops its graphics portfolio on a Compute Engine instance called DesignInstance. Today, you presented your work to the client and want to move the designs placed in Demo to Production. How would you go about completing this task?

A

gsutil mv gs://Demo/Graphics.Zip gs://Production/Graphics.Zip

When you are moving objects between Cloud Storage buckets, you utilize gsutil, and mv stands for move. The Demo bucket is the source bucket, whereas Production is the destination. The question is asking you to move the document, not copy it.

73
Q

What is a replica set?

A

ReplicaSet is a controller, as mentioned earlier in the chapter, that is capable of identifying when there are not enough Pods available for a running application or workload. If such a condition is recognized, a ReplicaSet will create one or more Pods. ReplicaSets can also update and delete Pods.

74
Q

What are GKE Deploykments

A

Deployments are sets of like-kind Pods managed by the Kubernetes Deployment controller that does not have any unique characteristics. When an instance becomes unhealthy, a Deployment can run one or more replicas of an application to replace failed or unresponsive instances. Deployments ensure that application instances are available to serve user requests as necessary.
Deployments makes use of the Pod template to describe a Pod specification.
The Pod specification acts as a blueprint detailing what the Pod should act and look like throughout its entire life cycle. The specification makes clear how an application should operate, the prerequisites for volume mounting, and labeling conventions among specific details. New Pods are automatically created any time a template is modified.

75
Q

How to you run Bigquery jobs programmatically

A

To run a BigQuery job programmatically using the REST API or client libraries, you:

Call the jobs.insert method.
Periodically request the job resource and examine the status property to learn when the job is complete.
Check to see whether the job finished successfully

Required permissions
To run a BigQuery job, you need the bigquery.jobs.create IAM permission.

Each of the following predefined IAM roles includes the permissions that you need in order to run a job:

roles/bigquery.user
roles/bigquery.jobUser
roles/bigquery.admin
Additionally, when you create a job, you are automatically granted the following permissions for that job:

bigquery.jobs.get
bigquery.jobs.update

76
Q

What roles would you use to manage service accounts? to create?

A

iam.serviceAccountUser - manage and control accounts
iam.serviceAccountAdmin - create plus User
iam.serviceAccountCreator - create but nothing else