Azure Fabric Flashcards

Keywords - Concepts

1
Q

What are diff set of Roles

A

Workspace
1Lake data access

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the 2 top level items u find when u open Lakehouse?

A

Tables
Files

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Where can u create shortcuts within a lakehouse ?

A

Under Tables - only in the top level
Under Files, anywhere in the folder structure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What locations can be your target for a shortcut ?

A

AWS S3
ADLS Gen 2
GCP
On-premises

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Can you write to shortcut target location ?

A

In ADLS v2 Yes, if u have permission at target.
In AWS No, even if u don’t have permission

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How permissions are resolved in shortcuts ?

A

An IAM/Azure role is associated with the shortcut during creation and that role is used to resolve permission.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What auth methods are used in shortcut connection setup of ADLS v2?

A

specify the full dfs url including the container name.
acct_key, sas_token or service principal are used

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How many 1Lake per Org ?

A

Only 1. 1Lake is a logical entity, that represents all the data elements within the org, without duplicating the data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How to have dept level segregation in 1Lake

A

1Lake is per Tenant. Within 1Lake workspaces can be created.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are data items in 1Lake?

A

Lakehouses, Warehouses, Files are the data items of 1Lake.
Each data item belongs to a different experience Fabric offers.
e.g. Lake house gives the spark experience to the developer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the data format of azure sql datawarehouse in a 1Lake ?
Will it be a propreitory sql db ?

A

All data items including lakehouse and sql warehouses are stored in Delta format. 1Lake is built on top of ADLS v2.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is Analysis services in Fabric

A

Analysis services supports PowerBI

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

There is a new mode called Direct Lake in Analysis Services. What are the old modes ?

A

Fabric adds the new mode called DirectLake where PowerBI can read 1Lake directly. The existing modes are Import and DirectQuery. DirectLake is the best of both import and Direct query.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is Fast Copy?

A

New feature in Data flow Gen2 - Copy activity that cuts down processing time & cost.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

In what resource level you call the 1Lake Data access APIs?

A

At the Lakehouse level.
list/get/put - workspace_id/lakehouse_item_id/

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Azure Private endpoint

A

An interfact in your vnet, connecting to azure services(sql/cosmos) thru microsoft backbone network.

17
Q

Does Private endpoint and link protect all traffic ?

A

No, it protects only inbound access to fabric services, Like uploading a file to lakehouse. For outbound data access, like api access, you shoud setup firewall rules.

18
Q

Vnet Data Gateway

A

To connect to on-premise data sources

19
Q

What is Trusted Workspace Access to Storage acct ?

A

Storage Accts are given secured access to Workspaces (synapse/dbriks) . A form of security thru limited network access.

20
Q

What are Data WorkFlow ?

A

Manage Apache Airflow to orchestrate the pipelines.
DataWorkflow is offered under the umbrella of Data Factory. What was Data Factory back then, is now Data Flow Gen2 and Data Pipelines.

21
Q

Fabric.Provider for Airflow

A

Fabric will package the operators, hooks, sensors relevant to Fabric as Provider and make it available to code data workflow jobs

22
Q
A