Mock Exam PL300 Flashcards
Your company’s product data is held in an Azure SQL Server database. Your company has an endorsed dataset available in the Power BI service that contains the company’s product catalog and sales data in a common model that standardizes how product data is reported. You use Power BI Desktop to create reports.
You need to create a new report on product data that is consistent with the standardized data.
What should you do?
A. Create a new local dataset connecting to the Azure SQL database.
B. Create a new .pbix file and use Power BI dataflows to connect to the endorsed dataset.
C. Edit a .pbix file with an existing connection to the Azure SQL database and connect to the endorsed dataset.
D. Create a new .pbix file and connect to the endorsed dataset.
D. Create a new .pbix file and connect to the endorsed dataset.
The Power BI Desktop can connect to datasets in the Power BI service that have been shared or published. This is called Power BI service live connection. You can then build your own reports based on this dataset and publish those reports to the Power BI service.
You should not use Power BI dataflows to connect to the dataset. You create datasets from a dataflow. Dataflows are the Data Transformation layer in Power Bl. Datasets are where modeling, calculations, and measures are created.
You are importing data into a dataset from two data tables using Power Query and DirectQuery. TableA contains the Vendor details and TableB contains the products that are sold by each vendor. You need to combine the data from the two tables into a single table to make the dataset easier to navigate.
You must ensure that all of the vendors are included in the dataset.
What should you do?
A. Enable Assume Referential Integrity in Power BI Desktop.
B. Create a surrogate key.
C. Merge TableA with TableB using a left-outer join.
D. Append the queries for TableA and TableB
C. Merge TableA with TableB using a left-outer join.
You are a data analyst using Visual Studio Data Tools (VSDT). You create a new Power BI organization and a premium workspace named workspace1 for a new customer.
You plan to connect and deploy a tabular model project in workspace1. What should you do first?
A. From Visual Studio Data Tools (VSDT), deploy the tabular model.
B. From the Power BI Admin portal, configure the XMLA endpoint as None.
C. From the Power BI Admin portal, configure the XMLA endpoint as Read Write.
D. From Visual Studio Data Tools (VSDT), update the tabular model compatibility level to 1500.
C. From the Power BI Admin portal, configure the XMLA endpoint as Read Write.
By default, a premium workspace has the XMLA endpoint enabled as Read only. Before trying to deploy the tabular model from Visual Studio Data Tools (VSDT), you need to change the XMLA endpoint to Read Write.
You are using Power Query Editor to import the data of a single business entity from different data sources with different schemas into Power Bl. All of the data sources share a common ID that uniquely identifies each record of the entity.
You need to use the entity to filter on your report using slicers, in order to unify the data. Which feature should you use?
A. Combine files
B. Append Queries
C. Merge columns
D. Merge Queries
E. Group by
D. Merge Queries
You should use Merge queries to unify the data of a single business entity. Since the data comes from different data sources, it is a good practice to unify it to build a single dimension table that represents a single business entity according to the widely adopted star schema modeling approach. This dimension table would be used to filter on the fact table using a relationship, so it is a good fit to be used as a slicer. Since each data source has a different schema, merging based on matching values from one or multiple columns would be the appropriate choice. If there are no 100% similarities between values of the matching columns, you can still use fuzzy matching (only on text format columns).
You are a business intelligence developer. You need to use data contained in Azure Analysis Services to build your report. You need to connect this data source.
What two solutions can you use to achieve this goal? Each correct answer presents a complete solution.
Correct!
A. Import
B. DirectQuery
C. Dual
D. Connect Live
A. Import
D. Connect Live
You can use Connect live. It is used when connecting to multidimensional data sources, such as Analysis Services. Connect live is one of the two solutions available when connecting to Azure Analysis Services in Power BI Desktop.
You can also use Import. Connect live is preferred because there are some limitations with Import mode.
Your data model contains the following tables:
- Calendar table: 25K rows that are updated rarely
- Product table: 2K rows that are updated monthly
- Sales table: 10M rows that are updated regularly, and changes need to be reflected immediately
You need to identify the best storage mode for the data model tables. Which storage mode should you use for the Product table and the Calendar table?
A. Product table - Import, Calendar table - DirectQuery
B. Product table - DirectQuery, Calendar table - Import
C. Product table - Import, Calendar table - Import
D. Product table - DirectQuery, Calendar table - DirectQuery
C. Product table - Import, Calendar table - Import
You should use Import as the storage mode for the Product and Calendar tables. The Import method is preferred when you have fixed or static rows with a low volume of data, meaning a few thousand rows that are updated infrequently. In the Import method, data is cached into Power BI
You should use DirectQuery as the storage mode for the Sales table. The DirectQuery method is preferred when you have a huge volume of data consisting of millions of rows. It is also useful when changes to the table need to be reflected immediately in the data model or report. In the DirectQuery method, queries are directly sent to the underlying data source.
You create a report about product sales and customer churn rates. You need to restrict the visual to show the top five products with the highest churn rates.
Solution: You create a measure of the top five products using the TOPN DAX function. Does this solution meet the goal?
A. True
B. False
A. True
This solution meets the goal. The measure will return the top five records. However, the TOPN DAX function does not guarantee the order of the results.
CompanyA is a consulting firm specialized in human resources (HR) processes for large retail companies in Europe. CompanyA manages an HR solution that uses multiple Azure SQL Server databases. Each retail company has its own database provisioned in an Azure subscription managed by CompanyA.
The data engineers are interested in building a report to measure employee turnover using Power BI Desktop. Each retail company may have different requirements for building the report. Each data engineer should work to build a report for a specific company, using a different database from their peers.
You need to support the data engineers to connect to the correct database.
Solution: Create a read-write XMLA endpoint for each customer. Does this solution meet the goal?
A. True
B. False
B. False
This solution does not meet the goal. You can use XML for Analysis (XMLA) endpoint to connect applications to access datasets published in Power BI service workspaces. This is useful for consuming Power BI datasets from Excel or for performing administrative tasks in SQL Server Management Studio (SSMS). XMLA is the same protocol as Microsoft Analysis Services, which is the engine that is used by Power BI under the hood.
You are analyzing data quality for a dataset in Power Bl. You use Power Query to preview the data. You need to examine the quality of the data.
Identify if the below statement is factually correct:
Using Column profile in Data Preview, you can gain an insight into only the first 1,000 rows in the dataset.
A. True
B. False
B. False
By default, you can use Power Query to examine the first 1,000 rows of your dataset, but you can change this configuration to analyze the whole dataset instead. This applies to the Column profile, Column distribution, and Column quality options in Data Preview.
You create a report about product sales and customer churn rates. You need to restrict the visual to show the top five products with the highest churn rates.
Solution: You create a measure of the top five products using the RANKX DAX function. Does this solution meet the goal?
A. True
B. False
A. True
This solution meets the goal. The measure will return the top five records. The RANKX DAX function returns the order (rank) for each product, and this rank can be used to display products in order from highest to lowest.
You have a large dataset containing four million rows. The dataset primarily stores manufacturing and shipping-related information of medicines manufactured at a facility on a given day. The table has a column of data type Date/Time/Timezone.
You need to create some views day by day, while optimizing performance and reduce the size of the data model. What should you do?
A. Convert the column to Date.
B. Convert the column to Date/Time.
C. Convert the column to Text.
D. Convert the column to Duration.
A. Convert the column to Date.
You should convert the column to the Date data type. The key to choosing an optimal data type such that you have the optimized data captured for a given column is determined by the specific use-case. In this scenario, although the dataset is large, all of the data is being captured for a manufacturing facility for a given day. So, the capturing date is the required granularity for this use-case. Therefore, storing Date/Time/Timezone is excessive and will result in loading a large amount of data that is not required in visualizations or reporting
Your manager encounters slowness with a Power BI report. The network and server are operating at optimal speed. You need to identify issues affecting report performance.
Which two tools can you use to meet your goal? Each correct answer presents a complete solution.
A. SQL Server Profiler
B. Query Diagnostics
C. Performance Analyzer
D. Performance Monitor
B. Query Diagnostics
C. Performance Analyzer
You can use Performance Analyzer to identify Power BI report performance issues. Performance analyzer shows you how much time it takes to render a visual. It also measures the amount of time it takes to retrieve data.
You can also use Query Diagnostics to identify Power BI report performance issues. Query Diagnostics allows you to diagnose query performance and transformations included in it. A query could have multiple steps. A step could be a transformation or a simple query. A transformation involves adding or removing a column, replacing a value, or calculating age from dates.
Your company requirement is to define a fiscal year that starts from May. You need to create a common date for a data model using DAX. How should you complete the DAX expression?
A. DATE = CALENDARAUTO(3)
B. DATE = CALENDAR(3)
C. DATE = CALENDARAUTO(4)
D. DATE = CALENDAR(4)
C. DATE = CALENDARAUTO(4)
You should use the CALENDARAUTO() function to automatically create the common date table for the data model. This DAX function creates the date table based on the minimum and maximum available dates in a data model. You can provide 4 as an argument to this function to end the fiscal year in April and start the new fiscal year from May to meet the requirements.
You have a table of Purchase Orders (POs), with the columns PO Number, PO Date, Vendor, and PO Amount.
[PO Number is formatted with PO Followed by the number. Date is YYYY-MM-DD Format, Vendor is text, and PO Number is a decimal ]
There are relationships with other tables on the PO Number, PO Date, and Vendor columns. You need to optimize model performance. Which two actions should you perform?
A. Remove the PO prefix from the PO Number column and change the data type to Whole Number.
B. Add a new column to the Purchase Order table and perform a lookup to set the Vendor Id from Vendor column.
C. Change the format of the PO Amount column to Currency.
D. Mark the Purchase Orders table as the date table.
A. Remove the PO prefix from the PO Number column and change the data type to Whole Number.
B. Add a new column to the Purchase Order table and perform a lookup to set the Vendor Id from Vendor column.
You should remove the PO prefix from the PO Number column and change the data type to Whole Number. The model will achieve the highest optimizations with numeric column data, which uses value encoding. Text columns use hash encoding that will adversely affect performance.
You should also add a new column to the Purchase Order table and perform a lookup to set the Vendor Id from Vendor column. The column should use the following function:
Vendor Id = LOOKUPVALUE (Vendor [Vendor Id], Vendor [Vendor], ‘Purchase Order’[Vendor]).
You should then recreate the relationship using the Vendor ID and not the Vendor name. Using numeric values in relationships instead of text will improve the model performance.
You import a table into Power Bl.
You need to configure the columns in the table. Which Summarize option should you recommend for aggregating the Amount column for use in visualizations and calculating totals?
A. SUM
B. AVERAGE
C. Count
D. None
A. SUM
You should use the Sum aggregation for the Amount field. Amount is a decimal number, and analysis of total sales will be required for visualizations. Adding aggregations will reduce the size of the model and improve performance.
You import a table into Power Bl.
You need to configure the columns in the table. Which Summarize option should you recommend for aggregating the Order Number column for use in visualizations and calculating totals?
A. SUM
B. AVERAGE
C. Count
D. None
C. Count
You should use the Count aggregation for the Order Number field. Order Number is an integer and is the unique reference for the Sales table. The type of visualizations that will be required for Order Number relates to the number of orders placed, so a count is the best aggregation option.
You are creating a report of financial transactions showing monthly trends. The source transaction table has in excess of 10 million rows.
You need to optimize the report’s performance.
Which two actions should you perform? Each correct answer presents part of the solution.
A. Set the Storage mode on the transactions table to DirectQuery.
B. Create all visualizations on a single report page.
C. Set the Storage mode on the transactions table to Import. O
D. Create a summary table in the data source with the transaction data grouped by month.
A. Set the Storage mode on the transactions table to DirectQuery.
D. Create a summary table in the data source with the transaction data grouped by month.
Choice D is the most effective technique to reduce the size of the model. The transaction table will contain records for each transaction, but a summary table will contain one record per month, aggregating all of the transactions over the month. This will significantly reduce the number of records in the dataset.
By using DirectQuery, the data will not be held in the model, and queries will be sent to the database when users access visualizations. If you use DirectQuery, the performance will rely on the capacity of the data sources.
You have a table named Sales with the columns: year, sales, prior year sales.
You need to create a measure that calculates sales from the previous year based on a dynamic period (_PY Sales).
Which two DAX expressions can you use to achieve the goal? Each correct answer presents a complete solution.
A. _PY Sales = CALCULATE(SUM(Sales[Sales]), SAMEPERIODLASTYEAR(‘Calendar’[Date]))
B. _PY Sales = CALCULATE(SUM(Sales[Sales]), DATEADD(‘Calendar’[Date].[Date], -1, MONTH))
C. _PY Sales = CALCULATE(SUM(Sales[Sales]), DATEADD(‘Calendar’[Date].[Date], -1, YEAR))
D. _PY Sales = CALCULATE(SUM(Sales[Sales]), PARALLELPERIOD(‘Calendar’[Date], -1, YEAR))
B. _PY Sales = CALCULATE(SUM(Sales[Sales]), DATEADD(‘Calendar’[Date].[Date], -1, MONTH))
C. _PY Sales = CALCULATE(SUM(Sales[Sales]), DATEADD(‘Calendar’[Date].[Date], -1, YEAR))
Learn more - https://radacad.com/dateadd-vs-parallelperiod-vs-sameperiodlastyear-dax-time-intelligence-questionLinks to an external site.
https://learn.microsoft.com/en-us/dax/calculate-function-daxLinks to an external site.
Which of the following statements about creating a portrait version of your reports/dashboards for mobile devices is TRUE ?
A. You can create a portrait version of your reports for mobile devices only on Power BI Desktop
B. You can create a portrait version of only your dashboards for mobile devices
C. You can create a portrait version of your reports for mobile devices on both Power BI Desktop and service
D. You can create a portrait version of your reports for mobile devices only on the Power BI service
C. You can create a portrait version of your reports for mobile devices on both Power BI Desktop and service.
When you publish a report to the Power BI service and view the report on a mobile device, you see the report in landscape orientation. However, you can create an additional view for mobile devices (portrait mode).
Either you can create a mobile layout in Power BI Desktop and publish the reports to service, or directly create the mobile layout in service. Either way, if a mobile layout for a report exists, the mobile-optimized version (portrait) will be shown to users on a mobile device.
Your company has employees in 10 states. The company recently decided to associate each state to one of the following three regions: East, West, and North.
You have a data model that contains employee information by state. The model does NOT include region information. You have a report that shows the employees by state. You need to view the employees by region as quickly as possible. What should you do?
A. Create a new group on the state column and set the Group type to List.
B. Create a new aggregation that summarizes by employee.
C. Create a new aggregation that summarizes by state.
D. Create a new group on the state column and set the Group type to Bin.
A. Create a new group on the state column and set the Group type to List.
I am using the “on-object interaction” preview.
My bar chart visual, which is on the “Summary” page of my report, shows the Year of the “Hire Date” field in the X-axis and Employee Count in the Y-axis.
I want to add a drill-through that goes from this “Summary” page to a “Detail page”, which shows an analysis by quarter for a particular year. What do I need to do?
A. In the Format pane for the “Detail” page, go to “Canvas settings”.
B. In the Format pane for the “Summary” page, go to “Canvas settings”.
C. In the Format pane for the “Summary” page, go to “Page information”.
D. In the Format pane for the “Detail” page, go to “Page information”.
D. In the Format pane for the “Detail” page, go to “Page information”.
You then change the “Page type” to “Drillthrough”, and then add “Hire Date” into the “Drill through from”.
I have an Employee table in Power BI. I want to analyze Salary data by drilling down into various fields. I also want to be able to change the order of exploration when I want to.
Which AI visual is designed for this purpose?
A. Decomposition tree
B. Q&A
C. Key influencers
D. Smart narrative
A. Decomposition tree
You put the measure you want to breakdown in the Analyze section, and how you want to break down in the “Explain by”.
I have a field called “Comments”.
I want to create a filter on this field to retrieve the words “Start”, “State”, “Streets” and any other words which start with “St” and include another “t”.
As shown below,
* I create a Filter on the field “Comments”,
* I change the filter type to “Advanced filtering”, and
* I change the Show items when the value to “contains”.
What do I put into the “Show items when the value contains” to capture all of these values? (Which wildcard do I use to represent zero-to-many characters?)
A. st_t_
B. st%t%
C. st?t?
D. st⭐️t⭐️
D. st⭐️t⭐️
The * represents zero, one or many characters.
You have a report page named Sales which contains visual A, showing total sales for each city for the selected year, as shown in the exhibit.
You need to configure an interaction for this visual. When clicking Drill through, it should redirect to a detail page that shows Sales for the selected city by Product Line and will preserve all the filters from the Sales page.
Which four actions should you perform? Each correct answer presents part of the solution.
A. Create a table visual named Visual B to show total sales by city and product line.
B. On the Sales page, under the Drillthrough option, add the City column as the drill through field.
C. On the Detail page, under the Drillthrough option, add City column as the drill through field.
D. Create a new page named Detail.
E. On the Sales page, under the Drillthrough option, set Keep all filters to On.
F. On the Detail page, under the Drillthrough option, set Keep all filters to On.
D. Create a new page named Detail.
A. Create a table visual named Visual B to show total sales by city and product line.
C. On the Detail page, under the Drillthrough option, add City column as the drill through field.
F. On the Detail page, under the Drillthrough option, set Keep all filters to On.
To create a drillthrough interaction on Visual A, you first need to create a Detail page.
Then, place a table visual on this page to show total sales by city and product line.
Next, on the Detail page, under the Drillthrough option, add City column as the drill through field.
Finally, set Keep all filters to On on the Detail page, so that it can preserve all the filters from the Sales page.