
(See below also for DA-100 requirements check list)
INTERNAL ONLY
Approach:
(For the Future, track the requirements for DA-100 exam as well)
- A 3-day Power BI course that will not make use of the Knight/Pearson manual.
- The course should have references to the Microsoft web pages and structure of their presentation. This will enable us to keep up to date with the latest developments automatically
- Use the Northwind data (Mark had no objections in his email re Zando)
- I want to use this as the schema is easy to understand. I want to focus the attention of the student on Power BI, not on the intricacies of the data model.
- Using Northwind also ties in very well with SQL Beginners. We recommend in the SQL course that they take Power BI, and vice versa. Using the same data will provide a narrative over the courses.
BESPOKE SECTION - ZandoCLone the completed course and add Zando material. In parallel with using Northwind, make use of the Zando data
- There is a lot to show them in terms of normalization
- We can check which visualizations to use for the Zando data
- Have meeting with MichaelC to get access to the Zando data (BernardN – 29-Jan-21 )
- Have a short workshop with MichaelC. Show all the visualizations and to get his requirements on where to use specific visualization on the Zando data (BernardN: 5-Feb-21)
DA-100
https://query.prod.cms.rt.microsoft.com/cms/api/am/binary/RE4qlRu
Skills Measured
NOTE: The bullets that appear below each of the skills measured are intended to illustrate how
we are assessing that skill. This list is not definitive or exhaustive.
NOTE: Most questions cover features that are General Availability (GA). The exam may contain
questions on Preview features if those features are commonly used.
Prepare the Data (20-25%)
Get data from different data sources
identify and connect to a data source
change data source settings
select a shared dataset or create a local dataset
select a storage mode
choose an appropriate query type
identify query performance issues
use Microsoft Dataversethe Common Data Service (CDS)
use parameters
use or create a PBIDS file
use or create a data flow
Profile the data
identify data anomalies
examine data structures
interrogate column properties
interrogate data statistics
Clean, transform, and load the data
resolve inconsistencies, unexpected or null values, and data quality issues
apply user-friendly value replacements
identify and create appropriate keys for joins
evaluate and transform column data types
apply data shape transformations to table structures
combine queries
apply user-friendly naming conventions to columns and queries
leverage Advanced Editor to modify Power Query M code
configure data loading
resolve data import errors
Model the Data (25-30%)
Design a data model
define the tables
configure table and column properties
define quick measures
flatten out a parent-child hierarchy
define role-playing dimensions
define a relationship's cardinality and cross-filter direction
design the data model to meet performance requirements
resolve many-to-many relationships
create a common date table
define the appropriate level of data granularity
Develop a data model
apply cross-filter direction and security filtering
create calculated tables
create hierarchies
create calculated columns
implement row-level security roles
set up the Q&A feature
Create measures by using DAX
use DAX to build complex measures
use CALCULATE to manipulate filters
implement Time Intelligence using DAX
replace numeric columns with measures
use basic statistical functions to enhance data
create semi-additive measures
Optimize model performance
remove unnecessary rows and columns
identify poorly performing measures, relationships, and visuals
improve cardinality levels by changing data types
improve cardinality levels through summarization
create and manage aggregations
Visualize the Data (20-25%)
Create reports
add visualization items to reports
choose an appropriate visualization type
format and configure visualizations
import a custom visual
configure conditional formatting
apply slicing and filtering
add an R or Python visual
configure the report page
design and configure for accessibility
configure automatic page refresh
Create dashboards
set mobile view
manage tiles on a dashboard
configure data alerts
use the Q&A feature
add a dashboard theme
pin a live report page to a dashboard
configure data classification
Enrich reports for usability
configure bookmarks
create custom tooltips
edit and configure interactions between visuals
configure navigation for a report
apply sorting
configure Sync Slicers
use the selection pane
use drillthrough and cross filter
drilldown into data using interactive visuals
export report data
design reports for mobile devices
Analyze the Data (10-15%)
Enhance reports to expose insights
apply conditional formatting
apply slicers and filters
perform top N analysis
explore statistical summary
use the Q&A visual
add a Quick Insights result to a report
create reference lines by using Analytics pane
use the Play Axis feature of a visualization
personalize visuals
Perform advanced analysis
identify outliers
conduct Time Series analysis
use groupings and binnings
use the Key Influencers to explore dimensional variances
use the decomposition tree visual to break down a measure
apply AI Insights
Deploy and Maintain Deliverables (10-15%)
Manage datasets
configure a dataset scheduled refresh
configure row-level security group membership
providing access to datasets
configure incremental refresh settings
promote or certify Power BI content a dataset
identify downstream dataset dependencies
Create and manage workspaces
create and configure a workspace
recommend a development lifecycle strategy
assign workspace roles
configure and update a workspace app
publish, import, or update assets in a workspace
apply sensitivity labels to workspace content
use deployment pipelines