KTL SOLUTIONS

Dataflows for Power BI with Dynamics GP Data: Part 2

Share this post

In Part 1 of Dataflows for Power BI with Dynamics GP data, we talked about what dataflows are and started creating our first dataflow entity by creating a Date dimension.  Now let’s start creating the remaining entities for:

  • Dimensions from Dynamics GP card windows via SQL queries:
    • Customer data from Sales module’s customer card
    • Salesperson data from the Sales module’s Salesperson card
    • Sales Territory data from the Sales module’s Sales Territory card
    • Product data from Inventory module’s item card
  • Fact data from our Dynamics GP transaction entry windows via SQL queries:
    • Sales data from the Sales Transaction Entry window
    • Inventory data from the Inventory Transaction Entry window

Creating Custom Functions in Dataflows

The easiest way to create custom function in Power BI service’s dataflow is to create it in Power BI Desktop’s Power Query and then open up the Advance Editor to copy and paste the M code into a blank query.

let

  Source = (input) =>

let

values = {

{“ALABAMA”,”AL”},

{“ALASKA”,”AK”},

{“ARIZONA”,”AZ”},

{“ARKANSAS”,”AR”},

{“CALIFORNIA”,”CA”},

{“COLORADO”,”CO”},

{“CONNECTICUT”,”CT”},

{“DELAWARE”,”DE”},

{“FLORIDA”,”FL”},

{“GEORGIA”,”GA”},

{“HAWAII”,”HI”},

{“IDAHO”,”ID”},

{“ILLINOIS”,”IL”},

{“INDIANA”,”IN”},

{“IOWA”,”IA”},

{“KANSAS”,”KS”},

{“KENTUCKY”,”KY”},

{“LOUISIANA”,”LA”},

{“MAINE”,”ME”},

{“MARYLAND”,”MD”},

{“MASSACHUSETTS”,”MA”},

{“MICHIGAN”,”MI”},

{“MINNESOTA”,”MN”},

{“MISSISSIPPI”,”MS”},

{“MISSOURI”,”MO”},

{“MONTANA”,”MT”},

{“NEBRASKA”,”NE”},

{“NEVADA”,”NV”},

{“NEW HAMPSHIRE”,”NH”},

{“NEW JERSEY”,”NJ”},

{“NEW MEXICO”,”NM”},

{“NEW YORK”,”NY”},

{“NORTH CAROLINA”,”NC”},

{“NORTH DAKOTA”,”ND”},

{“OHIO”,”OH”},

{“OKLAHOMA”,”OK”},

{“OREGON”,”OR”},

{“PENNSYLVANIA”,”PA”},

{“RHODE ISLAND”,”RI”},

{“SOUTH CAROLINA”,”SC”},

{“SOUTH DAKOTA”,”SD”},

{“TENNESSEE”,”TN”},

{“TEXAS”,”TX”},

{“UTAH”,”UT”},

{“VERMONT”,”VT”},

{“VIRGINIA”,”VA”},

{“WASHINGTON”,”WA”},

{“WEST VIRGINIA”,”WV”},

{“WISCONSIN”,”WI”},

{“WYOMING”,”WY”},

{“WASHINGTON, D.C.”,”DC”},

{“WAHINGTON”,”WA”},

{“ONTARIO”,”ON”},

{“ONTARIO CANADA”,”ON”},

{“QUEBEC”, “QC”},

{“QUEBEC CANADA”,”QC”},

{“NEWFOUNDLAND”,”NL”},

{“ALBERTA”,”AB”},

{“ALBERTA CANADA”,”AB”},

{“BRITISH COLUMBIA”,”BC”},

{“BRITISH COLUMBIA CANADA”,”BC”},

{“MANITOBA”,”MB”}

},

Result = List.First(List.Select(values, each _{0}=input)){1}

in

Result

in

  Source

 

Rename the custom function to “fnLookup”.

 

Now we need to modify our Customer entity to clean up the state column. Right click on the Customer entity and select “Advanced Editor”.

 

Copy and paste the below Power Query code into the Advanced Editor window.  This custom function helps replace the data in the State column with the accepted two-character State abbreviations.

let

  Source = Sql.Database(“YOUR SQL SERVER“, “YOUR DATABASE“),

  #”Navigation 1″ = Source{[Schema = “dbo”, Item = “view_Customers”]}[Data],

  #”Renamed columns” = Table.RenameColumns(#”Navigation 1″, {{“State”, “State_old”}}),

  #”Trimmed text” = Table.TransformColumns(#”Renamed columns”, {{“State_old”, each Text.Trim(_), type text}}),

  #”Inserted conditional column” = Table.AddColumn(#”Trimmed text”, “State”, each try fnLookup([State_old]) otherwise [State_old]),

  #”Reordered columns” = Table.ReorderColumns(#”Inserted conditional column”, {“Customer_Key”, “Company_Key”, “CustomerNumber”, “CustomerName”, “Group Name”, “SubGroup Name”, “Address1”, “Address2”, “Address3”, “City”, “State”, “State_old”, “Zip”, “Phone”, “Region”, “GeographyKey”, “Active”, “CreatedBy”, “CreatedOn”, “UpdatedBy”, “UpdatedOn”, “ActivatedOn”, “DeactivatedOn”, “Source”, “CheckSum”, “CompanyID”}),

  #”Transform columns” = Table.TransformColumnTypes(#”Reordered columns”, {{“State”, type text}}),

  #”Replace errors” = Table.ReplaceErrorValues(#”Transform columns”, {{“State”, null}}),

  #”Removed columns” = Table.RemoveColumns(#”Replace errors”, {“State_old”})

in

  #”Removed columns”

After all your dataflow entities have been created, save your changes and create a schedule to refresh your data.

I’m going to build out a Product Performance report from the dataflow entities that I created, I will add ABC segmentation, moving averages and time comparisons using DAX measures.  ABC segmentation allows you to sort a list of values in three groups, which have different impacts on the final result. ABC segmentation works on the famous Pareto principle, which states that 20% of efforts give 80% of the result.  The meaning of the segments:  A – the most important for the total of (20% gives 80% of the results).  B – average in importance (30% – 15%).  C – the least important (50% – 5%). 

Are three segments enough? Should we use more? What percentages? To answer these questions, you need to know your data and what the ABC segmentation is being done for. I have seen companies using 4 classes (ABCD) or even more.  For this example, three classes have the advantage that they separate the assortment in three categories of high, medium, low importance, which is easy to communicate.

In the Product Performance report below, I used a scatter chart to analyze the Product classes profit margin and total profits, the Sales ABC by ABC Class stacked bar chart segments, the products using ABC segmentation which allows you to drill down into each segment to see the underlying detail. Additionally, there is a total cost by warehouse location and weekly moving average cost comparison chart at the bottom of the report.  Here are the DAX measures and DAX Columns that I used to create the Product Performance report:

DAX Columns:

ProductSales =

CALCULATE(SUM(‘Sales'[ExtendedPrice]))

CumulatedSales =

CALCULATE(

    SUM(‘Products'[ProductSales]),

    ALL(‘Products’),

    ‘Products'[ProductSales] >= EARLIER(‘Products'[ProductSales]))

CumulatedPercentage = ‘Products'[CumulatedSales] / SUM(‘Products'[ProductSales])

 

DAX Measures:

Profit Margin = DIVIDE ([Total Profits], [Total Sales], 0)

Total Sales = SUM(Sales[ExtendedPrice])

Units Sold = SUM(‘Sales'[Quantity])

Total Profits = [Total Sales] -[Total Cost]

Costs Weekly MA =

AVERAGEX(

    DATESINPERIOD(

        Dates[Date] ,

        LASTDATE( Dates[Date] ),

        -7,

        DAY),

    [Costs LY]

)

Costs Weekly MA LY =

AVERAGEX(

    DATESINPERIOD(

        Dates[Date] ,

        LASTDATE( Dates[Date] ),

        -7,

        DAY),

    [Total Cost]

)

YTD COGS =

CALCULATE (

    SUM ( Sales[ExtendedCost] ),

    FILTER ( ALL ( ‘Dates’ ), ‘Dates'[Year] = MAX ( ‘Dates'[Year] ) )

)

Qty on Hand =

SUM (Inventory[Quantity on Hand])

Turn-Earn Index =

 ( [Inventory Turnover Ratio] * ‘Sales Measures'[Profit Margin] )

Total Cost = SUM(Sales[ExtendedCost])

Sales ABC =

CALCULATE(

    [Total Sales] ,

    VALUES( Products[ItemDescription] ),

    FILTER(

        CALCULATETABLE(

            ADDCOLUMNS(

                ADDCOLUMNS(

                    VALUES( Products[ItemDescription] ),

                    “OuterValue”, [Total Sales]

                ),

                “CumulatedSalesPercentage” , DIVIDE(

                    SUMX(

                        FILTER(

                            ADDCOLUMNS(

                                VALUES( Products[ItemDescription] ),

                                “InnerValue”, [Total Sales]

                            ),

                            [InnerValue] >= [OuterValue]

                        ),

                        [InnerValue]

                    ),

                    CALCULATE(

                        [Total Sales],

                        VALUES( Products[ItemDescription] )

                    )

                )

            ),

            ALL( Products )

        ),

        [CumulatedSalesPercentage] > [Min Boundary]

        && [CumulatedSalesPercentage] <= [Max Boundary]

    )

)

Ready to Do Even More with Your Data?

Start organizing, knowing and executing on your data today with dataflows and Power BI to provide a self-service data lake in the future. KTL Solutions works with business leaders every day in helping them lead their organization into becoming a data-driven organization. Need help executing on your data? Contact KTL today.

Related Posts

Checking Your CMMC Progress

Written by Alec Toloczko With Cybersecurity Maturity Model Certification (CMMC) requirements on the horizon, it’s crucial for organizations handling Controlled Unclassified Information (CUI) to adhere

Read More »