Wednesday, 5 June 2024

Documentation for Dashboards and Data Modeling for Bottling Company with sales in Africa

 The end client desires to upgrade the sales reporting for its resources from the current MS Excel spreadsheets to an online system.


The requirements of the end client were as follows:
1 .)    Consolidate the data from  multiple spreadsheets for analysis over period in time.
2.)    Provide visibility for sales and expenses by route and  truck .
Step 1: the ETL  team mapped  the spreadsheets onto a database(initially MS Access) for easy data entry at the client main location for after trip and before trip details. Attached are the details for the same (appendix a)
Design of the audit report:
As a first step the ETL team replicated the audit report for the end client in Powerbi. The purpose of replicating the audit report was as follows:
1.)    Check whether the numbers are correct as per the data entry done for the end client
2.)    Consolidate all the audit numbers for the end client in one interactive framework.
The details of the replication of the audit report are in appendix b.

Development of analytics and dashboards: the ETL team  identified four(4) key kpis for analytic purposes.
·         Sales: date, truck and route
·         Returned sales-returned crates, returned full bottles, returned empty bottles: date, truck and route
·         Samples: date, truck and route
·         Expenses: date, truck, route and expense type.
The behavior of expense is as follows:
·         Aggregate by date dimension and expense type
·         Semi aggregate by route and truck master
·         Non aggregate by item dimension

The behavior of sales is as follows:
·         Aggregate by date dimension
·         Aggregate by truck and route master
·         Non aggregate by expense type
·         Aggregate by item entities.

In case of expenses analysis the analytics to be developed were to be grouped by expense type(that does not fall in direct hierarchy with route and truck hierarchies). Hence the calculated measures need to be fixed at each level of analysis by expense type( for route truck and overall analysis).
Accordingly the etl team developed the following kpis for the expense analytics:
Expense(fixed overall):
All expense: CALCULATE(SUM(after_trip_allocation[expense]),ALL(after_trip_allocation),after_trip_allocation[type_of_expense_id]>0)
NOTE: The type of expense id was stipulated to be more than 0 since without the expense id expense aggregate would not make sense.

Expense across route id and expense nature:the given measures fixes the total expense
EXPENSE_ACROSS_ROUTE_EXPENSETYPE = CALCULATE(SUM(after_trip_allocation[expense]),ALLEXCEPT(after_trip_allocation,after_trip_allocation[route_id],'type of expense'[expense_nature]), after_trip_allocation[type_of_expense_id]>0)

Percentage calculation for route:
PERCENT_ROUTE = (after_trip_allocation[EXPENSE_ACROSS_ROUTE_EXPENSETYPE]/after_trip_allocation[expense_excl_nature])*100

Expense across truck id and expense nature:
EXPENSE_ACROSS_TRUCK_EXPENSE_NATURE = CALCULATE(SUM(after_trip_allocation[expense]),ALLEXCEPT(after_trip_allocation,Truck_master[Truck_ID],'type of expense'[expense_nature]),after_trip_allocation[type_of_expense_id]>0)


Percent expense along the truck route is defined as follows:
PERCENT_TRUCK = (after_trip_allocation[EXPENSE_ACROSS_TRUCK_EXPENSE_NATURE]/after_trip_allocation[expense_excl_nature])*100

Attached is the dashboard for the expense analysis:



Calculation  of sold bottles:
Sold bottles is the difference of quantity loaded,return bottles and sample bottles.
Attached is the snapshot for the sold bottles
sold_bottles =
SUM('Before_trip_allocation'[quantity_loaded]) - SUM('after_trip_allocation_item'[Returned_bottles])-CALCULATE(sum(after_trip_allocation_item[returned_full_bottles]),after_trip_allocation_item[sample_flag]="'sample'")

Sold bottles are calculated on the basis of:
·         Vehicle no.
·         Route description
·         Date dimension
·         Item dimension(brand,volume and holding type)
Attached is the dashboard of the sold bottles:



Vehicle no and the route description  is the slicer .we can analyzed the sold bottles for the particular  vehicle and route.


Calculation  of sample bottles
Sample bottles is extracted from the sample flag.
Attached is the snapshot for the sold bottles
Sample = CALCULATE(sum(after_trip_allocation_item[returned_full_bottles]),after_trip_allocation_item[sample_flag]="'sample'")

Sample bottles are calculated on the basis of:
·         Vehicle no.
·         Route description
·         Date dimension
·         Item dimension(brand,volume and  holding type)
Attached is the dashboard of the sample bottles:



Calculation  of return bottles:
Return bottles is extracted from the sample flag.
Attached is the snapshot for the sold bottles
Returned_bottles = CALCULATE(sum(after_trip_allocation_item[returned_full_bottles]),after_trip_allocation_item[sample_flag]="'returned'")

Return bottles are calculated on the basis of:
·         Vehicle no.
·         Route description
·         Date dimension
·         Item dimension(brand,volume and holding type)
Attached is the dashboard of the return bottles:



Appendix A
LIQUOR COMPANY  MOBILITY APP

The given implementation has been done to automate the field sales collection data for Liquor Company.

As a first step, the ETL team has deployed the MS Access database on the client end for data capture of before and after truck load. That is the ETL team has devised a system wherein it captures the data when the truck is loaded by various SKUs. Also the end client inputs the data after the truck has completed its journey.
The benefit to the end client is that the ms excel spreadsheets shall be replaced by database and consolidated reporting shall be available to the end client.
Access  Implementation
The end client desired  to have a set of reports using PowerBI  and database is created in MS Access  using the following features:
·         Calculate the crates loaded on the basis of driver name,areaname,sales person.
·         Calculate the Daily routine  sales and commission  report on the basis of brand ,volume and holding types.
·         Calculate the total bottles sold after the trip.
Database model for Access interface is as follows:( the data model was developed by reverse engineering the client spreadsheets)




Attached is the explanation of the database model:

Truck  Master: The truck  master entails all the master data for the truck.It includes the vehicle no.,driver no.,helper,sales person of the truck.

Route  Master: The route master entails all the master data for the route.The route master  has one to many relation with the truck  master.That is a truck can use the various route to travel or will have the same route on multiple dates. .It includes all the information of route i.e. when the  journey started ,what was the meter  reading and how many crates are loaded.

Before trip allocation: Before trip allocation entails all the information of the item before the journey  was started i.e. how much  quantity was loaded of which brand,volume and holding type.which type of load was it,either it is for “sales” ,”promotion”or ”sample”.

After trip allocation:it entails all the information of after trip of the truck on the given route.i.e how any full bottles,emptybottles,emptycrates,breakage bottles are returned,how much fuel was consumed and the expense details .

After trip allocation item: it entails all the information of after trip.i.e how any full bottles,emptybottles,emptycrates,breakage bottles are returned,which types of material  was sold “sample”,”promotion” or “sold”.

The database is created in MS Access.There are two buttons before trip and after trip.
Attached is the snapshot :

Before trip allocation contains all the details of the truck ,when the journey was started and how much quantity was loaded of which brand ,what is the volume and the holding types.





After trip allocation:the after trip form contain the  after trip allocation detail and expense detail on the basis of vehicle no. and area name.
Attached is the snapshot:

After trip allocation form contains the duration of journey and the description of the sample flag i.e  either the item returned was “return” or “sample” on the basis of brand, volume and holding types.




Expense detail: This form contains the expense  details of the truck.



Appendix B
Attached is the explanation for the dashboard developed for replicating the audit report.





The above portion has been mapped in PowerBI as follows

Area name and Vehicle no: These 2 have been assumed to be filter parameter as follows:







Vehicle no. and area name have been used as slicers for the given dashboards.



DAX function:The sample and returned bottles are calculate by  using the dax functionality.
Attached is the snapshot:





This is the aggregate function.we use the DAX functionality for the calculation of the returned bottles


For the same the etl team generated an aggregate for ‘after_trip_allocation’
Attached is the snapshot




Cross filter sample:



In case interested in converting spreadsheets to data model, please contact the following:

Name: Apoorv Chaturvedi
Phone: +91-8802466356
email: apoorv@mnnbi.com

Blog prepared by:

Weighted average calculation using Tableau

 The team has developed a module integrated with Turbodata for perpetual weighted average valuation.

The aggregations to be used for inventory valuations have been developed over turbodata.

Assumptions: the team has used inwards and outwards supply. Inwards supply shall include all purchase and journal debit entries.
Outward supplies shall include all sales and journal credit entries.
If the client has got item details within payment and receipt vouchers then the same are not covered as part of this module.

Benefits of using the Tableau weighted average module:
  •      Complete flexibility of calculation closing stock by perpetual method across any level of location, time granularity.
  •           Flexibility in calculating inventory ageing across any level of granularity
  •    Turbodata integration module ensures that any number of locations can be consolidated in the system across any combination of ERPs.
  •           Stable and mature system in terms of integration across multiple ERPs.
  •       Data validation system built into the system.


Attached is the brief methodology of calculation of weighted average valuation:
For cumulative purchase value and cumulative purchase quantity  the ETL team used the following method


Step1 : sorting the data by item and date
Step 2: calculation of the running sum of purchase value
Step3: adding the opening purchase value to the value derived in step 2 for the given item across each fiscal date.
Step 4: calculation of the running sum of purchase quantity by item and date.
Step 5: adding the opening stock quantity for each item by fiscal date to the value derived in step 4.
Step 6: calculation of the weighted average rate by dividing the values in step 3 and step 5.
Step 7: Calculation of the gross profit based on the weighted average rate in item sales.
Pane down & quick table_calculation(running_total):
·        Cumulative_purchase_value
Table Calculation (Running_total)


Pane Down:

Pane Down for the purchase quantity.

·        Cumulative_purchase_qty2




·        Cumulative_purchase_value2: addition of the opening stock value and cumulative purchase value.



For the weighted average calculation the following calculation was used:

·        Weighted_average: cumulative purchase value divided by cumulative purchase quantity.

·        Closing_stock_value
    Calculations:

·        Cumulative purchase_no
{EXCLUDE  [Item],DATEPART('day', [ordeR date]):SUM([quantity_debit])}
·        Closing stock_value
[net_closing_quantity]*[weighted_average]
·        Cumulative_purchase_qty
RUNNING_SUM(ATTR([cumulative_purchase_no]))+SUM([openinG_stock (updated WA & FIFO)].[Dspclqty])
·        Cumulative_purchase_value
{EXCLUDE  [Item],DATEPART('day', [ordeR date]):SUM([purchase_value2])}
·        Cumulative_purchase_value2
[opening_stock_value]+RUNNING_SUM(ATTR([cumulative_purchase_value]))

·        Net_closing_qty

SUM([quantity_debit_credit])+SUM([openinG_stock (updated WA & FIFO)].[Dspclqty])


·        Opening_stock_value
SUM([openinG_stock (updated WA & FIFO)].[Dspclqty])*SUM([openinG_stock (updated WA & FIFO)].[Dspclrate])



·        Purchase_value2
if [Is Deemed Positive]='yes' then [Quantity Appended]*[Rate Appended]
else 0 end

 ·        Purchase_value2
if [Is Deemed Positive]='yes' then [Quantity Appended]*[Rate Appended]
else 0 end
·        Weighted_average
[cumulative_purchase_value2]/[cumulative_purchase_qty2]


v Blending concept:
The opening stock number and the transaction data come from 2(two) different data sources. For each item the opening stock could be 0 or a non zero value. Accordingly the ETL team has used blending for getting the cumulative sum of purchase quantity and the opening stock quantity. The process of blending shall ensure the following:
a.)  Duplicates in the source or target tables for item names shall be removed
Cumulative inventory total.[item]=opening_stock_value.[dspdispname]




Final Dashboard:
·        Weighted_average:




·        Opening_closing_stock_valuation:







Prepared by
Deepanshu Garg: deepanshu@mnnbi.com

For further retail, sales, inventory, ledger, taxation and loyalty analytics please contact the following:
Apoorv chaturvedi: apoorv@mnnbi.com
Phone: +91-8802466356

Initial and Incremental data Load Template by M&N Business Intelligence-SAP Data Services

  INCREMENTAL LOAD/CREATION OF DIMENSION TABLE LOGIC AT SAP DATA SERVICES END CLIENT In this particular document we shall be looking at the ...