End
client problem:
The end client is a Rs. 150(one hundred and
fifty) crore rupee firm in the trading of steel tubes for casting purposes in
the automotive sector.
·
The end client desires visibility into its
inventory in terms of the following
parameters:
o Ageing
analysis
o ABC categorization
o Upper and
lower control limit for inventory ageing
·
The end client data was in disparate sources
of excel spreadsheets. The following were the issues with the excel
spreadsheets:
o The data
in the spreadsheets was not clean
o The end
client business logic was to be coded.
o There are
multiple and disparate spreadsheets for stock, stock valuation and sales.
·
Code maintainability: the end client has
limited skills on sql. The end client managers needed a system that could be
maintained by its internal resources with ease and convenience. The end client
requirements were as follows:
o Maintaining
of the business logic as per its requirements. This was specifically true for
the inventory ageing analysis
In order
to achieve the same, the etl team devised the following course of action.
·
Developed the solution using microsoft
powerbi by using the same as an etl tool and a reporting tool.
·
The entire sql business logic of the end
client was converted into microsoft powerbi etl logic.
·
The reports and dashboards were developed
from the cleansed data of microsoft powerbi
·
Knowledge transfer was done for the end
client to ensure smooth operations for the same.
Attached
is the detailed description of the data cleansing, data auditing and data
profiling for the end client.
The data
providers given are as follows:
Attached are the data providers for the end
client usage:
·
‘a
category ageing’: this gives a comprehensive analysis for ‘a’ category items in
terms of abc analysis, ageing.
·
Abc
category: this gives abc categorization by
weight
·
Ageing_new:
this module was later developer to include ageing analysis by value
·
Final(2):
sales analysis
·
Month_month_analysis:this
is for sales analysis with built in month data.
Stock
data:
This is
the raw data given by end client.the raw data requires data cleansing.
the
following were the errors marked for the end client:
1.) Grade can
be numeric as well as character.
2.) Missing
values of dim1, make, grade and dates. These details have to be marked to the
end client. For example in dates, the end client could misrepresent the year of
the date input.
Cleasing
part:
Attached are
the steps for cleansing:
·
An addendum to grade of ‘-x’ was added. This was later removed in the powerbi.
Sales
data:
·
Now the work for the etl team is to load the
data in the powerbi.
Stock
sheet:
Next step
for the etl team :
·
Remove the extra variables from the data
using powerbi.
·
It has to be done by the delimiter function.
A
conditional column was added within powerbi for flat,rounds and sections
business logic. Attached is the snapshot for the same:
The
excess column of price was removed.
Next
step:
·
Added custom column to find the value.
·
Value=wt*price
Attached
is the snapshot for the same:
After
that we have to do group by:
·
Grouped rows on the basis of
grade,make,condition,section stock
·
Operation =sum
This is
our required data for reporting.
Attached
is snapshot for the same:
Next
step for ETL resource is to do ABC categorization.
Steps for
the abc categorization are as follows:
(the
attached module sql code was converted into powerbi module for the same)
ALTER PROCEDURE [dbo].[SP_ABC_CATEGORIZATION_AGEING]
AS
BEGIN
TRUNCATE TABLE ABC_CATEGORIZATION_BBY;
WITH
CTE1
AS(
SELECT A.*
--,B.CATEGORY
FROM
BBY_STOCK_DEC A
--JOIN ABC_CATEGORIZATION1 B
-- ON A.GRADE=B.GRADE
-- AND A.CONDITION=B.CONDITION
-- AND A.MAKE=B.MAKE
-- AND A.DIM1=B.DIM1
WHERE D2 IS NULL
),
CTE2
AS(SELECT DISTINCT A.* FROM CTE1 A
JOIN
ABC_CATEGORIZATION B
ON A.GRADE=B.GRADE
AND A.CONDITION=B.CONDITION
AND A.MAKE=B.MAKE
AND A.D1=B.D1),
CTE2A
AS(SELECT DISTINCT A.*,B.CATEGORY FROM CTE1 A
JOIN
ABC_CATEGORIZATION B
ON A.GRADE=B.GRADE
AND A.CONDITION=B.CONDITION
AND A.MAKE=B.MAKE
AND A.D1=B.D1),
CTE3
AS(SELECT * FROM CTE1
EXCEPT
SELECT * FROM CTE2),
/* PICK UP THAT STOCK WHERE CONDITION IS NULL AND MAKE IS
NULL*/
CTE4
AS(SELECT DISTINCT A.* FROM CTE3 A
JOIN ABC_CATEGORIZATION B
ON A.GRADE=B.GRADE
AND A.D1=B.D1
WHERE A.CONDITION IS NULL
AND A.MAKE IS NULL),
CTE4A
AS(SELECT DISTINCT A.*,B.CATEGORY FROM CTE3 A
JOIN ABC_CATEGORIZATION B
ON A.GRADE=B.GRADE
AND A.D1=B.D1
WHERE A.CONDITION IS NULL
AND A.MAKE IS NULL),
CTE5
AS(SELECT * FROM CTE3
EXCEPT
SELECT * FROM CTE4),
CTE6
AS(SELECT DISTINCT A.* FROM CTE5 A
JOIN ABC_CATEGORIZATION B
ON A.GRADE=B.GRADE
AND A.CONDITION=B.CONDITION
AND A.D1=B.D1
WHERE
A.MAKE IS NULL),
CTE6A
AS(SELECT DISTINCT A.*,B.CATEGORY FROM CTE5 A
JOIN ABC_CATEGORIZATION B
ON A.GRADE=B.GRADE
AND A.CONDITION=B.CONDITION
AND A.D1=B.D1
WHERE
A.MAKE IS NULL),
CTE7
AS(SELECT
* FROM CTE5
EXCEPT
SELECT * FROM
CTE6),
CTE8 AS(SELECT DISTINCT A.* FROM CTE7 A
JOIN
ABC_CATEGORIZATION B
ON A.GRADE=B.GRADE
AND A.MAKE=B.MAKE
AND A.D1=B.D1
WHERE A.CONDITION IS NULL),
CTE8A
AS(SELECT DISTINCT A.*,B.CATEGORY FROM CTE7 A
JOIN
ABC_CATEGORIZATION B
ON A.GRADE=B.GRADE
AND A.MAKE=B.MAKE
AND A.D1=B.D1
WHERE A.CONDITION IS NULL),
CTE9
AS(SELECT * FROM CTE7
EXCEPT
SELECT * FROM CTE8),
CTE9X
AS(SELECT * FROM CTE9
WHERE DATE IS NOT NULL),
CTE9XA
AS(SELECT DISTINCT A.*,B.CATEGORY FROM CTE9X
A
JOIN
ABC_CATEGORIZATION B
ON A.CONDITION=B.CONDITION
AND A.MAKE=B.MAKE
AND A.D1=B.D1
WHERE A.GRADE IS NULL),
CTE9XAB
AS(SELECT DISTINCT A.* FROM CTE9X A
JOIN
ABC_CATEGORIZATION B
ON A.CONDITION=B.CONDITION
AND A.MAKE=B.MAKE
AND A.D1=B.D1
WHERE A.GRADE IS NULL),
CTE9XC
AS(SELECT * FROM CTE9X
EXCEPT
SELECT * FROM CTE9XAB),
CTE9A
AS(SELECT * FROM CTE2A
UNION ALL
SELECT * FROM CTE4A
UNION ALL
SELECT * FROM CTE6A
UNION ALL
SELECT * FROM CTE8A
UNION ALL
SELECT * FROM CTE9XA)
SELECT * INTO #CTE9A FROM
CTE9A;
/*AGEING LOGIC AND TOTAL VALUE LOGIC*/
WITH CTE10
AS(SELECT A.*,
DATEDIFF(DAY,ISNULL(CONVERT(DATETIME, DATE, 104),'2017-01-01 00:00:00.000'),GETDATE()) AS AGEING,
CAST((A.BAL_WT)AS DECIMAL (18,2))* CAST((A.PRICE) AS DECIMAL(18,2)) AS TOTAL_VALUE
FROM #CTE9A A)
INSERT INTO ABC_CATEGORIZATION_BBY
(
[DATE ]
,[LOT NO]
,[GRADE]
,[MAKE]
,[CONDITION]
,[D1]
,[D2]
,[BAL_WT]
,[PRICE]
,[CATEGORY]
,[AGEING]
,[TOTAL_VALUE]
)
SELECT
[DATE ]
,[LOT NO]
,[GRADE]
,[MAKE]
,[CONDITION]
,[D1]
,[D2]
,[BAL_WT]
,[PRICE]
,[CATEGORY]
,[AGEING]
,[TOTAL_VALUE]
FROM
CTE10
END;
·
First step is to find cummulative sum.
o
CUMULATIVE_SUM_2
= CALCULATE(SUM(STOCK_VALUATION_TABLE[VALUE_SUM]),FILTER(ALL(STOCK_VALUATION_TABLE),STOCK_VALUATION_TABLE[Index]<=MAX(STOCK_VALUATION_TABLE[Index])))
·
Abc
categorization:
ABC = IF(((STOCK_VALUATION_TABLE[CUMULATIVE_SUM_2]/SUM(STOCK_VALUATION_TABLE[VALUE_SUM]))*100)<=50,
"A",IF(((STOCK_VALUATION_TABLE[CUMULATIVE_SUM_2]/SUM(STOCK_VALUATION_TABLE[VALUE_SUM]))*100)>50
&& ((STOCK_VALUATION_TABLE[CUMULATIVE_SUM_2]/SUM(STOCK_VALUATION_TABLE[VALUE_SUM]))*100)<80,"B","C"))
Next step
is the ageing work for the stock.
Ageing
requires:
·
Date difference
o
date_diff =
DATEDIFF(AGEING_NEW[Stock.Date],today(),DAY)
Attached
is the ageing of the stock:
Formulae
for ageing:
·
ageing = IF(AGEING_NEW[date_diff]>0
&& AGEING_NEW[date_diff]<365,"LESS THAN 1
YEAR",IF(AGEING_NEW[date_diff]>365 &&
AGEING_NEW[date_diff]<730,"BETWEEN 1 AND 2
YEAR",IF(AGEING_NEW[date_diff]>730 && AGEING_NEW[date_diff]
< 1095,"BETWEEN 2 AND 3 YEAR","MORE THAN 3 YEAR")))
|
|
The
final output for the end client was as follows:
·
ABC
categorization in terms of stock value(sample attached below)
- Inventory ageing analysis in terms of weight(sample attached below)
- Month
and month analysis(sample attached below):
Dashboard
Inventory
valuations from turbodata using DAX functionality
The etl team had the following target
points for converting the sql code to dax for weighted average calculations.
·
Reduce the deployment
times-remove the sql server extraction step
·
Reduce the execution times:
remove the sql code execution steps
·
Handle large data loads: remove
the bottlenecks associated with ms excel
·
High end data auditing and data profiling solution using powerbi
·
Reduced costs of maintenance of
the solution: reduced dependence on sql developers for the solution execution.
What this solution does not handle?
·
Incremental data loads from
large disparate databases.
Attached document explains the conversion
of the complex sql code to powerbi code
·
Summarization of the debit and
credit entries: the etl team has assumed that all the debit entries and credit
entries entailing the stock item details shall be considered irrespective of
the voucher type details.
The attached sql code was converted into an
aggregate using dax functionality
SELECT DISTINCT STOCKITEM_NM AS STOCK_ITEM_NAME INTO #DAYBOOK_STOCK_ITEM_NM FROM STG_ITEM;
CREATE INDEX #DAYBOOK_STOCK_ITEM_NM_IDX ON #DAYBOOK_STOCK_ITEM_NM
(STOCK_ITEM_NAME);
--DELETE FROM MOVING_AVERAGE_RATE WHERE ITEM_NAME IN (SELECT
STOCK_ITEM_NAME FROM #DAYBOOK_STOCK_ITEM_NM);
WITH CTE1 AS (
SELECT E.DIM_COMPANY_ID,A.RATE,A.SALE_QUANTITY,(A.RATE*A.SALE_QUANTITY) AS SALE_VALUE,A.ORDER_DATE AS DATE_DESC,A.ITEM_SUPPLIER_ORDER_ID,A.ORDER_NO, W.LOCATION ,B.ITEM ,B.DIM_ITEM_ID,W.DIM_WAREHOUSE_ID,D.DIM_DATE_ID
FROM ITEM_SALES_ORDER
A
JOIN DIM_ITEM
B
ON A.DIM_ITEM_ID=B.DIM_ITEM_ID
JOIN DIM_WAREHOUSE
W
ON A.DIM_WAREHOUSE_ID=W.DIM_WAREHOUSE_ID
JOIN DIM_DATE
D
ON A.ORDER_DATE=D.DATE_DESC
JOIN #DAYBOOK_STOCK_ITEM_NM
C
ON B.ITEM=C.STOCK_ITEM_NAME
JOIN DIM_COMPANY
E ON E.COMPANY_NAME=A.COMPANY_NAME
UNION ALL
SELECT E.DIM_COMPANY_ID,A.RATE,A.JOURNAL_CREDIT_QUANTITY,(A.RATE*A.JOURNAL_CREDIT_QUANTITY) AS JOURNAL_CREDIT_VALUE,A.ORDER_DATE ASDATE_DESC, A.ITEM_SUPPLIER_ORDER_ID,A.ORDER_NO,W.LOCATION ,B.ITEM ,B.DIM_ITEM_ID,W.DIM_WAREHOUSE_ID,D.DIM_DATE_ID
FROM ITEM_JOURNAL_CREDIT
A
JOIN DIM_ITEM
B
ON A.DIM_ITEM_ID=B.DIM_ITEM_ID
JOIN DIM_WAREHOUSE
W
ON A.DIM_WAREHOUSE_ID=W.DIM_WAREHOUSE_ID
JOIN DIM_DATE
D
ON A.ORDER_DATE=D.DATE_DESC
JOIN #DAYBOOK_STOCK_ITEM_NM
C
ON B.ITEM=C.STOCK_ITEM_NAME
JOIN DIM_COMPANY
E ON E.COMPANY_NAME=A.COMPANY_NAME
--where B.DIM_ITEM_ID=696
)
SELECT * INTO #CTE1 FROM CTE1;
--SELECT * FROM #CTE1
WITH CTE2 AS(
SELECT E.DIM_COMPANY_ID,A.RATE,A.PURCHASE_QUANTITY,(A.RATE*A.PURCHASE_QUANTITY) AS PURHASE_VALUE,A.ORDER_DATE ASDATE_DESC,A.ITEM_SUPPLIER_ORDER_ID,A.ORDER_NO, W.LOCATION ,B.ITEM ,B.DIM_ITEM_ID,W.DIM_WAREHOUSE_ID,D.DIM_DATE_ID
FROM ITEM_PURCHASE_ORDER
A
JOIN DIM_ITEM
B
ON A.DIM_ITEM_ID=B.DIM_ITEM_ID
JOIN DIM_WAREHOUSE
W
ON A.DIM_WAREHOUSE_ID=W.DIM_WAREHOUSE_ID
JOIN DIM_DATE
D
ON A.ORDER_DATE=D.DATE_DESC
JOIN #DAYBOOK_STOCK_ITEM_NM
C
ON B.ITEM=C.STOCK_ITEM_NAME
JOIN DIM_COMPANY
E ON E.COMPANY_NAME=A.COMPANY_NAME
--where
B.DIM_ITEM_ID=696
UNION ALL
SELECT E.DIM_COMPANY_ID,A.RATE,A.JOURNAL_DEBIT_QUANTITY,(A.RATE*A.JOURNAL_DEBIT_QUANTITY) AS JOURNAL_DEBIT_VALUE,A.ORDER_DATE ASDATE_DESC,A.ITEM_SUPPLIER_ORDER_ID,A.ORDER_NO, W.LOCATION ,B.ITEM ,B.DIM_ITEM_ID,W.DIM_WAREHOUSE_ID,D.DIM_DATE_ID
FROM ITEM_JOURNAL_DEBIT
A
JOIN DIM_ITEM
B
ON A.DIM_ITEM_ID=B.DIM_ITEM_ID
JOIN DIM_WAREHOUSE
W
ON A.DIM_WAREHOUSE_ID=W.DIM_WAREHOUSE_ID
JOIN DIM_DATE
D
ON A.ORDER_DATE=D.DATE_DESC
JOIN #DAYBOOK_STOCK_ITEM_NM
C
ON B.ITEM=C.STOCK_ITEM_NAME
JOIN DIM_COMPANY
E ON E.COMPANY_NAME=A.COMPANY_NAME
--where
B.DIM_ITEM_ID=696
)
SELECT * INTO #CTE2 FROM CTE2;
WITH
CTE3 AS(
SELECT DATE_DESC, LOCATION ,ITEM ,DIM_ITEM_ID,
DIM_WAREHOUSE_ID,
DIM_DATE_ID,DIM_COMPANY_ID
FROM
#CTE1
UNION ALL
SELECT DATE_DESC, LOCATION ,ITEM ,DIM_ITEM_ID,
DIM_WAREHOUSE_ID,
DIM_DATE_ID,DIM_COMPANY_ID
FROM #CTE2),
CTE4 AS(
SELECT ROW_NUMBER() OVER(ORDER BY
DIM_WAREHOUSE_ID ,
DIM_ITEM_ID,DIM_DATE_ID, DATE_DESC,DIM_COMPANY_ID)AS ROWID,DIM_DATE_ID,
DIM_WAREHOUSE_ID ,
DIM_ITEM_ID ,DATE_DESC,DIM_COMPANY_ID
FROM CTE3)
The dax code used to convert the sql code
is as follows:
SUMMARIZE('Cumulative inventory
total','Cumulative inventory total'[ITEM],'Cumulative inventory
total'[ordeR_date],"total_purchase",sum('Cumulative inventory
total'[quantity_debit]),"total_purchase_value",sum('Cumulative
inventory
total'[purchase_value_2]),"net_debit_credit_qty",sum('Cumulative
inventory total'[quantity_debit_credit]))
Explanation
of the quantity_debit: if('Cumulative inventory total'[is_deemed_positive]="yes",'Cumulative
inventory total'[quantity],0)
Purchase_value_2: IF('Cumulative inventory
total'[is_deemed_positive]="yes",'Cumulative inventory
total'[QUANTITY_APPENDED]*'Cumulative inventory total'[RATE_APPENDED],0)
Net_debit_credit_qty: SUMMARIZE('Cumulative inventory total','Cumulative
inventory total'[ITEM],'Cumulative inventory
total'[ordeR_date],"total_purchase",sum('Cumulative inventory
total'[quantity_debit]),"total_purchase_value",sum('Cumulative
inventory total'[purchase_value_2]),"net_debit_credit_qty",sum('Cumulative
inventory total'[quantity_debit_credit]))
Removal of the perpetual weighted average
inventory calculations using complex while do loop across all the stock items
as given. For the audit purposes the end clients expected the inventory
valuations on a daily basis to catch any data entry errors.
The dax code uses running sum across all date
ranges for the stock items to capture the historical stock valuations.
CTE5 AS(
SELECT MIN(ROWID)AS MINROW,DIM_DATE_ID,
DIM_WAREHOUSE_ID ,
DIM_ITEM_ID ,DATE_DESC,DIM_COMPANY_ID
FROM CTE4
GROUP BY DIM_DATE_ID,
DIM_WAREHOUSE_ID ,
DIM_ITEM_ID,DATE_DESC,DIM_COMPANY_ID)
INSERT INTO [MOVING_AVERAGE_RATE]( DIM_DATE_ID,DIM_ITEM_ID,DATE_DESC,DIM_COMPANY_ID)
SELECT
DIM_DATE_ID,DIM_ITEM_ID,DATE_DESC,DIM_COMPANY_ID
FROM CTE5
ORDER BY
DIM_ITEM_ID,DIM_DATE_ID, DATE_DESC,DIM_COMPANY_ID;
WITH CTE_PURCHASE AS (
SELECT SUM(PURCHASE_QUANTITY) AS PURCHASE_QUANTITY,SUM(PURHASE_VALUE) AS PURCHASE_VALUE,
DIM_ITEM_ID,DIM_DATE_ID,DIM_COMPANY_ID,DIM_WAREHOUSE_ID
FROM #CTE2
GROUP BY DIM_ITEM_ID,DIM_DATE_ID,DIM_COMPANY_ID,DIM_WAREHOUSE_ID)
UPDATE MOVING_AVERAGE_RATE
SET PURCHASE_QUANTITY= ISNULL(B.PURCHASE_QUANTITY,0),
PURCHASE_VALUE=ISNULL(B.PURCHASE_VALUE,0)
FROM MOVING_AVERAGE_RATE
A
LEFT OUTER JOIN CTE_PURCHASE
B
ON
--A.DIM_WAREHOUSE_ID=B.DIM_WAREHOUSE_ID
--AND
A.DIM_ITEM_ID=B.DIM_ITEM_ID
AND A.DIM_DATE_ID=B.DIM_DATE_ID
AND A.DIM_COMPANY_ID=B.DIM_COMPANY_ID;
WITH CTE_SALES AS (
SELECT SUM(SALE_QUANTITY) AS SALE_QUANTITY,SUM(SALE_VALUE) AS SALE_VALUE,
DIM_ITEM_ID,DIM_DATE_ID,DIM_COMPANY_ID,DIM_WAREHOUSE_ID
FROM #CTE1
GROUP BY DIM_ITEM_ID,DIM_DATE_ID,DIM_COMPANY_ID,DIM_WAREHOUSE_ID)
UPDATE MOVING_AVERAGE_RATE
SET SALE_QUANTITY= ISNULL(B.SALE_QUANTITY,0),
SALE_VALUE=ISNULL(B.SALE_VALUE,0)
FROM MOVING_AVERAGE_RATE
A
LEFT OUTER JOIN CTE_SALES
B
ON A.DIM_ITEM_ID=B.DIM_ITEM_ID
AND A.DIM_DATE_ID=B.DIM_DATE_ID
AND A.DIM_COMPANY_ID=B.DIM_COMPANY_ID;
WITH CTE1
AS(SELECT ROW_NUMBER() OVER(OrdeR BY ltrim(rtrim(replace(REPLACE(item_name,char(13),''),char(10),'')))) AS ROW_NUMBER2,
ltrim(rtrim(replace(REPLACE(item_name,char(13),''),char(10),''))) AS item_name,
CAST(QTY AS FLOAT)AS QTY,CAST(VALUE AS FLOAT) AS VALUE,CAST(RATE AS FLOAT) AS RATE FROM dbo.OPENING_STOCK
WHERE VALUE IS NOT NULL
GROUP BY ltrim(rtrim(replace(REPLACE(item_name,char(13),''),char(10),'')))
,CAST(VALUE AS FLOAT),CAST(QTY AS FLOAT),CAST(RATE AS FLOAT)
),
CTE2
AS(SELECT MIN(ROW_NUMBER2) AS MIN_ROW_ID,
item_name
FROM CTE1
GROUP BY item_name),
CTE3
AS(SELECT A.* FROM CTE1
A
JOIN CTE2 B ON
A.ROW_NUMBER2=B.MIN_ROW_ID
--WHERE A.QTY IS NOT NULL
)
UPDATE MOVING_AVERAGE_RATE
SET OPENING_STOCK=ISNULL(CAST(CTE3.QTY AS FLOAT),0),
OPENING_STOCK_VALUE=ISNULL(CAST(CTE3.VALUE AS FLOAT),0)
,
MOVING_AVERAGE_RATE=CASE WHEN (ISNULL(CAST(CTE3.QTY AS FLOAT),0)+CAST(PURCHASE_QUANTITY AS FLOAT)-CAST(SALE_QUANTITY AS FLOAT))=0 THEN 0
WHEN ISNULL(CAST(CTE3.QTY AS FLOAT),0)=0 AND CAST(PURCHASE_QUANTITY AS FLOAT)=0 THEN 0
ELSE ((ISNULL(CAST(CTE3.VALUE AS FLOAT),0)+(PURCHASE_VALUE))/(ISNULL(CAST(CTE3.QTY ASFLOAT),0)+CAST(PURCHASE_QUANTITY AS FLOAT)))
END
/*CASE WHEN (ISNULL(OPENING_STOCK,0)+CAST(PURCHASE_QUANTITY AS
FLOAT)-CAST(SALE_QUANTITY AS FLOAT))=0 THEN 0
WHEN ISNULL(OPENING_STOCK,0)=0 AND CAST(PURCHASE_QUANTITY AS FLOAT)>0 THEN
CAST(PURCHASE_VALUE AS FLOAT)/CAST(PURCHASE_QUANTITY AS FLOAT)
WHEN ISNULL(OPENING_STOCK,0) >0 AND PURCHASE_QUANTITY >0
THEN ((ISNULL(OPENING_STOCK_VALUE,0)+(PURCHASE_VALUE))/(CAST(ISNULL(OPENING_STOCK,0)
AS FLOAT)+CAST(PURCHASE_QUANTITY AS FLOAT)))
ELSE (ISNULL(OPENING_STOCK_VALUE,0))/CAST(ISNULL(OPENING_STOCK,0)
AS FLOAT)
END*/
,
--COST_OF_GOODS_SOLD
= CASE WHEN PURCHASE_QUANTITY=0 THEN 0 ELSE
SALE_QUANTITY*(PURCHASE_VALUE/PURCHASE_QUANTITY) END,
/*
COST_OF_GOODS_SOLD=(CASE
--WHEN (OPENING_STOCK+CAST(PURCHASE_QUANTITY AS
FLOAT)-CAST(SALE_QUANTITY AS FLOAT))=0 THEN 0
WHEN CAST(ISNULL(OPENING_STOCK,0) AS FLOAT)=0 AND CAST(PURCHASE_QUANTITY AS
FLOAT)>0 THEN CAST(PURCHASE_VALUE AS FLOAT)/CAST(PURCHASE_QUANTITY AS FLOAT)
WHEN CAST(ISNULL(OPENING_STOCK,0) AS FLOAT) >0 AND
CAST(PURCHASE_QUANTITY AS FLOAT) >0 THEN
((ISNULL(OPENING_STOCK_VALUE,0)+(PURCHASE_VALUE))/(CAST(ISNULL(OPENING_STOCK,0)
AS FLOAT)+CAST(PURCHASE_QUANTITY AS FLOAT)))
ELSE (ISNULL(OPENING_STOCK_VALUE,0))/CAST(ISNULL(OPENING_STOCK,0)
AS FLOAT)
END)*SALE_QUANTITY,*/
CLOSING_STOCK_VALUE= ((ISNULL(CAST(CTE3.QTY AS FLOAT),0)+CAST(PURCHASE_QUANTITY AS FLOAT)-CAST(SALE_QUANTITY AS FLOAT)))*
(CASE WHEN (ISNULL(CAST(CTE3.QTY AS FLOAT),0)+CAST(PURCHASE_QUANTITY AS FLOAT)-CAST(SALE_QUANTITY AS FLOAT))=0 THEN 0
WHEN ISNULL(CAST(CTE3.QTY AS FLOAT),0)=0 AND CAST(PURCHASE_QUANTITY AS FLOAT)=0 THEN 0
ELSE ((ISNULL(CAST(CTE3.VALUE AS FLOAT),0)+(PURCHASE_VALUE))/(ISNULL(CAST(CTE3.QTY ASFLOAT),0)+CAST(PURCHASE_QUANTITY AS FLOAT)))
END),
/*(CASE WHEN (ISNULL(OPENING_STOCK,0)+CAST(PURCHASE_QUANTITY AS
FLOAT)-CAST(SALE_QUANTITY AS FLOAT))=0 THEN 0
WHEN ISNULL(OPENING_STOCK,0)=0 AND CAST(PURCHASE_QUANTITY AS FLOAT)>0 THEN
CAST(PURCHASE_VALUE AS FLOAT)/CAST(PURCHASE_QUANTITY AS FLOAT)
WHEN ISNULL(OPENING_STOCK,0) >0 AND PURCHASE_QUANTITY >0
THEN
((ISNULL(OPENING_STOCK_VALUE,0)+(PURCHASE_VALUE))/(CAST(ISNULL(OPENING_STOCK,0)
AS FLOAT)+CAST(PURCHASE_QUANTITY AS FLOAT)))
ELSE (ISNULL(OPENING_STOCK_VALUE,0))/CAST(ISNULL(OPENING_STOCK,0)
AS FLOAT)
END)*(CAST(ISNULL(OPENING_STOCK,0) AS FLOAT)+CAST(PURCHASE_QUANTITY AS
FLOAT)-CAST(SALE_QUANTITY AS FLOAT)),*/
CLOSING_STOCK=(ISNULL(CAST(CTE3.QTY AS FLOAT),0)+CAST(PURCHASE_QUANTITY AS FLOAT)-CAST(SALE_QUANTITY AS FLOAT)),
COST_OF_GOODS_SOLD = ISNULL(CAST(CTE3.VALUE AS FLOAT),0)+CAST(PURCHASE_VALUE AS FLOAT)-((ISNULL(CAST(CTE3.QTY AS FLOAT),0)+CAST(PURCHASE_QUANTITY ASFLOAT)-CAST(SALE_QUANTITY AS FLOAT)))*
(CASE WHEN (ISNULL(CAST(CTE3.QTY AS FLOAT),0)+CAST(PURCHASE_QUANTITY AS FLOAT)-CAST(SALE_QUANTITY AS FLOAT))=0 THEN 0
WHEN ISNULL(CAST(CTE3.QTY AS FLOAT),0)=0 AND CAST(PURCHASE_QUANTITY AS FLOAT)=0 THEN 0
ELSE ((ISNULL(CAST(CTE3.VALUE AS FLOAT),0))+(PURCHASE_VALUE))/(ISNULL(CAST(CTE3.QTY ASFLOAT),0)+CAST(PURCHASE_QUANTITY AS FLOAT))
END)
FROM MOVING_AVERAGE_RATE
A
JOIN DIM_ITEM
B
ON A.DIM_ITEM_ID=B.DIM_ITEM_ID
LEFT OUTER JOIN CTE3 ON CTE3.ITEM_NAME=B.ITEM
WHERE MOVING_AVERAGE_RATE_ID IN (SELECT MIN(MOVING_AVERAGE_RATE_ID) FROM MOVING_AVERAGE_RATE
A
GROUP BY DIM_ITEM_ID);
DELETE FROM MOVING_AVERAGE_RATE WHERE CLOSING_STOCK<0;
DECLARE @COUNT INT;
DECLARE @COUNT_ITEM_START INT;
DECLARE @COUNT_ITEM INT;
DECLARE @CHECKNO INT;
DECLARE @MOVING_AVERAGE_RATE DECIMAL(15,3);
DECLARE @CLOSING_STOCK DECIMAL(15,3);
DECLARE @CLOSING_STOCK_VALUE DECIMAL(15,3);
--DECLARE @MIN_DIM_WAREHOUSE_ID INT;
--SET @MIN_DIM_WAREHOUSE_ID=(SELECT MIN(DIM_WAREHOUSE_ID) FROM
OPENING_STOCK);
--DECLARE @MAX_DIM_WAREHOUSE_ID INT;
--SET @MAX_DIM_WAREHOUSE_ID=(SELECT MAX(DIM_WAREHOUSE_ID) FROM
OPENING_STOCK);
SET @COUNT_ITEM_START=1;
--WHILE(@MIN_DIM_WAREHOUSE_ID<=@MAX_DIM_WAREHOUSE_ID)
--BEGIN
SET @COUNT_ITEM_START=1;
SET @COUNT_ITEM=(SELECT COUNT(DIM_ITEM_ID) FROM DIM_ITEM);
WHILE @COUNT_ITEM_START<=@COUNT_ITEM
BEGIN
SET @COUNT=(SELECT MAX(MOVING_AVERAGE_RATE_ID) FROM MOVING_AVERAGE_RATE WHERE DIM_ITEM_ID=@COUNT_ITEM_START)
SET @CHECKNO=(SELECT MIN(MOVING_AVERAGE_RATE_ID)+1 FROM MOVING_AVERAGE_RATE WHERE DIM_ITEM_ID=@COUNT_ITEM_START) ;
WHILE(@CHECKNO<=@COUNT)
BEGIN
UPDATE MOVING_AVERAGE_RATE
SET OPENING_STOCK=S.CLOSING_STOCK,
OPENING_STOCK_VALUE=S.CLOSING_STOCK_VALUE,
@MOVING_AVERAGE_RATE=S.MOVING_AVERAGE_RATE,
--
,COST_OF_GOODS_SOLD=S.MOVING_AVERAGE_RATE*A.SALE_QUANTITY,
@CLOSING_STOCK=ISNULL(S.CLOSING_STOCK,0),
@CLOSING_STOCK_VALUE=ISNULL(S.CLOSING_STOCK_VALUE,0)
FROM MOVING_AVERAGE_RATE
A
JOIN (SELECT A.MOVING_AVERAGE_RATE_ID,B.CLOSING_STOCK,B.CLOSING_STOCK_VALUE,B.MOVING_AVERAGE_RATE
FROM MOVING_AVERAGE_RATE
A
JOIN MOVING_AVERAGE_RATE
B
ON A.MOVING_AVERAGE_RATE_ID=B.MOVING_AVERAGE_RATE_ID+1
WHERE A.MOVING_AVERAGE_RATE_ID IN (SELECT MIN(MOVING_AVERAGE_RATE_ID) FROM MOVING_AVERAGE_RATE
WHERE OPENING_STOCK IS NULL
AND DIM_ITEM_ID=@COUNT_ITEM_START
GROUP BY DIM_ITEM_ID,DIM_DATE_ID)
)S
ON S.MOVING_AVERAGE_RATE_ID=A.MOVING_AVERAGE_RATE_ID
JOIN DIM_ITEM
D
ON A.DIM_ITEM_ID=D.DIM_ITEM_ID
AND A.DIM_ITEM_ID=@COUNT_ITEM_START
AND D.ITEM IN (SELECT STOCK_ITEM_NAME FROM #DAYBOOK_STOCK_ITEM_NM)
--AND
A.DIM_WAREHOUSE_ID=@MIN_DIM_WAREHOUSE_ID
--AND (ISNULL(A.OPENING_STOCK,0)+CAST(A.PURCHASE_QUANTITY AS
FLOAT)-CAST(A.SALE_QUANTITY AS FLOAT))>=0
UPDATE MOVING_AVERAGE_RATE
SET
CLOSING_STOCK=OPENING_STOCK+PURCHASE_QUANTITY-SALE_QUANTITY,
MOVING_AVERAGE_RATE=CASE WHEN (ISNULL(OPENING_STOCK,0)+CAST(PURCHASE_QUANTITY AS FLOAT)-CAST(SALE_QUANTITY AS FLOAT))=0 THEN 0
WHEN ISNULL(OPENING_STOCK,0)=0 AND CAST(PURCHASE_QUANTITY AS FLOAT)=0 THEN @MOVING_AVERAGE_RATE
WHEN (ISNULL(OPENING_STOCK,0)+CAST(PURCHASE_QUANTITY AS FLOAT)-CAST(SALE_QUANTITY AS FLOAT))<0 THEN @MOVING_AVERAGE_RATE*(-1)
ELSE ((ISNULL(OPENING_STOCK_VALUE,0)+(PURCHASE_VALUE))/(CAST(OPENING_STOCK AS FLOAT)+CAST(PURCHASE_QUANTITYAS FLOAT)))
END,
/*COST_OF_GOODS_SOLD=
/*(CASE WHEN
(ISNULL(@CLOSING_STOCK,0)+CAST(ISNULL(PURCHASE_QUANTITY,0) AS
FLOAT)-CAST(ISNULL(SALE_QUANTITY,0) AS FLOAT))=0 THEN 0
WHEN ISNULL(@CLOSING_STOCK,0)=0 AND CAST(ISNULL(PURCHASE_QUANTITY,0) AS
FLOAT)>0 THEN CAST(ISNULL(PURCHASE_VALUE,0) AS FLOAT)/CAST(PURCHASE_QUANTITY
AS FLOAT)
WHEN ISNULL(@CLOSING_STOCK,0)>0 AND ISNULL(PURCHASE_QUANTITY,0)
>0 THEN
((ISNULL(@CLOSING_STOCK_VALUE,0)+(PURCHASE_VALUE))/(CAST(ISNULL(@CLOSING_STOCK,0)
AS FLOAT)+CAST(PURCHASE_QUANTITY AS FLOAT)))
WHEN ISNULL(@CLOSING_STOCK,0) >0 AND
ISNULL(PURCHASE_QUANTITY,0) =0 THEN (ISNULL(@CLOSING_STOCK_VALUE,0))/CAST(ISNULL(@CLOSING_STOCK,0)
AS FLOAT)
ELSE @MOVING_AVERAGE_RATE END)*/
(CASE WHEN (ISNULL(OPENING_STOCK,0)+CAST(PURCHASE_QUANTITY AS
FLOAT)-CAST(SALE_QUANTITY AS FLOAT))=0 THEN 0
WHEN ISNULL(OPENING_STOCK,0)=0 AND CAST(PURCHASE_QUANTITY AS FLOAT)=0 THEN
@MOVING_AVERAGE_RATE
ELSE
((ISNULL(OPENING_STOCK_VALUE,0)+(PURCHASE_VALUE))/(CAST(OPENING_STOCK AS
FLOAT)+CAST(PURCHASE_QUANTITY AS FLOAT)))
END)*A.SALE_QUANTITY,*/
--CLOSING_STOCK_VALUE=
OPENING_STOCK_VALUE+PURCHASE_VALUE-COST_OF_GOODS_SOLD
CLOSING_STOCK_VALUE=CASE WHEN (OPENING_STOCK+PURCHASE_QUANTITY-SALE_QUANTITY=0) THEN 0
ELSE
(OPENING_STOCK_VALUE+PURCHASE_VALUE-
((CASE WHEN (ISNULL(OPENING_STOCK,0)+CAST(PURCHASE_QUANTITY AS FLOAT)-CAST(SALE_QUANTITY AS FLOAT))=0 THEN 0
WHEN ISNULL(OPENING_STOCK,0)=0 AND CAST(PURCHASE_QUANTITY AS FLOAT)=0 THEN @MOVING_AVERAGE_RATE
WHEN (ISNULL(OPENING_STOCK,0)+CAST(PURCHASE_QUANTITY AS FLOAT)-CAST(SALE_QUANTITY AS FLOAT))<0 THEN @MOVING_AVERAGE_RATE*(-1)
ELSE ((ISNULL(OPENING_STOCK_VALUE,0)+(PURCHASE_VALUE))/(CAST(OPENING_STOCK AS FLOAT)+CAST(PURCHASE_QUANTITYAS FLOAT)))
END)*A.SALE_QUANTITY)) END ,
COST_OF_GOODS_SOLD=
(CASE WHEN (ISNULL(OPENING_STOCK,0)+CAST(PURCHASE_QUANTITY AS FLOAT)-CAST(SALE_QUANTITY AS FLOAT))=0 THEN 0
WHEN ISNULL(OPENING_STOCK,0)=0 AND CAST(PURCHASE_QUANTITY AS FLOAT)=0 THEN @MOVING_AVERAGE_RATE
WHEN (ISNULL(OPENING_STOCK,0)+CAST(PURCHASE_QUANTITY AS FLOAT)-CAST(SALE_QUANTITY AS FLOAT))<0 THEN @MOVING_AVERAGE_RATE*(-1)
ELSE
((ISNULL(OPENING_STOCK_VALUE,0)+PURCHASE_VALUE))/(CAST(OPENING_STOCK AS FLOAT)+CAST(PURCHASE_QUANTITY AS FLOAT))
END)*A.SALE_QUANTITY
/*ISNULL(@CLOSING_STOCK_VALUE,0)+PURCHASE_VALUE-(CASE WHEN
(ISNULL(@CLOSING_STOCK,0)+CAST(ISNULL(PURCHASE_QUANTITY,0) AS
FLOAT)-CAST(ISNULL(SALE_QUANTITY,0) AS FLOAT))=0 THEN 0
WHEN ISNULL(@CLOSING_STOCK,0)=0 AND CAST(PURCHASE_QUANTITY AS FLOAT)>0 THEN
CAST(PURCHASE_VALUE AS FLOAT)/CAST(PURCHASE_QUANTITY AS FLOAT)
WHEN @CLOSING_STOCK >0 AND PURCHASE_QUANTITY >0 THEN
((@CLOSING_STOCK_VALUE+(PURCHASE_VALUE))/(CAST(@CLOSING_STOCK AS
FLOAT)+CAST(PURCHASE_QUANTITY AS FLOAT)))
WHEN @CLOSING_STOCK >0 AND PURCHASE_QUANTITY =0
THEN (@CLOSING_STOCK_VALUE)/CAST(@CLOSING_STOCK AS FLOAT)
ELSE @MOVING_AVERAGE_RATE END)*A.SALE_QUANTITY*/
FROM MOVING_AVERAGE_RATE
A
JOIN DIM_ITEM
D
ON A.DIM_ITEM_ID=D.DIM_ITEM_ID
AND A.DIM_ITEM_ID=@COUNT_ITEM_START
--AND A.DIM_WAREHOUSE_ID=@MIN_DIM_WAREHOUSE_ID
WHERE A.MOVING_AVERAGE_RATE_ID IN (SELECT MAX(MOVING_AVERAGE_RATE_ID) FROM MOVING_AVERAGE_RATE
WHERE OPENING_STOCK IS NOT NULL
AND DIM_ITEM_ID=@COUNT_ITEM_START
--AND (ISNULL(A.OPENING_STOCK,0)+CAST(A.PURCHASE_QUANTITY AS
FLOAT)-CAST(A.SALE_QUANTITY AS FLOAT))>=0
GROUP BY DIM_ITEM_ID)
AND D.ITEM IN (SELECT STOCK_ITEM_NAME FROM #DAYBOOK_STOCK_ITEM_NM);
SET @CLOSING_STOCK=(SELECT (OPENING_STOCK+PURCHASE_QUANTITY-SALE_QUANTITY) FROM MOVING_AVERAGE_RATE
WHERE DIM_ITEM_ID=@COUNT_ITEM_START
AND MOVING_AVERAGE_RATE_ID=@CHECKNO);
IF @CLOSING_STOCK<0 BREAK;
SET @CHECKNO=@CHECKNO+1;
END;
SET @COUNT_ITEM_START=@COUNT_ITEM_START+1;
PRINT 'The following item id has
been loaded '+CAST(@COUNT_ITEM_START AS VARCHAR);
END;
--SET @MIN_DIM_WAREHOUSE_ID=@MIN_DIM_WAREHOUSE_ID+1;
--PRINT 'The following warehouse id has been loaded
'+CAST(@MIN_DIM_WAREHOUSE_ID AS VARCHAR);
--END;
END;
The following is the dax code for the same:
Cumulative
of purchase quantity:
cumulative_purchase_no
= CALCULATE(SUM('table2'[total_purchase]),filter(allexcept('table2','table2'[ITEM]),'table2'[ordeR_date]<=max('table2'[ordeR_date])))
cumulative
of purchase value:
cumulative_purchase
value =
CALCULATE(SUM('table2'[total_purchase_value]),filter(allexcept('table2','table2'[ITEM]),'table2'[ordeR_date]<=max('table2'[ordeR_date])))
cumulative_purchase value = CALCULATE(SUM('table2'[total_purchase_value]),filter(allexcept('table2','table2'[ITEM]),'table2'[ordeR_date]<=max('table2'[ordeR_date])))
cumulative_purchase_qty2 = 'table2'[cumulative_purchase_no]+'table2'[opening_stock qty]
net_closing_qty =
SUM('table2'[net_debit_credit_qty])
+ SUM('table2'[opening_stock qty])
opening_stock qty = LOOKUPVALUE(openinG_stock[DSPCLQTY],openinG_stock[DSPDISPNAME],'table2'[ITEM])
opening_stock rate = LOOKUPVALUE(openinG_stock[DSPCLRATE],openinG_stock[DSPDISPNAME],'table2'[ITEM])
weighted_average =
DIVIDE(
[cumulative_purchase
value2],
SUM('table2'[cumulative_purchase_qty2])
)
Final result:
Results of the
code:
·
Attached are the final results
for the process:
o
Reduced execution times
o
Reduced times for the weighted
average calculations
o
Better error handling and
logging capabilities
o
Reduced deployment requirements
Prepared by :
Ankur
- Linkedin profile: https://www.linkedin.com/in/ankur-gupta-3b9a71179
Email: apoorv@mnnbi.com
website: www.mnnbi.com
Phone: +91-8802466356
(For additional details on how PowerBI could be used for better reporting, contact above)