Showing posts with label Inventory analytics. Show all posts
Showing posts with label Inventory analytics. Show all posts

Tuesday, 13 February 2018

Optimizing Inventory for a large trading company


Optimizing Inventory for a large trading company

The end client is a large trading company based out of Delhi involved in trading of steel sheets/steel pipes. The steel sheets and steel pipes are used for casting designs in automotive sector.
Problem statement: The end client is looking to improve the inventory turnover ratios( as per the blog attached herewith) and the hypothesis of the end client is that it has a large amount  of slow moving stock.
Approach of the ETL team along with the management consulting firm(Govisory Services Private Limited):
  • ·         Find the A,B,C category of stock for the items: the top 70% of the closing stock value as of the fiscal date was classified as category ‘A’, next 20% as category ‘B’ and last 10% as category ‘C’.
  • ·         Inventory ageing analysis was done to calculate the age of the closing stock
  • ·         Calculation of the over stock and under stock items: for the same the ETL team followed the following steps:

o   Calculation of the average sale value: this was done by dividing the total  of year to date sale of the item by the total number of active fiscal months in which the item was sold. Active fiscal months entail the date difference between the  minimum sale month for the item and the maximum sale date(31st December 2017).
o   The Lower and upper control limits were set as 6 times the average sale value and 9 times the average sale value.
Key findings: most of the category ‘A’ stock items were found to be over stock by using the average method. The ETL team along with the consulting team found that it was the variance and not the averages that are a problem(The Goal Methodology). The end client had significant variations in sales between the fiscal months for the items.

Next step: in order to calculate the effect of variances on the sale, the ETL team decided to carry out the time series analysis over the last 2(two) fiscal years for the category ‘A’ items. This action was done through the proprietary code of ARIMA(Auto Regressive Integrated Moving Average) with the ETL team.
Based on the finding of the same for most of the items it has been found that level has been the most important indicator.

Other activities done by the ETL team for inputs from MS Excel spreadsheets:


Presented by:

Ritu Lakhani
Phone: 0124-4365845
Website: www.mnnbi.com



Wednesday, 31 January 2018

Inventory optimization for large hospital

Problem: for the given hospital, the end client desires to optimize its inventory. The end client has very high level of closing stock numbers and wanted to optimize the same using the Turbodata services.

Approach: Turbodata team had worked under a top 4 consulting firm. The end client had been given target Share of Business at the generic, strength and UOM level. The actual share of business versus target share of business was calculated.
The ETL team first looked at category ‘A’ items in terms of closing stock numbers at the generic strength and UOM level. Thereafter the percentage contribution in terms of sales and profit was calculated for these items.


The ETL team also found yearly and monthly trend analysis.


The ETL team then did a comparison between actual and target SOB on yearly basis and looked at the average procurement price by generic, strength and UOM.
Based on the same the ETL team looked at the vendor wise share and payables owed to the vendors in terms of ageing analysis(snapshot attached below):




Decision sheet: The final decision sheet had the following 7(seven) set of dashboards and graphs:
·         Consumption(year to date) by patient category
·         Consumption(year to date) by doctor
·         Consumption(year to date) by package
·         The team looked at inventory ageing and number of days inventory for the targeted stock items.
Based on the above decision criteria, the ETL team looked at actual SOB versus the Target SOB.

Action step:
The top 3(three) doctors were sent an email about the excess consumption.


Result for the end client: immediate intimation of the excess stock by the doctor and package.

Process pre requisites:
Data consolidation: HIS and the SAP systems
Data cleansing: specifically from the HIS system
Data auditing and data profiling
Business Intelligence: Microsoft platform was used for the same.
Data extraction/data capture: from multiple source systems.
Further details on inventory optimization can be obtained from the given link.

Contact:
Apoorv Chaturvedi
Phone; +91-8802466356
Website: www.mnnbi.com



Wednesday, 24 January 2018

Tally Data Consolidation, Tally Data Cleansing project from West India

Deployment of Turbodata for Retail company based out of Western India


Source system: multiple installation of Tally ERP 9.1.
Problem : The end client desired to have a custom installation of Turbodata based on the unique requirements of its business. The product shall be used for designing a custom web interface for customer interaction. The key tenets of the solution that differed from the standard deployment of turbodata were as follows:
·         Standard price list instead of weighted average or FIFO pricelist.
·         Closing stock value was to be calculated at a godown and item level.
·         The solution was to work across multiple locations seamlessly with maximum RAM usage been 1 GB for both initial and incremental data loads.
·         Custom masters extraction for item, stock group, category .
·         GST classification to be extracted for the end client.
Time duration of the project: 2 weeks.
Approach of the ETL team:
·         Choosing the appropriate set of attributes to be loaded based on the modular approach. That is the required fields to be loaded for ledger and inventory were chosen.
·         Custom extraction for the tables: The process of normalization helped in the same since the attribute is to be loaded only once.
·         Testing of initial and incremental data loads in terms of time and load on the system. The incremental data load process helped at reducing the time of data load.
·         Data cleansing: special characters were removed from the item names. Also separation of the numeric values from the character fields
·         Data consolidation: multiple types of voucher types were loaded onto the datawarehouse.

Project has been done successfully. Hereafter the end client shall go for a MVC interface over the datawarehouse for reporting and customer interaction purposes.



Initial and Incremental data Load Template by M&N Business Intelligence-SAP Data Services

  INCREMENTAL LOAD/CREATION OF DIMENSION TABLE LOGIC AT SAP DATA SERVICES END CLIENT In this particular document we shall be looking at the ...