Monday, 3 June 2024

Developing an optimization module for inventory

 The given blog has been developed for helping the end clients get rid of slow moving stock at healthy margins. The solution from Turbodata entails selling the stock items through the secondary sales channel involving sites such as indiamart, tradeindia etc.



Contact details of blog writer:

Name: Apoorv Chaturvedi
Email: support@mndatasolutions.com;support@turbodatatool.com
Phone: +91-8802466356



Problem that we see to resolve

·        Gauge the value of the closing stock using ABC analysis.
·        Find the slow moving stock within category A
·        Sell the items that are slow moving and have high stock value.
·        Find the buyers for the given stocks with the following parameters:
o   Buyers with good payment history to get additional discounts
o   Buyers with poor payment history to be blocked or not get high discounts


Solution

Attached is the flow of the information for solving the required problem:



Data auditing:


A number of times the end client types in wrong data into the source ERP system thereby resulting in wrong outputs and results. Junk inputs imply junk outputs.  The ETL team would recommend an auditable output from Turbodata to be used as part of the reporting purposes.  Wrong data inputs can impact the end client in one or more of the following ways:
  •        Wrong tax filing specifically in online scenario.
  •         Wrong business picture
  •         Wrong predictive analytics.
As per the Toyota ProductionSystem, bad inputs should not be processed further as it adds to the final costs.
The ETL team(my firm) has found the following errors with regards to the data entry inputs specifically with Tally ERP 9.0.  

·         Stock input has been in one godown but stock outward movement has been from other godowns:





·         Missing purchase or sales order entries resulting in negative stocks at given points in time. One cannot have negative stock balances at any point in time.



Other data input errors that we have commonly seen are as follows:

  •       Duplicate payment entries
  •      Duplicate sales entries
  •        Receipt note entries but no purchase invoice entries
  •         Payments not having the required bill reference numbers.
How to resolve the errors:
·         In an object oriented program it is difficult to catch the errors on a real time basis. The ETL team recommends using the relational databases for catching the errors. The real time extraction module for Turbodata should be used for the same.
·         Transferring the data onto the third normal database is recommended. This helps catch data duplicity based on the composite keys.
For example if an end client has made the same amount payment for a given voucher on a given fiscal date, then the same should come as part of the discrepancy report. It is possible that the end client could be correct. There is also a possibility that the payment entries have been made by 2 different resources. Further handling of the given situation is as follows:
·         If the end client desires to catch the following error then the username by which the data entries have been done shall not be added to the composite key. In such a scenario there is a discrepancy between the Turbodata ledger balance output and the Tally report. The end client to approve the discrepant entry before the data is input into the system for auditing purposes.
Using perpetual valuations for ledger and inventory instead of periodic valuations. For example if an end client relies on periodic valuations for ledger balances then a duplicate payment entry then the periodic balances at the end of the fiscal month are difficult to catch. For example if an end client has a duplicate entry of Rs. 100k(One hundred thousand  only) over a balance of say Rs. 15000k(One fifty million only).
However using the perpetual system it is easy to catch the data entry errors.

Matching the consolidated trial balances and closing stock balances at the database level with the on fly calculations at the software level.

A small story for the end user: as Yuval Harari is Sapians says that mankind is primarily driven by myths. Hence many a managers are driven by myths regarding software or the consulting companies having the right audit numbers(with the managers inputting junk numbers).
A small story from one of my favourite books(Raag Darbari by Srilal Shukla) could best illustrate the point.
The protagonist Ranganath had gone from the city to visit his relative, an aunt’s husband , in the village. During the course of the village fair, it was suggested that the group goes and sees the village temple for the local goddess. At the temple Ranganath found that the statue instead of been of a goddess was of a soldier( for a goddess he was looking for two lumps  in front and two lumps in the back). The priest asked for donations for the goddess. To this request Ranganath refused saying that the statue was not of a goddess but of a man. There was an ensuing scuffle between the villagers and Ranganath. Ranganath was eventually rescued by his cousin. On going out and meeting other people, the cousin mentioned the following:
"My cousin has come from the city and is very well read. That is why he talks like a fool."
The author has always associated himself with Ranganath.


Data consolidation example:


Deployment of Turbodata for Retail company based out of Western India


Source system: multiple installation of Tally ERP 9.1.
Problem : The end client desired to have a custom installation of Turbodata based on the unique requirements of its business. The product shall be used for designing a custom web interface for customer interaction. The key tenets of the solution that differed from the standard deployment of turbodata were as follows:
·         Standard price list instead of weighted average or FIFO pricelist.
·         Closing stock value was to be calculated at a godown and item level.
·         The solution was to work across multiple locations seamlessly with maximum RAM usage been 1 GB for both initial and incremental data loads.
·         Custom masters extraction for item, stock group, category .
·         GST classification to be extracted for the end client.
Time duration of the project: 2 weeks.
Approach of the ETL team:
·         Choosing the appropriate set of attributes to be loaded based on the modular approach. That is the required fields to be loaded for ledger and inventory were chosen.
·         Custom extraction for the tables: The process of normalization helped in the same since the attribute is to be loaded only once.
·         Testing of initial and incremental data loads in terms of time and load on the system. The incremental data load process helped at reducing the time of data load.
·         Data cleansing: special characters were removed from the item names. Also separation of the numeric values from the character fields
·         Data consolidation: multiple types of voucher types were loaded onto the datawarehouse.

Project has been done successfully. Hereafter the end client shall go for a MVC interface over the datawarehouse for reporting and customer interaction purposes.

Data consolidation for Trial Balance example:
Problem: the end client required consolidated ledger balances and balance sheet details across 36 companies. With the given software that the end client had the process was taking a lot of time. The system would hang during the process of consolidation and generation of the required reports.

Methodology of the ETL team: the ETL team consolidated data ledger data from all the 36 companies. In order for the end client to generate balance sheet/trial balance details on any fiscal date the ETL team did the following activities:
·         Perpetual ledger balance details were stored by partyledgername and ledgername.
·         The associated cost center details for the ledger were also stored. The Profit and Loss statements could be generated according to the cost center details.
·         The ETL team was able to generate the balance sheet details, trial balance details across all the companies.
·         The end client could get the access to the balance sheet details across multiple companies.

The following system was used to match the trial balance details:
·         Data audit: the ETL team used the perpetual ledger balance details to arrive at the closing ledger balance details on the given fiscal date. The closing ledger balance on the given fiscal date was matched with the trial balance details from the software. The software was able to handle the cases where opening ledger balance was non zero.

Final result:
·         The audit numbers of the resulting output were matching with the software output.
·         The report refresh times was crashed by more than 90%(ninety) percent
·         The software did not hang during the process of initial and incremental data load and during the process of report generation.
Other benefits to the end client:
·         Better scope of cash flow availability: since the end client is having the cash flow balances on each fiscal date, hence the end client is able to capture the variances in payments across all ledgers. This helps the end client at better planning of the cash flows.
For the process of data consolidation, the following actions were done:
·         Data cleansing
·         Data consolidation
·         Report generation using C#/.net interface.

       Further ledger analysis was done as given in the following link:
       Ledger analysis link



      Dashboards for hypothesis testing:

Are you a customer having the following issues:

Having issues with large value of  slow moving inventory
Have issues with cash flow cycles
Do not have clarity regarding product profitability


Our product Turbodata can help your firm with resolving the above issues. The product is inspired by philosophy of The Goal by Eliyahu Goldratt and Profit Beyond Measure by Thomas Johnson and Ander Brohms(please see the appendix 1 for a summary of the philosophies)

Both the philosophies imply that the end client should use the order line profitability instead of using the periodic calculations. Only then would the end client get complete visibility into its operations and profitability by customer, region etc.

What is required for determining the orderline profitability?
For determining the same the end client needs to have valuations of inventory using perpetual method instead of the periodic method.
As a case to the point, consider the following:




In the attached scenario of an item, the valuation using weighted average/FIFO has been done on periodic basis. Hence the end client looses the orderline profitability details by using the same.

However in the snapshot below using Turbodata, the weighted average calculations are done on a daily basis(as in the attached snapshot)

 

This enables the end client to calculate orderline profitability.

Issues with calculating the orderline profitability:
v  In some of the software,  negative stock is allowed.  Because of the same orderline profitability calculations might be impacted. The sample below gives the first instance of negative stock for an item.
Sample attached below:






v  The physical stock entries valued at 0(zero) value can create discrepancies in the stock valuations.
v  Data consolidation from multiple systems could be required for calculating the same.
v  Data transformation in terms of business logic of the end client needs to be done so that the required calculations come into force.

By using Turbodata, the end clients shall be able to achieve the following:
v  Go towards orderline profitability by getting an estimate of cost of goods sold based on perpetual FIFO and weighted average calculations.
v  Achieve the following activities
o   Data cleansing: clean the master data before reporting is done
o   Data profiling: find the first instance when the closing stock of an item turned negative at godown or consolidated level.
       Data analytics: have consolidated dashboards along with predictive analytics facilities at economical costs.
v  Better management of inventories: by finding the profitability of the sale of items at the orderline level for a given set of customers.
v  Prepare the data for predictive analytics and forecasting through data compression and sql reduction. The predictive analytics and forecasting is required to capture the variations from the standard values for sales. A significant variation is to be captured early so that the end client could take the corrective actions quickly.

Interested in moving towards orderline profitability:
v  Deployment of Turbodata solution(for testing sample data): USD 3000/-(USD Three thousand only)+taxes as applicable. Contact us for a sample demo
v  Buy our standard book based on Turbodata project experiences: USD 5/-(five) dollars
Please contact the following for the above for a demo
Name: Apoorv Chaturvedi
Website: www.mnnbi.com


Appendix 1

What do the above management philosophies say?
The Goal:
The Goal is inspired by the theory of constraints. This implies that there are 3 parameters that are critical for any firm:
v  Throughput: the rate at which the system generates the sales(our definition of cash sales)
v  Inventory: the input material required to convert the inputs material to final product for generating throughput.
v  Labor: The manpower required for converting inventory to throughput.
The protagonist Jonah in ‘Goal’ also insisted on standard deviations and variations to be part of the process. The variations to be detected on a close to real time basis so that any errors are caught beforehand.

Profit Beyond Measure:
Profit Beyond Measure  is inspired by the Toyota Production system. It emphasizes that the manufacturing company should function like a human body. The functional managers should account for self sustainability(standard cycle times), diversity and interdependence( the manufacturing managers need to look at the whole system like a human body and not just a single component).
The book emphasizes that there should be a reduction in inventory by reducing a changeover times at each of the working station. That is the manufacturing process should start once the customer order has come into the system. The book further looks at ‘Design to order’ by designing multiple configurable modules to offer the end clients multiple types of products.
The system emphasizes catching the errors in production cycle quickly so that there is reduced material wastage.

Sample example of inventory optimization:Inventory optimization of large trading company


Predictive analytics for sales forecasting

How Turbodata helped lower the costs of developing a datawarehouse and helped the end clients do predictive analytics with quickness and ease-applicable for retail sales and inventory(Website: www.mnnbi.com)

Purpose of the development of the product: The Turbodata team intends to reduce the costs of the analytic solutions by creating a single platform for ETL, Reporting, Reporting development and predictive analytics. The team also intends to provide the best in class analytics on the same machine on which the ERP is running or with the addition of minimum hardware requirements for the end client. This has been done to develop scalable systems that can be deployed over a large number of customers(with limited budgets) with ease(deployment, delivery and usage) and convenience(maintenance).
The end goal is to increase derisking and predictability for the end clients at lower costs.




Methodology for achieving the required ends for the end client:
·         Turbodata adopted the Inmon methodology for datawarehouse development so that multiple data sources could be added onto the same datawarehouse. That is the change from one data source to another was done with ease. More details on the attached web page link: http://mnnbi.com/mnnbi_tallydataconsolidation.html


o   The benefits of the normalization of data were as follows:
§  The incremental data load took minimum time and had minimum impact on the source system. The ETL team was able to commit the incremental data load to a maximum of 2GB RAM from multiple source systems. The source systems did not hang with the incremental data load working.
§  Error handling was done with ease at staging layer.
§  Massive data compression took place due to reduced data redundancies.
§  The business logic was coded between staging and the ODS layers thereby reducing the length of the final sql code.
The attached video shows a more detailed description of the benefits listed above:
The joins were reduced in the data mart layer(over which a reporting layer was built).

The ETL team was able to develop extremely complex reports using the datawarehouse as in the attached sample:

Due to the data compression for most projects the ETL team are able to bring the data within 1 GB. Hence the desktop version of Microsoft Power BI could be used free of cost for the end client.

Reducing the cost of predictive analytics solutions
 Most of the end clients use high end predictive tools over the datawarehouse/ over the direct data extract from various source databases. With large datasets predictive analytics using in memory solutions entails high usage of RAM. The ETL team has gone around this issue in the following manner:
o   A seamless environment was created for ETL, reporting and thereafter predictive analytics on SQL/C# and .Net. The reasons for the same are attached herewith:
§  Maintenance becomes easier since there is a single platform for all aspects.
§  The cost comes down since the resources to be used for ETL can also be used for predictive analytics
§  Error handling becomes very easy since errors can be captured before in the


Hypothesis testing
Based on the hypothesis testing, the ETL team developed ARIMA analysis and Market Basket analysis in SQL using seamless integrated set of stored procedures. That is the ARIMA analysis flowed from the datawarehouse A,B,C categorization. The ETL team thus reduced the requirement for high end R and Python developers to code over the datawarehouse thereby presenting a seamless solution to the end client on a 8GB RAM machine.

Benefits to the end client:
·         The end client gets immediate and confirmed peace of mind and satisfaction through immediate deployment of predictive and forecasting analytics modules.
·         No additional hardware/software requirements need to be taken
·         The costs are way lower for the end client.
·         Large scale deployment is possible with the given set of solutions.
Please check the attached video for the same:
A more detailed video is attached herewith:


Example of predictive analytics with Turbodata: Example of predictive analytics-Turbodata

Understanding the buyer profile in detail

Problem statement: a number of firms use periodic statement for ledger analysis(monthly, quarterly and yearly). In such a scenario these firms loose the day by day and transaction by transaction history of ledger balances. This information is required for the ageing analysis, in depth accounts receivable analysis per ledger. As an example, consider the following:


The above snap shot indicates the ledger balance on any fiscal date by partyledgername and ledger name(the group name is a roll up). From the daily ledger balance, the end client should be able to extract the trial balance, balance sheet and even profit and loss statements.
As an example consider the following snap shot:


Because of keeping the ledger balance history, the end client is able to find the cash balance as of the given fiscal date. Thereafter it has been able to capture the cash balances on a monthly, yearly, quarterly basis as given below:
Monthly report:

Daily Report:


Pre requisites for achieving the same:
The historical ledger balances need to be calculated and the closing ledger balance on the current fiscal date shall need to be matched with the ledger balance on the last day of the ledger balance history table as given below:



Alternatively the debit and credit balances need to be matched as given below:


The process replicates the ledger balancing related with bitcoins.

For achieving the same the end client needs to do the following:
The ETL team would be also able to offer Business Intelligence and predictive analytics services along with ledger analytics.

Ledger analytics is also related with GST filing.

Further case studies for consolidation of data can be seen from the following link.

Construction Analytics using Turbodata

 Contact details of blog writer:

Name: Apoorv Chaturvedi
Email: support@mndatasolutions.com;support@turbodatatool.com
Phone: +91-8802466356
website: support@mndatasolutions.com;support@turbodatatool.com
Website: https://mn-business-intelligence-india.business.site/


Problems to be solved by for the end client:

  • Automated tax filings(Indian end clients, middle eastern end clients)
  • Purchase optimization: for the builders, how to optimize the budget for purchase optimization while meeting the delivery times
  • Projects to invest in case of cash constraints/cash shortage
  • Estimate the impact of delays on the project execution times and the total project costs
  • Capture the data entry errors(extremely critical) on a real time basis.
Requirements of the end client: immediate and quick results with deliveries on a week by week basis. Reducing the risks for the end client in terms of project costs and implementation.

Solution:


For construction analytics, the ETL team has devised a solution for optimizing the construction lending costs. The solution is meant for customers who are in the construction financing business and construction project management. The product incorporates a step by step data audit, inventory optimization, cash flow optimization services followed by intensive construction analytics module.




  1. Data Audit


A number of times the end client types in wrong data into the source ERP system thereby resulting in wrong outputs and results. Junk inputs imply junk outputs.  The ETL team would recommend an auditable output from Turbodata to be used as part of the reporting purposes.  Wrong data inputs can impact the end client in one or more of the following ways:
  •        Wrong tax filing specifically in online scenario.
  •         Wrong business picture
  •         Wrong predictive analytics.
As per the Toyota ProductionSystem, bad inputs should not be processed further as it adds to the final costs.
The ETL team(my firm) has found the following errors with regards to the data entry inputs specifically with Tally ERP 9.0.  

·         Stock input has been in one godown but stock outward movement has been from other godowns:





·         Missing purchase or sales order entries resulting in negative stocks at given points in time. One cannot have negative stock balances at any point in time.



Other data input errors that we have commonly seen are as follows:

  •       Duplicate payment entries
  •      Duplicate sales entries
  •        Receipt note entries but no purchase invoice entries
  •         Payments not having the required bill reference numbers.
How to resolve the errors:
·         In an object oriented program it is difficult to catch the errors on a real time basis. The ETL team recommends using the relational databases for catching the errors. The real time extraction module for Turbodata should be used for the same.
·         Transferring the data onto the third normal database is recommended. This helps catch data duplicity based on the composite keys.
For example if an end client has made the same amount payment for a given voucher on a given fiscal date, then the same should come as part of the discrepancy report. It is possible that the end client could be correct. There is also a possibility that the payment entries have been made by 2 different resources. Further handling of the given situation is as follows:
·         If the end client desires to catch the following error then the username by which the data entries have been done shall not be added to the composite key. In such a scenario there is a discrepancy between the Turbodata ledger balance output and the Tally report. The end client to approve the discrepant entry before the data is input into the system for auditing purposes.
Using perpetual valuations for ledger and inventory instead of periodic valuations. For example if an end client relies on periodic valuations for ledger balances then a duplicate payment entry then the periodic balances at the end of the fiscal month are difficult to catch. For example if an end client has a duplicate entry of Rs. 100k(One hundred thousand  only) over a balance of say Rs. 15000k(One fifty million only).
However using the perpetual system it is easy to catch the data entry errors.

Matching the consolidated trial balances and closing stock balances at the database level with the on fly calculations at the software level.

A small story for the end user: as Yuval Harari is Sapians says that mankind is primarily driven by myths. Hence many a managers are driven by myths regarding software or the consulting companies having the right audit numbers(with the managers inputting junk numbers).
A small story from one of my favourite books(Raag Darbari by Srilal Shukla) could best illustrate the point.
The protagonist Ranganath had gone from the city to visit his relative, an aunt’s husband , in the village. During the course of the village fair, it was suggested that the group goes and sees the village temple for the local goddess. At the temple Ranganath found that the statue instead of been of a goddess was of a soldier( for a goddess he was looking for two lumps  in front and two lumps in the back). The priest asked for donations for the goddess. To this request Ranganath refused saying that the statue was not of a goddess but of a man. There was an ensuing scuffle between the villagers and Ranganath. Ranganath was eventually rescued by his cousin. On going out and meeting other people, the cousin mentioned the following:
"My cousin has come from the city and is very well read. That is why he talks like a fool."
The author has always associated himself with Ranganath.




2. Purchase Optimization

Are you a customer having the following issues:

Having issues with large value of  slow moving inventory
Have issues with cash flow cycles
Do not have clarity regarding product profitability


Our product Turbodata can help your firm with resolving the above issues. The product is inspired by philosophy of The Goal by Eliyahu Goldratt and Profit Beyond Measure by Thomas Johnson and Ander Brohms(please see the appendix 1 for a summary of the philosophies)

Both the philosophies imply that the end client should use the order line profitability instead of using the periodic calculations. Only then would the end client get complete visibility into its operations and profitability by customer, region etc.

What is required for determining the orderline profitability?
For determining the same the end client needs to have valuations of inventory using perpetual method instead of the periodic method.
As a case to the point, consider the following:





In the attached scenario of an item, the valuation using weighted average/FIFO has been done on periodic basis. Hence the end client looses the orderline profitability details by using the same.

However in the snapshot below using Turbodata, the weighted average calculations are done on a daily basis(as in the attached snapshot)

 

This enables the end client to calculate orderline profitability.

Issues with calculating the orderline profitability:
v  In some of the software,  negative stock is allowed.  Because of the same orderline profitability calculations might be impacted. The sample below gives the first instance of negative stock for an item.
Sample attached below:






v  The physical stock entries valued at 0(zero) value can create discrepancies in the stock valuations.
v  Data consolidation from multiple systems could be required for calculating the same.
v  Data transformation in terms of business logic of the end client needs to be done so that the required calculations come into force.

By using Turbodata, the end clients shall be able to achieve the following:
v  Go towards orderline profitability by getting an estimate of cost of goods sold based on perpetual FIFO and weighted average calculations.
v  Achieve the following activities
o   Data cleansing: clean the master data before reporting is done
o   Data profiling: find the first instance when the closing stock of an item turned negative at godown or consolidated level.
       Data analytics: have consolidated dashboards along with predictive analytics facilities at economical costs.
v  Better management of inventories: by finding the profitability of the sale of items at the orderline level for a given set of customers.
v  Prepare the data for predictive analytics and forecasting through data compression and sql reduction. The predictive analytics and forecasting is required to capture the variations from the standard values for sales. A significant variation is to be captured early so that the end client could take the corrective actions quickly.




Appendix 1

What do the above management philosophies say?
The Goal:
The Goal is inspired by the theory of constraints. This implies that there are 3 parameters that are critical for any firm:
v  Throughput: the rate at which the system generates the sales(our definition of cash sales)
v  Inventory: the input material required to convert the inputs material to final product for generating throughput.
v  Labor: The manpower required for converting inventory to throughput.
The protagonist Jonah in ‘Goal’ also insisted on standard deviations and variations to be part of the process. The variations to be detected on a close to real time basis so that any errors are caught beforehand.

Profit Beyond Measure:
Profit Beyond Measure  is inspired by the Toyota Production system. It emphasizes that the manufacturing company should function like a human body. The functional managers should account for self sustainability(standard cycle times), diversity and interdependence( the manufacturing managers need to look at the whole system like a human body and not just a single component).
The book emphasizes that there should be a reduction in inventory by reducing a changeover times at each of the working station. That is the manufacturing process should start once the customer order has come into the system. The book further looks at ‘Design to order’ by designing multiple configurable modules to offer the end clients multiple types of products.
The system emphasizes catching the errors in production cycle quickly so that there is reduced material wastage.



3.)Ledger optimization


 Problem statement: a number of firms use periodic statement for ledger analysis(monthly, quarterly and yearly). In such a scenario these firms loose the day by day and transaction by transaction history of ledger balances. This information is required for the ageing analysis, in depth accounts receivable analysis per ledger. As an example, consider the following:


The above snap shot indicates the ledger balance on any fiscal date by partyledgername and ledger name(the group name is a roll up). From the daily ledger balance, the end client should be able to extract the trial balance, balance sheet and even profit and loss statements.
As an example consider the following snap shot:


Because of keeping the ledger balance history, the end client is able to find the cash balance as of the given fiscal date. Thereafter it has been able to capture the cash balances on a monthly, yearly, quarterly basis as given below:
Monthly report:

Daily Report:


Pre requisites for achieving the same:
The historical ledger balances need to be calculated and the closing ledger balance on the current fiscal date shall need to be matched with the ledger balance on the last day of the ledger balance history table as given below:



Alternatively the debit and credit balances need to be matched as given below:


The process replicates the ledger balancing related with bitcoins.

For achieving the same the end client needs to do the following:
The ETL team would be also able to offer Business Intelligence and predictive analytics services along with ledger analytics.

Ledger analytics is also related with GST filing.

Further case studies for consolidation of data can be seen from the following link.



4.)Construction analytics(Analyzing risks and capital rationing)
How to use ‘The Goal’ Philosophy to optimize resources for construction lending
The given note is based on the philosophy of ‘The Goal’ by Eliyahu Goldratt. ‘The Goal’ concentrates on the constraints faced by the firms in terms of resources.
Effect of delay between milestone achievements: The variance between the actual and the target dates in achievement of the milestones results in additional costs which result in increased cash outflows for the end client.
Hence it becomes important to predict the impact of variances between the actual and the target delivery dates and give the management multiple options in case of delays happening in achievement of milestones.
Predicting the costs of delays:
The following is the step by step guide for resolving the above issues:
·         Track the target versus actual milestones for the construction projects. In such a scenario the target dates need to be fixed. Accordingly the variances shall be calculated based on the achievements of the actual versus the target dates. If a software is not available for data capture then the software could be developed for the same.




·         The costs associated with the delays shall need to be tracked using dashboards. For the same data connectors are available with the ETL team for data consolidationdata transformationdata cleansing and data auditing.


·         Based on the cost of delays, hypothesis shall need to be developed for optimization purposes.


·         Best case and worst case scenario: based on the variances for the project implementation, the best case and the worst case scenario for each project stage shall be developed. The most probable cost shall be arrived based on the same.






·         Allocation of resources based on maximizing the NPV of the project: based on the expected variance in the cost of the project, the expected NPV is to be calculated. The projects having the maximum NPV shall be addressed first.





·         Crashing of the activities to meet the project deadlines: in case of delay and a set delivery date for the project, route optimization techniques to be used to meet the required deadlines at minimal costs.




·         Optimization of purchase costs based on delivery dates, delivery requirements of the end client.

    Other activities: purchase optimization through Turbodata
     Ledger optimization



5.) Data preparation(reducing the query execution times)

Query reduction implementation example: large hospital chain in Gurgaon

Problem: the nightly process of the hospital chain was taking time because of which the SLA(performance parameter) of the CTO of the firm was not been met. The ETL team was advised the bottleneck sql by the end client manager.
  • Methodology: the end client had developed the code using cursor logic. The ETL team developed the code using set based logic in order to optimize the usage of RAM. The given process has the following benefits for the end client:
    ·         Optimum usage of RAM
    ·         Lower implementation times
    ·         Error logging and error handling
    ·         Incremental data loads after the initial data loads.
    ·         Audit of the transformation process for the end client.

    Methodology:
    The ETL team adopted the Inmon methodology for resolving the same. The cursor logic was reverse engineered using the set based system. The following were the methodologies adopted by the ETL team.
    ·         Data normalization: error logging, data audit, incremental data load and optimum usage of RAM.
    ·         Data transformation: converting the cursor logic to set based logic
    ·         Data cleansing: obtained from the cursor logic

    Final result:
    ·         The output of the set based system was matching with the cursor logic output.
    ·         The execution time was reduced by more than 80(eighty) percent.


6.) Automated GST Filing Services from Turbodata



Are you facing the following issues with regards to GST filing?
  • ·         Delay in filing
  • ·         Concern regarding the changing regulations from the government
  • ·         Concern regarding reconciliation: specially for customers using MS Excel upload.
  • ·         Have a manual process for GSTR filing. The manual process is prone to error
  • ·         Have high manpower costs related with GST filing.

Turbodata shall help your firm with faster, easier and more convenient GST filing.
How is Turbodata different?
  • ·         All the reports for the end client shall be developed on the cloud installation. Only a minimal extract for all the vouchers and masters shall be done from the end client location. The ETL team shall commit to usage of maximum amount of RAM for the same(say 1 GB for incremental data extract)
  • ·         The end client can do the prior change of the data. The system shall automatically take care of the same. This is enabled through incremental data load process using data normalization.
  • ·         No reports shall be developed at the client location. All the reporting work shall be done at the server location.
  • ·         Initial and incremental transaction data extract shall be done from the end client location.
  • ·         The end client need not worry about re filing the GST reports since it shall be done by the GSP partner automatically.
  • ·         The package is very easy to deploy, deliver and maintain. No high end software are required. The system can extract data from SAP, Tally and other source systems with ease.
  • ·         Dependence on MS Excel for tax filing purposes is taken away since it could result in data errors and discrepancies.
  Current system:




Why is the Turbodata system better?

Turbodata system:


·         Turbodata system is inspired by ‘The Deming Way’, ‘The Goal’ and the Toyota production system and the Inmon methodology.  In a nutshell the following are the features copied from the above systems by Turbodata:
o   No error prone data should be passed for the reporting purposes. The data needs to be cleansed, audited and consolidated before report development.
o   The processing of the transaction should be done as soon as the transaction has been fed in the source system. That is the processing should take place on a real time basis and not specifically at the end of the month. Turbodata enables this feature in the following manner:
§  Each transaction fed into the end client source system is assumed to be an order from the end client.
§  The system offers the facility for real time extract and upload(current system is manual but the data can be loaded on a daily basis by the end client go the server)
o   Once the data has been loaded onto the server, it is transferred to a normalized database(insert, update and deletes). At the data warehouse level the data cleansingdata transformationdata consolidation activities are done
o   Once the data has been cleansed at the datawarehouse level then the reports for GST are developed. In one single lot, GSTR1, GSTR2 and GSTR3 reports can be developed.
o   Turbodata is integrated with at least one GSP partner. The end client could look at other GSP partner solutions if it desires the same.
o   The deployment of the solution is very easy and convenient. For any end client the deployment should take not more than 20(twenty) minutes. Minimum installation pre requisites are required.
o   The data for the end client is stored in a datawarehouse. The end client does not need to worry about changes in the statutory requirements. Other high end services like inventory optimization and predictive analytics are possible on the cloud.

To check why should the end client consider Turbodata GST, please check the following linkage:
http://mndatasolutionsindia.blogspot.in/2018/02/why-turbodata-gst.html




Indicate the following:
·         ERP system/ERP systems
·         Turnover: frequency of load
·         Number of locations

  Sample video link: https://www.youtube.com/watch?v=sYbeBfc3ozo&feature=youtu.be

       The product uses optimum RAM so that the source system does not hang during extraction as given in the following video:
       https://youtu.be/7CULkzc5h2g



FOR FREE MICROSOFT POWERBI DASHBOARDS: please do one of the following:Email: apoorv@mnnbi.com or go to  landing  page http://user1333631.sites.myregisteredsite.com/id62.html

Initial and Incremental data Load Template by M&N Business Intelligence-SAP Data Services

  INCREMENTAL LOAD/CREATION OF DIMENSION TABLE LOGIC AT SAP DATA SERVICES END CLIENT In this particular document we shall be looking at the ...