Wednesday, 31 January 2018

Inventory optimization for large hospital

Problem: for the given hospital, the end client desires to optimize its inventory. The end client has very high level of closing stock numbers and wanted to optimize the same using the Turbodata services.

Approach: Turbodata team had worked under a top 4 consulting firm. The end client had been given target Share of Business at the generic, strength and UOM level. The actual share of business versus target share of business was calculated.
The ETL team first looked at category ‘A’ items in terms of closing stock numbers at the generic strength and UOM level. Thereafter the percentage contribution in terms of sales and profit was calculated for these items.


The ETL team also found yearly and monthly trend analysis.


The ETL team then did a comparison between actual and target SOB on yearly basis and looked at the average procurement price by generic, strength and UOM.
Based on the same the ETL team looked at the vendor wise share and payables owed to the vendors in terms of ageing analysis(snapshot attached below):




Decision sheet: The final decision sheet had the following 7(seven) set of dashboards and graphs:
·         Consumption(year to date) by patient category
·         Consumption(year to date) by doctor
·         Consumption(year to date) by package
·         The team looked at inventory ageing and number of days inventory for the targeted stock items.
Based on the above decision criteria, the ETL team looked at actual SOB versus the Target SOB.

Action step:
The top 3(three) doctors were sent an email about the excess consumption.


Result for the end client: immediate intimation of the excess stock by the doctor and package.

Process pre requisites:
Data consolidation: HIS and the SAP systems
Data cleansing: specifically from the HIS system
Data auditing and data profiling
Business Intelligence: Microsoft platform was used for the same.
Data extraction/data capture: from multiple source systems.
Further details on inventory optimization can be obtained from the given link.

Contact:
Apoorv Chaturvedi
Phone; +91-8802466356
Website: www.mnnbi.com



Wednesday, 24 January 2018

Query reduction implementation example: large hospital chain in Gurgaon

Problem: the nightly process of the hospital chain was taking time because of which the SLA(performance parameter) of the CTO of the firm was not been met. The ETL team was advised the bottleneck sql by the end client manager.
Methodology: the end client had developed the code using cursor logic. The ETL team developed the code using set based logic in order to optimize the usage of RAM. The given process has the following benefits for the end client:
·         Optimum usage of RAM
·         Lower implementation times
·         Error logging and error handling
·         Incremental data loads after the initial data loads.
·         Audit of the transformation process for the end client.

Methodology:
The ETL team adopted the Inmon methodology for resolving the same. The cursor logic was reverse engineered using the set based system. The following were the methodologies adopted by the ETL team.
·         Data normalization: error logging, data audit, incremental data load and optimum usage of RAM.
·         Data transformation: converting the cursor logic to set based logic
·         Data cleansing: obtained from the cursor logic

Final result:
·         The output of the set based system was matching with the cursor logic output.
·         The execution time was reduced by more than 80(eighty) percent.

Suggested next steps for the prospect:
·         Send the bottleneck sql to the ETL team.

Posted by:
Ritu Lakhani
website: www.mnnbi.com


Phone: 0124-4365845.

Tally Data Consolidation, Tally Data Cleansing project from West India

Deployment of Turbodata for Retail company based out of Western India


Source system: multiple installation of Tally ERP 9.1.
Problem : The end client desired to have a custom installation of Turbodata based on the unique requirements of its business. The product shall be used for designing a custom web interface for customer interaction. The key tenets of the solution that differed from the standard deployment of turbodata were as follows:
·         Standard price list instead of weighted average or FIFO pricelist.
·         Closing stock value was to be calculated at a godown and item level.
·         The solution was to work across multiple locations seamlessly with maximum RAM usage been 1 GB for both initial and incremental data loads.
·         Custom masters extraction for item, stock group, category .
·         GST classification to be extracted for the end client.
Time duration of the project: 2 weeks.
Approach of the ETL team:
·         Choosing the appropriate set of attributes to be loaded based on the modular approach. That is the required fields to be loaded for ledger and inventory were chosen.
·         Custom extraction for the tables: The process of normalization helped in the same since the attribute is to be loaded only once.
·         Testing of initial and incremental data loads in terms of time and load on the system. The incremental data load process helped at reducing the time of data load.
·         Data cleansing: special characters were removed from the item names. Also separation of the numeric values from the character fields
·         Data consolidation: multiple types of voucher types were loaded onto the datawarehouse.

Project has been done successfully. Hereafter the end client shall go for a MVC interface over the datawarehouse for reporting and customer interaction purposes.



Friday, 19 January 2018

Ledger analytics/Ledger analysis

Ledger Analytics

Problem statement: a number of firms use periodic statement for ledger analysis(monthly, quarterly and yearly). In such a scenario these firms loose the day by day and transaction by transaction history of ledger balances. This information is required for the ageing analysis, in depth accounts receivable analysis per ledger. As an example, consider the following:


The above snap shot indicates the ledger balance on any fiscal date by partyledgername and ledger name(the group name is a roll up). From the daily ledger balance, the end client should be able to extract the trial balance, balance sheet and even profit and loss statements.
As an example consider the following snap shot:


Because of keeping the ledger balance history, the end client is able to find the cash balance as of the given fiscal date. Thereafter it has been able to capture the cash balances on a monthly, yearly, quarterly basis as given below:
Monthly report:

Daily Report:


Pre requisites for achieving the same:
The historical ledger balances need to be calculated and the closing ledger balance on the current fiscal date shall need to be matched with the ledger balance on the last day of the ledger balance history table as given below:



Alternatively the debit and credit balances need to be matched as given below:


The process replicates the ledger balancing related with bitcoins.

For achieving the same the end client needs to do the following:
The ETL team would be also able to offer Business Intelligence and predictive analytics services along with ledger analytics.

Ledger analytics is also related with GST filing.

Further case studies for consolidation of data can be seen from the following link.




Apoorv Chaturvedi

Blogspot: http://mndatasolutionsindia.blogspot.in/

Tuesday, 16 January 2018

Excel Consolidation-spreadsheet consolidation

Excel consolidation Problem
Many a times we have seen the following:
·         MS Excel reports been developed and used since developing complex reports is very cumbersome at database level.
o   Large amount of data to be processed to develop these complex reports. Hence the end client does not desire to load the transaction system with these reports.
o   The reports are very complex.
o   The reports require data consolidation from multiple data sources to be developed.
o   The data sanctity of the reports is under question since the data can be tampered within MS Excel
o   Developing the MS Excel reports takes a lot of time.
If the end client is facing these issues then Turbodata should be able to help with the following:
·         Data consolidation services: consolidate data from multiple data sources onto a single location for developing consolidated reports.
·         Data cleansing services: cleanse the data for any bad entries.
·         Data transformation services: provide any business logic that is required for the product.
·         Business Intelligence services at lowest cost: development of complex reports using best in class Business Intelligence  tool at lowest cost(free if possible).
·         User security: each user to see its own data only.
·         Design of online forms is possible.
·         Data auditing and data profiling services are available.
Contact:
Apoorv chaturvedi
Website: www.mnnbi.com
Phone: +91-8802466356


Monday, 15 January 2018

Reducing long query times through data compression and sql reduction

Nightly process completion at Afro Incorporation
Irfan, Production Manager, met Sohail(IT Head) in the meeting room.
Irfan: what happened champ? Look worried?
Sohail: It is the nightly report process that begins at 12 am. It is not completing by 9 am. Suneet(IT Manager) is asking for a new server costing Rs. 25 lakhs. Currently my SLA(service level agreement) is not been met.
Irfan: what will happen with the new server?
Sohail: will speed it up. The management is saying that there is no budget left for this year for IT.
Irfan: Now what?
Sohail: am stuck. How is it going for you?
Irfan: very nice. Since the new Japanese(Takashida) came in. He is removing the constraints in production, reducing inventory and increasing throughput. Why do you start the daily process at 12 am. Why not before?
Sohail: because complete data comes by midnight.
Irfan: say a transaction has happened at 11 am in the morning, then  as per the production team, it is in stock for 13 hours since the other data has not come. Am I correct?
Sohail: correct.
Irfan: Takashida would pre process the data immediately and remove the constraint. He would then store the same in a Work in progress area so that the final fitment is quick.
Sohail: basically work for 24 hours instead of 9 hours.
Irfan: correct. Remove the time constraint. But how would you store the work in progress output?
Sohail: I had this consulting team from M&N BI come up to me that it could store the Work in progress output in a separate database by pre processing the business logic
He gave me this link for data consolidation, datatransformation, data cleansing and even Business Intelligence from his website. The company was saying that even advanced analytics and resolving complex inventory issues should be possible by using its products.
What do you suggest?
Irfan: Let us see what Wasim(CEO) says.

An example of what his team has done for another firm is attached herewith:
http://mndatasolutionsindia.blogspot.in/2018/01/query-reductionimplementation-example.html


If you have the same problem as Sohail, contact the following
By:
Name: Apoorv Chaturvedi
Website: www.mnnbi.com
Phone: +91-8802466356


Sunday, 14 January 2018

GST Filing Services-Turbodata

Automated GST Filing using Turbodata and GSP Partners

Are you facing the following issues with regards to GST filing?
  • ·         Delay in filing
  • ·         Concern regarding the changing regulations from the government
  • ·         Concern regarding reconciliation: specially for customers using MS Excel upload.
  • ·         Have a manual process for GSTR filing. The manual process is prone to error
  • ·         Have high manpower costs related with GST filing.

Turbodata shall help your firm with faster, easier and more convenient GST filing.
How is Turbodata different?
  • ·         All the reports for the end client shall be developed on the cloud installation. Only a minimal extract for all the vouchers and masters shall be done from the end client location. The ETL team shall commit to usage of maximum amount of RAM for the same(say 1 GB for incremental data extract)
  • ·         The end client can do the prior change of the data. The system shall automatically take care of the same. This is enabled through incremental data load process using data normalization.
  • ·         No reports shall be developed at the client location. All the reporting work shall be done at the server location.
  • ·         Initial and incremental transaction data extract shall be done from the end client location.
  • ·         The end client need not worry about re filing the GST reports since it shall be done by the GSP partner automatically.
  • ·         The package is very easy to deploy, deliver and maintain. No high end software are required. The system can extract data from SAP, Tally and other source systems with ease.
  • ·         Dependence on MS Excel for tax filing purposes is taken away since it could result in data errors and discrepancies.
  Current system:




Why is the Turbodata system better?

Turbodata system:


·         Turbodata system is inspired by ‘The Deming Way’, ‘The Goal’ and the Toyota production system and the Inmon methodology.  In a nutshell the following are the features copied from the above systems by Turbodata:
o   No error prone data should be passed for the reporting purposes. The data needs to be cleansed, audited and consolidated before report development.
o   The processing of the transaction should be done as soon as the transaction has been fed in the source system. That is the processing should take place on a real time basis and not specifically at the end of the month. Turbodata enables this feature in the following manner:
§  Each transaction fed into the end client source system is assumed to be an order from the end client.
§  The system offers the facility for real time extract and upload(current system is manual but the data can be loaded on a daily basis by the end client go the server)
o   Once the data has been loaded onto the server, it is transferred to a normalized database(insert, update and deletes). At the data warehouse level the data cleansing, data transformation, data consolidation activities are done
o   Once the data has been cleansed at the datawarehouse level then the reports for GST are developed. In one single lot, GSTR1, GSTR2 and GSTR3 reports can be developed.
o   Turbodata is integrated with at least one GSP partner. The end client could look at other GSP partner solutions if it desires the same.
o   The deployment of the solution is very easy and convenient. For any end client the deployment should take not more than 20(twenty) minutes. Minimum installation pre requisites are required.
o   The data for the end client is stored in a datawarehouse. The end client does not need to worry about changes in the statutory requirements. Other high end services like inventory optimization and predictive analytics are possible on the cloud.

To check why should the end client consider Turbodata GST, please check the following linkage:
http://mndatasolutionsindia.blogspot.in/2018/02/why-turbodata-gst.html


GST Reporting Issues:
Contact
Apoorv Chaturvedi
Phone: 8802466356
Email: support@mndatasolutions.com;support@turbodatatool.com
Website: www.mnnbi.com

Indicate the following:
·         ERP system/ERP systems
·         Turnover: frequency of load
·         Number of locations

  Sample video link: https://www.youtube.com/watch?v=sYbeBfc3ozo&feature=youtu.be

       The product uses optimum RAM so that the source system does not hang during extraction as given in the following video:
       https://youtu.be/7CULkzc5h2g






FOR FREE MICROSOFT POWERBI DASHBOARDS: please do one of the following:Email: support@mndatasolutions.com;support@turbodatatool.com 




Initial and Incremental data Load Template by M&N Business Intelligence-SAP Data Services

  INCREMENTAL LOAD/CREATION OF DIMENSION TABLE LOGIC AT SAP DATA SERVICES END CLIENT In this particular document we shall be looking at the ...