Wednesday, 26 December 2018

Testing methodology for Turbodata using SSIS


Testing experience-summary

The attached blog indicates the comprehensive testing methodology adopted by Turbodata team for GST and audit  reporting purposes.   






       Business Requirements and Understanding:
The resource understood the business logic requirements for GST reports. Each measure and the value was specified with the end client(GSP partner). 
2.       Test planning and estimation: For the same the ETL team did the following steps:
a.       Arranged for sample Tally data from the end client
b.      Looked for the final GST reports to be developed at the ETL end. The numbers from the source system GST reports were matched with the data warehouse GST reports.
3.       Designing test cases and preparing test plan: for the numbers auditing purposes, the following was the methodology adopted by the ETL team.
a.       The testing was done for the entire fiscal year and fiscal month(the granularity of GST reporting is for the fiscal year and fiscal month)
b.      The testing was done across multiple companies
c.       The scenarios were developed for GST reports based on ledger entries, voucher type entries. For example how should credit notes be handled, how should sales invoice be handled?
d.      The specifications were signed off by the end client(GSP partner)
4.       Test execution with Bug closure and reporting:
·         The ETL process was run across various fiscal years and fiscal months. For each fiscal year and fiscal month, the variances with regards to the source ERP reporting module was reported to the developers
·         Finding the bug: The required business logic because of which error was happening was deducted and the details given to the developer.
1.       Summary report and result analysis: the final details were given to the ETL team based on the data audit numbers.


The types of testing that was done was as follows:
·         Incremental testing: when the jobs were migrated from development to production, the output of the incremental data was checked. For the same the ETL team used the following:
o   Used merge joins and lookup transforms for the same. The non-matching entries from lookup were stored in separate tables

·         Data quality testing: the dirty data was checked during data load. For example between stg_daybook_temp and stg_daybook the dirty data was cleansed. The log of the same has been kept for the audit purposes.
For matching the same, the ETL team used the following:
o   Fuzzy lookup
·         Data transformation testing: The business rules were validated with the end client. Thereafter the business logic was tested using SQL and SSIS coding. The features in SSIS used for the same were as follows:
o   Redirect rows on failure
o   Used lookup for finding the unmatched rows
·         Data accuracy testing: the output numbers in the reports were matched with the Tally GST reports to check the accuracy of the output data.
·         Data completeness testing: for the same, the ETL team did the following:
o   For the same, the ETL team did the following:
§  Used the @execute variable to check the row counts
§  Used the following at the dataflow level:
·         Redirect rows for transformations and target table loading: the unmatched rows were stored in staging tables for audit purposes.
·         At the sequence container level, the etl team used the following connection parameters for the following parameters:




  •                          Control flow:
§  Transaction level: required
§  Isolation level: serializable (where possible). To enable rollback of transactions. In case of nightly load after the day’s processing is completed, the etl team can used ‘transaction uncommitted’
§  Enable logging
§  Enable the usage of @executevariable
§  Usage of checkpoints and breakpoints for error handling


     o Dataflow:
§  Transaction level: supported
§  Isolation level: transaction committed (where possible).
§  Enable logging(use parent settings)
§  Enable the usage of @executevariable
§  Error logging: use parent settings
§                                                                                                                            Fail parent on failure : true  
     Metadata Testing: for the same, the ETL team used the following:
o   Usage of log tables
o   Used the data viewer during testing process

For metadata testing, Data Profiling transform can be used.

·         Application Upgrades: this entailed migration purposes across multiple end clients. For the same, the ETL team tested with the following:
o   Multiple package configurations
o   Multiple project deployment configurations: using manifest files
·         GUI/Navigation Testing: the final reports were tested for report reload and report refresh times
·         Source to Target Testing (Validation Testing): for the same, the filter parameters were checked while extraction to staging area. The following were the methods adopted for the same:
o   The extractor C# code was checked for the required data columns


Methodology adopted for testing: The source to target mappings given by end clients was taken as a base. Thereafter the testing process was started.

Test case scenarios:

Serial number
Mapping doc validation
M&N BI test case
1
Verify whether the business logic was mentioned
Every measure and dimension was tested for source mapping
2
Validation
        1.) Null: Derived column(findstring)
        2.)  Source and target data type to be same: Data conversion
      3.)     Validate the name of columns in the table against mapping doc: manually checked
.         



3
Constraints
All the target tables had primary key constraints
In the intermediary tables, the composite key constraints were checked using Row_Number().
4
Data consistency issues 
.        1.)      Stg_daybook_temp to stg_daybook load. Used fuzzy lookup, replace            and findstring for the same
5
Completeness issues
        1.        Confirm all the data is loaded: @valuevariable, log tables
        2.        Check for any rejected records: dataflow level(audit tables)
       3.        Check for data not to be truncated: warnings in SSI, output                varchar(255)
        4.        Boundary value analysis: WIP.
       5.        Compare unique value of key fields between source and target: audit table(sql not in statement used)
6
Correctness issues                     
        1.)      Data that is misspelled: manual checking, looku
        2.)      Null, unique, non null: referential key constraints were checked
7
Data quality
        1.)      Null check: handled above(conditional split)
        2.)      Date check: usage  of parameters, table variables
        Use Data profiler task
8
Duplicate check
        1.)      Check for unique key, primary key constraints in all tables
        2.)      Sort transform: remove duplicates
9
Date validation
      1.)      Know the row creation date: insert_ts, update_ts and load_ts to be added.
Used table variable and parameters for date entry

10
Data validation
1.)    To validate the complete data set in source and target table minus a query in a best solution: staging tables, audit tables
                 2.)      Difference between source and target: redirect the rows
11.
Data cleanliness
1.)    Remove unnecessary columns: optimize the dataflow




Types ETL bugs taken into account:

Serial number
Types of ETL bugs
details
1.)       
User interface bug
       1.)      Extractor of Tally and SAP: interface bug testing for and from dates
2.)
Boundary value analysis      
       1.)      The maximum and minimum dates in stg_daybook_temp and stg_daybook should be between the input dates in the ETL extractor. Use data profiler task
              3.)
Equivalence class partitioning(the data type should be specific)
HSN CODE should be 16 characters. If less or more than 16 characters then an error to be generated. Use data profiler task
              4.)
Input/output bugs
Related to above
5.)
Calculation bugs
Check physically
             6.)
Load condition bugs:
Not allow multiple users
Configuration file to be checked
              7.)
Race condition bugs
Test the system with full load. Check for incremental data load




Prepared by:

Ishwar Singh
Lead ETL Developer
Linkedin profile: https://www.linkedin.com/in/ishwar-singh-034324139/

Website: www.mnnbi.com
Email address: apoorv@mnnbi.com
Phone: +91-8802466356

Thursday, 20 December 2018

Usage of PowerBI over Turbodata for best in class reporting solutions




For free PowerBI dashboards elaborating the features below, contact the following:

  • Apoorv Chaturvedi
  • email: apoorv@mnnbi.com


The ETL team intended to develop a reporting solution over Turbodata with the following tenets:
·         Scalable: the reporting solution should be used across multiple end clients. Thus it should have the following features:
o   Role management
o   Migration properties
o   Connectivity across multiple databases such as SAP, Navision etc.
o   Optimum cost: PowerBI is free below 1GB of data.
o   Handling of reporting errors such as fan traps and chasm traps

Implementation of turbodata reporting in the form of Dashboard and semantic layer and relevant query implementation.
Design the semantic layer






Import Database and their tables.

Step-2: Import option loads the data onto local machine, direct query runs the query at database level.


We loaded Dimensions tables, Fact table and Aggregate tables.

Manage the relationship between tables.

Developing Reports

New Report

Export data to excel
Drill down and Drill through
  • }   Set up a hierarchy
  • }   Drill down on the graph by hierarchy
  • }   Drill down for data points


Open chart –inventory turnover

Drill up and Drill down
Custom measures-sample
Power BI Joins Implementations
Steps:
 i) Goto Edit Queries  ii) Click Merge Queries
 iii) Select Source and Destination Database
 iv) Select the join type (left, right, full)   v) Select common column 
We applied the condition with ok.
Query Applied Data:
Right Outer joins In Power Bi Desktop
      Steps: i) Goto Edit Queries    ii) Click Merge Queries   iii) Select Source and Destination Database
      iv) Select Right Outer joins   IV) Select common column 


Full Outer Join Table
Inner Join in Power BI
Output of inner-join
Why parameters?
}   Query to be executed in sql by using external inputs
      Report migration

      On fly ABC classification

Apply The Parameter inside the power BI

  1. Create the parameter 
  2. Create Queries

2. Goto PowerBI
 I) Right Click on the table, click on the Advanced Editor

We have solved the problem of Chasm trap and Fan trap by implementing the following:
I.)Union all
ii) Table alias name
iii) Decision on contexts
iv) Aggregate awareness


2. Goto PowerBI
  I) Right Click on the 
Role Management in Power BI
By the use of Manage role, we can apply the concept security like, we can give the access permission to specific user specific data and condition data access allowance.
 Data representation on the basis of Role base Authentication on Dashboard




Name: Aprajita Kumari
Email: aprajita@mnnbi.com
Phone: +91-9717077793

Alternate contact details: apoorv@mnnbi.com
Phone: +91-8802466356


Website: www.mnnbi.com



Initial and Incremental data Load Template by M&N Business Intelligence-SAP Data Services

  INCREMENTAL LOAD/CREATION OF DIMENSION TABLE LOGIC AT SAP DATA SERVICES END CLIENT In this particular document we shall be looking at the ...