Problem statement: Tally is a hierarchical database with a desktop application. In order to refresh the mobile apps one needs to sync the data with a nosql or relational database. The issue is how to do the incremental refresh.
Using the standard Tally XML export: Tally can export any report in the xml format. For example the Daybook can be exported using the following xml code.
The given xml request assumes that Tally is been used as a webserver
<ENVELOPE>
<HEADER>
<VERSION>1</VERSION>
<TALLYREQUEST>Export</TALLYREQUEST>
<TYPE>Data</TYPE>
<ID>Daybook</ID>
</HEADER>
<BODY>
<DESC>
<STATICVARIABLES>
<EXPLODEFLAG>Yes</EXPLODEFLAG>
<SVEXPORTFORMAT>$$SysName:XML</SVEXPORTFORMAT>
</STATICVARIABLES>
</DESC>
</BODY>
</ENVELOPE>
The output of the same shall give a complete list of masters and transactions.
However there could be a need to only get transactions in the load. In such a scenario the following set of sample collections export could be used.
<ENVELOPE>
<HEADER>
<VERSION>1</VERSION>
<TALLYREQUEST>Export</TALLYREQUEST>
<TYPE>Collection</TYPE>
<ID>salesVouchers</ID>
</HEADER>
<BODY>
<DESC>
<TDL>
<TDLMESSAGE>
<COLLECTION NAME="salesVouchers"IsModify="No">
<TYPE>Voucher</TYPE>
<FETCH>*.*</FETCH>
<COMPUTE>FOREXTRANSACTION_rohit:$$FOREXVALUE:$AMOUNT</COMPUTE>
<COMPUTE>Ledgeramount:$$COLLECTIONFIELD:$AMOUNT:1:ALLLEDGERENTRIES</COMPUTE>
<COMPUTE>HsnIgstRate_ledger:If $$IsEmpty:$.GSTOVRDNRate Then $$GetInfoFromHierarchy:$ledgername:"ledger":"GSTRate":$date:"GST":0:@@IGSTDutyHead Else $.GSTOVRDNRate</COMPUTE>
<SYSTEM Type="Formulae" NAME="opfilter">$$ISSALES:$VOUCHERTYPENAME</SYSTEM>
</COLLECTION>
</TDLMESSAGE>
</TDL>
</DESC>
</BODY>
</ENVELOPE>
In this scenario only the sales vouchers are been extracted based on the filter parameters for the vouchers. No master details have been extracted for the same.
However the complete extraction of the sales vouchers could result in a heavy load on the transaction system. In such a scenario it is best to extract the vouchers on a day by day basis beginning with the DSPFromdate and ending with the DSPTodate. The data from each voucher entry shall then be stored in the database.
Finding if the voucher is insert, upddate or delete. The same can be done through the field Remoteid in the xml export as given below:
VOUCHER REMOTEID="13excel".
Note: all xml exports work for only the current company that has been opened in the xml voucher. If there are multiple companies then the xml export shall need to be done for each company individually.
Heavy load on the system: in case the load on the system is heavy then one could extract the voucher on a date by date basis using the SVFROMDATE and SVTODATE parameters.
<ENVELOPE>
<HEADER>
<VERSION>1</VERSION>
<TALLYREQUEST>Export</TALLYREQUEST>
<TYPE>Collection</TYPE>
<ID>VOUCHERINCREMENTAL</ID>
</HEADER>
<BODY>
<DESC>
<TDL>
<TDLMESSAGE>
<COLLECTION NAME="VOUCHERINCREMENTAL" ISMODIFY="No" ISFIXED="No" ISINITIALIZE="No" ISOPTION="No" ISINTERNAL="No">
<TYPE>VOUCHER</TYPE>
<NATIVEMETHOD>*.*</NATIVEMETHOD>
<PARMVAR>svfromdate: Date:$$Date:@@machinedate-300</PARMVAR>
<PARMVAR>svtodate:DATE: $$Date:@@machinedate-200</PARMVAR>
</COLLECTION>
</TDLMESSAGE>
</TDL>
</DESC>
</BODY>
</ENVELOPE>
In the above TDL code the PARMVAR for from and todate can be set to capture the data for the voucher incrementally.
Masters incremental extraction XML: In the case of masters there is little option in terms of incremental extraction using XML. One needs to sync the entire masters at one go. The same could take time.
New approach: for the new approach, the M&N BI developers looked at the following parameters for initial and incremental extraction.
a.) Unlimited companies: we desired to use the loopcollobj to extract the data from unlimited companies
b.) Lighter data extract: we decided to use json[used in a standard fashion for api extract]
c.) More complete extract than done by other SaaS providers.
d.) Lower load on the system
e.) Faster extract for the transactions and the masters: we decided to extract the data on a real time basis[as the voucher is entered into the system]. We desired to use the event based activities within Tally specifically ON form accept and on form delete.
f.) Updation of the masters as the voucher is entered or the masters are updated/inserted.
g.) Reducing the data load transferred from the system to the cloud.
For the same we developed custom collections to extract data from unlimited companies using the Tally function loopcollobj.
Reduce the load on Tally system while extracting the voucher data: In order to reduce the load on Tally during voucher extract the team took the following actions:
a.) Broke the voucher into child components.
Say
The same is extracted as follows:
Similarly voucher ledger details are extracted as follows;
After the extract was done in json files, the details were joined using sql joins using the composite keys companyname, masterid and so forth.
b.) Converting the heirarchial functions to SQL joins: The sql joins are atleast 70% faster than the heirarchial functions in terms of speed and consume atleast 50% less memory than the functions. The same concept was used in the database level and the Powerbi module to speed up the querying process.
Reducing the time to extract: Our team developed a datawarehouse and thereby captured the incremental data load using the form accept events for masters and vouchers in Tally. This would further speed up the extraction process.
Sample code:
For example for the masters the incremental data load logic could be as follows:
[Collection: Group_Crash]
Type: Group
Filter:MASTER_FILTER_Groupfilter
Fetch : Parent,ParentHierarchy,Masterid,Alterid,Isrevenue,Affectsstock,AffectsGrossprofit,Reservedname,Reservename+
,Issubledger,Istdsapplicable,Istcsapplicable,isbillwiseone,iscoscentreson,Istdsexpense,istcsapplicable,Isdeemedpositive
Compute :Area :$_Area
Compute :PrimaryGroup :$_PrimaryGroup
Compute :CompanyName :##SVCurrentCompany
Compute: Groupname: $Name
;select $Name,$Parent,$Area,$ParentHierarchy,$PrimaryGroup,$Masterid,$Alterid from Group_crash
[#SYSTEM:FORMULAE]
MASTER_FILTER_Groupfilter: $$String:$Name=$$String:#MSTNAME
Here #MSTname is the name associated with the group name in the Group form.
Similarly for voucher:
[#Form:VOUCHER]
On:Form Accept:Yes:FORM ACCEPT
On:Form Accept:Yes:Call:MasterFunction_Transaction
The voucher collection would include a statement like:
[collection:VoucherDetails]
Type:Voucher
;Parm Var: svfromdate: Date:$$Date:@@machinedate-30
;Parm Var: svtodate:DATE: $$Date:@@machinedate
FILTER:MASTER_FILTER_VOUCHER
where the MASTER_FILTER_VOUCHER is as follows:
[#SYSTEM:FORMULAE]
MASTER_FILTER_VOUCHER: $Date=#plainvchdate and $$String:$Vouchernumber=$$String:#VCHNUMBER
Reducing the data load transferred from the system to the cloud: this was done through incremental data load and through data compression achieved through Powerbi module. If a SQL Server instance instance is installed then the speed is quicked by having api request with a single table.
The above approaches can be considered for the initial and incremental data load from Tally to the cloud.
Name: Apoorv Chaturvedi
Phone:+91-8802466356
email:support@turbodatatool.com;support@mndatasolutions.com
youtube channel:@mndatasolutions2042.
Indiamart place:https://www.indiamart.com/mnbusinessintelligenceindia/