Tuesday, November 1, 2011

Delta methods In LO

Queued Delta

In this mode, extraction data is collected from document postings in an extraction queue from which the data is transferred into the BW delta queue using a periodic collective run. The transfer sequence is the same as the sequence in which the data was created ,

When a document is posted it is saved to application table, and also saved to the Extraction Queue ( here is the different to direct delta) and you have to schedule a V3 job to move the data to the delta queue periodically and from their it is moved to BW.

Direct delta.

When a Document is posted it first saved to the application table and also directly saved to the RSA7 (delta queue) from here it is being moved to BW.

So you can understand that for Delta flow in R/3 Delta queue is the exit point.

Un serialized V3 Update

This method is largely identical to the serialized V3 update. The difference lies in the fact that the sequence of document data in the BW delta queue does not have to agree with the posting sequence. It is recommended only when the sequence that data is transferred into BW does not matter (due to the design of the data targets in BW).

You can use it for Inventory Management, because once a Material Document is created, it is not edited. The sequence of records matters when a document can be edited multiple times. But again, if you are using an ODS in your inventory design, you should switch to the serialized V3 update.

LO Cockpit (LO Extraction )Delta Mechanism

For extracting logistic transactional data from R/3, a new generation of datasources and extractors, no longer based on LIS (Logistic Information System) information structures, was developed starting from BW Release 2.0B and PI 2000.1 or PI-A 2000.1 (valid from R/3-Release 4.0B ).
The tools for the logistics extract structures can be found in the IMG for BW (transaction SBIW): access into your OLTP system, choose Customer-Defined DataSources -> Logistics -> Managing Extract Structures.
The Logistics Extract Structures Customizing Cockpit (you can directly see it by transaction LBWE) represents the central tool for the administration of extract structures.

image

Ok...but, in other words, what is this the Logistic Cockpit (LC)?
We can say that it’s a new technique to extract logistics information and consists of a series of a standard extract structures (that is, from a more BW perspective, standard datasources), delivered in the business content thanks to a given plug-in.
But, what is the logic behind these datasources that allows to manage the logistic flows towards BW in delta mode after an initial load of the historical data (done with update mode ‘Delta Initialization’, by retrieving data from the setup tables, which are assigned to each extract structure, and are filled using special setup transactions in OLTP) ?
Following from many questions posted until now in SDN BW Forums, in this weblog we will focus only on the delta mechanism of the LC and not on the other tasks we can manage inside it, like the necessary steps for activating and carrying out successful data extraction or the maintenance of extract structures and datasources (but, don’t worry, it will arrive also a summary weblog dedicated to these important procedures in the next days !).

The V3 Update

Unlike the LIS update (well, I know that you are asking ‘but how does this old LIS update work ???’...my dear, another weblog will arrive very soon also for this topic...sorry for the waiting, but I have a lot of things to do !), data is transferred from the LIS communication structure, using extract structures (e.g. MC02M_0HDR for the header purchase documents), into a central delta management area.
This transfer takes place thanks to the V3 update with a specific (scheduled) job and is therefore temporally detached from the daily application operations; the main consideration is that the delta management acts as a buffer (not depending from the application business) containing data that can be requested from BW via infopackage with update mode ‘delta’.
The following picture shows (with an high-level view) the interaction between the LIS communication structure and the V3 extraction technology.

imageWe said that for updating the extraction of transactional data from the different logistics applications (MM, PP, SD and so on), the technology for collective updates (‘V3 updates’) is used (until PI 2003.1).
This means that the data is collected in the R/3 update tables before the transfer to the interface: the data is retrieved there by means of a periodic update process that needs to be started in order to transfer delta records to the the BW system delta queue.
During this V3 collective run (that you can start and schedule from LBWE for each application component), the data is transferred to the BW delta queue (that you can see from RSA7 (see the picture below) or LBWQ transactions), from which they are retrieved by means of (delta) requests from the BW system.
image

V1, V2, V3...
When scheduling what

Normally in R/3 there are three types of update available:

Synchronous update (V1 update)

Statistics update is carried out at the same time (synchronous) as the document update (in the application tables).

Asynchronous update (V2 update)

Document update and the statistics update take place in different tasks.

So, V1 and V2 updates don’t require any scheduling activity.

Collective update (V3 update)

As for the previous point (V2), document update is managed in a separate moment from the statistics update one, but, unlike the V2 update, the V3 collective update must be scheduled as a job (via LBWE).
Remember that the V3 update only processes the update data that is successfully processed with the V2 update.

image

This is a key task in order to properly manage the right working of the BW logistic flows.
In fact, scheduling timing process is very important and it should be based on the basis of
1) the amount of activities on a particular OLTP system and on
2) the particular requirements related to the updating needs of data displayed in BW reports.

For example (relating to the first point), a development system with a relatively low/medium of new/modified/deleted documents may only need to run the V3 update on a weekly/daily basis.
Instead, a full production environment, with really many thousands of transactions everyday, may have to be updated hourly, otherwise postings will queue and can affect performance heavily.
About the second point: if, for example, the reporting timing refers to a monthly periodic view, successfully monthly scheduling the V3 update will ensure that all the necessary information structures are properly updated when new or existing documents are processed in the meanwhile.
Finally, the right choice will be the result of all these considerations; by doing so, the information structures in BW will be current and overall performance will be improved.
It’s possible to verify that all V3 updates are successfully completed via transaction SM13.
SM13 transaction will take you to the Update Records: Main Menu screen:


image
On this screen, enter asterisk as your user (for all users), flag the radio button ‘V2 executed’, select a range date and hit enter.
Any outstanding V3 updates will be listed.

At this point, it’s clear that, considering the V3 update mechanism, the main requirement is that the delta info have to be transferred in the same sequence to the BW system as it occurred in the OLTP system.
Just a consideration...if we had to load our delta records only in a cube, there would be no problem: everything goes in append and, in the end, we’ll find the final situation right displayed thanks to the OLAP processor!
But since updating in ODS objects is permitted in the logistics extraction for almost all DataSources, we have to consider any effects that can derive from the ‘overwrite’ update mode (specific of the ODS object).
For example, the consistent storage of a status field (e.g. delivery status) in ODS objects can only be ensured only with a right (serialized) delta sequence: if the record with ‘open delivery’ status (created as first record in R/3) arrives later than the record with ‘closed delivery’ one (created as second one in R/3), we would have a false representation of the reality.
image
Considering that, the sequence of the existing data records is recognized by and taken into account when reading and processing the update data (step A of the picture), as well as when transferring data to the BW system (step B).
Since the normal existing update methods actually does not recognize the serialized processing of update data, the Serialized V3 Update function was created (also thanks to subsequent several corrections in SAP Basis) in order to be able to serialize step A.