Tuesday, October 16, 2012

FI Data Extraction(Including New GL)



FI Extraction: 
FI Module deals with accounting and financial needs of an organization.
Financial Accounting is broken down into the following sub-modules: 
  • Accounts Receivables
  • Accounts Payable
  • Asset Accounting
  • Bank Accounting
  • Consolidation
  • Funds Management
  • General Ledger
  • Special Purpose Ledger
  • Travel Management
Note: Only discussing key areas (AP/AR/GL/SL) briefly because of the complexity of the area
We can extract the financial data at totals level / line item level. Please Refer the linkhttp://help.sap.com/saphelp_nw70/helpdata/en/af/16533bbb15b762e10000000a114084/frameset.htm
In general, we will use R/3 line item tables as the data source for extracting the data to allow drill down capability from summarized data to line-item details.
Financial Accounting data can be extracted directly from the tables.
Depending on the business requirement we can use either FI-SL or standard BW content extractors (FI-AR, FI-AP, and FI-GL) to fetch FI data.
Note: FI-SL will be discussed under "one stage stop to know all about BW Extractors -Part2 "which explains about Application specific customer generated extractors
FI-AR, FI-AP, and FI-GL:
General Ledger: All accounting postings will be recorded in General Ledger. These postings are real time to provide up-to-date visibility of the financial accounts.
Account Receivable: Accounts Receivables record all account postings generated as a result of Customer sales activity. These postings are automatically updated in the General Ledger
Accounts Payable: Accounts Payables record all account postings generated as a result of Vendor purchasing activity. Automatic postings are generated in the General Ledger as well.
Standard FI data sources:
0FI_GL_4 (G/L Accounts- line items)
Takes the data from the FI document tables (BKPF/BSEG) that are relevant to general ledger accounting (compare table BSIS).
0FI_AP_4 (AP-line items) and 0FI_AR_4 (AR- line items
Selections are made from tables BSID/BSAD (Accounts Receivable) and BSIK/BSAK (Accounts Payable)
With old G/L in R/3, the tables GLT0 (totals), BSEG, BKPF (Line Item) get filled in SAP BI side you should have to use data model.

0FI_GL_1 --> 0FI_GL_1 --> 0FIGL_C01 (for Totals)
0FI_GL_4 --> 0FI_GL_4 --> 0FIGL_O02 (for Line item)

With New G/L in SAP ECC 6.0 onwards, the tables FAGLFLEXT (totals), FAGLFLEXA, BSEG, BKPF (Line Item) get filled in BI side you have to use data model.

0FI_GL_10 --> 0FI_GL_10 --> 0FIGL_C10 (for Totals)
0FI_GL_14 --> 0FIGL_O14 (for Line item)

Functionally, this effects other financial modules like AP, AR, PCA (Profit Center Accounting)

for ex: while implementing new G/L in BI side, this fulfills most of profit center Accounting requirements, and you do not have to implement PCA module separately.
When I was in FI implementation, there was no 0FI_GL_14 datasource...

We had existing 0FI_GL_1 & 0FI_GL_4 flows and we implemented new GL totals flow i.e. 0FI_GL_10...

FAGLFLEXT --> 0FI_GL_10 --> 0FIGL_O10 --> 0FIGL_C10.... new GL Totals implementation was quite smooth, since this flow is completely different from old GL totals (GLT0 --> 0FI_GL_1 --> 0FIGL_C01).

We recreated existing (on 0FIGL_C01) queries on 0FIGL_C10 (&V10) and used jump targets (RRI) to old line item (0FIGL_O02) wherever required...

You can go ahead with new GL lineitems (FAGLFLEXA & BSEG & BKPF) --> 0FI_GL_14 --> 0FIGL_O14 in parallel with existing old one (BSEG & BKPF) --> 0FI_GL_4 --> 0FIGL_O02.
How the data extraction happens?
In FI extraction 0FI_AR_4 and 0FI_AP_4 are linked with 0FI_GL_4 in order to maintain consistent data transfer from OLTP system (it is called coupled data extraction, Ref OSS notes 428571).
Note: Uncoupled" extraction possible with Plug-In PI 2002.2, see OSS note 551044
0FI_GL_4 writes the entries into the time stamp table BWOM2_TIMEST in the SAP R/3 System with a new upper limit for the time stamp selection.
And now, 0FI_AP_4 and 0FI_AR_4 will copy this new upper limit for the time stamp selection during the next data extraction in the SAP R/3 System. This ensures the proper synchronization of accounts payable and accounts receivable accounting with respect to G/L accounting.
Full load: Not a valid choice because of large volumes of detailed R/3 transaction data.
Delta load:
Note: Here the delta identification process works differently for new financial records and for changed financial records.
New Financial accounting line items which are posted in SAP R/3 sytem will be identified by the extractor using the time stamp in the document header (Table BKPF-(field) CPUDT).
By scheduling an initialization IP all the historical data can be loaded into BW from the application tables and it also sets "X" indicator in field LAST_TS (Flag: 'X' = Last time stamp interval of the delta extraction).That means after the last delta, initialization was done.




After this, daily delta loads can be carried out depending on timestamp by scheduling delta info packages.
During the delta load , the SAP R/3 system logs two time stamps that delimit a selection interval for a Data Source in table BWOM2_TIMEST(fields TS_LOW and TS_HIGH).














In case of changed FI documents, selections will be based on tables:
BWFI_AEDAT and (timestamp table) BWOM2_TIMEST (See OSS note 401646 for more details).
Delta extraction using delta queue method can also be possible incase if we want,
  • Serialization of the records
  • To distribute delta records to multiple BW systems.
FI -Delta Mode:
A time stamp on the line items serves to identify the status of the delta. Time stamp intervals that have already been read are then stored in a time stamp table (BWOM2_TIMEST).
(Info object 0Recordmode plays vital role deciding delta's .Check the field "delta "in ROOSOURCE /RODELTAM table to identify the image)
The Financial Accounting line items are extracted from the SAP R/3 system in their most recent status (after-image delta method).
AIE: This delta method is not suitable for filling Info Cubes directly in the BW system. To start with therefore, the line items must be loaded in the BW system in an ODS object that identifies the changes made to individual characteristics and key figures within a delta data record. Other data destinations (Info Cubes) can be provided with data from this ODS object.
It uses delta type E(pull) means the delta data records are determined during the delta update by the data source extractor, updated to the delta queue and passed on to BI directly from there.

One stage stop to know all about BW Extractors-Part2


This blog focuses on customer generated extractors behavior (EX:  CO-PA and FI-SL).
Along with that ,this blog also explains on data recovery methods incase of lost delta.
Please refer "One stage stop to know all about BW Extractors-Part1" to get an idea on BW content extractors.
Note: With regards to Generic extraction, only helpful links are provided.
Application specific-customer generated extractors:
Controlling:
Controlling is broken down into following sub modules:
  • Cost Element Accounting
  • Cost Center Accounting
  • Internal Orders
  • Activity-Based Costing ( ABC)
  • Product Cost Controlling
  • Profitability Analysis
  • Profit Center Accounting
Note: Only discussing (CO-PA) briefly because of the complexity of the area.
CO-PA:
Profitability analysis allows Management to review information with respect to the company's profit or contribution margin by business segment. 
It can be obtained by the following methods: 
  • Account-Based Analysis
  • Cost-Based Analysis
Note:The details will be discussed once after understanding the CO-PA data flow.

How the data Extraction happens?
When the data is requested from SAP BW, the extractor determines which data source the data is to be read from. This depends on the
  • Update mode (full, initialization of the delta method, or delta update)
  • On the definition of the DataSource (line item characteristics (apart from field REC_WAERS) or calculated key figures)
  • On the available summarization levels.
The extractor always tries to select the most appropriate data source, that is, the one with the smallest data volume.
image

Once an Info-Package is executed, the SAP BW Staging Engine calls the CO-PA transaction data interface. CO-PA extraction program for the SAP BW uses the same replication method as the update program for CO-PA updating summarization levels. On the BW side, only data that is "at least 30 minutes old" is received .This is to secure data integrity.Because the time stamps from different application servers can be slightly different.
This retention period of 30 minutes is often described as a "security delta/Safety delta" The system only extracts data that is at least 30 Min Old.

Account-Based Analysis
For account-based CO-PA extraction, only Full Update from summarization levels is supported for releases up to and including Release PI2001.1.
In this case we can carry out delta using pseudo delta technique. Here we need to do selective full load based on some selection conditions (Fiscal period) and then we need to selectively drop the requests for the last period and reload the data that have changed.
From Release PI2001.2, the delta method can also be used.
Initialization: The initialization must be performed from a summarization level.
Delta update: Delta will be read from line items.
During the delta load controlling area, fiscal period fields should be mandatory.
Note: If the data needs to be read from a summarization level, then the level must also contain all the characteristics that are to be extracted using the Data Source (entry * in maintenance transaction KEDV). Furthermore, the summarization must have status ACTIVE.
Account based CO-PA is part of the CO module. This means the data which is posted in account based CO-PA is always in synchronize with the CO-module (CCA, OPA, PA, PS etc).
The CO tables are COEP, COBK (for line items) COSS and COSP (for the totals).

Cost-Based Analysis:
In the case of costing-based CO-PA, data can only be read from a summarization level if no characteristics of the line item are selected apart from the Record Currency (REC_WAERS) field, which is always selected.
An extraction from the segment level, that is, from the combination of the tables CE3XXXX / CE4XXXX (where XXXX stands for the operating concern), is only performed for Full Updates if no line item characteristics are selected (as with summarization levels).
Initialization: There are two possible sources for the initialization of the delta method. One is from Summarization levels (if no characteristics of the line item are selected) and the other one is from line item level.
In case of Summarization level, it will also record the time when the data was last updated / built.
 If it is not possible to read data from a summarization level, data is read from line items instead.
Delta update: Data is always read from line items.
Costing Based CO-PA data is statistical data. This means that the update of CO-PA is not always equal to what is stored in the CO modules or in finance. The cost element is also not always updated and there are also more key-figures used to store info about the type of costs or revenues.
Understanding various tables(CE1/CE2/CE3/CE4) that are involved in co-pa extraction, please read BW data extraction .
CO-PA Delta Mode:
Extraction is based on Timestamp.
When data is extracted from CO-PA, a "safety delta" of half an hour is used with the initialization and the delta upload. This always ensures that only records that are already half an hour old since the start of the upload are loaded into SAP BW. Half an hour was chosen as the safety delta to overcome any time differences between the clocks on the different application servers.
Please check the below links for more information:
  
FI-SL:
There are two types of ledgers in the FI-SL System:
Standard Ledger: Delivered by SAP, Ex: General Ledger Accounting (FI-GL)
Special Purpose Ledgers: These will be designed as per business needs (User defined,Ex:FI-SL)
The FI-SL Data Source can supply the data both at totals record level and also at line item level

How the data extraction happens?
Prerequisite:
Since FI-SL is a generating application, the Data Source, transfer structure and assignment of the DataSource to the InfoSource must be created manually.
FI-SL line items:
Line item Data Source provides actual data at line item level.
Full and Delta mode:  FI-SL line items can be extracted both in full and delta upload mode. The time stamp (TIMSTAMP field in the extract structure) is used to identify the delta load, which is supplied from the CPUDT and CPUTM fields in the line items table. It uses safety delta concept set to one hour. This means that posted line items can be loaded into BW after an hour.
Constraint:
The extract structure does not contain the BALANCE field. Refer note 577644 to find out alternative ways to populate this field.
FI-SL Totals Records:
This DataSource can provide both actual and plan data at totals record level
Full update: The full update DataSource can be used to determine the balance carry forward, since the line items DataSource does not supply this.
Usually Plan data will be transferred using the totals datasource in full update mode.
Delta Update: The delta method can only be used for actual data with the selection (0VTYPE = 010). The Delta method is based on Delta queue technology. That means after initialization during updating, the relevant data is posted to the Delta queue.
Before running the delta, please check the restrictions in the below link

  

Part3: Cross application -Generic extractors
When none of the SAP- predefined extractors meeting the business demand, then the choice is to go for Generic extraction
We will go for Generic extraction:
  1. When Business content does not include a data source for your application.
  2. Business content requires additional enhancements that need data that is not supplied by SAP BW.
  3. The application does not features it's own generic data extraction method
  4. When the requirement demands to use your own programs to fill your tables in SAP Systems
Check the below link for more information:

  

Data recovery:
Scenario 1:  The last run delta was failed(Not applicable to ALE based datasources)
Solution:
Make the QM status red, delete the request from all targets
Re-schedule the load this time it will prompt a window as shown below
image

Click on request again, it will recover the failed request
Senario2: Everyday delta was running fine but you find suddenly delta is missing for certain period (the reason may be anything),
Solution:
1. Reload data from the PSA
2. Reload data from an ODS Object or an Info Cube (in a layered
Architecture, EDW approach)
Applicable to Logistics:
Please refer "One stage stop to know all about BW Extractors-Part1" to get an idea on Logistics extraction .
Option 1 and 2 are not applicable, the only choice is to extract the data from sources system
Check this OSS notes: 691721: Restoring lost data from a delta request
Here again we have one more constraint
As explained in the above OSS, because of huge data we can't bear the downtime due to re-initialization, we have a workaround here
1. in BW,transfer the existing target contents to an external source using open hub services
2. Then selectively fill the setup tables for the missing data for the respective duration.
3. And run initialization, schedule V3 jobs to enable delta postings
(Here we have a drawback, since we are deleting setup tables and refilling them using selections, setup tables won't contain entire historical data)
Check this interesting document:
Further-602260: Procedure for reconstructing data for BW
In case of ODS
You can go for repair full load(739863: Repairing data in BW)
Senario3:
Accidentally good data was deleted consequently all the data which was loaded later was deleted.(assuming no further data mart, no aggregates to avoid the complexity, if considered the solution is more dynamic and situational)
Check these links for more intricate details to handle the above situation

One stage stop to know all about BW Extractors-Part1

image 
What do you feel after having a glance at the above snapshot.............
Wait a minute I know what you guys are thinking..
Few of us will say immediately with out any second thought that it is a graphical representation of various extractors involved in data acquisition.
Great! We got the answer
But if the image still dwells on your mind and if it motivates you to know more on how the various extractors behave (or) how the extraction happens?
This blog is created to answer this.

Note:Please refer "One stage stop to know all about BW Extractors-Part2", to know how customer generated extractors behave.

Extractor in a simple terminolgy is used for extracting  the data from various sources to BW.
For this purpose we have SAP pre-defined extractors (LO extraction etc...) and customized extractors (Generic extractors)

  

Application specific BW content extractors:
Lo Extraction:
Logistics refers to the process of getting a product or service to its desired location upon request which involves transportation, purchasing, warehousing etc.
Main areas in logistics are: 
Sales and Distribution (SD)          :  application 11, 13, 08 (in LBWE T-code)
Materials Management (MM)        :  application 03, 02
Logistics Execution (LE)               :  application 12
Quality Management                  :  application 05
Plant Maintenance (PM)              :  application 04, 17
Customer Service (CS)               :  application 18
Project System (PS)                   :  application 20
SAP Retail                                    :  application 40,43,44,45


How the data extraction happens?
Extraction can be done using either Full update/delta update.

Full load: Incase of logistic application, Full/Initialization will extract the data from setup tables (contains only historical data).
So if you have decided to go for full load, wait a minute there is a road block
For full update the data will be taken from setup tables, so in order to capture the changes you need to fill setup tables every time ,which will be a laborious task.
So, it is always suggestible to go for delta loads which makes loading life easier.
Read the below note to get details on delta load-:

Initialization: Data will be fetched from application table to setup tables (In Lo extraction, the extractor won't allow the direct communication with the application tables) from here, data finally reaches the target (info cube/ODS).Remember this process is for onetime.
Pre-requisites: Prior to initialization make sure the following steps are completed:
  1. Maintain Extract Structure
  2. Maintain data sources
  3. Activate Extract Structure
  4. Delete Setup tables
  5. Fill  setup tables

Delta load: Once after successful initialization, we can use delta update to capture the changed /new records
Once a new transaction happens/an existing record is modified, upon saving it goes to the respective application table. From here depending on the update mode LOGISTIC COCKPIT DELTA MECHANISM - Episode three: the new update methods the data will be populated in delta queue (RSA7) and finally reaches to BW.


image

Pre-requisites: Prior to delta loads make sure the following steps are completed:
1.Define periodic V3 update jobs 2. Setting up the update mode (direct/queued/Un serialized v3 update)

LO- Delta Mode:
Info object 0Recordmode helps in identifying the delta
Check the field "delta "in ROOSOURCE /RODELTAM table
Incase of Lo extraction it is "ABR"
ABR: An after image shows the status after the change, a before image the status before the change with a negative sign and the reverse image also shows the negative sign next to the record while indicating it for deletion. This serializes the delta packets.This process supports an update in an ODS object as well as in an Info Cube.
  
FI extraction:
FI Module deals with accounting and financial needs of an organization.
Financial Accounting is broken down into the following sub-modules:

  • Accounts Receivables
  • Accounts Payable
  • Asset Accounting
  • Bank Accounting
  • Consolidation
  • Funds Management
  • General Ledger
  • Special Purpose Ledger
  • Travel Management
Note: Only discussing key areas (AP/AR/GL/SL) briefly because of the complexity of the area

We can extract the financial data at totals level / line item level.
In general, we will use R/3 line item tables as the data source for extracting the data to allow drill down capability from summarized data to line-item details.
Financial Accounting data can be extracted directly from the tables.
Depending on the business requirement we can use either FI-SL or standard BW content extractors (FI-AR, FI-AP, and FI-GL) to fetch FI data.

Note: FI-SL will be discussed under "One stage stop to know all about BW Extractors-Part2 "which explains about Application specific customer generated extractors

FI-AR, FI-AP, and FI-GL:
General Ledger: All accounting postings will be recorded in General Ledger. These postings are real time to provide up-to-date visibility of the financial accounts.
Account Receivable: Accounts Receivables record all account postings generated as a result of Customer sales activity. These postings are automatically updated in the General Ledger
Accounts Payable: Accounts Payables record all account postings generated as a result of Vendor purchasing activity. Automatic postings are generated in the General Ledger as well.

Standard FI data sources:
0FI_GL_4 (G/L Accounts- line items)
Takes the data from the FI document tables (BKPF/BSEG) that are relevant to general ledger accounting (compare table BSIS).
0FI_AP_4 (AP-line items) and 0FI_AR_4 (AR- line items
Selections are made from tables BSID/BSAD (Accounts Receivable) and BSIK/BSAK (Accounts Payable)

How the data extraction happens?
In FI extraction 0FI_AR_4 and 0FI_AP_4 are linked with 0FI_GL_4 in order to maintain consistent data transfer from OLTP system (it is called coupled data extraction, Ref OSS notes 428571).
Note: Uncoupled" extraction possible with Plug-In PI 2002.2, see OSS note 551044

0FI_GL_4 writes the entries into the time stamp table BWOM2_TIMEST in the SAP R/3 System with a new upper limit for the time stamp selection.
And now, 0FI_AP_4   and 0FI_AR_4 will copy this new upper limit for the time stamp selection during the next data extraction in the SAP R/3 System. This ensures the proper synchronization of accounts payable and accounts receivable accounting with respect to G/L accounting.
Full load: Not a valid choice because of large volumes of detailed R/3 transaction data.

Delta load:
Note: Here the delta identification process works differently for new financial records and for changed financial records.
New Financial accounting line items which are posted in SAP R/3 sytem will be identified by the extractor using the time stamp in the document header (Table BKPF-(field) CPUDT).
By scheduling an initialization IP all the historical data can be  loaded into BW from the application tables and it also sets "X" indicator in field LAST_TS (Flag: 'X' = Last time stamp interval of the delta extraction).That means after the last delta, initialization was done.
 
OLTPSOURCE
AEDAT/AETIM
UPD
DATE_LOW
DATE_HIGH
LAST_TS
0FI_GL_4
16 May 2007/20:15
Init
01 Jan 1990
15 May 2007

0FI_GL_4
24 May 2007/16:59
delta
16 May 2007
23 May 2007

0FI_GL_4
21 June 2007/18:12
delta
15 June 2007
20 June 2007
X
0FI_AP_4
18 May2007/21:23
Init
01 Jan 1990
15 May 2007

After this, daily delta loads can be carried out depending on timestamp by scheduling delta info packages.
During the delta load , the SAP R/3 system logs two time stamps that delimit a selection interval for a Data Source in table BWOM2_TIMEST(fields TS_LOW and TS_HIGH).
image
In case of changed FI documents, selections will be based on tables:
BWFI_AEDAT and (timestamp table) BWOM2_TIMEST (See OSS note 401646 for more details).
Delta extraction using delta queue method can also be possible incase if we want,
  •  Serialization of the records
  •  To distribute delta records to multiple BW systems.

FI -Delta Mode:
A time stamp on the line items serves to identify the status of the delta. Time stamp intervals that have already been read are then stored in a time stamp table (BWOM2_TIMEST).
(Info object 0Recordmode plays vital role deciding delta's .Check the field "delta "in ROOSOURCE /RODELTAM table to identify the image)
The Financial Accounting line items are extracted from the SAP R/3 system in their most recent status (after-image delta method).
AIE: This delta method is not suitable for filling Info Cubes directly in the BW system. To start with therefore, the line items must be loaded in the BW system in an ODS object that identifies the changes made to individual characteristics and key figures within a delta data record. Other data destinations (Info Cubes) can be provided with data from this ODS object.
It uses delta type E(pull) means the delta data records are determined during the delta update by the data source extractor, updated to the delta queue and passed on to BI directly from there.

Check the below helpful links:
  
CRM extraction:
Customer relationship management (CRM) is broadly about managing the relationships with customers, and is useful to analyze customer, vendor, partner, and internal process information.

How the data extraction happens?
We can do both full load and delta load depending on the CRM extractor behavior.
Initialization:
During the initialization, all data that can be extracted using a data source is transferred from SAP CRM into SAP BW.
image
  • Execute the initialization of the delta process in SAP BW by creating and scheduling an Info Package.
  • SAP BW calls up the BW Adapter using the Service API.
  • The BW Adapter reads the data from the respective database.
  • The selected BDoc data is converted into the extract structure from a mapping module that is also entered in the BW Adapter metadata.
  • The type of Business Add-In (BAdI) that is called up by the BW Adapter depends on the BDoc type
  • The requested data package is transferred to SAP BW using the Service API.
Delta load:

image

  • Any new postings/uptation of old postings from the source sytem (CRM )side will be communicated via Middleware in the form of a BDoc.
  • The flow controller for Middleware calls up the BW Adapter. 
  • The BW Adapter first checks whether the change communicated via the BDoc is relevant for SAP BW. A change is relevant if a Data Source for the BDoc is active.
  • If the change is not relevant, it is not transferred to SAP BW and the process is complete.
  • If it is relevant, then the BW Adapter calls up the corresponding mapping module and BAdi (the type of BAdi that needs to be called up in turn depends on the type of BDoc).
  • And finally these will help in converting the BDoc data into the extract structure.
Note:The mapping module and  the BAdis that are called up during delta upload are same as those called up during the initialization of the delta process.
The change is transferred to SAP BW using the Service API.
CRM-Delta Mode:
The delta will be identified /communicated via middleware in the form of Bdoc to BW adapter.
CRM standard data sources support AIMD (After-Images with Deletion Indicator Delta Queue)
  
HR extraction:
The HR module enables customers to effectively manage information about the people in their organization, and to integrate that information with other SAP modules and external systems
HR broadly has the following modules:
PA (Personnel Administration) and Organization Management
Personnel Development
Payroll Accounting
Time Management
Compensation
Benefits
Training and Events
The Personnel Administration (PA) sub module helps employers to track employee master data, work schedules, salary and benefits information. Personnel Development (PD) functionality focuses on employees' skills, qualifications and career plans. Finally, the Time Evaluation and Payroll sub modules process attendance and absences, gross salary and tax calculations, and payments to employees and third-party vendors
HR delivers a rich set of business content objects that covers all HR sub-functional areas.

How the data extraction happens:
Before getting into how the data gets populated into HR info cube
Let's understand the term info type
"An info type is a collection of logical and/or business-related characteristics of an object or person"
Here the data will be extracted from an info type (PA, PD, time management etc) and for few other applications it is from the cluster tables (Payroll, compensation etc.)
HR is basically master data centric because it is always related to people related Info Objects, such as Employee, Person. In most of the cases HR master data is defined as Time Dependent to enable historical evaluation. HR R/3 system records a specific period of validity for each Info type.

Procedure to extract the HR data:
  • Activate Data Sources in the Source system (R/3)
  • Replicate Data Sources in BW system:
  • Activate business contents in BW.
  • Populate HR cubes with data by scheduling info packages.
Note: Master Data should be loaded first
Except for payroll and time management rest all sub-functional areas supports only full load.
In case of full loads, old data needs to be deleted to avoid duplicated records in the target.
Check the below links for more information
To know about customer generated extractors, Please refer "One stage stop to know all about BW Extractors-Part2"