Wednesday, June 26, 2013

How to and what to check in your BW system before GO-Live Before going to production your BW system ,please check the below things….

Transfer global settings. Goto Administrator Workbench -> Modeling -> Source Systems. Select an R/3 source system -> right-click and choose Transfer global settings. 
Select Currencies and choose a transfer mode.
Simulation - The transfer of Customizing tables/exchange rates is simulated. No data is updated.
Update tables - The Customizing tables/exchange rates are collected from the source system. The new entries are updated on the database.
Rebuild tables - The Customizing tables/exchange rates are collected from the source system. The tables are rebuilt, old data is deleted.
Then Execute.
Repeat these steps for Units of measurement, Fiscal year variants, and Factory calendar for each R/3 source system in BW.

Transfer exchange rates. Goto Administrator Workbench -> Modeling -> Source Systems. Select an R/3 source system -> right-click and choose Transfer exchange rates.
Maintain Exchange rates and the transfer mode.
Simulation - The transfer of Customizing tables/exchange rates is simulated. No data is updated.
Update exchange rates - The Customizing tables/exchange rates are collected from the source system. The new entries are updated on the database.
Transfer exchange rates again - The Customizing tables/exchange rates are collected from the source system. The tables are rebuilt, old data is deleted.
Then Execute.
Repeat these steps for each R/3 source system in BW.

Check the read mode for queries.
For a query, the OLAP processor can read the data from the fact table in one of three ways:
Reading all of the data
When executing the query in the Business Explorer all of the fact table data that is needed for all possible navigational steps in the query, is read in the main memory area of the OLAP processor. Therefore, all new navigational states are aggregated and calculated from the data of the main memory.
Reading the data on demand
The OLAP processor only requests the corresponding fact table data that is needed for each navigational state of the query in the Business Explorer. Therefore, new data is read for each navigational step. The most suitable aggregate table is used and, if possible, already aggregated on the database. The data for identical navigational states are buffered in the OLAP processor.
Reading on demand when expanding the hierarchy
When reading data on demand (2), the data for the entire - meaning completely expanded - hierarchy is requested for a hierarchy drilldown. For the read on demand when expanding the hierarchy (3), the data is aggregated by the database along the hierarchy and is sent to the start level of the hierarchy (highest node) in the OLAP processor. When expanding a hierarchy node, the children of the node are then respectively read on demand.
In general, the reading of data on demand (2) provides much better performance than reading all the data (1). This read mode should especially be 
considered for queries with many, free characteristics. A query that contains two or more free characteristics from different dimensions (e.g. 'Customer' and 'Product'), will probably only be efficiently executable in this mode, as the aggregates can only be optimally used when reading the data on demand.

For large hierarchies, aggregates should be created on the middle level of the hierarchy and the start level of the query should be smaller or the same as this aggregate level. For queries about such large hierarchies, the read on demand when expanding the hierarchy method (3) should be set.
Execute transaction RSRT and enter =RMRP into the OK field. After choosing Return, the read mode of your queries is the read mode recommended by SAP. Switch off all system traces. Execute transaction ST01. Trace status should be set to “Trace switched off”.Check trace tool. Goto transaction RSRTRACE. Verify there are no users activated for logging. Users can be removed by selecting the User -> select Deactivate user.Check BW Reporting Authorization Check Log. Goto transaction RSSM -> Authorizaton Check Log. Select a user to remove -> Remove User from List
Maintain the extraction settings for each source system. In R/3 source system goto transaction SBIW -> General Settings -> Maintain Control Parameters for Data Transfer.
1. Source System
Enter the logical system of your source client and assign the control parameters you selected to it.
You can find further information on the source client in the source system by choosing the transaction SCC4. 

2. Maximum Size of the Data Packet
When you transfer data into BW, the individual data records are sent in packets of variable size. You can use these parameters to control how large a typical data packet like this is. If no entry was maintained then the data is transferred with a default setting of 10,000 kBytes per data packet. The memory requirement not only depends on the settings of the data packet, but also on the size of the transfer structure and the memory requirement of the relevant extractor. 
3. Maximum Number of Lines in a Data Packet.
Upper-limit for the number of records per data packet.
The default setting is 'Max. lines' = 100000.
The maximum main memory space requirement per data packet is around
memory requirement = 2 * 'Max. lines' * 
1000 Byte, meaning 200 MByte with the default setting.
4. Frequency
The specified frequency determines the number of IDocs that an Info IDoc is to be sent to, or how many data IDocs an Info Idoc describes.
Frequency 1 is set by default. This means that an Info Idoc follows every data Idoc. In general, you should select a frequency between 5 and 10 but no higher than 20. The bigger the data IDoc packet, the lower the frequency setting should be. In this way, when you upload you can obtain information on the respective data loading in relatively short spaces of time.
With the help of every Info IDoc, you can check the BW monitor to see if there are any errors in the loading process. If there are none, then the traffic light in the monitor will be green. The Info IDocs contain information such as whether the respective data IDocs were uploaded correctly.

So many more settings we need to do ..will try to upload the document for this..pls write in comments if you find new points ..thanks. :)

Tuesday, June 25, 2013

Snapshot Scenario on Stock Data with APD

I recently was asked to share my knowledge of how to implement a snapshot scenario on stock data out of 0IC_C03 with Analysis Process Designer (APD) and here are the steps to take with screen shots of how the objects are implemented in our BW system (7.0) since 2009. All general information about the different approaches for analyzing stock data with cumulative and non-cumulative key figures are perfectly explained in the document "How to Handle Inventory Management Scenarios in BW (NW2004)" and therefore will not be mentioned in this blog of mine here.

Snapshot Scenario with APD on Stock Data of 0IC_C03
  1. Create Query for selecting the data out of 0IC_C03 that you want to have in snapshot Cube


2. Implement a DSO (type: direct update) for the data of your snapshot query
3. Implement an analysis process with APD (transaction RSANWB) and include an ABAP routine and transformation for mapping source and target fields:

4.  Execute the analysis process to insert data to your DSO; Result log:


5. Implement a standard InfoCube and a dataflow from DSO to InfoCube















6. You can update your DSO and InfoCube by Process Chain

ProcessChain: Program to start APD:
ProcessChain: InfoPackage to load data from DSO to InfoCube with OLAP variable for selection of current week:

Tuesday, June 4, 2013

Infoset

Definition:
  1. InfoSets are specific kind of InfoProviders.
  2. InfoSet describes data sources that are defined as a rule as joins of DataStore Objects, standard InfoCubes and/or InfoObjects (characteristics with master data).
  3. If one of the InfoObjects contained in the join is a time-dependent characteristic, the join is a time-dependent or temporal join.
  4. An InfoSet is a semantic layer over the data sources.
  5. Unlike the classic InfoSet, an InfoSet is a BI-specific view of data.
Usage:
  1. With activated InfoSets you can define Queries in the BI suite. 
  2. InfoSets allow you to analyze the data in several InfoProviders by using combinations of master data-bearing characteristics, InfoCubes and DataStore objects.
  3. The system collects information from the tables of the relevant InfoProviders. 
  4. When an InfoSet is made up of several characteristics you can map transitive attributes and analyze this master data.
  5. You create an InfoSet using the characteristics Business Partner (0BPARTNER) – Vendor (0VENDOR) – Business Name (0DBBUSNAME) and can thereby analyze the master data.
  6. You can use an InfoSet with a temporal join to map periods of time. 
  7. With all other types of BI object, the data is determined for the key date of the query, but with InfoSets with a temporal join, you can specify a particular point in time at which you want the data to be evaluated. 
  8. The key date of the query is not taken into consideration in the InfoSet
Creation of InfoSets
The step by step procedure on how to create InfoSets are described below
Step 1:
Go to InfoProvider tree of the Modeling function area in the Data Warehousing Workbench.
Or RSA1  InfoProvider.
Under your InfoArea  Rclick  Create InfoSet


Step 2
Enter the following descriptions for the new InfoSet:
  • Technical name
  • Long name
  • Short name (optional)
In the Start with InfoProvider section, you determine which InfoProvider you want to use to start defining the 
InfoSet.
Select one of the object types that the system offers you:
  • DataStore object
  • InfoObject
  • Standard InfoCube
If you want to choose an InfoObject, it must be a characteristic with master data. The system provides you 
with the corresponding input help. 
In our example, we have selected a DSO ZAD_DSO1


Choose Continue.
Step 3:
The first time you call the InfoSet Builder you can choose between two display modes: 
  • network (DataFlow Control) 
  • Tree (TreeControl). 
While the network display is clearer, the tree display can be read by the ScreenReader and is suitable for 
visually-impaired users. 
You can change this setting at any time using the menu path Settings  Display
The Change InfoSet screen appears.
Note: If you want to create a new InfoSet you can also use transaction RSISET to call the InfoSet Builder. 
Step 4:
Now save & activate the InfoSet.


Do’s and Don’ts for InfoSets
  • Do not use more than 10 InfoProviders in one InfoSet. It is better to create multiple InfoSets depending on reporting needs.
  • Do not use more than 10 joins in one InfoSet (especially if you expect high a data volume).
  • InfoSet queries can be used for DataStore objects without the activated BEx Reporting indicator.
  • Do not use calculations before aggregation on InfoSet because this may lead to wrong query results.
  • If there are InfoSets with time-dependent master data, do not restrict the data by the fields Valid from (0DATEFROM) and Valid to (0DATETO). 
Data Models Using Infosets:
  • InfoSets are InfoProviders that logically join data and provide this data for BI queries. 
  • InfoSets only reference basic InfoProviders (InfoCubes, DataStore objects, master data InfoObjects), but they contain no data. 
  • All the BEx and OLAP services are available (authorizations, texts, variables, hierarchies, calculated key figures) except navigational attributes of InfoSet characteristics.
  • In the InfoSet maintenance, you can make field descriptions unique for the BEx user and hide fields of the basic InfoProviders that are not important for reporting.
When to use InfoSets?
  1. To join required data from basic InfoProviders.This allows building a relational BI data model with unified views for reporting (several InfoProviders, but only one view). Therefore, we recommend keeping data in smaller, basic InfoProviders that can be flexibly joined for reporting purposes.
  2. To allow BEx Reporting on a DataStore object without turning the BEx Reporting indicator on
  3. To evaluate time dependencies (for example, join time dependent master data InfoObjects)
  4. To be able to create self joins and left outer joins
Join Concepts:
  • Inner join: A record can only be in the selected result set if there are entries in both joined tables
  • Left outer join: If there is no corresponding record in the right table, the record is part of the result set (fields belonging to the right table have initial values)
  • Temporal join: A join is called temporal if at least one member is time-dependent.
  • Self join: The same object is joined together
Performance Aspects of InfoSets:
  • InfoSets do not have the set of performance tools as InfoCubes (such as aggregates, partitioning, and compression).
  • Use left outer joins in InfoSets only when necessary. A left outer join has a significantly worse performance than a corresponding inner join.
  • If your reporting requirements on a DataStore Object are very restricted (that is, you want to display only very few, selective records), use an InfoSet on top of the DataStore object and disable the BEx Reporting indicator. This results in better data loading performance, but also in worse performance at BI query runtime if more than 10 records are selected from the DataStore Object.