BI Content How-to and FAQs for GRC RM-PC-FN 3.0 Table of Contents: 1. Introduction 2. Technical Requirements 3. Instructions and Information: I. Installation Steps: A) PC/RM Installation (perform installation and listed steps) B) BW Installation (perform installation and listed steps) C) Customizing Extraction in PC/RM (perform steps 1-5) D) Replicate DataSources in BW (perform listed steps) E) Activate Business Content in BW (perform listed steps) II. Post-Installation Steps: F) Confirm Business Content activation in BW (perform listed steps) G) Follow-up to activation in BW (perform steps 1-3) H) Load Supporting Data in BW (prior to loading PC/RM data into BW) III. BW Maintenance & Operation: I) Loading PC/RM Data into BW (perform steps 1-3) J) Executing BW Queries and Web Templates (mostly informational) IV. Helpful Information: K) Known Problems and Limitations (apply Notes if issues arise) L) Frequently Asked Questions Introduction: This document pertains to BI Content for PC 3.0 and RM 3.0. The content for PC 2.5 and RM 2.0 are completely different from a model perspective than PC/RM 3.0. Therefore, an upgrade from the older content versions to 3.0 is not possible. The BI Content model as delivered can be installed and connected to either PC or RM, or both applications. You do not need both installed/configured in the backend. The content will work if it is only connected to PC or RM even if some or all of the BI objects pertaining to the non-connected application are activated. However, the delivered content is designed for an integrated scenario which means that if both PC and RM are to be connected to BI then both should be installed on the same client. If they reside on different systems or clients then workarounds are possible but not supported. BI will extract the PC/RM data from the datamart in PC/RM so this needs to be loaded prior to any BI extracts. The BI Content contains extractors for the PC and RM data based on timeframes. The desired timeframes and selected frequency need to be specified in the PC/RM system prior to BI extractions. Delivered process chains in the BI system are used to execute the proper flow of the extracts. A timeframe can be updated using subsequent extracts. There is no delta mechanism so all records for the timeframe will be extract each time but the receiving objects in BI are set to “overwrite” to avoid duplicating the data. So, in effect, an update to the existing data (not an append) will occur. However, a full timeframe refresh option is also available from the process chains since subsequent timeframe extracts will not remove any data from BI which had been deleted in PC/RM. Numerous queries and web templates have been delivered with the content. Analysis authorization has also been delivered based on objects „Organizational Unit‟ and „Organization Unit in Regulation‟ but some setup is involved. Several “jumps” from BI queries to reports in the PC/RM system have also been delivered but some configuration is required. These steps are all described in this document. Technical Requirements: Required PC/RM system configuration – Minimum GRC PC/RM 3.0 SP02 (however, SP03 is strongly recommended) Recommended BI system configuration: SAP_BW 7.0 SP22 BI_CONT 7.04 SP04 or higher Instructions and Information: Installation Steps: A) PC/RM System Install minimum GRC PC/RM 3.0 SP02 in the GRC system (however, SP03 is strongly recommended) B) BW System Install SAP_BW 7.0 SP22, BI_CONT 7.04 SP04 or higher in the BW system Create a BW source system including the necessary RFC connection to the PC/RM backend system. Steps and prerequisites involved on the BW and PC/RM system can be found in the online help under „Creating SAP Source Systems‟ in: http://help.sap.com/saphelp_nw70/helpdata/en/ac/4a4e38493e4774e10000009b38f889/co ntent.htm C) Customizing Extraction The following configuration needs to occur in the PC/RM source system to enable integration with BW: 1) Maintain Timeframe Frequencies: In this Customizing activity, you specify the frequencies to be used in the Process Control and Risk Management applications. Beyond the standard frequencies defined in section Standard Settings below, other reasonable user-defined frequencies might be "quarterly", "semi-annually" or other non-calendar based cycles. Path: SPRO SAP Reference IMG GRC Process Control or Risk Management General Settings Key Attributes Maintain Timeframe Frequencies 2) Maintain Timeframes: In this Customizing activity, you define the timeframes to be used in the Process Control and Risk Management applications. Timeframes can be considered as specific named time buckets within one year (such as "January" or "Week 24") and need to be defined to cover the entire year. Each timeframe is assigned to one previously defined frequency, and then combined with defined offset settings to create a user-defined timeframe. Path: SPRO SAP Reference IMG GRC Process Control or Risk Management General Settings Key Attributes Maintain Timeframes 3) Transfer (activate) Business Content DataSources in the PC/RM source system (transaction SBIW): Activities Note that these steps need to be performed in both systems if PC and RM are run as separate applications. This will be necessary during the business content activation in BW as described below in step E. Execute the function „Transfer Application Component Hierarchy‟ under „Business Content DataSources‟ (alternatively transaction RSA9 can be used) Execute the function „Transfer Business Content DataSources‟ under „Business Content DataSources‟ (alternatively transaction RSA5 can be used). Transfer the delivered PC/RM DataSources into the active version. The DataSources are located under application components: GRC-PCRM30 and optionally also GRC-DP (DataSources for Process Control Events) 4) Maintain BI Extraction Settings (transaction GRFN_BI_TF_CUST) In this activity you can customize the timeframes and timeframe frequencies that will be used to extract data into BI system. These settings will serve as input to the data selection ABAP routines in the BI InfoPackages during extraction. However, the data to be extracted also must be filled in the datamart (also referred to as the report buffer). Activities: 1. Go to transaction GRFN_BI_TF_CUST („GRC BI Extraction Customizing‟) on the PC or RM system and make one and only one entry. 2. Specify appropriate values for the fields a. Frequency: Enter the frequency of the extracted data. The frequencies should have already been defined using transaction SPRO. (SAP Customizing Implementation Guide GRC Process Control General Settings Key Attributes Maintain Timeframe Frequencies) b. Timeframe: Enter the starting extraction timeframe. The timeframes should have already been defined using transaction SPRO. (SAP Customizing Implementation Guide GRC Process Control General Settings Key Attributes Maintain Timeframes) c. From Year: Enter the beginning of the time interval to be used to extract data. This field, in conjunction with the “Timeframe” field, is used to determine the start of the extraction time interval. Ex. If the Timeframe has been set to M05 and the From Year is set to 2008, then during extraction into BI, only data that is valid from 01 May, 2008 onwards will be extracted d. To Year: Enter the end year of the extraction time interval. The end date of the extraction is determined by the frequency. If the end date based on the frequency runs into the year following the “To Year” value, then that date will also be considered (even though it does not belong to the “to year” year). e. Mixed frequencies: This feature is reserved for the future and is not currently available. This setting will enable “prefixing” the timeframe ID with the frequency. f. Inconsistencies: It is recommended to keep this as “Inconsistencies logged as Error” (default setting). Any extraction-time inconsistencies will be logged as either warnings or errors depending on this flag. g. Do not extract special timeframes: Aggregation of Deficiencies (AoD) and Signoff data may belong to a different frequency and timeframe than specified in above settings. Hence it is recommended to keep this unchecked to extract AoD and Signoff relevant data as well. h. Click the Save button to save your changes 5) Maintain DataMart calculation (transaction GRFN_DM_MAINTAIN Maintain Datamart) Use this activity to create and maintain data that will be used to load data into BI system. The datamart is also referred to as the report buffer. If you need to extract data to BW, the corresponding snapshot has to be created in the datamart first, otherwise the extraction will issue warning about missing data for requested timeframe. In case you need to reload the given timeframe into BW with the most recent data, you must update the corresponding snapshot in the datamart first. Activities 1. Creating and Filling Datamart Timeframes: Use transaction GRFN_DM_MAINTAIN for maintaining datamart timeframes. 2. Steps to create Datamart entries 1. On your PC source system: a. Ensure that the required timeframe and timeframe frequencies have been created using transaction SPRO b. Ensure timeframe customizing settings have been appropriately configured using transaction GRFN_BI_TF_CUST 2. Go to transaction GRFN_DM_MAINTAIN and select “Maintain Datamart” then click „Execute‟ to enter the Datamart Log 3. Select “Goto Languages” from the menu options of the Datamart Log. Here you should maintain the languages that will be considered when extracting text elements into the datamart, and eventually, into the BI system. Once maintained, go back to the main screen of the DataMart Log. 4. Click on the “Create” button and enter the required information. 5. Depending on your reporting needs, specify the appropriate timeframe and year in the appropriate text boxes. Only one timeframe can be specified for each „create datamart entry‟ created. *** The „App. Component‟ field should always be “FN” as in the screenshot below. Entering “PC” or “RM” will not work, but entering “FN” (for Foundation) will process PC, RM and FN data. Enter “FN” even if you have only PC or RM installed, as well as when you have both of them installed. 6. Click the “Save” button to create the new datamart. 7. In order to fill the datamart, select the datamart entry (or multiple entries) with status “Created” (yellow) and click on the “Upload Data mart” icon. Data will not be filled in entries with status “Completed” (Green) or “Error” (Red). Depending on the frequency setting defined in GRPC_TF_CUST transaction, the data will be organized and loaded into this datamart entry. The status of the schedule upload will be set to “Completed” upon successful load of the data. 8. If you rerun the upload for an already loaded datamart entry (the „Created‟ entry), the existing data in the datamart entry (the „Completed‟ entry) will be available for extraction to BI as long as the new upload is collecting the data. Once the new upload has finished collecting the data to be loaded into the datamart entry, the old data will be REPLACED with the new one. The status of the „Created‟ entry will be temporarily changed to „In Process‟ while the datamart is recalculated. Another way to fill or refresh the datamart once the entries have been created is to go to transaction GRFN_DM_MAINTAIN and select either of the two highlighted entries below: „Fill Data Mart (Online)‟ or „Fill Data Mart (Batch). This will fill or refresh all of the datamart entries in one step. D) Replicate DataSources After activating the delivered DataSources in the PC/RM source system (step 3 under Customizing Extraction) you must go to BW and replicate these DataSources. In BW, call transaction RSA1 and under the „Modeling‟ tab select the „Source System‟ window. Find the PC/RM source system (do this for both source systems if these are run as separate applications) by expanding the „SAP‟ node under „Source Systems‟, perform a right-click on it, and choose „Replicate DataSources‟. Optionally, you can verify the replication of the DataSources by double-clicking on the PC/RM source system which will take you to the „DataSources‟ window. Navigate the nodes to find the „GRC PC/RM 3.0 (GRC-PCRM30)‟ application component and check for the existence of the DataSources in the subnodes. E) Activate Business Content in BW SAP BI Content is configured to work with the delivered, integrated PC/RM scenario. SAP does not support any enhancements to, or special configuration of, the PC/RM or BI delivered product. Any suggestions given in this document are intended for customers using the products in ways not intended by delivered design but SAP will not support such customizations. PC and RM are separate applications and they have distinct business content that can be activated and run independently of one another. However, they share the same foundational objects so you must activate all of the 0GRC_FN objects regardless of whether you are using the PC and/or RM content along with the process chains 0GRC_PCRM30*. The business content was designed for the situation where PC and RM are truly integrated and are both installed on the same client in one backend system and extracted using one BW source system. If you have a landscape where they are stand alone applications and reside in different clients or even different systems then the values of the shared FN objects such as Risk ID could potentially collide. In this situation customization (similar to MDM) must occur. For example, it may be necessary to append the system to the key fields (entities) in order to keep the values from different systems distinct. However, such modifications are not supported by SAP. Additionally, during business content activation in these situations it is also necessary to select both source systems during the InfoArea activation as described below (the DataSources must be transferred in step C-3 and replicated in step D for BOTH source systems). Activating the process chains takes special care and one option to be integrated in the steps below is presented here: 1. Load ALL chains with the PC source system selected 2. Delete chain GRC PC/RM 3.0: Load RM Targets 3. Reactivate the following chains with the RM source system selected: GRC PC/RM 3.0: Load Attribute Text GRC PC/RM 3.0: Load Master Data Text GRC PC/RM 3.0: Load Master Data Attributes GRC PC/RM 3.0: Load RM Targets GRC PC/RM 3.0: Reload GRC PC/RM 3.0: Start Load Authorizations GRC PC/RM 3.0: Start Load Hierarchies Activating content: In order to activate the delivered business content execute transaction RSA1 and select the „Business Content‟ menu and proceed as follows: Select the „Self-def‟d‟ source system as well as the PC/RM source system(s) Choose Collection Mode „Collect Automatically‟ Choose Grouping „In Data Flow Before‟ Collect the following InfoObject Catalogs: 0GRC_FN30*, 0GRC_PC30*, 0GRCRM30* and select „Install‟ Collect the following InfoAreas: 0GRC_FN, 0GRC_PC30, 0GRC_RM30 and select „Install‟ Choose Grouping „Only Necessary‟ If separate source systems are used for PC and RM then follow the instructions at the top of this section (E). Otherwise, simply collect all 0GRC_PCRM30* Process Chains and select „Install‟ Collect the following Roles: SAP_BW_GRC_FN30_ROLE, SAP_BW_GRC_PC30_ROLE, SAP_BW_GRC_RM30_ROLE and select „Install‟ (this should include all associated queries and web templates) Post-Installation Steps: F) Confirm Business Content object activation Optionally, you can call transaction RSA1 in BW and under „Modeling‟ ensure that the objects indicated in help.sap.com and their corresponding sub-objects are all present and active as follows: Under „InfoObjects‟ check for the existence and active status of InfoObjects under the following InfoObject Catalogs: 0GRC_FN30_CHA01, 0GRC_FN30_KFY01, 0GRC_PC30_CHA01, 0GRC_PC30_KFY01, 0GRC_RM30_CHA01, 0GRC_RM30_KFY01: Under „InfoProvider‟ check for the existence and active status of InfoProviders, Transformations, DTPs and InfoPackages under the following InfoAreas: 0GRC_FN, 0GRC_PC30, 0GRC_RM30: Call transaction RSPC and ensure that all Process Chains are present and active under „GRC PC/RM 3.0 (Process Control and Risk Management 3.0)‟: In the Business Explorer (BEx) use the Query Designer to ensure that all queries are active. You can drill down by roles to see queries by InfoProvider: The delivered roles are: SAP_BW_GRC_FN30_ROLE SAP_BW_GRC_PC30_ROLE SAP_BW_GRC_RM30_ROLE These roles should be assigned to users so they can readily access the delivered queries from either the User Menu is BI or from BEx. If any objects exist but are inactive, repeat the activation steps from above, „Activate Business Content in BW‟. If some objects are still inactive after repeated attempts, reactivate them manually in the workbench. If any objects are missing then go to the BI Content tab and reactivate them from content according to the directions in the section above. G) Follow-up to activation (to be done in BW) 1) According to Note 871132, ensure that InfoObject 0RTYPE (Exchange Rate Type) does NOT have the conversion routine „ALPHA‟ specified in tab „General‟. This could cause errors when executing queries containing currency conversion routines. Please follow the directions in the note carefully. 2) In order to initiate BW 7.0 Authorizations, the following InfoObjects need to be active in RSA1: 0TCAACTVT, 0TCAIPROV, OTCAKYFNM, OTCAVALID. Additionally, ensure that the delivered objects 0GFN_OU and 0GPC_OURE are active as these objects play a significant role in PC/RM authorizations in BI Content. All of these InfoObjects also need to be made authorization relevant (if they are not already) by selecting this option in the „Business Explorer‟ tab. Additional instructions pertaining to enabling authorizations for PC/RM in BW are covered under the subsequent section on authorizations in this document. 3) Query sender/receiver assignments (for „jumps‟ to the backend PC/RM system) have been delivered with BI Content but the Receiver Object URLs associated with the „jumps‟ to the backend system need to be maintained to reflect the customer‟s landscape (for technical details see the FAQ section in this document under „How do I retrieve comments / long text from the PC/RM system?‟). The delivered RSBBS entries contain template URLs and specific parameters used to call the backend report so be careful to only make the necessary server and client adjustments to the URLs. Call transaction RSBBS and select the following queries as the „Sender‟ then select „Continue (Enter)‟: Sender Queries Receiver Reporting Objects Long Description Group: 0GRM3MP01_Q0007 0GRM3MP02_Q0001 0GPC3MP01_Q0003 0GPC3MP13_Q0002 Show Risk Long Text Description Show Response Long Text Description Detailed Control Description CAPA Root Cause Case Details Group: 0GPC3MP01_Q0003 0GPC3MP02_Q0002 0GPC3MP02_Q0003 0GPC3MP02_Q0004 0GPC3MP02_Q0006 0GPC3MP02_Q0007 0GPC3MP02_Q0008 0GPC3MP02_Q0010 0GPC3MP02_Q0011 0GPC3MP02_Q0012 0GPC3MP02_Q0013 0GPC3MP02_Q0014 0GPC3MP02_Q0015 0GPC3MP03_Q0002 0GPC3MP10_Q0002 0GPC3MP11_Q0002 0GPC3MP13_Q0002 0GPC3MP14_Q0002 Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Issue Details Evaluation Case Details Evaluation Case Details CAPA Details Remediation Details For each of the above queries, select the listed „Receiver‟ and click on „Change‟ then perform a dropdown under „Report‟ where the URL exists at the bottom of the pop-up „Change‟ screen. Here you can adjust the URLs as indicated in steps 1-3 below the following two groups: Long Description Group Case Details Group For each of the above cases leave the majority of the URL in tact but just make the changes indicated below („rm‟ below will be substituted with „pc‟ for the PC related „jumps‟ in the delivered URLs): 1. Replace „<rm_server>:<port>‟ or „<rm_server>:<port>‟ with the address associated with your PC/RM server (such as „rmserver.mycompany.corp:50050‟, for example) 2. Replace the „rm_client‟ in „sap-client=<rm_client>‟ with the PC/RM source system client number (such as „200‟, for example) 3. Then select „Transfer‟, „Transfer‟, then „Save‟ to preserve your changes 4) The Permitted Extra Characters (RSKC) needs to be set to solely the value „ALL_CAPITAL‟ to allow expected character values from the PC/RM source system plus it will handle most unexpected customer values as well. Otherwise, some data loads may fail – particularly with errors during the execution of the DTP related to „invalid‟ or „non permitted‟ values or characters. 5) Apply Note 1459101 to correct an issue with special timeframe extracts for objects Signoff and AOD. H) Load Supporting Data Some supporting data such as Exchange Rates, Currencies and Units of Measure need to be loaded into BW in order to enhance the PC/RM reporting capabilities. These first need to be established in the PC/RM system then loaded into BW by performing a rightclick on the PC/RM source system in RSA1 and selecting „Transfer Exchange Rates‟ and „Transfer Global Settings‟: BEx maps have also been included in a number of delivered PC web templates based on countries and regions. First, the following master data text and attributes need to be loaded via delivered DataSources for the following InfoObjects: 0LANGU, 0REGION, 0COUNTRY. Then, in order to display the graphics the „BEx Map‟ settings need to be maintained in InfoObjects 0COUNTRY and 0REGION and the shape files and abbreviations must be loaded by following the instructions in: http://help.sap.com/saphelp_bw/helpdata/en/1a/f405387bcc513be10000009b38f8cf/conte nt.htm Be sure that the BW master data, the shape files and the PC/RM application values for the country and region are all in sync. Be sure to maintain this data at the organization unit level in the PC/RM application. To do this, execute the transaction GRFN_STR_CHANGE in the PC/RM system. Then navigate by Organizational Unit on the left and double-click to make your selection. Maintain the values in the „Country and Region‟ tab on the right. I) Loading PC/RM Data: Loading data from the PC/RM source system is driven by BW process chains. It is guided by timeframe customizing on the source system, InfoPackage selection criteria, and sometimes the timeframe override in BW. The timeframe(s) to be extracted into BI (for timeframe dependent loads) are specified in the source system using transaction GRFN_BI_TF_CUST „GRC BI Extraction Customizing‟. The settings here indicate which timeframe(s) will be relevant for extraction from the datamart. Therefore, the datamart first needs to be filled with the desired timeframes for extraction. Only „full‟ extracts (limited by timeframe, where applicable) are delivered as there is no delta extraction of data from the source system enabled in the DataSources. Also, only one timeframe selection can be specified in customizing at a time and PC/RM is designed to handle only one timeframe frequency/granularity - so choose the lowest level that is desired to report on. If selection criteria are entered in the InfoPackage then the timeframe customizing entries are overridden. However, this is contrary to the delivered design and is not recommended unless necessary. But if you decide to stray from this design then you must ensure that each InfoPackage is in sync with the frequency/granularity which you have been loading (GRFN_BI_TF_CUST). If transaction RS_BCT_GRC_FN_TFLOAD is used for timeframe reload (covered below) then the timeframe customizing entries are overridden. Hierarchies are the exception to the InfoPackage selection rule (see step 3 below). Also, the information associated with the DataSources 0GPC_SIGNOFF (Org. Units Sign-Off Info) and 0GPC_AOD (Aggregation of Deficiencies) behave differently and are scheduled for particular timeframe frequencies in PC. In order to extract these special timeframes you need to leave blank the checkbox „Do not extract special timeframes‟ in timeframe customizing (GRFN_BI_TF_CUST). Otherwise, if you mark this selection then the customizing settings will be respected and any special timeframes not included in GRFN_BI_TF_CUST will be ignored. Basically, the PC/RM data is loaded by executing the following three process chains in order, one after the other one completes (details below): 1. „GRC PC/RM 3.0: Start Load Main Data‟ 2. „GRC PC/RM 3.0: Start Load Authorizations‟ 3. „GRC PC/RM 3.0: Start Load Hierarchies‟ 1. Standard Load of Master Data, Text and InfoProviders (Main Data): Typically, the standard load of data will be performed by scheduling one of the delivered process chains. This will need to happen in conjunction with the manual maintenance of the custom timeframe in the source system to ensure that the desired timeframe(s) is being extracted. Initiate the extraction/load by executing process chain ‘GRC PC/RM 3.0: Start Load Main Data’ (0GRC_PCRM30_LOAD_ALL) using transaction RSPC in BW. This chain will delete the PSAs prior to loading to avoid duplication of data since the DTP Extraction Modes are defined as „Full‟. The key figures of the DSOs are set to „overwrite‟ so the existing contents are configured to be „updated‟, not appended. However, the contents of the target InfoCubes are deleted prior to each load in order to avoid duplication of data. Subsequent timeframe extracts will not remove any data from BI which had been deleted in PC/RM so a timefame refresh option is also available from the process chains (covered below). 2. Load Authorizations: In order to update the authorizations data you need to execute process chain ‘GRC PC/RM 3.0: Start Load Authorizations’ (0GRC_PCRM30_AUTHS). This chain will delete the PSAs prior to loading to avoid duplication of data since the DTP Extraction Mode is defined as „Full‟. This is also a „full‟ extract of data without any timeframe limitation as only current data is stored in the source system. The contents of the DSO are also deleted prior to loading since this is a complete refresh of the data with the results being simply the current authorizations being contained in the DSO. The processing of the authorization data is covered in section „J‟ under „Authorizations Setup and Generation‟. 3. Load Hierarchies: In order to update the hierarchy data you need to execute process chain ‘GRC PC/RM 3.0: Start Load Hierarchies’ (0GRC_PCRM30_HIER). These extracts are timeframe dependent so the timeframe is determined by a combination of the value in GRFN_BI_TF_CUST in the source system and the selection process in the InfoPackage. First you must select a hierarchy version from the available list in each InfoPackage prior to performing the extract. From the „Hierarchy Selection‟ tab click on „Available Hierarchies from OLTP‟ to refresh the list. The list will represent the hierarchies which can be built from the timeframes specified in timeframe customizing. Select one of the listed hierarchies and save the InfoPackage. Note that prior to each hierarchy extract it may be necessary to update the hierarchy selection in the InfoPackage in order to refresh the list and select the latest hierarchy available in the PC/RM system. Repeat this process for all delivered InfoPackages for hierarchies belonging to InfoObjects 0GPC_CEC, 0GFN_TF, 0GRM_RG, 0GPC_CSP, 0GFN_OU, 0GRM_OG, 0GPC_AG, 0GRM_CA and 0GPC_OURE Finally, you can execute the process chain „GRC PC/RM 3.0: Start Load Hierarchies‟ (0GRC_PCRM30_HIER). If the delivered Organizational Unit hierarchy setting is changed from „Entire hierarchy is time-dependent‟ to „Time-Dependent Hierarchy Structure‟ then this is not supported and the results cannot be guaranteed. Since hierarchy extraction uses 3.x InfoSources there are no DTPs involved and no deletion of the PSAs is necessary to avoid duplicate data loading. Timeframe Reload (situational): Since subsequent timeframe extracts will not remove any data from BI which had been deleted in PC/RM, a full timeframe refresh option is also available from the process chains. If a timeframe needs to be reloaded then the following procedure must be followed. Run transaction RS_BCT_GRC_FN_TFLOAD in BW and enter the timeframe to reload. Make sure the entry is in sync with the frequency/granularity which you have been loading specified in timefame customizing of the PC/RM system (transaction GRFN_BI_TF_CUST). The InfoPackage selection routines will allow this value to override the value entered in GRC BI Extraction Customizing in the source system. Initiate the extraction/load by executing process chain „GRC PC/RM 3.0: Reload - Start Reload Timeframe‟ (0GRC_PCRM30_RELOAD_TF). This chain will delete the data and PSAs for the timeframe from the master data, text and InfoProviders and then reload the data for this timeframe. Remember to remove the value from RS_BCT_GRC_FN_TFLOAD or subsequent timeframe dependent loads (the Standard load and the Hierarchies load) will fail. Complete Data Deletion (situational): If you ever need to completely remove data from the system execute process chain „GRC PC/RM 3.0: Delete All Data (Master/Trans/Hierarchy)‟ (0GRC_PCRM30_DELETE_ALL). This would only be required in special cases and is not part of routine, scheduled loads. Description of Process Chains: Two “streams” of process chains exist in the automation of data loads – „GRC PC/RM 3.0: Start Load Main Data‟ and „GRC PC/RM 3.0: Reload - Start Reload Timeframe‟. The process chains „GRC PC/RM 3.0: Start Load Hierarchies‟, „GRC PC/RM 3.0: Start Load Authorizations‟, and „GRC PC/RM 3.0: Delete All Data (Master/Trans/Hierarchy)‟ are “stand alone” but the two aforementioned call subsequent chains so their flow will be defined in detail below. Flow of „GRC PC/RM 3.0: Start Load Main Data‟ (0GRC_PCRM30_LOAD_ALL) process chain: GRC PC/RM 3.0: Start Load Main Data GRC PC/RM 3.0: Load Preliminary Data GRC PC/RM 3.0: Load Attribute Text GRC PC/RM 3.0: Load Master Data Text GRC PC/RM 3.0: Load Master Data Attributes GRC PC/RM 3.0: Load PC Targets GRC PC/RM 3.0: Load RM Targets Flow of „GRC PC/RM 3.0: Reload - Start Reload Timeframe‟ (0GRC_PCRM30_RELOAD_TF) process chain: GRC PC/RM 3.0: Reload - Start Reload Timeframe GRC PC/RM 3.0: Reload Delete Timeframe MD & Targets GRC PC/RM 3.0: Load Preliminary Data GRC PC/RM 3.0: Reload - Load Timeframe Dep. Attribute Text GRC PC/RM 3.0: Load Master Data Text GRC PC/RM 3.0: Load Master Data Attributes GRC PC/RM 3.0: Load PC Targets GRC PC/RM 3.0: Load RM Targets J) Executing BW Queries and Web Templates: All delivered queries and web templates are found under the following roles in BW (transaction PFCG): SAP_BW_GRC_FN30_ROLE, SAP_BW_GRC_PC30_ROLE, SAP_BW_GRC_RM30_ROLE. These roles do not provide secured access to these reporting objects but simply provide a method of convenient access to the objects once the role is assigned to a user. Optionally, access to the queries is secured either by InfoProvider or at the individual query level. To secure by InfoProvider users must have sufficient authorization both in analysis authorizations (via the special characteristics) and within PFCG roles/profiles. To secure at the query level authorization is granted by PFCG roles/profiles. Analysis authorization and PFCG maintenance is outside the scope of this document. If queries fail to execute first try to regenerate them using program RSR_GEN_DIRECT_ALL_QUERIES (transaction SE38) in BW. Select InfoCubes 0GFN3*, 0GPC3* and 0GRM3*. Query analysis authorizations (field level authorizations) are based on BW InfoObjects Organizational Units (0GFN_OU) and Org. Unit in Regulation (0GPC_OURE). Query authorization variables are used to restrict access during query execution. A full explanation of this procedure is under the subsequent section on authorizations in this document. The following sender/receiver assignments („jumps‟) from one query to another reporting object are provided with the PC/ RM Business Content. They are defined in transaction RSBBS and are executed by performing a right-click „Go To‟ from the query output screen. The column/row from which the jump is executed (as well as the assignment details in RSBBS) determines the selection criteria for the receiving reporting object and thus the output. There are three types of receiving reporting objects below: Query, Web Template (WT) and URL. The URLs represent jumps to the backend PC/RM system in order to report on data that is not loaded into BW. Below are all of the delivered queries representing these three types of jumps. RM Sender Queries Risk by Objective Risk Overview Response Overview Opportunity by Objective KRI Evaluations by Org Unit RM Receiver Objects Org. Unit - Objective assignment Show Risk Long Text Description Show Response Long Text Desc Org. Unit - Objective assignment KRI Instance History Chart PC Sender Queries Heat Map for Control Evaluations Control Evaluations Summary PC Receiver Objects Control Evaluations Summary Control Evaluations Details Control Evaluations Details Control Evaluations Details Control Testing Evaluation History Overall Failed Controls for Account Group - Risks Failed Controls for Account Group - Risks Failed Controls for Account Group - Risks Failed Controls for Account Group - Risks Failed Testings for Account Group - Risks Failed Design Assessments for Acct Group - Risks Failed Self Assessments for Acct Group - Risks Failed Controls for CO-Risks Failed Controls for CO-Risks Failed Controls for CO-Risks Failed Controls for CO-Risks Failed Testings for CO-Risks Failed Design Assessments for CO-Risks Failed Self Assessments for CO-Risks Failed Controls for Subprocess - Risks Failed Controls for Subprocess - Risks Failed Controls for Subprocess - Risks Failed Controls for Subprocess - Risks Failed Testings for Subprocess - Risks Failed Design Assessments for Subprocess - Risks Failed Self Assessments for Subprocess - Risks Failed Monitorings for Account Group - Risks Failed Monitorings for CO-Risks Failed Monitorings for Subprocess - Risks Issue Status by Organization Issues Status Details by Organization Indirect ELC Evaluation Summary Indirect ELC Evaluation Details Subprocess Evaluations Summary Subprocess Evaluations Details Issues Trend Analysis Issues Trend Analysis Details Issue Trends By Org. Unit Issue Trends By Subprocess Issue Trends By Risk Level Issue Trends By Issue Priority CAPA Trend Analysis CAPA Trend Analysis Details CAPA Trend Analysis Details CAPA Trends by Org. Unit CAPA Trends by Subprocess CAPA Trends by Risk Level Trends by CAPA Statuses Evaluation Case Details Detailed Control Description Ctrl Testing Evals Hist for Org. Unit Failed Testings Failed Design Assessments Failed Self Assessments Failed Monitorings Evaluation Case Details Evaluation Case Details Evaluation Case Details Failed Testings Failed Design Assessments Failed Self Assessments Failed Monitorings Evaluation Case Details Evaluation Case Details Evaluation Case Details Failed Testings Failed Design Assessments Failed Self Assessments Failed Monitorings Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Issue Status Details by Organization Issue Details Indirect ELC Evaluation Details Evaluation Case Details Subprocess Evaluations Details Evaluation Case Details Issues Trend Analysis Details Issue Details Issues Trend Analysis Details Issues Trend Analysis Details Issues Trend Analysis Details Issues Trend Analysis Details CAPA Trend Analysis Details CAPA Details CAPA Root Cause CAPA Trend Analysis Details CAPA Trend Analysis Details CAPA Trend Analysis Details CAPA Trend Analysis Details Remediation Trend Analysis Remediation Trend Analysis Dtls - Global Scope Trend by Org. Unit Trend by Subprocess Trend by Risk Levels Trend by Remediation Statuses Remed Plans Trend Analysis Details Remediation Details Remed Plans Trend Analysis Details Remed Plans Trend Analysis Details Remed Plans Trend Analysis Details Remed Plans Trend Analysis Details Currency conversion can be performed on all delivered queries that contain currency value key figures. The currency translation type 0GRC_TFEND (transaction (RSCUR) is tied to all PC/RM currency type key figures in the BW queries. It evokes the user entry variables 0PGRCXRT (Exchange Rate Type) and 0PGRCCUR (Target Currency) and all conversion values used are derived from table TCURR and evaluated in reference to the key date of the query. The following Timeframe related variables are used in the PC/RM queries: „End Date of Timeframe‟ (0P_GFN_TF_PE) is an SAP Exit variable referencing the InfoObject 0DATE. It determines the last day of the user entered timeframe from user entry, single value variable 0P_GFN_TF in order to supply the key date of the query. Function Module RSVAREXIT_0P_GFN_TF_PE „Timeframe End Date from 0P_GFN_TF‟ is used to determine the proper date. End Date of Timeframe Interval‟ (0P_GFN_TF_IE) is an SAP Exit variable referencing the InfoObject 0DATE. It determines the last day of the user entered timeframe range from user entry, range value variable 0I_GFN_TF in order to supply the key date of the query. Function Module RSVAREXIT_0P_GFN_TF_IE „Timefame Interval End Date from 0I_GFN_TF‟ is used to determine the proper date. The following nine hierarchies are provided with BI Content for PC/RM: 0GPC_CEC, 0GFN_TF, 0GRM_RG, 0GPC_CSP, 0GFN_OU, 0GRM_OG, 0GPC_AG, 0GRM_CA, 0GPC_OURE. These hierarchies are loaded as well as activated using the process chain 0GRC_PCRM30_HIER. The details of this process are covered in section I-3 „Loading PC/RM Data‟. Some of the delivered queries display their output by these hierarchies. Authorizations Setup and Generation: Field level query security (analysis authorizations) is delivered in PC/RM 3.0 content using BI 7.x Analysis Authorizations technology based on Organizational Unit (0GFN_OU) and Org. Unit in Regulation (0GPC_OURE). Authorizations based on, for example, Org Unit simply means that you can only report on records from an InfoProvider which contain the Org Unit values you are authorized to see. However, you can see the values for all other fields associated with these records (such as subprocess level data) with no restrictions on those particular fields. You are free to „turn off‟ the authorization on these objects and/or adding new authorization relevant objects but such changes are customizations and are not supported by SAP. The values determining the BI analysis authorizations for a user are derived from the Org Unit and Org Unit in Regulation authorizations in the PC/RM backend system and are based strictly on current (not time dependent) authorizations. These values are loaded into the BW system then authorizations are generated using a standard BW tool (RSECADMIN). In order to leverage these generated authorizations the InfoObjects which the authorization is based on were made „authorization relevant‟. Additionally, query authorization variables are created on these InfoObjects and used within the queries to facilitate the security. Previously, field level query output security in PC/RM 2.5 was supported by exit variables in the queries and provides a form of time dependent reporting authorization. The DataSource 0GFN_AUTHORIZATIONS has been delivered to extract the authorizations data from the PC/RM backend system. This DataSource feeds the DSO 0GFN_DS01 which is a copy of the template DSO 0TCA_DS01 used for processing the supporting authorization data. Once loaded, the data is processed in BW using transaction RSECADMIN. Transaction RSECADMIN is used to generate authorizations based on the authorization data stored in DSO 0GFN_DS01. The authorization data consists of individual values or ranges of values for each authorization relevant InfoObject by user. The generated authorizations contain these current authorization values for each user assigned to their user IDs (see the „User‟ tab of RSECADMIN). Authorizations can be assigned to a user via RSECADMIN or in PFCG. But the recommended method is to use RSECADMIN for all analysis authorization objects while PFCG should be limited to just role based authorization. Beware that some authorization assignments (such as „*‟ in the S_RS_COMP object) can lead to the authorization 0BI_ALL being assigned to a user which effectively negates all analysis authorization functionality. This can be seen in RSECADMIN -> User -> Assignment -> Display -> „Role-Based‟ tab. The InfoObjects 0GFN_OU and 0GPC_OURE have been delivered authorization relevant by checking this option in the InfoObject definition tab „BEx Explorer‟. 0GPC_OURE is used solely in the PC content while 0GFN_OU is used in both the PC and RM content. Users running queries defined against an InfoCube containing either of these InfoObjects must have analysis authorization based on these objects – even if it is simply the value „*‟. Use of authorization variables is suggested in all queries belonging to an InfoProvider containing any of the authorization relevant InfoObjects. The variables will use the values in the generated authorizations to create the valid scope of user‟s access. If the user‟s selection criteria are a subset of the user‟s authorization scope, then the result of the query will be the intersection of the user‟s authorizations and the query filter selection. If the selection criteria are outside of the user‟s authorization then an authorization error will occur when the query is executed. Authorization variable 0S_GFN_OU_AUTH has been delivered to provide the user‟s authorizations while allowing the user to enter query selection criteria (ready for input). In other words, the output of the query will be based on the intersection of the entered criteria and the defined authorizations. Authorization variable 0S_GFN_OU_AUTH_H has been delivered to provide the user‟s authorizations without allowing the user to enter query selection criteria (not ready for input). In other words, the output of the query will be based solely on the authorized Org Unit values for this user. Special high level authorization characteristics exist in the system to facilitate reporting security. The InfoObjects 0TCAACTVT, 0TCAIPROV, 0TCAKYFNM and 0TCAVALID must be activated from content and also made authorization relevant (in the Business Explorer tab in the InfoObject definition). An authorization must also be created in RSECADMIN containing these InfoObjects with the value „*‟ assigned to them. This authorization needs to be assigned to all BEx end users in order to run queries. This is in addition to the required PFCG role assignments required to run queries: S_RS_COMP, S_RS_COMP1, S_RFC, S_TCODE, etc). For more information on the BI 7.0 Analysis Authorizations concept please see: „An Expert Guide to new SAP BI Security Features‟ http://www.sdn.sap.com/irj/scn/elearn?rid=/library/uuid/659fa0a2-0a01-0010-b39c8f92b19fbfea K) Known Problems and Limitations: 1) There is a limitation when using the F4 help function (from the variable screen dropdowns during query execution) when a user exit variable is used for the key date of the query. This is a known BEx limitation in the variable screen only and does not affect the query results. In this situation the key date is not derived by the exit variable until the query is run so the time dependent attributes and text associated with a characteristic reflect the current system date (the default) in a dropdown in the variable screen. However, once the query result is rendered, if the attributes or text are displayed they will reflect the derived key date of the query so this issue does not affect the output. Since time dependent reporting delivered with the content is based on the timeframe end date, all objects displayed in the query output will be in sync with this derived date. In the PC/RM 3.0 content the situation may arise when user entry timeframe variables 0P_GFN_TF or 0I_GFN_TF are used to determine the timeframe end date in SAP exit variables 0P_GFN_TF_PE or 0P_GFN_TF_IE, respectively. If users have the tendency to filter query selections in the variable screen based on characteristics with time dependency, then a workaround can be created if this is bothersome to them. Here is a suggested solution: Have the user enter the Timeframe End Date using a variable associated with the key date of the query. Since more than one timeframe may be associated with the end date (in situations where multiple frequencies or special timeframes have been used) the desired timeframe needs to be either entered manually or determined by a user exit. If an exit is used then the primary timeframe frequency could be store in a table in order to determine the proper timeframe associated with the end date. Then the selection of the query will filter on the timeframe and the key date of the query will be equal to the timeframe end date. Steps to perform in the BI system: - Create a table ZGFN_RPT_TF_FREQ with field FREQUENCY CHAR(10) (from data element /BI0/OIGFN_TFFREQ) and store the primary reporting frequency to be loaded into BI. Set the Data Class to „APPL2‟ (Organization and customizing) under „Technical Settings‟ and the Enhancement Category for Structure to „Can Be Enhanced (DEEP)‟ under „Extras‟ (unless there are different customer standards). - Create a single value, mandatory, user entry input variable ZP_TF_END_DATE (Timeframe End Date) based on InfoObject 0DATE (this will allow you to use the variable in the key date section and when filtering on TF end date as well) - Ensure that the setting in InfoObject 0GFN_TFEND for filter value selection (in the BEx tab in RSD1) is set to „Only Values in InfoProvider‟ - Set the query key date (and all references in the query) to use the new date variable ZP_TF_END_DATE - Create single value, mandatory (initial value not allowed), not ready for input, customer exit variable ZP_TF_FROM_END_DATE (Timeframe from End Date) based on InfoObject 0GFN_TF. Replace the delivered timeframe variable (0P_GFN_TF) with this new variable in all query occurrences. *Note that under this solution if you wish to use a range of timeframes in the query then instead an interval variable would need to be created defined as „user entry input‟. - Function Module EXIT_SAPLRRS0_001 is used to include code for customer exit BEx variables. The code is written in Include ZXRSRU01 but the exit first needs to be activated under a project in CMOD. The code for looking up the proper timeframe (from the user input timeframe end date and the primary reporting frequency) by following all steps above should look something like the following – but additionally customer coding standards should also be adhered to. DATA: L_S_RANGE TYPE RRRANGESID. DATA: L_S_VAR_RANGE TYPE RRRANGEEXIT. CASE I_VNAM. WHEN 'ZP_TF_FROM_END_DATE'. DATA: TF_FREQ LIKE ZGFN_RPT_TF_FREQ, FREQ LIKE ZGFN_RPT_TF_FREQ-FREQUENCY, TF_MD LIKE /BI0/PGFN_TF, TF LIKE /BI0/PGFN_TF-GFN_TF. IF I_STEP = 2. SELECT * from ZGFN_RPT_TF_FREQ into TF_FREQ. IF SY-SUBRC = 0. FREQ = TF_FREQ-FREQUENCY. READ TABLE I_T_VAR_RANGE INTO L_S_VAR_RANGE WITH KEY VNAM = 'ZP_TF_END_DATE'. IF SY-SUBRC = 0. SELECT * from /BI0/PGFN_TF into TF_MD WHERE GFN_TFFREQ = FREQ and GFN_TFEND = L_S_VAR_RANGE-LOW. IF SY-SUBRC = 0. TF = TF_MD-GFN_TF. CLEAR L_S_RANGE. L_S_RANGE-LOW = TF. L_S_RANGE-SIGN = 'I'. L_S_RANGE-OPT = 'EQ'. APPEND L_S_RANGE to E_T_RANGE. ENDIF. ENDSELECT. ENDIF. ENDIF. ENDSELECT. ENDIF. 2) Errors can occur when attempting to perform a currency conversion during query execution in either the variable screen or when using F4 Help in the currency translation type field. Please apply Note 1348575 to resolve this issue. 3) If errors occur when running or editing process chains with regard to the PSA deletion steps, please refer to Note 1344779 – „Correction: Multiple PSA references in variant from content‟. Possible error messages which could be related this problem are: TSV_TNEW_BLOCKS_NO_ROLL_MEMORY and „No roll storage space of length ####### available for internal storage‟. Please read the note before analyzing memory and space related issues. 4) According to Note 871132, ensure that InfoObject 0RTYPE (Exchange Rate Type) does NOT have the conversion routine „ALPHA‟ specified in tab „General‟. This could cause errors when executing queries containing currency conversion routines. Please follow the directions in the note carefully. 5) If errors occur when executing queries containing currency conversion routines, please apply Note 871132 – „InfoObject: 0RTYPE has ALPHA conversion exit‟. 6) Apply Note 1459101 to correct an issue with special timeframe extracts for objects Signoff and AOD. Without this correction the flag „Do not extract special timeframes‟ in timeframe customizing (GRFN_BI_TF_CUST) will be ignored and the special timeframes will be extracted along with all timeframes included in GRFN_BI_TF_CUST (instead of skipping the special timeframes). L) Frequently Asked Questions How do I prepare for loading data into BI? Getting ready to integrate BI with PC/RM data involves preparation in both the PC/RM and BI systems. The steps are covered in detail under „Installation Steps‟ (sections A – H) but here they are summarized: 1. 2. 3. 4. 5. 6. 7. 8. Install GRC PC/RM 3.0 SP03 in the GRC system Install SAP_BW 7.0 SP22, BI_CONT 7.04 SP04 or higher in the BW system Create a BW source system and RFC connection to the PC/RM backend system Maintain Timeframe Frequencies in the PC/RM system Maintain Timeframes in the PC/RM system Transfer Application Component Hierarchy for BI in the PC/RM system Transfer (activate) Business Content DataSources in the PC/RM system Maintain BI Extraction Customizing Settings (transaction GRFN_BI_TF_CUST in PC/RM) a. Set the frequency, starting timeframe and from/to year to extract into BI 9. Maintain DataMart calculation (transaction GRFN_DM_MAINTAIN in PC/RM) a. Use this activity to create and maintain data that will be used to load data into BI system. The datamart is also referred to as the report buffer. 10. Replicate DataSources in the BI system 11. Activate Business Content in the BI system 12. Follow-up steps to activation (to be done in BW): a. According to Note 871132, ensure that InfoObject 0RTYPE (Exchange Rate Type) does NOT have the conversion routine „ALPHA‟ specified in tab „General‟. b. In order to initiate BW 7.0 Authorizations, the following InfoObjects need to be active in RSA1: 0TCAACTVT, 0TCAIPROV, OTCAKYFNM, OTCAVALID. c. Query sender/receiver assignments (for „jumps‟) have been delivered with BI Content but the URLs associated with the „jumps‟ to the backend system need to be maintained to reflect the customer‟s landscape. d. The Permitted Extra Characters (RSKC) needs to be set to solely the value „ALL_CAPITAL‟ 13. Load Supporting Data: a. Supporting data such as Exchange Rates, Currencies and Units of Measure need to be loaded into BW in order to enhance the PC/RM reporting capabilities. b. Master data text and attributes to support the BEx maps need to be loaded for the following InfoObjects: 0LANGU, 0REGION, 0COUNTRY. c. BEx Map settings need to be maintained in InfoObjects 0COUNTRY and 0REGION and the shape files and abbreviations must be loaded What is the flow of data in BI and how do I update, refresh and delete it? Loading data from the PC/RM source system is driven by BW process chains. It is guided by timeframe customizing on the source system, InfoPackage selection criteria, and sometimes the timeframe override in BW. The details are covered completely in section I „Loading PC/RM Data‟ but here they are summarized: The timeframe(s) to be extracted from the PC/RM datamart into BI are specified in the source system using transaction GRFN_BI_TF_CUST „GRC BI Extraction Customizing‟. Only „full‟ extracts (limited by timeframe, where applicable) are delivered as there is no delta extraction of data from the source system enabled in the DataSources. Only one timeframe frequency/granularity is allowed - so choose the lowest level that is desired to report on. The PC/RM data is loaded by executing the following three process chains: ‘GRC PC/RM 3.0: Start Load Main Data’: Chain is used to load the master data, text and InfoProviders from the PC/RM system Deletes the PSAs prior to loading to avoid duplication of data since the DTP Extraction Modes are defined as „Full‟ Key figures of the DSOs are set to „overwrite‟ so the existing contents are configured to be „updated‟, not appended Contents of the target InfoCubes are deleted prior to each load in order to avoid duplication of data Subsequent timeframe extracts will not remove any data from BI which had been deleted in PC/RM so a timefame refresh option is also available from the process chains (covered below) Information associated with the DataSources 0GPC_SIGNOFF (Org. Units Sign-Off Info) and 0GPC_AOD (Aggregation of Deficiencies) behave differently and are scheduled for particular timeframe frequencies in PC (see detailed instructions on this) ‘GRC PC/RM 3.0: Start Load Authorizations’: Chain is used to load the authorizations from the PC/RM system Deletes the PSAs prior to loading to avoid duplication of data since the DTP Extraction Mode is defined as „Full‟ This is a „full‟ extract of data without any timeframe limitation Contents of the DSO are deleted prior to loading since this is a complete refresh of the data Processing of the authorization data is covered in section „J‟ under „Authorizations Setup and Generation‟ ‘GRC PC/RM 3.0: Start Load Hierarchies’: Chain is used to load the hierarchies from the PC/RM system The timeframe of the extract is determined by a combination of the value in GRFN_BI_TF_CUST in the source system and the selection process in the InfoPackage Prior to performing the extract select a hierarchy version from the available list in each InfoPackage Prior to each hierarchy extract it may be necessary to update the hierarchy selection in the InfoPackage InfoPackages are delivered for hierarchies belonging to InfoObjects 0GPC_CEC, 0GFN_TF, 0GRM_RG, 0GPC_CSP, 0GFN_OU, 0GRM_OG, 0GPC_AG, 0GRM_CA, 0GPC_OURE Timeframe reloads can be performed if necessary: Since subsequent timeframe extracts will not remove any data from BI which had been deleted in PC/RM, a full timeframe refresh option is also available from the process chains. If a timeframe needs to be reloaded then the following procedure must be followed: Run transaction RS_BCT_GRC_FN_TFLOAD in BW and enter timeframe to reload Execute process chain 0GRC_PCRM30_RELOAD_TF Chain will delete data and PSAs for the timeframe from the master data, text and InfoProviders and then reload the data for this timeframe Remove the value from RS_BCT_GRC_FN_TFLOAD or subsequent timeframe dependent loads will fail Complete deletion of data can be performed if necessary: If you need to completely remove data from the system execute process chain 0GRC_PCRM30_DELETE_ALL. How do I extend the delivered DataSources in PC/RM 3.0 SP3? Please see Note 1314368 regarding customer defined fields which includes two white paper attachments. Use the white papers in conjunction with the table below while following these steps: 1. Prepare the Customer Include (CI) in the PC/RM application. The naming convention is „CI_GRxx_xxxxxx‟ and the lists of tables/objects that can be enhanced are in the attachments of Note 1314368. 2. Create the CI in reporting with naming convention „CI_GRxx_xxxxxx_BI‟. It will be created using structure „GRFN_S_BI_xx_ATTR_I‟ as indicated in the table below. The names of the fields need to correspond to the names of the fields created in the first step (data is copied using „move corresponding‟) but must not be of type „string‟. 3. Edit the corresponding DataSource in RSA6 and unhide the new fields then resave the DataSource. 4. Replicate the modified DataSources from PC/RM to BI. Datamart Datamart Include Datamart CI Reporting/Extractor Structure Reporting/Extractor CI DataSource Orgunit GRFN_S_OU_ATTR_I CI_GRFN_ORGUNIT GRFN_S_BI_OU_ATTR_I CI_GRFN_ORGUNIT_BI 0GFN_OU_ATTR Account Group GRFN_S_AG_ATTR_I CI_GRPC_ACC_GROUP GRFN_S_BI_AG_ATTR_I CI_GRPC_ACC_GROUP_BI 0GPC_AG_ATTR Assessment GRFN_S_AS_ATTR_I CI_GRPC_CASEAS GRFN_S_BI_AS_ATTR_I CI_GRPC_CASEAS_BI 0GPC_AS_ATTR CAPA plan GRFN_S_CP_ATTR_I CI_GRPC_CASECP GRFN_S_BI_CP_ATTR_I CI_GRPC_CASECP_BI 0GPC_CP_ATTR Issue GRFN_S_IS_ATTR_I CI_GRPC_CASEIS GRFN_S_BI_IS_ATTR_I CI_GRPC_CASEIS_BI 0GPC_IS_ATTR Remediation Plan GRFN_S_PL_ATTR_I CI_GRPC_CASEPL GRFN_S_BI_PL_ATTR_I CI_GRPC_CASEPL_BI 0GPC_PL_ATTR Test Log GRFN_S_TL_ATTR_I CI_GRPC_CASETL GRFN_S_BI_TL_ATTR_I CI_GRPC_CASETL_BI 0GPC_TL_ATTR Control Objective GRFN_S_CO_ATTR_I CI_GRPC_COBJECTIVE GRFN_S_BI_CO_ATTR_I CI_GRPC_COBJECTIVE_BI 0GPC_COBJ_ATTR Control GRFN_S_CN_ATTR_I CI_GRPC_CONTROL GRFN_S_BI_CN_ATTR_I CI_GRPC_CONTROL_BI 0GPC_CN_ATTR Entity Level Control GRFN_S_EC_ATTR_I CI_GRPC_ECONTROL GRFN_S_BI_EC_ATTR_I CI_GRPC_ECONTROL_BI 0GPC_EC_ATTR Process GRFN_S_PR_ATTR_I CI_GRPC_PROCESS GRFN_S_BI_PR_ATTR_I CI_GRPC_PROCESS_BI 0GPC_PR_ATTR Signoff GRFN_S_SO_ATTR_I CI_GRPC_SIGNOFF GRFN_S_BI_SO_ATTR_I CI_GRPC_SIGNOFF_BI 0GPC_SIGNOFF Sub Process GRFN_S_SP_ATTR_I CI_GRPC_SUBPROCESS GRFN_S_BI_SP_ATTR_I CI_GRPC_SUBPROCESS_BI 0GPC_SP_ATTR Test step GRFN_S_V0_ATTR_I CI_GRPC_V0TS GRFN_S_BI_V0_ATTR_I CI_GRPC_V0TS_BI 0GPC_V0_ATTR Activity GRFN_S_AC_ATTR_I CI_GRRM_ACTIVITY GRFN_S_BI_AC_ATTR_I CI_GRRM_ACTIVITY_BI 0GRM_AC_ATTR Opportunity Risk GRFN_S_OR_ATTR_I CI_GRRM_OPPORTUNITY GRFN_S_BI_OR_ATTR_I CI_GRRM_OPPORTUNITY_BI 0GRM_OR_ATTR Risk GRFN_S_RS_ATTR_I CI_GRRM_RISK GRFN_S_BI_RS_ATTR_I CI_GRRM_RISK_BI 0GFN_RS_ATTR *Note that if you are upgrading from PC/RM 2.x to 3.0, the Business Content is not upgradeable so you need to first upgrade the PC/RM system then install the content and add any previously existing customizations/extensions to the 3.0 content. Extensions to equivalent DataSources (since there are all new DataSources in the 3.0 content) would be done via the CIs mentioned above (if available) and all other customizations (such as transformations and InfoProvider modifications) would need to be mapped to their 3.0 equivalent and manually implemented. How do I retrieve comments / long text from the PC/RM system? There is a limitation in BI of a maximum length of 60 characters per field. Therefore, descriptions or comment fields from the source system longer than 60 characters cannot be seamlessly stored in BI. In the PC/RM 3.0 business content some basic support has been provided to enable access to some of the fields exceeding 60 characters in the PC/RM system. At the technical level this data from the PC/RM source system is provided using the following objects: o Data Source 0GFN_LONG_TEXT_ID – Informs BI if language dependent long text for a particular entity exists or not in the source system o Data Source 0GFN_LONG_TEXT_NO_LANGU - Informs BI if language independent long text for a particular entity exists or not in the source system (typically for comments) o RFC FM GRFN_BI_GET_LONG_TEXT – RFC to retrieve the actual source system long texts (for the given entity, id, timeframe, language) o Web Dynpro application GRFN_BI_LONG_TEXT – The application which displays the actual source system long texts to the query user. The parameters to this application are ENTITY_ID, FIELDNAME, KEY_DATE, LANGU, OBJECT_KEY, TF_FREQ. o Web Dynpro application GRPC_ASSESSMENT – The application which displays the case details (including long texts) of the entity. The parameters to this application are ACTIVE_TAB, GUID, OBJECT_ID, READ_ONLY, REGULATION_ID, SHARED_CONTROL, WDACCESSIBILITY, WDTABLENAVIGATION, WORKITEM, WORKITEM_NAVIGATE. At the query level this has been provided by delivering “jumps” from the following BI queries to the report in the PC/RM system via the specified Web Dynpro applications and parameters: Sender Queries 0GRM3MP01_Q0007 Receiver Rptng Objects Risk Long Text Desc WD App (parameters) GRFN_BI_LONG_TEXT ENTITY_ID=RISK, FIELDNAME=RS_D, OBJECT_KEY=Risk, KEY_DATE=Timeframe End Date, 0GRM3MP02_Q0001 Response Long Text Desc 0GPC3MP01_Q0003 Detailed Control Desc 0GPC3MP13_Q0002 CAPA Root Cause 0GPC3MP01_Q0003 0GPC3MP02_Q0002 0GPC3MP02_Q0003 0GPC3MP02_Q0004 0GPC3MP02_Q0006 0GPC3MP02_Q0007 0GPC3MP02_Q0008 0GPC3MP02_Q0010 0GPC3MP02_Q0011 0GPC3MP02_Q0012 0GPC3MP02_Q0013 0GPC3MP02_Q0014 0GPC3MP02_Q0015 0GPC3MP03_Q0002 0GPC3MP10_Q0002 0GPC3MP11_Q0002 0GPC3MP13_Q0002 0GPC3MP14_Q0002 Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Evaluation Case Details Issue Details Evaluation Case Details Evaluation Case Details CAPA Details Remediation Details TF_FREQ=Timeframe Frequency ENTITY_ID=RESPONSE, FIELDNAME=RP_DESC OBJECT_KEY=Response, KEY_DATE=Timeframe End Date, TF_FREQ=Timeframe Frequency ENTITY_ID=CONTROL, FIELDNAME=CN_D, KEY_DATE=Timeframe End Date, TF_FREQ=Timeframe Frequency, OBJECT_KEY=Control ENTITY_ID=G_CP, FIELDNAME=CP_ROOT_CAUSE KEY_DATE=Timeframe End Date, TF_FREQ=Timeframe Frequency, OBJECT_KEY=CAPA Plan GRPC_ASSESSMENT GUID=GRC CASE Id, READONLY=X GUID=GRC CASE Id, READONLY=X GUID=GRC CASE Id, READONLY=X GUID=GRC CASE Id, READONLY=X GUID=GRC CASE Id, READONLY=X GUID=GRC CASE Id, READONLY=X GUID=GRC CASE Id, READONLY=X GUID=GRC CASE Id, READONLY=X GUID=GRC CASE Id, READONLY=X GUID=GRC CASE Id, READONLY=X GUID=GRC CASE Id, READONLY=X GUID=GRC CASE Id, READONLY=X GUID=GRC CASE Id, READONLY=X GUID=Issue GUID=GRC CASE Id, READONLY=X GUID=GRC CASE Id, READONLY=X GUID=CAPA Plan GUID=Remediation Plan Section G, step 3 covers the details of maintaining these delivered ‘jumps’ in RSBBS. The only required modifications are to the server and client specified in the delivered Receiver Object URLs. Some of the parameters mentioned above are sent to the Web Dynpro application using the URL and others are sent using the Assignment Details of the RSBBS entry. But these settings are delivered and should not be modified. How do I synchronize the InfoProvider timeframe selection with the key date for time dependent master data attributes in a query? Several of the business content queries have been delivered with variables created to handle this situation (0GRM3MP03_Q0001, for example). The challenge arises from the Timeframe being defined at a higher level of granularity than 0CALDAY (which is used to determine the key date of the query) so the Timeframe end date needs to be derived and used as the query key date for the purpose of time dependent attribute determination. The delivered variables, when properly used, will automatically calculate the proper key date based on the timeframe entered by the user for the query. The solution requires using 0GFN_TF variable 0P_GFN_TF (single value required entry) or 0I_GFN_TF (interval required entry) in conjunction with SAP Exit variable 0P_GFN_TF_PE. The user enters the desired timeframe (or timeframe interval) in order to select the proper records from the InfoProvider. Then the exit variable uses this value to determine the proper timeframe end date. The variable 0P_GFN_TF_PE is entered in the Key Date field of the query properties screen and is thus used to determine time dependent attribute values in the query output. Custom queries need to use these variables if time dependent attributes are included, otherwise the default value (the current system date) will be used as the query‟s key date and this could result in inconsistent results in the output.
© Copyright 2024