How to Use BPC Data Manager to Import Master Data
Short Description
BPC DATA MANAGER...
Description
90
How To… Use BPC Data Manager to Import Master Data & Transaction Data from 3rd Party Systems via SAP BusinessObjects Data Integrator SAP Product Name:
SAP BusinessObjects Planning and Consolidation, version for NetWeaver
Applicable Product Versions:
7.0
Last Updated:
10/2009
Enterprise Performance Management www.sdn.sap.com/irj/sdn/bpx-epm
© Copyright 2009 SAP AG. All rights reserved. No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP AG. The information contained herein may be changed without prior notice. Some software products marketed by SAP AG and its distributors contain proprietary software components of other software vendors. Microsoft, Windows, Outlook, and PowerPoint are registered trademarks of Microsoft Corporation. IBM, DB2, DB2 Universal Database, OS/2, Parallel Sysplex, MVS/ESA, AIX, S/390, AS/400, OS/390, OS/400, iSeries, pSeries, xSeries, zSeries, z/OS, AFP, Intelligent Miner, WebSphere, Netfinity, Tivoli, and Informix are trademarks or registered trademarks of IBM Corporation in the United States and/or other countries. Oracle is a registered trademark of Oracle Corporation. UNIX, X/Open, OSF/1, and Motif are registered trademarks of the Open Group. Citrix, ICA, Program Neighborhood, MetaFrame, WinFrame, VideoFrame, and MultiWin are trademarks or registered trademarks of Citrix Systems, Inc. HTML, XML, XHTML and W3C are trademarks or ® registered trademarks of W3C , World Wide Web Consortium, Massachusetts Institute of Technology. Java is a registered trademark of Sun Microsystems, Inc. JavaScript is a registered trademark of Sun Microsystems, Inc., used under license for technology invented and implemented by Netscape. MaxDB is a trademark of MySQL AB, Sweden. SAP, R/3, mySAP, mySAP.com, xApps, xApp, and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP AG in Germany and in several other countries all over the world. All other product and service names mentioned are the trademarks of their respective companies. Data
contained in this document serves informational purposes only. National product specifications may vary. These materials are subject to change without notice. These materials are provided by SAP AG and its affiliated companies ("SAP Group") for informational purposes only, without representation or warranty of any kind, and SAP Group shall not be liable for errors or omissions with respect to the materials. The only warranties for SAP Group products and services are those that are set forth in the express warranty statements accompanying such products and services, if any. Nothing herein should be construed as constituting an additional warranty. These materials are provided “as is” without a warranty of any kind, either express or implied, including but not limited to, the implied warranties of merchantability, fitness for a particular purpose, or non-infringement. SAP shall not be liable for damages of any kind including without limitation direct, special, indirect, or consequential damages that may result from the use of these materials. SAP does not warrant the accuracy or completeness of the information, text, graphics, links or other items contained within these materials. SAP has no control over the information that you may access through the use of hot links contained in these materials and does not endorse your use of third party web pages nor provide any warranty whatsoever relating to third party web pages. SAP “How-to” Guides are intended to simplify the product implementation. While specific product features and procedures typically are explained in a practical business context, it is not implied that those features and procedures are the only approach in solving a specific business problem using SAP products. Should you wish to receive additional information, clarification or support, please refer to SAP Consulting. Any software coding and/or code lines / strings (“Code”) included in this documentation are only examples and are not intended to be used in a productive system environment. The Code is only intended better explain and visualize the syntax and phrasing rules of certain coding. SAP does not warrant the correctness and completeness of the Code given herein, and SAP shall not be liable for errors or damages caused by the usage of the Code, except if such damages were caused by SAP intentionally or grossly negligent.
Table of Contents 1 Scenario .................................................................................................................. 1 2 Background Information ......................................................................................... 1 3 Prerequisites ........................................................................................................... 2 4 Known Limitations .................................................................................................. 2 5 The Step By Step Solution ....................................................................................... 3 5.1 Import ABAP Configuration into your landscape ............................................ 3 5.2 Configure Data Services Designer ................................................................... 4 5.3 Create the Data Manager Package from the BPC Excel Client ........................ 5 5.4 Create the Data Integrator Batch Job ........................................................... 11 5.5 Setup Data Integrator Logon Credentials as Web Admin Parameters ......... 39 5.6 Running the Data Manager Package ............................................................. 41 5.7 Check the Master Data .................................................................................. 45 6 Appendix ............................................................................................................... 47 6.1 Data Manager Dynamic Script ....................................................................... 47 6.2 Mapping Coding for EntityTransformation.ROWINDEX ................................ 48 6.3 Mapping Coding for EntityTransformation.ROWDATA ................................. 48
1 Scenario This guide will show how to integrate BOBJ Data Integrator with BPC NW 7.0. The intended purpose is to allow business users who are familiar with using BPC Data Manager to execute SAP BusinessObjects Data Services jobs through the BPC Data Manager User Interface. This provides BPC users the same user experience and familiarity of working with BPC that they are used to, but opens up BPC NW to direct integration with 3rd party solutions without going through flat file interfaces. The scenarios also allow BPC users to continue to use BPC transformation and conversion files to do mapping and transformations on top of the 3rd party sources. Some of the possible scenarios are to use the SAP BusinessObjects Data Services product to load data from any number of databases into BPC NW. There are two scenarios outlined in this guide: 1. Load Master Data from a flat file 2. Load Transaction Data from a flat file While these packages natively exist in BPC today, this guide used these two examples for sake of simplicity. Using the techniques outlined within this guide, it would be possible for customers to Load Master Data or Transaction Data from any 3rd party system supported by SAP BusinessObjects Data Services directly into BPC NW, while continuing to use the BPC Data Manager user interface and transformation and conversion files. Note: It is also possible to allow the users who are familiar with BOBJ Data Integrator to do transformation and conversion of data within Data Integrator Packages. The decision of whether to do transformation and conversion configuration within Data Services or BPC is dependent on what the customer prefers.
2 Background Information For customers unaware of how data services work, here is some background information. Data Services allow configuration of moving and transforming data from a source database table to a target database table. When working with NW, Data Services does not connect to the NW Application Server. Instead it connects to the database directly under the NW Application Server in order to move and copy data. This provides very fast handling of large data volumes. Within BPC, the Data Manager package is executing a Data Services Job which extracts data from the source specific in the Data Services Job and pushes the data into the BPC File Service. BPC File Service has two specific types of files: 1. A binary file 2. Table storage Data Manager within BPC uses the table storage mechanism. Therefore, the way this solution works is the Data Services will specify a database table as a target. During design time, a template table name is used. At runtime, a dynamically created database table name will be
passed to the Data Services job via a substitution parameter, and this new database table name will be used as the target. Once the data is in this table, the Data Manager package can read from this and continue importing the data into a BPC Dimension or BPC Application using BPC transformation and conversion files.
3 Prerequisites This guide requires the following prerequisites Client Desktop Prerequisites: • BPC Excel Client • BPC Admin Client • SAP BusinessObjects Data Services Designer 12.2.1 or above Server Side Pre‐requisites • SAP BusinessObjects Data Services Server 12.2.1 or above • SAP BusinessObjects Planning and Consolidations 7.0, version for NetWeaver
4 Known Limitations The dynamic table name parameterization introduced in Designer 12.2.1 is only supported for the following databases: • SQL • DB2 • Oracle
5 The Step By Step Solution 5.1
Import ABAP Configuration into your landscape
This How‐To guide contains transport request files, K900893.PMR and R900893.PMR. This transport request contains all the NetWeaver objects that are required to complete this How‐To Guide, including custom classes, custom process types, custom process chains, and default package instructions. Classes • ZCL_UJD_BPC_RUN_DI_JOB Process Type Class for Running DI Job • ZCL_UJD_RUN_DI_JOB Functional Class for Running DI Job • ZCL_SBOBJ_DI_API Data Integrator API Class Process Types • ZBPCRDIJOB Process Type for Run DI Job Process Chains • ZBPC_IMP_MDATA_DI Import Master Data from DI • ZBPC_IMP_TDATA_DI Import Transaction Data from DI Supporting Programs • ZSBOBJDI_API_TEST Data Integrator API Test Program Database Table • ZUJD_DM_TMPL Data Manager Table Template As the process of importing a transport request is not covered here, it is suggested that you seek assistance from your NetWeaver Basis Administrator in order to have this transport request imported into your system. Note: These objects must be imported into your system before continuing any further.
5.2
Configure Data Services Designer
The Data Services designer now allows the user to use a substitution parameter when defining the target database table. You must first configure the Designer in order to expose this new feature. This feature was introduced in Data Services Designer 12.2.1
1. Go to the DSConfig.txt file found in the C:\Program Files\Business Objects\BusinessObjects Data Services\bin directory and open it.
2. Find the parameter called “Enable_Table_Parameterization” and change the value to “YES”. Save the file.
5.3
Create the Data Manager Package from the BPC Excel Client
Once the transport is imported into your system, we will create the Data Manager Packages from the BPC Excel Client. We will cover two different scenarios here. The first is creating a Data Manager package to import Master Data from a flat file into BPC. The second scenario is creating a Data Manager package to import Transactional Data from a flat file into BPC.
1. Launch the BPC Excel Client and log on to the “ApShell” Appset or a copy of the “ApShell” appset. In this example, we will be using an Appset called APSHELL_HTG. Use the PLANNING application. From the action pane, click “Manage Data”.
2. Next, click “Maintain data management”
3. Next, click “Manage packages (organize list)”.
4. In this dialog, we can create the packages. Select the “Data Management” package group, and click the
icon, to create a new package.
icon next to the input 5. Click the field for Process Chain.
6. Select the “BPC: Planning and Consolidation: System” group on the left, and then select the ZBPC_IMP_MDATA_DI process chain on the right. Finally click the “Select” button. Note: If loading transaction data, choose the ZBPC_IMP_TDATA_DI process chain.
7. Continue creating the package, and enter the package name as “ImportMasterDataFromDI”, and the description as “Import Master Data from DI”. Also check both checkboxes under Task Type. Finally click the “Add” button. Note: If loading transaction data, enter the package name as “ImportTransactionDataFromDI”, as well as an appropriate description.
8. Now click the “Save” button. A message will appear stating that the package has been saved successfully, and the dialog will close.
9. From the action pane, once again click the “Manage packages (organize list)” link to go back into the package maintenance screen.
10. Select the ImportMasterDataFromDI package and right‐click on it. Choose “Modify Package”.
11. In the following dialog, click the icon to view the package.
12. The “Data Manager Package and Dynamic Script Editor” will then be launched. Here you can configure how the dialog screen for this package will look and behave. Click the “Advanced” button at the top to launch the editor.
13. The default instructions were installed with the transport request. You may modify to prompt for additional DI job parameters if required. Refer to the Appendix 6.1 for more information. Click “Ok” if the script has been modified, otherwise click “Cancel”.
14. Control is then passed back to the previous dialog, click the “Save” button.
15. Control is then passed back to the previous dialog, click the “Save” button.
16. Finally, at the last dialog, click “Save” one last time. You will receive a message saying that the package has been saved successfully.
5.4
Create the Data Integrator Batch Job
In this step, we will create a job using the Data Services Designer. For this example, the DI job will import a flat .txt file, and reformat it into the required structure. Then finally the job will write the formatted data to the table in the underlying database of the NetWeaver system directly. Following this, the data will be imported either into the BPC Dimension as master data or into a BPC Application as transaction data.
1. Log on to the Data Services server using the Data Services Designer. Check with the Administrator for log on information.
2. Click on the “Data Stores” tab.
3. Right‐click in the white space of this window, and choose “New”.
4. In the following dialog, enter the connection information for the underlying database of the BPC NetWeaver system. In this example, we are dealing with a SQL Database. Then click “Apply” and “OK”. Note: Remember we are writing data directly to the underlying database, and not going through the NW application layer.
5. The Datastore is now created.
6. In this example, we will attempt to load master data for the Entity dimension. The data will come from the supplied Entity.csv file. Note: If you are loading transaction data, use the Planning.csv file supplied with this how‐to guide.
7. You must copy the data file to the c:\temp directory on the server in which the Data Services server was installed. Note: Again, use the Planning.csv file if loading transaction data.
8. Next, click on the “Formats” tab.
9. Right‐click on “Flat Files”, and choose “New”.
10. In the following dialog box, enter the name of the format as “Entity”. Change the location to “Job Server”. Change the root directory to C:\temp and enter the file name as Entity.csv and press Enter. Note: If you are loading transaction data, then use the Planning.csv file, and name the file format as “Planning”.
11. Simply, click “Yes”.
12. The right side of the dialog box will then be populated with data.
13. Finally, click “Save & Close”.
14. The format is now created.
15. Next, click on the “Data Flows” tab.
16. Right‐click on “Data Flows” and choose “New”.
17. Next, change the name of the new “Data Flow” to “EntityMasterData”. Note: If loading transaction data, name it as “PlanningTransactionData”.
18. Double‐click on the newly created Data Flow. The canvas on the right should be empty.
19. Go back to the “Formats” tab.
20. Drag/drop the “Entity” format into the white area on the right. 201B
Note: If loading transaction data, drag/drop the “Planning” format. 20B
21. When dropping into the white area on the right, choose “Make Source” in the context menu. 203B
22. The “Data Flow” layout should look similar to this. 204B
23. Next, click on the “Query Transform” icon on the right side of the designer window. Now simply click in the white space area of the “Data Flow” layout. 205B
24. A new “Query” icon will show in the layout, change the name of this to something meaningful, like “EntityTransformation. 206B
Note: If you are loading transaction date, then name it as “PlanningTransformation”. 207B
25. The “Data Flow” should now look similar to this. 208B
26. Connect the source file with the query transformation. Simply draw a line between the two using the mouse. 209B
27. Go back to the “Datastore” tab. 210B
28. Expand the BPCNW datastore, and right‐click on the “Tables” node, and choose “Import By Name…”. 21B
29. Enter the name of the transparent table for the Data Manager file. Use the template table ZUJD_DM_TMPL. The designer needs a valid table in order to get the structure of the target. We will then use a substitution parameter to pass the table name at runtime. The “Owner” value will probably always be the SYSID of the NetWeaver system. This value is case sensitive. 21B
30. The table has now been defined in the datastore. 213B
31. Drag and drop the imported table into the “Data Flow” layout on the right. 214B
32. When dropping into the “Data Flow” layout, choose “Make Target” from the context menu. 215B
33. The Data Flow should now look similar to this. 216B
34. Connect the “EntityTransformation” with the target by simply drawing a line between the two. 217B
Note: If loading transaction data, connect the “PlanningTransformation” with the target. 218B
35. In order to pass parameters to the Data Flow, the parameters first need to be defined. In this example, we will demonstrate the procedure for passing parameters by passing the client field. The transparent table in NetWeaver contains a client field called MANDT. We need to fill this field with the correct client for the BPC NW system. We could of course hardcode the value here in the definition, but for this example, we will pass it using global job parameters. Choose Tools, Variables. 219B
36. Right‐click on “Parameters” and choose “Insert”. 20B
37. Now, double‐click on the newly created parameter. 21B
38. Enter the name of the parameter as $MANDT, select the data type, parameter type, and enter a length as 3. Then click “Ok”. 2B
39. The new parameter is now created. Later we will use this parameter to fill the MANDT field in the target structure of the Data Flow. 23B
40. Return to the “Data Flow” layout. 24B
41. Double‐click on the “EntityTransformation” object. 25B
Note: If loading transaction data, double click on the “PlanningTranformation” object. 26B
42. You should now see a screen similar to this. 27B
43. Double‐click on MANDT in the right side of this screen. 28B
44. In the mapping box, enter $MANDT. Then click “Ok”. Here we are simply mapping the MANDT field in the target structure to the Data Flow parameter $MANDT. Later we will map $MANDT to the global variable $GV_MANDT which will have its value passed from the BPC Package. 29B
45. Now, single‐click on the ROWINDEX field on the right‐side of the screen. 230B
46. Enter the script as you see here. Since the ROWINDEX in the table, is a character field with a length of 10, we have to do some formatting when generating the new row index. You may copy and paste this from Appendix 6.2. 231B
47. Next, click on the ROWDATA field in the right‐side of the screen. 23B
48. At the bottom of the screen, enter the following coding. This coding concatenates the fields from the source into the ROWDATA field in the target, separated by a comma. The || refers to the concatenate function. Copy and paste the coding from the Appendix 6.3. 23B
49. Return to the Data Flow layout, and double‐click on the Target Table. 234B
50. Click the “Options” tab at the bottom, and check the “Delete data from table before loading” checkbox. 235B
51. In order to pass the actual table name to the batch job dynamically at runtime, we need to use a substitution parameter. Create a substitution parameter by choosing Tools, Substitution Parameter Configurations. 236B
52. In this dialog, scroll to the bottom and enter a new parameter called, $$DMTABNAME, then click “Ok”. 237B
53. Next, return to the “Options” tab. In the dynamic table name variable field, enter the name of the substitution parameter as [$$DMTABNAME]. 238B
54. From the Local Object Library, click on the “Jobs” tab. 239B
55. Right‐click on “Batch Jobs” and choose “New” from the context menu. 240B
56. Rename the job name as “EntityMDataLoad” and hit enter. 241B
Note: If loading transaction data, then name it as “PlanningTDataLoad”. 24B
57. Next, double click on the newly created job. 243B
58. Click on the “Data Flow” tab and drag and drop the “EntityMasterData” data flow into the right side of the “Job” screen. 24B
Note: If loading transaction data, drag and drop the “PlanningTransactionData” data flow into the right side. 245B
59. You should now see the data flow in the layout of the job. 246B
60. From the menu, choose Tools, Variables. 247B
61. Now, we are in the context of the batch job. Here we will create a Global Variable for the client. Right‐click on Global Variables and choose “Insert”. 248B
62. Now double click on the newly created global variable. 249B
63. Rename the global parameter as $GV_MANDT, choose the data type, and set the length. Again, it may make sense to hard code the value here, but for this example, we want to demonstrate how to pass the value from the package, so we will leave it blank for now. Click “Ok”. 250B
64. Next, click on the “Calls” tab, and double click on the $MANDT parameter. 251B
65. Now enter the value for this parameter as the global variable $GV_MANDT. Click “Ok”. 25B
66. Go back to the “Jobs” tab. 253B
67. Next, expose the job as a web service. Right‐click on the “EntityMDataLoad” job. Choose Properties. 254B
Note: If loading transaction data, right click on the “PlanningTDataLoad” job. 25B
68. Click on the “Attributes” Tab, then click on “Web_Service_Enabled”. In the “Value” box, enter YES. Finally, click “Apply”, and “Ok”. 256B
69. Next, click the “Save All” button in the main toolbar. 257B
70. Next, choose Validation, All Objects in View. 258B
71. You should have no errors. You may get a warning around the conversion of the generated row number. It can be ignored. 259B
72. Finally, setup security for the new job as a web service. Go to the Data Services Management Console. Launch a browser and enter http://:28080/DataServices /launch/logon.do as the URL. Enter the administrator credentials, and click “Login”. 260B
H
H
Note: Check with the administrator for log on information. 261B
73. Click on the “Administrator” Link. 26B
74. Next, click on the “Web Services” link on the left. 263B
75. Click on the “Web Services Configuration” Tab. 264B
76. Click the checkbox next to the “EntityMDataLoad” job. 265B
Note: If loading transaction data, click the checkbox next to the “PlanningTDataLoad” job. 26B
77. At the bottom of the screen, choose “Enable Session Security” from the listbox, and click the “Apply” button. 267B
78. The job will then be set for Session Security, which means when the web service is called, a user id and password will be required. 268B
5.5
Setup Data Integrator Logon Credentials as Web Admin Parameters 36B
Now, we have a few options. We can either prompt for the username and password in the BPC Data Manager package, or we can set these as web admin parameters. If your company is allowing business power users to create Data Integrator jobs, then it would be recommended to prompt for username and password for connecting to Data Integrator in the BPC Data Manager prompts. However, if your Data Services jobs are typically setup by IT, then the BPC Administrator can just specify a service user id and password as a web admin parameter that can be used for connection to Data Services. In this example, we will set the web admin parameter with the connection information. 89B
1. Log on to the BPC Web Administration. From the BPC Excel Client, expand the “Available Interfaces” section of the action pane. Click on BPC Administration. 269B
2. From the action pane inside the BPC Administration window, click the line for “Set AppSet Parameters”. 270B
3. Scroll to the bottom of the page, so that you can see the empty row in which you can add new parameters. 271B
4. In the “KeyID” field, enter DI_USER_ID as shown. Also enter the user id for the DI server in the “Value” field. Click the “Update” button at the top of the page.
27B
5. Repeat step 4 for two other parameters, one called DI_PASSWORD, and the other called DI_SERVER_HOST. Supply the values for these parameters and click the “Update” button at the top of the page. 273B
Note: The “Run DI Job” process type will attempt to get these values from the package prompts first, but if they are not specified there, it will look for the values as web admin parameters here.
274B
5.6
Running the Data Manager Package 37B
Running the data manager packages is typically the task of a BPC Power User or End User. 90B
1. Logon to the BPC Excel Client. From the action pane, click “Manage Data”. 275B
2. Next, click “Run a data management package”. 276B
3. Choose package “ImportMasterDataFromDI”, and click the “Run” button. 27B
Note: If loading transaction data, choose package “ImportTransactionDataFromDI”. 278B
4. Enter the name of the Data Integrator job, in this case it is “EntityMDataLoad”. Enter any parameters, for example, here we need to supply the client in which BPC is installed. 279B
Note: These parameter values can be hardcoded and not prompted to the user if needed. This is just to showcase building the Data Service Jobs generically. 280B
If loading transaction data, enter “PlanningTDataLoad” as the DI job name. 281B
5. Supply the import file which the DI job will write to. The file will also be used by the downstream processes in the chain, such as the convert and load processes. Click the folder icon next to the field. 28B
6. In the following dialog, click on the “DataFiles” folder, and supply the file name as “EntityDataFile.csv” or some meaningful name. Then click the “Open” button. This file will be created in the BPC file service at runtime and used as a staging area for the data from Data Services. This file service file can then be read by the downstream process types in the chain. 283B
7. Next, specify the transformation file and click “Next”. 284B
8. Choose the correct dimension from the listbox, and click “Finish”. 285B
Note: If loading transaction data, there will be no prompt for the dimension, but there will be other prompts. 286B
9. You can then check the status by selecting the package and choosing the “View Status” button. 287B
10. Select the package run, and click on the “Detail” button. 28B
11. The log is then displayed. Notice that any/all warning or error messages from Data Integrator will be shown in the log. 289B
Note: The warning message shown here is not an error, and is expected in all scenarios. 290B
5.7
Check the Master Data 38B
1. Log on to the BPC Admin Client. From the left side of the screen, expend the AppSet node. Then expand the Dimension Library node. Finally click on the “Entity” node. 291B
2. From the action pane, click “Maintain dimension members”. The members will then be shown in the center panel of the screen. 29B
3. You should see the newly created records from the Entity.csv file. 293B
6 Appendix 31B
6.1
Data Manager Dynamic Script 39B
The following is the default instructions code used for the ZBPC_IMP_TDATA_DI package. A DI job parameter, called GV_MANDT, has been defined as a prompt. The end user may fill in this value. The names of the DI parameters must match exactly with what has been defined in the DI batch job(without $). Notice the TASK which contains the DI_JOB_PARAM. If you add any DI job parameters as PROMPTs, then you must also add that field to that task as well. 91B
PROMPT(TEXT,%DI_JOB%,"Please Enter DI Batch Job Name",) PROMPT(TEXT,%GV_MANDT%,"DI Job Parameter- GV_MANDT",) PROMPT(INFILES,,"Import file:",) PROMPT(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls) PROMPT(DIMENSIONNAME,%DIMNAME%,"Dimension name:",,,%DIMS%) INFO(%TEMPNO1%,%INCREASENO%) INFO(%TEMPNO2%,%INCREASENO%) INFO(%EQU%,=) INFO(%TAB%,;) TASK(ZBPC_IMP_MDATA_DI,SAPPSET,%APPSET%) TASK(ZBPC_IMP_MDATA_DI,DI_JOB,%DI_JOB%) TASK(ZBPC_IMP_MDATA_DI,DI_JOB_PARAM,GV_MANDT%EQU%%GV_MANDT%) TASK(ZBPC_IMP_MDATA_DI,FILE,%FILE%) TASK(/CPMB/MASTER_CONVERT,OUTPUTNO,%TEMPNO1%) TASK(/CPMB/MASTER_CONVERT,FORMULA_FILE_NO,%TEMPNO2%) TASK(/CPMB/MASTER_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%) TASK(/CPMB/MASTER_CONVERT,SUSER,%USER%) TASK(/CPMB/MASTER_CONVERT,SAPPSET,%APPSET%) TASK(/CPMB/MASTER_CONVERT,SAPP,%APP%) TASK(/CPMB/MASTER_CONVERT,FILE,%FILE%) TASK(/CPMB/MASTER_CONVERT,DIMNAME,%DIMNAME%) TASK(/CPMB/MASTER_LOAD,INPUTNO,%TEMPNO1%) TASK(/CPMB/MASTER_LOAD,FORMULA_FILE_NO,%TEMPNO2%) TASK(/CPMB/MASTER_LOAD,DIMNAME,%DIMNAME%) 92B
93B
94B
95B
H
H
96B
97B
98B
9B
10B
10B
102B
103B
104B
105B
106B
107B
108B
109B
10B
1B
12B
13B
14B
15B
Note: If there is more than one parameter for a job, then you can simply prompt for it using this syntax. 16B
PROMPT(TEXT,%GV_PARAM1%,"DI Job Parameter- GV_PARAM1",) PROMPT(TEXT,%GV_PARAM2%,"DI Job Parameter- GV_PARAM2",) PROMPT(TEXT,%GV_PARAM3%,"DI Job Parameter- GV_PARAM3",) 17B
18B
19B
Then for the TASK, you can add those parameters to the existing DI_JOB_PARAM parameter. 120B
TASK(ZBPC_IMP_MDATA_DI,DI_JOB_PARAM,GV_PARAM1%EQU%%GV_PARAM1%%TAB% GV_PARAM2%EQU%%GV_PARAM2%%TAB%GV_PARAM3%EQU%%GV_PARAM3%)) 12B
Notice the INFO lines, %EQU% refers to (=) and %TAB& refers to (,). The result of the DI_JOB_PARAM would have the format as A=B;C=D. This format must be used in order for the process type to parse it correctly and create the required XML to pass to the job. Also the parameter names, such as GV_PARAM1 must match the global variable name in the DI job, without the $ in the beginning. 12B
The following is the default instructions for the ZBPC_IMP_TDATA_DI package. You can add prompts to this script in the same way as above. 123B
PROMPT(TEXT,%DI_JOB%,"Please Enter DI Batch Job Name",) PROMPT(TEXT,%GV_MANDT%,"DI Job Parameter- GV_MANDT",) PROMPT(INFILES,,"Import file:",) PROMPT(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls) PROMPT(RADIOBUTTON,%CLEARDATA%,"Select the method for importing the data from the source file to the destination database",0,{"Merge data values (Imports all records, leaving all remaining records in the destination intact)","Replace && clear datavalues (Clears the data values for any existing records that mirror each entity/category/time combination defined in the source, then imports the source records)"},{"0","1"}) PROMPT(RADIOBUTTON,%RUNLOGIC%,"Select whether to run default logic for stored values after importing",1,{"Yes","No"},{"1","0"}) PROMPT(RADIOBUTTON,%CHECKLCK%,"Select whether to check work status settings when importing data.",1,{"Yes, check for work status settings before importing","No, do not check work status settings"},{"1","0"}) INFO(%EQU%,=) INFO(%TAB%,;) INFO(%TEMPNO1%,%INCREASENO%) INFO(%ACTNO%,%INCREASENO%) TASK(ZBPC_IMP_TDATA_DI,SAPPSET,%APPSET%) TASK(ZBPC_IMP_TDATA_DI,DI_JOB,%DI_JOB%) TASK(ZBPC_IMP_TDATA_DI,DI_JOB_PARAM,GV_MANDT%EQU%%GV_MANDT%) TASK(ZBPC_IMP_TDATA_DI,FILE,%FILE%) TASK(/CPMB/CONVERT,OUTPUTNO,%TEMPNO1%) TASK(/CPMB/CONVERT,ACT_FILE_NO,%ACTNO%) TASK(/CPMB/CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%) TASK(/CPMB/CONVERT,SUSER,%USER%) TASK(/CPMB/CONVERT,SAPPSET,%APPSET%) TASK(/CPMB/CONVERT,SAPP,%APP%) TASK(/CPMB/CONVERT,FILE,%FILE%) TASK(/CPMB/CONVERT,CLEARDATA,%CLEARDATA%) TASK(/CPMB/LOAD,INPUTNO,%TEMPNO1%) TASK(/CPMB/LOAD,ACT_FILE_NO,%ACTNO%) TASK(/CPMB/LOAD,RUNLOGIC,%RUNLOGIC%) TASK(/CPMB/LOAD,CHECKLCK,%CHECKLCK%) TASK(/CPMB/LOAD,CLEARDATA,%CLEARDATA%) 124B
125B
126B
127B
H
128B
129B
130B
13B
132B
13B
134B
135B
136B
137B
138B
139B
140B
14B
142B
143B
14B
145B
146B
147B
148B
149B
150B
15B
6.2
Mapping Coding for EntityTransformation.ROWINDEX 40B
Ltrim_blanks(to_char(gen_row_num(), '000000000')) 152B
6.3
Mapping Coding for EntityTransformation.ROWDATA 41B
varchar_to_long(Entity.Field1||','||Entity.Field2||','||Entity.Field3||','|| Entity.Field4||','||Entity.Field5||','||Entity.Field6||','|| Entity.Field7||','||Entity.Field8||','||Entity.Field9 ) 153B
154B
15B
H
View more...
Comments