sap demand signal management, version for sap bw/4hana
TRANSCRIPT
CONFIDENTIALDocument Version: 1.0 – 2018-11-30
SAP Demand Signal Management, version for SAP BW/4HANA
© 2
018
SAP
SE o
r an
SAP affi
liate
com
pany
. All r
ight
s re
serv
ed.
THE BEST RUN
Content
1 SAP Demand Signal Management, version for SAP BW/4HANA. . . . . . . . . . . . . . . . . . . . . . . . . 9
2 Data Model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132.1 Master Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Context. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14Handling Descriptions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15Handling Products. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16Handling Locations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22Handling Time. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24Handling Hierarchies. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
2.2 Transaction Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29Handling Transaction Data from Retailers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .29Handling Retail Panel Data from Market Research Companies. . . . . . . . . . . . . . . . . . . . . . . . . . 32
2.3 Enhancing the Data Model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32Example: Adding New Attributes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
2.4 Data Flow. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .36Data Flow for Master Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .37Data Flow for Transaction Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39Process Chains in the Data Flow. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
2.5 Configuring and Administrating the Data Model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .42Defining Location Calendars. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44Delete Account Assignments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45Setting Up Semantic Partitioning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45Direct Update of Master Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51Check the Customizing for Data Model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
2.6 Configuring and Administrating Global Market Share Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . .72Define Market Groups. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72Include Images in Global Market Share Analysis User Interfaces. . . . . . . . . . . . . . . . . . . . . . . . . 73
3 Data Upload. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 763.1 General Settings for Data Upload. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
Basic Settings for Uploading Data in Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77Basic Settings for Data Upload in Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85Scheduling and Monitoring Jobs for Automatic Data Uploads. . . . . . . . . . . . . . . . . . . . . . . . . . 86Manage Agreements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
3.2 Configuring and Administrating the Data Upload. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123Enabling the Upload of Name/Value Pairs With Special Characters. . . . . . . . . . . . . . . . . . . . . . 126
2 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Content
Configuring the Data Upload for Retail Panel Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127Configuring the Data Upload for Retailer Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154One-Time Uploads Before Recurring Regular Data Uploads Are Started. . . . . . . . . . . . . . . . . . . 164Setting Up Automatic Upload with Folder Scanner. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .166Setting Up Automatic Upload with PSA Scanner. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169Manual Data Upload. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169Checking Completeness of Data Deliveries. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171Archiving. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
3.3 Supervising the Data Upload. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .174Process Flow Control. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174Status Management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178Monitor Jobs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180Monitor Deliveries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180Data Upload Monitor. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190Error Handling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190Planning Data Deliveries. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195Monitoring Data Deliveries. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
4 Quality Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2024.1 Configuring Quality Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2034.2 Overview of Key Figures in Quality Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2044.3 Reports for Administrating Quality Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .207
5 Data Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2085.1 Harmonized Objects and Source Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 209
Example: Harmonized Product with Assigned Source Products. . . . . . . . . . . . . . . . . . . . . . . . .2105.2 Object Types Used in Data Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2115.3 Work Items. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2125.4 Units of Measure in Data Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2145.5 Name/Value Pairs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2155.6 Search Strategy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2155.7 User-Specific Parameters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2175.8 Support of Global Reporting in Data Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2185.9 Configuring and Administrating Data Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219
Set Up Data Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221Defining the Appearance of Object Attributes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221Setting Up Automatic Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .223Defining Priorities for Copying Attribute Values into Harmonized Objects. . . . . . . . . . . . . . . . . 224Defining Restrictions for Harmonized Attribute Values. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .226Defining Harmonization Groups. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230Maintain Name Value Pair Filters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231Reports for Administrating Data Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232
SAP Demand Signal Management, version for SAP BW/4HANAContent C O N F I D E N T I A L 3
Scheduling Background Jobs for Data Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235Activate Conversion Routines for MATERIAL, EANUPC, GLN and DATA_LOAD_ID. . . . . . . . . . . .236Activate the Action Log. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237Activate Data Origin per Attribute for Source Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 238Activate Distinct ABBIDs in Data Harmonization UIs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239Activate the Action Log and DOASO for User-Created Object Types. . . . . . . . . . . . . . . . . . . . . 240Maintaining General Harmonization Parameter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
5.10 Automatic Data Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2425.11 Manual Tasks in Data Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 244
Defining Derivation Instruction Sets for Harmonized Attribute Values. . . . . . . . . . . . . . . . . . . . 245Defining Derivation Instruction Sets for Source Attribute Values. . . . . . . . . . . . . . . . . . . . . . . . 251Monitoring Work Items. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254Manual Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255Editing Harmonized Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 260Editing Source Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261
6 Integration with Machine Learning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2626.1 Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2626.2 Set up Model Definition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2626.3 Build Machine Learning Environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263
7 Data Enrichment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2647.1 Sales Data Enrichment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266
Periods with Sales Deviations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267Location Calendar. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267
7.2 Stock Data Enrichment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2687.3 Overview of Key Figures in Data Enrichment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2697.4 Executing Data Enrichment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2707.5 Enhancing the Data Enrichment Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2717.6 Configuring and Administrating Data Enrichment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .273
Defining Periods with Sales Deviations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274
8 Global Reporting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2768.1 Provide Data for Global Reporting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278
Data Consolidation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .279Extrapolation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 281Example: Extrapolation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .283Time Split. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285Time Split: Example for Weekday Split Profile. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287Time Split: Example for Weekly Split Profile. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 289Time Split: Example for Monthly Split Profile. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 291Rules for Defining Profiles. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 292
4 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Content
Delete Unused Profiles. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2938.2 Supervise Data for Global Reporting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294
Release Data for Global Reporting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295Manage Publishing Groups. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 297Publish Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 299
8.3 Global Market Share Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 300Global Market Share Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .301Gainer/Loser Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .303Root Cause Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 304Key Figures in Global Market Share Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307Calculated Key Figures for Global Market Share Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 308Restricted Key Figures for Global Market Share Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 310Configuring and Administrating Global Market Share Analysis. . . . . . . . . . . . . . . . . . . . . . . . . .311
9 Analytics and Reporting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3159.1 Global Market Share Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316
Global Market Share Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317Gainer/Loser Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 318Root Cause Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319Key Figures in Global Market Share Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 322Calculated Key Figures for Global Market Share Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 323Restricted Key Figures for Global Market Share Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 325Configuring and Administrating Global Market Share Analysis. . . . . . . . . . . . . . . . . . . . . . . . . 326
9.2 Key Figures for SAP Demand Signal Management, version for SAP BW/4HANA. . . . . . . . . . . . . . . 330
10 Integration with SAP Integrated Business Planning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33510.1 Configuring the Integration with IBP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33610.2 Assigning Retailer Locations to Manufacturer DC and IBP Customer. . . . . . . . . . . . . . . . . . . . . . . 33810.3 Releasing Data for IBP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .340
Release Data for IBP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341Mass Release to IBP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .343
10.4 Transferring Data to IBP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 343
11 ATMA Interface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34511.1 Configuring the ATMA-Interface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 345
Installing SAP BW Content. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 346Installing the ATMA-Interface Content. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 347Authorization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 354Customizing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 354Loading with DSiM. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 356Load Flat File. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 368Upload to DMF. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 385
SAP Demand Signal Management, version for SAP BW/4HANAContent C O N F I D E N T I A L 5
12 Configuration and Administration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .39112.1 Configuring and Administrating the Data Model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394
Defining Location Calendars. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 396Delete Account Assignments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 397Setting Up Semantic Partitioning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .397Direct Update of Master Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403Check the Customizing for Data Model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 425
12.2 Configuring and Administrating the Data Upload. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 425Enabling the Upload of Name/Value Pairs With Special Characters. . . . . . . . . . . . . . . . . . . . . .428Configuring the Data Upload for Retail Panel Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 429Configuring the Data Upload for Retailer Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .456One-Time Uploads Before Recurring Regular Data Uploads Are Started. . . . . . . . . . . . . . . . . . 466Setting Up Automatic Upload with Folder Scanner. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 468Setting Up Automatic Upload with PSA Scanner. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 471Manual Data Upload. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 471Checking Completeness of Data Deliveries. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 473Archiving. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .476
12.3 Configuring Quality Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47612.4 Configuring and Administrating Data Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 478
Set Up Data Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 480Defining the Appearance of Object Attributes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 481Setting Up Automatic Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 482Defining Priorities for Copying Attribute Values into Harmonized Objects. . . . . . . . . . . . . . . . . 484Defining Restrictions for Harmonized Attribute Values. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .485Defining Harmonization Groups. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 489Maintain Name Value Pair Filters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 490Reports for Administrating Data Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 491Scheduling Background Jobs for Data Harmonization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 495Activate Conversion Routines for MATERIAL, EANUPC, GLN and DATA_LOAD_ID. . . . . . . . . . . .496Activate the Action Log. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .497Activate Data Origin per Attribute for Source Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 498Activate Distinct ABBIDs in Data Harmonization UIs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 499Activate the Action Log and DOASO for User-Created Object Types. . . . . . . . . . . . . . . . . . . . . 499Maintaining General Harmonization Parameter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 501
12.5 Configuring and Administrating Data Enrichment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .501Defining Periods with Sales Deviations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 502
12.6 Configuring and Administrating Global Market Share Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . 504Define Market Groups. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 504Include Images in Global Market Share Analysis User Interfaces. . . . . . . . . . . . . . . . . . . . . . . .505
12.7 Configuring Analytics and Reporting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .507Configuring the Web Dynpro Reports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 507
6 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Content
Configuring the Analytics Dashboards Created with SAPUI5. . . . . . . . . . . . . . . . . . . . . . . . . . 50912.8 Extensibility of Analytics and Reporting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 519
Extensibility of the Web Dynpro Reports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 519Extensibility of the Analytics Dashboards Created with SAPUI5. . . . . . . . . . . . . . . . . . . . . . . . 523
13 Roles for SAP Demand Signal Management, version for SAP BW/4HANA. . . . . . . . . . . . . . . .53013.1 Business Roles for SAP Demand Signal Management, version for SAP BW/4HANA. . . . . . . . . . . . . 53113.2 Administrator for Demand Data Foundation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .53313.3 Data Upload Supervisor. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .534
14 Apps for SAP Demand Signal Management, version for SAP BW/4HANA. . . . . . . . . . . . . . . . 53614.1 Manage Agreements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 537
App Implementation: Manage Agreements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .538App Extensibility: Manage Agreements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 539
14.2 Manage Contexts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 539App Implementation: Manage Contexts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 540App Extensibility: Manage Contexts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 541
14.3 Manage Data Origin. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 542App Implementation: Manage Data Origin. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 543App Extensibility: Manage Data Origin . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 544
14.4 Manage Data Provider. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 544App Implementation: Manage Data Provider. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .545App Extensibility: Manage Data Provider. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 546
14.5 Manage Derivation Instruction Sets (Products/Locations). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 546App Implementation: Manage Derivation Instruction Sets. . . . . . . . . . . . . . . . . . . . . . . . . . . . 547App Extensibility: Manage Derivation Instruction Sets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 549
14.6 Manage Objects (Products/Locations). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 549App Implementation: Manage Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 550App Extensibility: Manage Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 551
14.7 Manage Region. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 551App Implementation: Manage Region. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .552App Extensibility: Manage Region. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 553
14.8 Manage Time Derivation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 554App Implementation: Manage Time Derivation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 555App Extensibility: Manage Time Derivation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 556
14.9 Match External Hierarchy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 556App Implementation: Match External Hierarchy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 557App Extensibility: Match External Hierarchy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 558
14.10 Monitor Deliveries. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .558App Implementation: Monitor Deliveries. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .559App Extensibility: Monitor Deliveries. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 560
14.11 Plan Data Deliveries. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .561
SAP Demand Signal Management, version for SAP BW/4HANAContent C O N F I D E N T I A L 7
App Implementation: Plan Data Deliveries. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 562App Extensibility: Plan Data Deliveries. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 563
14.12 Restrict Attribute Value. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 563App Implementation: Restrict Attribute Value. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 564App Extensibility: Restrict Attribute Value. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .565
14.13 Harmonization Status. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 565App Implementation: Harmonization Status. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .566App Extensibility: Harmonization Status. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .567
14.14 Manage Bill of Materials. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 568App Implementation: Manage Bill of Materials. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 569App Extensibility: Manage Bill of Materials. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 570
14.15 Monitor Jobs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 570App Implementation: Monitor Jobs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 571App Extensibility: Monitor Jobs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 572
15 DSIM Fiori Authorization Roles OData. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 574
8 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Content
1 SAP Demand Signal Management, version for SAP BW/4HANA
Product Information
Product SAP Demand Signal Management, version for SAP BW/4HANA
Release 1.0
Based on ● SAP HANA Appliance 1.0● SAP BW/4HANA Core Component 1.0 SP07
NoteBefore installing SAP Demand Signal Management, version for SAP BW/4HANA refer the release information note (RIN) 2707571 .
BI Content Release SAP BW4 HANA Content Add-on & Content Basis Add-on 1.0 SP05
Documentation published November 2018
SAP Demand Signal Management, version for SAP BW/4HANA provides a robust state-of-the-art foundation for managing the entire data upload and harmonization process. This software, which includes comprehensive business process monitoring and control, offers a very efficient approach for managing a huge volume of highly disparate data.
The application leverages powerful and modern analytics and user interface (UI) technology, together with the performance gains due to SAP HANA, to provide manufacturers with up-to-date views of demand signals like end consumer sales and store inventory, as well as a comprehensive view of the overall market.
Powered by SAP HANA, SAP Demand Signal Management, version for SAP BW/4HANA contains a consistent, centralized in-memory database that stores, integrates, cleanses, and harmonizes large volumes of data from disparate data sources. The data can include external downstream data, such as POS data from retailers, as well as market research data from various providers and company-internal data like ERP master data.
NoteReferences to market research data are intended only for retail panel data.
SAP Demand Signal Management, version for SAP BW/4HANASAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 9
Implementation Considerations
Key Capabilities
Scalable and centralized in-memory database for large volumes of data
● Capture large volumes of downstream demand data, such as POS data, and market research data in a single instance
● Automate the process to cleanse, validate, harmonize, and enrich data with additional information during the data upload
Central source of truth for advanced analytics
● Prepackaged analytics to instantly analyze large volumes of demand and market data along multiple dimensions
● Extend and create purpose-built analytics and mobility applications to meet the specific needs of various roles and departments in the organization
● Integration of social media data with analytics is prepared
Benefits of SAP Demand Signal Management, version for SAP BW/4HANA
● Greater insight into the market through monitoring the competitive landscape and shifts in consumer demand and consumer segments, helping to increase market share and brand value
● Better visibility into sales performance and retailer and promotions compliance based on yesterday's POS data down to the store/SKU level, providing the opportunity to react in a timely fashion
● Monitoring trade promotions performance across different stores● Globally consistent reporting of retail panel data across countries and product categories independently of
the delivery schedule of the data● Efficient classification in order to meet different reporting needs and provide insightful segmentations● High quality, timely business information as needed by business users● More reliable supply plans that better reflect actual demand● Reduced inventory costs due to less uncertainty because of better downstream visibility● Faster response to demand fluctuations● Increased trustworthiness of external, uploaded, and harmonized data throughout the company● Efficient, consistent, and enterprise-wide distribution and availability of relevant demand data
10 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
SAP Demand Signal Management, version for SAP BW/4HANA
Features
Components of SAP Demand Signal Management, version for SAP BW/4HANA
SAP Demand Signal Management, version for SAP BW/4HANA contains the following components:
● Data UploadSAP Demand Signal Management, version for SAP BW/4HANA contains predefined interfaces for the data to be imported from external sources, such as retailers and market research companies. The data can be uploaded automatically by making the settings for this in Customizing. You can upload data either from files or by using extraction, transformation, and loading tools such as SAP Data Services or SAP Information Interchange OnDemand.
● Data Quality ValidationThe data quality is checked using specified key figures to calculate, for example, the percentage of unknown stores and products, or the number of stores for which data is missing compared to the last data delivery.
● Data HarmonizationData harmonization maps and harmonizes master data objects, such as products and locations, from different data origins. For example, you receive product master data from different retailers. Each retailer uses different identifiers for each product and different terminology for product attributes. To ensure that you have a consistent set of master data for consuming applications and for analytics and reporting, the data must be harmonized. Therefore, all external data is mapped into one harmonized object with consistent attributes.
● Data EnrichmentIn this stage of the upload process, data is enriched with additional records and calculated key figures to provide insights or to gain visibility in the on-shelf availability and the in-stock availability in retail stores. For example, you need to generate zero sales records if such data is not provided by the data origin taking into account whether a store was open for sale or not. This information is a prerequisite for precise calculation of the additional key figures for out-of-shelf and out-of-stock situations to make corresponding incidents transparent in reporting.
● Process Flow ControlYou can control and monitor the entire upload process for each data delivery.
● Provisioning of Global Market Research DataYou can prepare market research data for a globally consolidated reporting across different countries and product categories.
● Analytics and ReportingYou can use various preconfigured reports and incorporate your own reports using tools such as SAP BusinessObjects Dashboards.
SAP Demand Signal Management, version for SAP BW/4HANASAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 11
● Integration with SAP Integrated Business PlanningYou can transfer POS data from SAP Demand Signal Demand Management to SAP Integrated Business Planning. This data can be used in SAP Integrated Business Planning for demand to improve the quality of short-term forecasting.
12 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
SAP Demand Signal Management, version for SAP BW/4HANA
2 Data Model
Use
The data model for SAP Demand Signal Management, version for SAP BW/4HANA, version for SAP BW/4HANA consolidates heterogeneous data from different data origins in one model.
The following data is combined and processed in BI Content objects:
● Master data: product data, location data, time data● Transactional data: stock data, sales data● Market research data
These data types can have the following data origins:
● Internal company data● Retailers
More Information
Context [page 14]
2.1 Master Data
Reporting in SAP Demand Signal Management, version for SAP BW/4HANA is based on the following dimensions:
● Product● Location● Time
You can use internal and external hierarchies for products and locations.
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 13
2.1.1 Context
Use
In a system landscape with a multitude of different data origins and data providers it is important to ensure that the products and locations you report on are unique.
The external IDs of locations and products sent to the SAP Demand Signal Management, version for SAP BW/4HANA system are combined with a context to make the products and locations unique. The context is assigned to the corresponding data delivery agreement and added to the external IDs of locations and products.
The system creates a unique internal number for each combination of context and external ID of the products and locations. Therefore, you have to assign a number range for products and a number range for locations to each context. We recommend that you use different number ranges for different contexts. For more information, see the Customizing for Data Model under Cross-Application Components Demand Data Foundation Data Model General Settings . Using the context that is assigned to your data delivery agreements enables the system to create different internal IDs for different products even if they share the same external ID in the uploaded data. Therefore, we strongly recommend that you use different contexts for different data origins.
Transactional data is stored only with the internal numbers generated by the system. By using contexts, you ensure that you can correctly report on different products sharing the same external ID.
NoteYou assign data origin and context to a data delivery agreement independently. Therefore, it is technically possible to you use the same context in several data delivery agreements for different data origins. If your specific business requires this setup, you must ensure that the same external product ID always refers to the same physical product, regardless of where the data comes from. In this exceptional case, one context is sufficient. However, if you cannot rule out that the same external ID is used for different physical products, you must assign different contexts to the corresponding data delivery agreements to avoid data inconsistencies and data loss.
NoteAnother exceptional case is that you use different contexts for different data delivery agreements of the same data origin. This might be necessary if, for example two retailers just merged and have not yet harmonized their master data systems.
ExampleIn the data deliveries of Retailer X, product ID 12345 is used for shortbread fingers. In the data deliveries of Retailer Y the same product ID is used for chocolate cookies. In this case you need different contexts to be able to differentiate between the two products in the system.
Context, data origin, and data provider are defined in Customizing and extracted to the corresponding InfoObjects.
14 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
More Information
BI content documentation:
● (/DDF/CDPA)● (/DDF/CDORIGIN)
2.1.2 Handling Descriptions
Use
The following rule of thumb applies to InfoObjects that are modeled with text:
● InfoObjects extracted from Customizing have language-dependent texts. Examples:○ Data origin, data provider, data delivery agreement○ Type of sales data, stock type
● InfoObjects that receive descriptions from outside SAP Demand Signal Management, version for SAP BW/4HANA have language-independent texts. These description texts are uploaded in the language they are delivered. Examples:○ Product, location○ Local attributes of product like product category or brand
Using Contexts to Identify Attributes
Retailers often provide descriptions only and no identifiers (IDs)for their attributes. To ensure that the attributes are unique, they are compounded with a context. If a data origin provides only texts for a local attribute, the system generates a unique ID using the internal number range objects. In the given context, the attributes can then be clearly identified.
The following attributes are compounded with the context:
● Product group (/DDF/MATL_GRPL)● Manufacturer (/DDF/MANUFACTL)● Brand (/DDF/BRANDL)● Product category (/DDF/PRODCATL)● Product subcategory (/DDF/PRODSCATL)
Dealing with Descriptions Over 60 Characters
The descriptions of InfoObjects are restricted to the length of 60 characters. However, master data delivered by market research companies quite often contains longer descriptions for products and locations. To enable the handling of long descriptions, products and locations have two additional text attributes. Using /DDF/TXTLG1 and /DDF/TXTLG2, you can split up long descriptions into two fields of 60 characters each. This is implemented in the transformation of the master data flow for market research data.
NoteIn market research data, locations are often referred to as markets.
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 15
More Information
Context [page 14]
2.1.3 Handling Products
Use
The same product is sold by different retailers using different product IDs. To be able to identify products of different data origins, the data model differentiates between the following InfoObjects:
● InfoObject Source Product (/DDF/PRODREF) for the external IDs compounded with the context.● InfoObject Internal Source Product (/DDF/PRODUCT) using internal number range objects that are
assigned to a context to ensure the uniqueness of the product.The InfoObject Internal Source Product acts as the central product object and collects all the data that is relevant for reporting. It contains, for example, the following information:○ Attributes provided by the data origin, for example brand, product category, manufacturer, or material
group○ Different units of measure, if provided○ Internal material number from SAP ECC○ Harmonized product from data harmonization○ Product
● InfoObject Product (/DDF/GPID) groups products of different data origins with harmonized attributes.
16 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
Structure
Data Model for Products
Process
1. You define number ranges for internal source products in Customizing.2. You assign a number range to a context as an attribute of the data delivery agreement in Customizing.3. During data upload, the system generates a new unique internal number for each new combination of
product and context.The system stores the assignment of source product and internal source product in the InfoObject Source Product.
NoteThe same process can be used for company-internal data. You can extract material numbers from SAP ECC using the InfoObject Material (0MATERIAL) into InfoObject Source Product and generate an internal product number.
4. If required, you can also forward your products to data harmonization. During data harmonization, objects are grouped and combined in a harmonized object. The harmonized product and the corresponding harmonized attributes are extracted to the InfoObject Product (DDF/GPID).
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 17
More Information
Context [page 14]
BI content documentation:
● (/DDF/PRODREF)● (/DDF/PRODUCT)● (/DDF/CDPA)
2.1.3.1 Example: Product Data
The following example shows a data delivery for products:
The same physical product is part of three uploads from different data origins:
● Two external files from different retail companies● Internal manufacturer data from SAP ECC
Data Origin Product ID Product Group Product Group Text ...
Retailer_1 (source:file) R1_33333 3300 Chocolate 100 g ...
Manufacturer (source: SAP ECC)
M_1111 1110 Chocolate ...
Retailer_2 (source: file) R2_777777 not provided Chocolate bars 100 g ...
The following Customizing settings are applied:
Data Delivery Agreement
Data Origin Context Number Range for Products ...
R1 RETAILER_1 R1 R1 ...
COMP_INT COMP_INT ECC IC ...
R2 RETAILER_2 R2 R2 ...
18 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
The upload is structured as follows:
Product Data Upload into InfoObjects
2.1.3.2 Units of Measure
Use
Retailers and market research companies typically do not provide any units of measure in the product master data and sales data they send. This information is usually embedded in the textual attributes of the corresponding files.
When preparing your data for reporting, you often have to deal with a variety of units of measure. The same product can be sold in different packaging units (for example, six pack or single bottles) and different weight or volume units (for example, due to a promotion).
Manufacturers usually sell their products to retailers in logistical units different to the units sold to the consumer at the POS (for example, in boxes or packs). Therefore, the logistical units of a manufacturer maintained in SAP ECC are usually multiples of the units bought by the consumer. This is why some manufacturers maintain conversion factors for the different units of measure in SAP ECC.
Reporting Requirements
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 19
Manufacturers are interested in different key figures that require an evaluation of the mixture of incoming units of measure as well as a conversion of units into the unit that is relevant for the particular report. The following table shows some examples of key figures and conversion requirements that could be relevant for reporting:
Key Figure Conversion Requirement
How often was a certain brand or product sold at the POS independent of the packaging unit or the volume unit?
Display number of scans at the POS
How many pieces of a product were sold? Convert different packaging units into one typical dimensionless unit, for example, piece, each, or bottle
How many kilograms or liters were sold? Convert the dimensionless units into the corresponding weight unit or volume unit
How many products (for example, pieces and kilogram) did the manufacturer sell to different retailers compared to the POS data provided by the retailers? (sell-in – sell-out)
Convert logistical units from manufacturer and retailer unit into one typical unit
More Information
Handling Units of Measure [page 20]
2.1.3.2.1 Handling Units of Measure
Use
Due to the fact that the incoming data from retailers and market research companies usually does not provide any units of measure, you can maintain a default unit of measure in Customizing. This unit is used and added during transaction data upload to define the consumer unit of measure. This consumer unit of measure is equivalent to the acoustic signal (“beep”) when the products are scanned at the POS. It does not give any information about the number of pieces or the volume/weight that was scanned.
During the upload of the retailer product master data, the consumer unit is transferred to data harmonization as a base unit of the corresponding object. Within data harmonization you have to convert the consumer unit of measure to the corresponding “known” alternative units (dimensionless, weight or volume). You manually maintain the corresponding conversion factors. The manufacturer products and the alternative unit of measures with conversion factors maintained in SAP ECC can be uploaded to data harmonization.
When assigning objects to a harmonized object in data harmonization, you can define a conversion factor. For the harmonized object you can also define alternative units (dimensionless, weight or volume) with the corresponding conversion factors from the base unit of measure.
With the help of BW extractors, all this information is pulled into the InfoObjects Internal Source Product and Product of the reporting layer. The conversion factors are extracted to the DataStore object (DSO) /DDF/DSUOM and assigned to the InfoObject Internal Source Product (/DDF/PRODUCT) with the consumer unit of
20 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
measure (/DDF/CONSUNIT) as the base unit of measure. At this point in time, the conversion factors that have been maintained for the harmonized objects are inherited to the objects of /DDF/PRODUCT.
For reporting purposes, you can use of the quantity conversion DSO and its conversion factors and define corresponding quantity conversion types (using transaction RSUOM) that allow the conversion of sales or stock quantities into consumer units or other units of measure you want to report on. That way, you can easily create reports based on a certain unit (dimensionless, volume or weight) without having to calculate unit conversions. The system automatically converts the units during the execution of the query. Quantity conversion type /DSR/FUOMT serves as an example here.
Unit conversion into various dynamic target units, depending on each individual product, is supported at reporting runtime. You can use the quantity conversion types /DDF/PREF, /DDF/NVOL, and /DDF/NWT in transaction RSUOM.
The following table shows the relationship between objects in reporting and in data harmonization:
InfoObject Description of InfoObject Data Harmonization
/DDF/CONSUNIT
0BASE_UOM
Consumer Unit
Base Unit
Base unit of object
/DDF/PREFUNIT Preferred Unit Base unit of harmonized object
/DDF/NORMWUNIT
/DDF/NORMVUNIT
Normalized Weight Unit
Normalized Volume Unit
Alternative units (with dimension weight)
Alternative units (with dimension volume)
Process
The overall process for making units of measure available for reporting runs as follows:
1. You load retailer products and company-internal products and their units of measure.2. You define factors for the conversion from the base unit of measure of the object to the base unit of
measure of the harmonized object.3. You define conversion factors between the base unit of measure of the harmonized object and the
alternative units of measure that you may need.4. The system pulls the conversion factors from data harmonization into the generated quantity conversion
DataStore object /DDF/DSUOM using InfoSource /DDF/HI_PRODUOM_ATTR.5. You use quantity conversion types in transaction RSUOM for a dynamic conversion of units in reporting.
More Information
Units of Measure [page 19]
Units of Measure in Data Harmonization [page 214]
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 21
BI content documentation:
● (/DDF/DSUOM)
2.1.4 Handling Locations
Use
To be able to identify locations of different data origins the data model differentiates between the following InfoObjects:
● Source Location (/DDF/LOCREF): This InfoObject contains the external location numbers and location hierarchy nodes compounded with the context (to ensure that internal locations are unique), the location node qualifier (used to distinguish between nodes and leaves of a hierarchy), and the location type (used to distinguish between stores and delivery locations).
● Internal Source Location (/DDF/LOCATION): This InfoObject is based on internal number range objects that are assigned to a context to ensure that internal location numbers are unique.It acts as the central InfoObject and collects, for example, the following information for reporting:○ Country, region, city, and so on.○ Location type and delivery location (to differentiate between stores and distribution centers and to
provide the information about a delivery location for a store)○ Location○ Harmonized location○ Data origin, data provider, and context○ Customer number as company-internal number from SAP ECC.
● InfoObject Location (/DDF/GLID) groups locations with harmonized attributes.
22 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
Data Model for Locations
Process
1. You define number ranges for internal source locations in Customizing.2. You assign a number range to a context as an attribute of the data delivery agreement in Customizing3. During data upload, the system generates a new unique internal number for each new combination of
location and context. It stores the assignment of source location and internal source location in the InfoObject Source Location.
NoteThe same process can be used for company-internal data. You can extract customer numbers from SAP ECC (InfoObject Customer Number (0CUSTOMER)) to InfoObject Source Location and generate an internal location number.
4. If required, you can forward your locations to data harmonization. During data harmonization, the objects are grouped and combined in a harmonized object. The harmonized location and the corresponding harmonized attributes are extracted to the InfoObject Location (DDF/GLID).
More Information
Context [page 14]
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 23
BI content documentation:
● (/DDF/LOCREF)● (/DDF/LOCATION)● (/DDF/CDPA)
2.1.5 Handling Time
Retailers and market research companies usually provide time characteristics in various formats. To be able to use data from various retailers in the same report, for example, POS data, you have to convert the time characteristics to standard BW time characteristics.
Time in the Sales and Stock DataStore Objects
The delivered sales and stock data is represented as a time interval.
ExampleIf data is provided for January 1, 2012, the data record is transferred to the sales or stock data DataStore object using the time interval from 20120101000000 to 20120101235959.
/DDF/TIMEFROM is a timestamp that defines the start point of the time interval and /DDF/TIME represents the ending point of the time interval. /DDF/TIME is compounded with /DDF/TIMEFROM. It contains various standard BW time characteristics as attributes for usage in BW reporting, for example, 0CALDAY, 0CALWEEK, 0CALMONTH. Due to the fact that time intervals can represent days or weeks, the dedicated InfoObject /DDF/TIMEAGGRLV is stored with the sales and stock data records, indicating whether the record contains weekly data (W) or daily data (D).
NoteThe master data InfoObject attributes of /DDF/TIME are updated based on Date From and Date To fields in the transaction data flow of:
● Aggregated sales acquisition DSO /DDF/DS01● Stock snapshot acquisition DSO /DDF/DS02● Distributor stock acquisition DSO /DDF/DS08● Distributor sales acquisition DSO /DDF/DS07
Time Data Provided by Market Research Companies
Market research companies deliver time data in various granularities, for example weekly, four-weekly, monthly, bi-monthly starting with even/odd months.
24 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
The InfoObject Market Research Time String (external) (/DDF/TIMEREF) stores the time strings that have been delivered by the market research companies along with BW time characteristics as attributes. A data flow from DataSource /DDF/TIME_ATTR_COL_DX to /DDF/TIMEREF encapsulates the conversion from the external time string to the BW time characteristic values, such as 0CALDAY or 0CALWEEK. In Customizing you can maintain rules for determining the correct BW time characteristic values. The granularity, the starting day, and the first week of the year can be defined in the data delivery agreement.
2.1.6 Handling Hierarchies
Use
SAP Demand Signal Management, version for SAP BW/4HANA uses BW hierarchies for products and locations. To make hierarchy data available for reporting, you can use one of the following approaches:
● Extracting company-internal hierarchies○ You extract company-internal hierarchies from SAP ECC to BW hierarchies using the InfoObjects
Material (0MATERIAL) and Customer (0CUSTOMER). With the help of joins and double joins, you can use these hierarchies when reporting on specific InfoObjects in SAP Demand Signal Management, version for SAP BW/4HANA.
NoteTo work with double joins, you have to create a CompositeProvider.
○ You extract company-internal hierarchies to BW hierarchies of the InfoObjects in SAP Demand Signal Management, version for SAP BW/4HANA, for example, InfoObject Product (/DDF/PRODREF) and Location (/DDF/LOCREF). Joining these InfoObjects allows you to use the corresponding hierarchies in reporting. SAP Demand Signal Management, version for SAP BW/4HANA provides specific transformations for this purpose.
● Extracting company-external hierarchiesMarket research companies send multiple hierarchies in a parent-child relationship along with the hierarchy metadata that explains the format and descriptions of the hierarchies and theirs nodes. This information can be uploaded to the InfoObjects Product (/DDF/PRODUCT) and Location (/DDF/LOCREF).
More Information
Company-Internal Hierarchies [page 26]
Market Research Hierarchies [page 27]
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 25
2.1.6.1 Company-Internal Hierarchies
Product Hierarchies
You can extract a company-internal product hierarchy from SAP ECC to the hierarchy of InfoObject Material (0MATERIAL) with standard extractors and DataSources. As InfoObject 0MATERIAL is an attribute of the InfoObject /DDF/GPID and /DDF/GPID is an attribute of the InfoObject /DDF/PRODUCT, you can implement the hierarchy of InfoObject 0MATERIAL for reporting on global products and internal products. This is done, for example, in sales analysis using CompositeProvider /DSR/CP10. As a prerequisite, 0MATERIAL must be assigned to the global object in data harmonization. Additionally, you can extract the company-internal product hierarchy to the hierarchy of Global Product (/DDF/GPID) and Product (/DDF/PRODREF) with the transformations provided by SAP Demand Signal Management, version for SAP BW/4HANA. These transformations evaluate the relationship between of Material, Global Product, Internal Product, and Product. The resulting hierarchies ofr Global Product and Product look exactly like the hierarchy of the InfoObject Material, the only difference being that the material nodes are replaced by global products or products.
ExampleYou can extract a material hierarchy (table T179) with three levels using DataSource 0MATERIAL_LPRH_HIER to extract the BW hierarchy PRDHA of InfoObject Material (0MATERIAL). The external characteristics Product Hierarchy Level 1 (0PRODH1), Product Hierarchy Level 2 (0PRODH2), and Product Hierarchy Level 3 (0PRODH3) provide the three levels. If you add these characteristics to the InfoObjects Global Product (/DDF/GPID) and Product (/DDF/PRODREF) as external characteristics, you can build a hierarchy equivalent to the BW hierarchy PRDHA with products as leaves (hierarchy of /DDF/PRODREF) or global products as leaves (hierarchy of /DDF/GPID).
Account Hierarchies
You can extract account hierarchies from SAP ECC to the hierarchy of InfoObject Customer (0CUSTOMER) with standard extractors and DataSources. If you assign the InfoObject Customer to the Global Location (/DDF/GLID) in data harmonization, you can use the account hierarchy for reporting. The double join in the CompositeProvider is based on the following relation: InfoObject 0CUSTOMER is an attribute of /DDF/GLID, and /DDF/GLID is an attribute of /DDF/LOCATION.
ExampleYou can extract the customer hierarchy of the class system (class type 011) from SAP ECC using DataSource 0CUSTOMER_LKLS_HIER. Additionally, you can extract this account hierarchy to the hierarchy of Global Location (/DDF/GLID) and Location (/DDF/LOCREF) with the transformations provided by SAP Demand Signal Management, version for SAP BW/4HANA. These transformations evaluate the relationship between Customer, Global Location, Internal Location, and Location. If the InfoObject Internal Location (/DDF/LOCATION) provides a delivery location (distribution center), the relationship is taken into account in the transformation and the delivery locations are assigned to the hierarchy. The locations (stores) are then assigned to the delivery locations as child nodes.
26 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
2.1.6.2 Market Research Hierarchies
Market research companies usually deliver the following dimensions in a hierarchical parent-child structure:
● Product● Location (in market research data often referred to as Market)● Time
The metadata information of the hierarchies is also included.
Example● A product hierarchy could have the following levels: Category, subcategory, segment, brand, and
product.● A market hierarchy could have the following levels: Market, country, region, and Zip code.● A time hierarchy could have the following levels: Year, half year, quarter, and month.
Within one data delivery there can be different hierarchies for one dimension.
ExampleProduct hierarchy 1: Product category, segment, brand, and products
Product hierarchy 2: Product category, subcategory, segment, brand, and products.
Market research data is delivered for any level of the provided hierarchies and figures are calculated by the system applying complex calculation rules to the detailed data. Consequently, the delivered key figures cannot be aggregated using the BW Online Analytical Processing technology. Instead the data is shipped as pre-calculated values for different aggregation levels, these are known as pre-aggregated totals.
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 27
Pre-aggregation can affect all three dimensions (product, location, and time). The following example shows only one dimension:
Example of Hierarchy with One Dimension
To ensure correct reporting on different levels in the various hierarchies, an exception aggregation is defined for each key figure and dimension. To report on the pre-aggregated totals, priorities have to be available as reference characteristics for exception aggregation. The calculation of the priorities must be carried out correctly; this happens during data upload in two steps:
1. During master data upload the system calculates and stores a numeric hierarchy level in the corresponding master data objects Product, Location, and Time.
2. During transaction data upload the system calculates the priority for each data record.
ExamplePriority = numeric hierarchy level of product * numeric hierarchy level of market (location) * numeric hierarchy level of time
28 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
Hierarchy Metadata
The metadata of the hierarchies of the dimensions product and market are uploaded and persisted in two DataStore objects (DSO):
● DSO Hierarchy Metadata /DDF/DS71: Contains identifiers and long texts of external hierarchies.● DSO Hierarchy Level Metadata: Contains the number of levels for each hierarchy and a long text for the
levels.
To distinguish between the dimensions, the InfoObject Hierarchy Dimension /DDF/HDIM is a key field of both DSOs. You can influence the setup of the hierarchy, for example, you can change the descriptions of the hierarchies or hierarchy levels in Customizing or change the sort sequence using a customer-specific BAdI implementation.
Product, Market, and Time Hierarchies
The external hierarchy of market research data is created using the InfoObjects Product (/DDF/PRODREF), and Location (/DDF/LOCREF). To distinguish between hierarchy nodes and leaves, the master data records have a node qualifier for products (/DDF/PQUAL) and one for locations (/DDF/LQUAL). For market research data uploads the value of the qualifiers is set to M. The external product and location hierarchies are converted to a BW parent-child hierarchy using acquisition and propagation DSOs.
2.2 Transaction Data
The handling of transaction data differs from the handling of master data: Transaction data provided by retailers is separated from the transaction data provided by market research companies. This is due to the fact that the delivered key figures are quite different.
2.2.1 Handling Transaction Data from Retailers
Use
The following DataStore objects (DSO) of the propagation layer are used as the central InfoProviders for reporting on aggregated sales data or stock snapshot data:
● Aggregated Sales Propagation (/DDF/DS11)● Stock Snapshot Propagation (/DDF/DS12)
Both DSOs contain the following key fields:
● Time Dimension
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 29
Time Stamp From (/DDF/TIME_FROM) and Generic Time (Time Stamp To) (/DDF/TIME) define the interval of the timestamps with reporting-relevant time characteristics as attributes (for example, calendar day, week, or the posting periods of a fiscal year variant defined in Customizing).
● Product DimensionInternal Source Product (/DDF/PRODUCT) with all local attributes, units of measure, and the attributes (Product (/DDF/GPID), Source Product (/DDF/PRODREF), and Manufacturer Product (0MATERIAL)) to link to the attributes or hierarchies of those objects.
● Location DimensionInternal Source Location (/DDF/LOCATION) with geographical attributes like country (0COUNTRY), region (0REGION), city (/DDF/CITY), attributes of the data delivery agreement (data origin, data provider, context), and the attributes Location (/DDF/GLID), Source Location (/DDF/LOCREF), and Account (0CUSTOMER) to link to the attributes or hierarchies of those objects.
There are additional transaction-specific key fields to separate different types of sales data (/DDF/SALGRP) for aggregated sales data or different types of stock (/DDF/STOCKTYPE) for stock snapshot data.
NoteThese fields are defined in Customizing. They are important for the data enrichment process, because the value 0001 for the field Type of Sales is reserved for data records that have been created during sales data enrichment to distinguish them from regular sales (1000) or promotional sales (1001). Similarly, the Stock Type 00 is reserved for data records created during stock data enrichment.
Data Model for Transaction Data (Simplified Overview)
30 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
More Information
Example: Use of CompositeProviders for Sales Data [page 31]
Data Enrichment [page 264]
BI content documentation:
● (/DDF/SALPGRP)● (/DDF/STOCKTYPE)
2.2.1.1 Example: Use of CompositeProviders for Sales Data
You can use CompositeProviders to link harmonized attributes or hierarchies of the InfoObjects Material (0MATERIAL), Source Product (/DDF/PRODREF), Account (0CUSTOMER), and Source Location (/DDF/LOCREF) to the corresponding DataStore objects using the internal representations Internal Source Product (/DDF/PRODUCT) and Internal Source Location (/DDF/LOCATION). The following graphic shows an example of how to make the navigational attributes of InfoProviders available for analytical applications.
Example of a CompositeProvider for Sales Data
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 31
2.2.2 Handling Retail Panel Data from Market Research Companies
The DataStore object (DSO) /DDF/DS14 of the Data Propagation Layer is used as the central InfoProvider for reporting on retail panel data from market research companies. You can use it as an example. In market research huge numbers of attributes and key figures have to be processed. It depends strongly on the individual business needs which key figures are used for reporting.
The DSO contains the following key fields:
● Time DimensionThe External Time String (/DDF/TIMEREF) defines the time period of the market research data record. This can be a week or another time interval, for example, a month or a quarter. Customizing is available to derive the exact meaning of the external time string, mainly the date from and the date to. The external time string is compounded with the Data Delivery Agreement (/DDF/DDAGR) because it is not necessarily unique across different market research data deliveries.
● Product DimensionInternal Product (/DDF/PRODUCT) with all local attributes, units of measure, and the attributes (Global Product (/DDF/GPID), Product (/DDF/PRODREF), and Manufacturer Product (0MATERIAL)) to link the global products and external or internal product numbers.
● Location DimensionInternal Location (/DDF/LOCATION) with geographical attributes like country (0COUNTRY) or region (0REGION), attributes of the data delivery agreement (data origin, data provider, context), and the attributes Global Location (/DDF/GLID), Location (/DDF/LOCREF), and Account (0CUSTOMER) to link to global locations and external or internal location numbers. Locations in market research data usually do not refer to a physical location, but rather to a specific region (market). They often refer to a hierarchy node rather than to the most detailed level (leaf) of the hierarchy.
While the key fields are relevant in any business context, the attribute fields (characteristics and key figures) are customer-specific. The example DSO (/DDF/DS14) contains the following key figures:
● Sales key figures (in units, value, and volume)● Baseline sales key figures (in units, value, and volume)● Distribution key figures (numeric and weighted distribution)● Priority key figure for exception aggregation● Number of stores in the corresponding market (location).
2.3 Enhancing the Data Model
Most of the common attributes of products and locations are delivered as attributes of the InfoObjects Internal Product (/DDF/PRODUCT) and Internal Location (/DDF/LOCATION). However, you might need to report on attributes that are not available in the standard content because they are only used in certain industries, categories, or countries.
There are different options for adding and integrating new attributes:
● Enhancing InfoObjects using standard SAP NetWeaver Business Warehouse capabilities
32 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
Depending of the nature of the attribute you can add the attribute to the internal product (/DDF/PRODUCT) or internal location (/DDF/LOCATION) or to the corresponding global objects /DDF/GPID and /DDF/GLID. In this case you have to enhance all objects accordingly, such as DataSources, transformations, InfoSources, using standard BW capabilities. This approach is suitable for common master data attributes that are required for global reports, for example, attributes that are relevant for multiple categories or countries.
● Using name-value pairsUsing the concept of name-value pairs you can upload new product attributes and the corresponding values to the DataStore object (DSO) (/DDF/DS53). Accordingly, new location attributes can be uploaded to the DSO /DDF/DS32. This approach is suitable for specific master data attributes that are required for local reports, for example, country-specific reports.
To make new attributes available for reporting, you have to filter them out of the name-value pairs and join them to the reporting objects. This can be done in in the following ways:
● You can use the standard BI Content capabilities to create new InfoObjects for each attribute you want to add and filter the corresponding attribute values in transformations. Afterwards, you have to add the new InfoObjects to the Data Propagation Layer for transaction data and to combine the DSOs using MultiProviders or CompositeProviders.
● You can build a logical model without additional persistence using SAP NetWeaver Business Warehouse on SAP HANA database. For this purpose you have to create an analytic view for each additional attribute and publish this view using transaction RSDD_HM_PUBLISH. Afterwards, you can use a CompositeProvider to join the views as TransientProviders with the DSOs containing the transaction data. Queries built on top of the CompositeProvider filter the attribute values out of the name-value pair DSO during run time.
2.3.1 Example: Adding New Attributes
Use
Defining New Attributes
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 33
Depending on your reporting requirements, the standard product and location attributes might not be sufficient and you want to add new attributes:
Example for Additional Product and Location Attributes
Defining Name-Value Pairs
34 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
You could create the following name-value pairs for additional product attributes:
Example of Name-Value-Pairs for New Product Attributes
Integrating New Attributes
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 35
The new attributes could be integrated in the standard data model as follows:
Example for Integration of New Attributes in the Data Model
More Information
Enhancing the Data Model [page 32]
2.4 Data Flow
The data flow in Demand Signal Management is as follows:
1. The system extracts data from different DataSources to the Data Acquisition Layer .2. The system transforms the data to the Data Propagation Layer to receive the internal representation of the
objects (internal product and internal location) for reporting.
All steps of the data flow are embedded in the data management of SAP NetWeaver Business Warehouse using process chains. The process chains are assigned to process steps within the process flow control.
You can define that the quality validation is called (one step within a process definition) after the data has been posted to the Data Acquisition Layer. Depending on the key figures defined in quality validation, the data is forwarded to the Data Propagation Layer or it is rejected for further processing. Between the Data Acquisition
36 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
Layer and the Data Propagation Layer master data can be forwarded to data harmonization to map and harmonize the objects, for example, products or locations. The result is that the global object with consistent attributes can be extracted to the corresponding InfoObjects.
2.4.1 Data Flow for Master Data
The overall data flow for products and locations is very similar. It runs as follows:
1. Extract data to the Data Acquisition Layer using DataSources:○ BOBJ DataSources to push retailer data to the Persistent Staging Area (PSA)○ File DataSources to pull retailer data to the PSA○ SAP DataSources to extract company-internal data from SAP ECC. SAP DataSources require an
additional step in which texts and attributes of material or customer master data are combined before transforming the data to the corresponding acquisition DataStore objects. This step is necessary to avoid conflicts in the sequence of the attribute and text upload.
○ BI DataSources to pull market research data.2. Call quality validation (using the process flow control) to check if the delivered data can be forwarded
according the defined criteria within quality validation.3. Create internal numbers and forward the data to the Data Propagation Layer.4. Create or update the internal and harmonized master data of the corresponding InfoObjects.5. Extract harmonized objects into internal source InfoObjects, for example Internal Source Product and
harmonized objects into harmonized InfoObjects (for example, Product) using BI DataSources.
NoteYou can define a process definition for a complete data upload that covers the transfer of data to data harmonization as well as the extraction of the harmonization results from this upload process into the relevant InfoObjects. Subsequent changes performed in data harmonization — for example, changing global records using the data harmonization UI or resolving work items — can be uploaded to the InfoObjects (for example, Internal Source Product and Product) using a separate process definition. This process definition must be assigned to a data delivery agreement of the type Harmonization Data. You assign the data delivery agreement in Customizing for Data Harmonization under Cross-Application Components Demand Data Foundation Data Harmonization General Settings .
Both process definitions can use the same process chains to extract data from data harmonization.
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 37
Simplified Overview of Data Flow for Master Data
2.4.1.1 Steps in the Data Flow for Master DataThe data flow for products and locations starts with the extraction from the DataSource to the Data Acquisition Layer. It consists of the following steps:
1. Extract Data from DataSource to Data Acquisition LayerThe master data from different DataSources is collected and enhanced.○ File DataSource (contains only data sent by retailers)
○ A delivery instance is created by the process flow control (PFC) when the file scanner reads the files in the folder.
○ Delivery ID, process ID, and context are determined.○ BOBJ DataSource
○ Delivery ID and data delivery agreement are already part of the DataSource.○ Delivery instance for given delivery ID is created in a data transfer process (DTP) during manual
data upload.○ The context of the product or location is determined using the delivery ID.
○ SAP ECC (Internal Master Data)○ The DataSource contains only master data from SAP ECC.○ Master data is replicated to the consolidation DataStore object (DSO) without any additional logic.○ The delivery instance is generated in the DTP between the consolidation DSO and the acquisition
DSO using the logical system that is defined for the data delivery agreement.
38 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
○ The context of the product or location is determined.2. Extract Data from Data Acquisition Layer to Data Propagation Layer and InfoObject
1. Data Acquisition Layer to InfoObject○ The system checks if the object already exists in the InfoObject.○ If the object does not exist, a number is generated for the internal source object and the mapping
between the source object and the internal source object is created.2. Data Acquisition Layer to Data Propagation Layer
○ An internal number for the source object is determined.○ Attributes (data origin, data delivery agreement, and data provider) are determined.○ Object is harmonized.
3. Data Propagation Layer to InfoObject○ The internal source object is saved.○ The text of the internal source object is saved.
3. Extract Object○ Harmonized object
○ The changes of the harmonized object in data harmonization are determined and extracted to BW. The last extraction is used as the selection criterion.
○ The changes are applied to the InfoObject of the harmonized object.○ Object
○ Changes of the object in data harmonization are determined and extracted to BW. Objects have been assigned to a harmonized object. The last extraction is used as the selection criterion.
○ For products, any changes to harmonized products have to be determined as well, because some changes (normalized units) are maintained for the harmonized object in data harmonization and inherited by the assigned objects in BW.
○ In the case of product units of measure, only changes to harmonized products have to be determined, because all changes (addition of alternative units of measure) are maintained in data harmonization and inherited by the assigned objects in BW.
○ The changes are applied to the InfoObject of the object.
2.4.2 Data Flow for Transaction Data
The data flow for aggregated sales data and stock snapshot data from retailers is very similar to the data flow for retail panel data provided by market research companies. It runs as follows:
1. Extract data to the Data Acquisition Layer using DataSources.○ BOBJ DataSources to push retailer data into the Persistent Staging Area (PSA)○ File DataSources to pull retailer data into the PSA
2. Call quality validation (using the process flow control) to check if the delivered data can be forwarded according the defined criteria in the quality validation.
3. Transform the data to the Data Propagation Layer.4. For the transaction data provided by retailers (aggregated sales and stock snapshot data) you can execute
stored procedures on the uploaded external mass data using DataSources for sales or stock data enrichment and create new data records in the PSA. These new records contain additional calculated key figures or indicators about out-of-shelf or out-of-stock situations that are posted back to in the propagation DataStore object.
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 39
Data Flow for Retailer Transaction Data
2.4.2.1 Steps in the Data Flow for Aggregated Sales Data or Stock Snapshot Data
The data flow for aggregated sales data and stock snapshot data starts with the extraction from the DataSource to the Data Acquisition Layer and ends with the final extraction of the enriched data to the Data Propagation Layer. It consists of the following steps:
1. Extract Data from DataSource to Data Acquisition Layer○ Enrich data delivery agreement.○ Clean up type of sales data/stock type.○ Determine product if only EAN/UPC is available.○ Enrich time aggregation level (day or week).○ Enrich consumer unit of measure, if no unit available. For stock data, enrich stock keeping unit.○ Determine Recently Sold/Listed indicator (for sales data only).○ Determine delivery ID and process ID for process flow control.
2. Extract Data from Data Acquisition Layer to Data Propagation Layer○ Call quality validation.○ Upload additional products and locations to the Data Acquisition Layer, which occur in transaction
data, but for which no master data exists yet.
40 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
○ Extract additional master data. For more information, see Steps in the Data Flow for Master Data [page 38].
○ Convert source product and source location to internal source product and internal source location.○ Set counter for Recently Sold/Listed entries (for sales data only).○ Count stock records (constant 1) (for stock data only).○ Determine generic time stamp (from and to time).○ Perform time split on the data records.
3. Extract Data from Propagation DSO○ Evaluate Customizing settings and execute stored procedures for the data enrichment process.○ Create new records in the Persistent Staging Area.
4. Extract Data from Data Enrichment DataSource to Data Propagation Layer○ Post new records back into Propagation DSO using the BW data flow.
2.4.2.2 Steps in the Data Flow for Retail Panel Data
The data flow for retail panel data starts with the extraction from the DataSource to the Data Acquisition Layer and it ends with the final extraction of the data to the Data Propagation Layer. It consists of the following steps:
1. Extract data from DataSource to Data Acquisition Layer○ Determine data delivery agreement and context.○ Fill currency and unit of measure with default values if they are not contained in the raw data.
2. Extract data from Data Acquisition Layer to Data Propagation Layer○ Convert source product and source location to internal source product and internal source location.○ Call the priority calculation to fill the priority attribute (/DDF/MRPRIO) correctly. This is necessary for a
correct exception aggregation in the Queries.○ Perform time split on the data records.
2.4.3 Process Chains in the Data Flow
Use
All steps of the data flow are embedded in the data management of SAP NetWeaver Business Warehouse using individual process chains. These process chains are combined and grouped in superordinate process chains that can be used in process definitions for monitoring the status of the process flow.
The process chains in SAP Demand Signal Management, version for SAP BW/4HANA that are used in the process definition of the process flow control use the following naming conventions:
● /DDF/<…>_FILE: Transferring files to the Data Store object (DSO) in the Data Acquisition Layer● /DDF/<…>_BOBJ: Transferring data to the Persistent Staging Area (PSA) using SAP Data Services and
further on to the DSO in the Data Acquisition Layer● /DDF/<…>_ADD: Extracting additional master data from the Data Acquisition Layer for transaction data
and transferring them to the DSO for master data in the Data Acquisition Layer
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 41
● /DDF/<…>_STAGE: Transferring data from the acquisition DSO to the propagation DSO and to the corresponding InfoObjects
● /DDF/<…>_MR_EXTRACT: Extracting market research data to the PSA● /DDF/G<…>_EXTRACT: Extracting harmonized data from data harmonization● /DDF/<…>_EXTRACT: Extracting data from data harmonization● /DDF/<…>_ENRICHMT_STAGE: Executing data enrichment and posting data to the propagation DSO
More Information
Examples of Process Definitions for Uploading Master Data from Harmonization [page 69]
BI content documentation:
2.5 Configuring and Administrating the Data Model
As an administrator for the data model you have the following tasks:
● Set up the data model in Customizing. For more information, see the Customizing documentation for Data Model under Cross-Application Components Demand Data Foundation Data Model .
● Configure the location calendar in the SAP Easy Access Menu● Set up semantic partitioning● Delete account assignments from InfoObjects● Activate the direct update of master data● Check Customizing
Task When? Where?
Set up data model in Customizing
Implementation Phase Customizing under Cross-Application Components
Demand Data Foundation Data Model
Define set of operating times Before you upload locations of a new retailer
On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Location Calendar Define Set of
Operating Times .
Define location calendar Before you upload locations of a new retailer
On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Location Calendar Define Location
Calendar .
42 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
Task When? Where?
Define periods of location calendar
Before you upload locations of a new retailer
On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Location Calendar Define Periods of
Location Calendar .
Define default assignment of calendar
Before you upload locations of a new retailer
On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Location Calendar Define Default
Assignment of Calendar .
Assign locations to calendars On demand, to check the assignments, and to create or change assignments if necessary
On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Location Calendar Assign Locations to
Calendar .
Set up semantic partitioning When you create a new data delivery agreement or a new context
Customizing under Cross-Application Components
Demand Data Foundation Data Model .
For more information, see Setting Up Semantic Partitioning [page 45].
Delete account assignments On demand, in the rare case that an account represents an individual person and the records for this person must be deleted
On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Account Reference Delete Account
Assignments .
Activate direct update of master data
Implementaton phase On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Account Reference Update
InfoObjects from Change Log . For more information, see Activating the Direct Update of Master Data [page 54].
Check Customizing On demand On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Account Reference Check
Customizing for Data Model .
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 43
2.5.1 Defining Location Calendars
Use
Within the Demand Data Foundation, you can define location calendars and assign them to locations to indicate the operating times of these locations (for example, stores or distribution centers of a retail company).
For SAP Demand Signal Management, version for SAP BW/4HANA it is important to know the operating days and sometimes even operating hours of a retailer's store. This information is needed to obtain precise results during the data enrichment of aggregated sales data. Location calendars can vary for different companies, countries, and regions. Therefore it is possible to define various location calendars and assign them to the respective locations. Determining the correct calendar and evaluating the calendar information (for example, to find out if a certain day is a working day or a public holiday, or to check operating hours of a location for a given day or week), can impact performance. Therefore, the system generates periods for each location calendar and persists the relevant information as a time stream. This time stream can be evaluated quickly by the consuming applications.
Depending on the application and data using the location calendar, you can define the required periods, configuring a calculation rule. Optionally, you can also define a set of operating times per weekday and assign it to a location calendar.
NoteThe location calendar, the set of operating times, and the assignment of location calendars to locations are treated like master data. This data needs to be adjusted from time to time. It is not possible to transport it; you have to set it up from scratch in every system. Setting up location calendars is an administrative job that uses some Customizing data, for example, the calculation rule.
If you change the settings for location calendars, operating times, or assignments to locations the changes will only be effective for newly created data. If a reevaluation is required, you need to execute a reprocessing of the corresponding data to make the changes within a location calendar effective.
Prerequisites
For all activities around the location calendar you need the authorization DDF_ADMIN (Administration Demand Data Foundation).
Activities
Setting Up Location Calendars
1. In the Customizing activity Maintain Calendar, define a holiday and factory calendar. For more information, see the Customizing for Calendars under SAP NetWeaver General Settings Maintain Calendar .
2. In the Customizing activity Define Calculation Rules for Periods of Location Calendar, define a period calculation rule. For more information, see the Customizing for Data Model under Cross-Application
44 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
Components Demand Data Foundation Data Model Define Calculation Rules for Periods of Location Calendar .
3. Define a set of operating times (optional).4. Define a location calendar and assign:
○ A factory calendar○ A period calculation rule○ A set of operating times (optional)
Assigning Locations to Location Calendars
1. Assign your calendar in the view Define Default Assignment of Location Calendar.2. Load locations into the system, for example into InfoObject /DDF/LOCATION.3. In the program Assignments of Locations to Calendars, verify the generated assignments.
For more detailed information, see the individual program documentation.
2.5.2 Delete Account Assignments
Should data privacy require the deletion of accounts that represent private persons, you can use this report to delete account assignments from InfoObjects.
2.5.3 Setting Up Semantic Partitioning
Use
If you receive very large amounts of POS data or retail panel data over time, you may not be able to upload all this data to a single DataStore object (DSO). You can use semantically partitioned objects (SPOs) to distribute the data over several smaller physical DSOs. SAP Demand Signal Management, version for SAP BW/4HANA supports the semantic partitioning of your data model by data delivery agreement for transactional data and by context for master data. If the upload of one semantic partition fails, the uploads of other partitions are not affected.
If you receive several master data files from different retailers at the same time, you want to make these data available for reporting as fast as possible. In this case, you can also use SPOs to upload the data from the different contexts in parallel to the different physical DSOs of the SPO. You do this, for example, if the system cannot parallelize the data upload of a single data delivery, because the master data files are too small.
Implementation Considerations
● Consider semantic partitioning of transactional data whenever you expect that the amount of data uploaded to a single standard DSO will exceed 500 million records over time. Do not create more semantic partitions than necessary because this could lead to a decrease in query performance.
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 45
● Consider semantic partitioning of master data when you frequently receive small portions of master data which need to be uploaded in parallel.
● We highly recommend to use SPOs for semantic partitioning. If you create multiple own copies for each standard DSO instead, you need to carry out additional activities to configure and operate your system.You must implement your own logic to direct the data of each data delivery to the right copy of the standard DSO. Depending on this logic, you adjust your implementation or data model when onboarding a new data provider. For master data, you must create your own implementation of the Business Add-In (BAdI) /DDF/MD_ACCESS. If you work with SPOs instead, you can use the default implementation of this BAdI.
Activities
Perform the following steps to be able to use semantic partitioning in SAP Demand Signal Management, version for SAP BW/4HANA:
1. Set up the data model by partition values for automatically including the new partition as part of the calculation view
1. Set up the data model by partition values for transactional data2. Set up the data model by partition values for bill of material data3. Set up the data model by partition values for master data4. Make sure that the correct partitions are accessed by the system
More Information
Maintaining Partition Objects [page 47]
Setting Up the Data Model by Partition Values for Transactional Data [page 47]
Setting Up the Data Model by Partition Values for Master Data [page 48]
Accessing the Correct Partitions [page 49]
For more information about using semantic partitioning, see SAP Help Portal at http://help.sap.com under SAP NetWeaver SAP NetWeaver Platform Application Help Function-Oriented View English
Business Warehouse Data Warehousing Modeling Enterprise Data Warehouse Layer Using Semantic Partitioning .
For more information about SAP HANA, see SAP Help Portal at http://help.sap.com under SAP In-Memory Computing SAP HANA SAP HANA Platform Development Information SAP HANA Developer Guide .
46 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
2.5.3.1 Maintaining Partition Objects
Within the Demand Data Foundation, you can maintain the settings in the Maintain Partition Objects view to automatically include the new partition as part of the calculation view.
This functionality uses a flexible framework to automatically handle the new partitions without creating a new set of procedures for Quality Validation, Enrichment and Reporting. To maintain this partitioning, proceed as follows:
1. Create a partition and include it in your DSO.2. Specify the DSO as the Main Object in this Customizing.3. Maintain the partitions as Partition Object under the main object.4. Choose the attributes which should be included in the calculation view.5. Select the check-box for the Main Object and Partition Object, which you want to include and choose the
Execute button .Check on the HANA side. The calculation view is generated with the partitions and DSO attributes included in the Customizing
For more information, see the Customizing documentation for Cross-Application Componentsunder Demand Data Foundation Data Model Maintain Partition Objects
2.5.3.2 Maintaining Partition Filter for BOM
Within the Demand Data Foundation you can define the source partition names in the propagation layer for the target Bill of Material split DataStore Object (DSO) . The standard DSOs for transactional data are used as templates.
To maintain this partitioning, proceed as follows:
1. Source partition refers to the sales/stock data with Bill of Material relevant DSO2. Target partition refers to Bill of Material split DSO which stores data at a lower level granularity.
For more information, see the Customizing documentation for Cross-Application Componentsunder Demand Data Foundation Data Model Maintain Partition Filter for Bill of Materials
2.5.3.3 Setting Up the Data Model by Partition Values for Transactional Data
Use
The criterion for defining the partitions of an semantically partitioned object (SPO) must be both flexible and scalable because new data delivery agreements are created at different points in time and the data deliveries
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 47
may have different sizes. Every time you create a new data delivery agreement you decide to which partition value you want to assign it. To set up the data model, proceed as follows:
1. Create partition values in Customizing.2. Assign the partition values to data delivery agreements during mass creation and synchronization.3. Make the partition values available as characteristic values of InfoObject /DDF/DDA_PART in SAP
NetWeaver Business Warehouse.4. Use the InfoObject /DDF/DDA_PART as a criterion for defining partitions of SPOs based on the structure of
the following standard DSOs:○ /DDF/DS04 (Market Research Retail Panel Acquisition)○ /DDF/DS14 (Market Research Retail Panel Propagation)○ /DDF/DS01 (Aggregated Sales Acquisition)○ /DDF/DS11 (Aggregated Sales Propagation)○ /DDF/DS02 (Stock Snapshot Acquisition)○ /DDF/DS12 (Stock Snapshot Propagation)
5. You can define the settings in Customizing to automatically include the new partition as part of the calculation view. For more information, see the Customizing documentation for Cross-Application Components under Demand Data Foundation Data Model Maintain Partition Objects .
For more information, see the Customizing documentation for Cross-Application Components under Demand Data Foundation Data Model Define Partition Values for Transactional Data .
More Information
Setting Up Semantic Partitioning [page 45]
Setting Up the Data Model by Partition Values for Master Data [page 48]
Accessing the Correct Partitions [page 49]
2.5.3.4 Setting Up the Data Model by Partition Values for Master Data
Use
You assign the partition value for master data to a context to make sure that each master data record is available in only one partition of the semantically partitioned object (SPO).
Every time you create a new context, proceed as follows, before you upload master data for that context:
1. Create partition values in Customizing.2. Assign the partition values to the context. If you synchronize contexts from different systems, make sure
that the partition values are synchronized as well.3. Make the partition value available as characteristic value of InfoObject /DDF/CTX_PART in SAP NetWeaver
Business Warehouse by executing the corresponding data transfer process (DTP).
48 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
4. Use the InfoObject /DDF/DDA_PART as a criterion for defining partitions of SPOs based on the structure of the following standard DSOs:○ /DDF/DS41 (Product Acquisition)○ /DDF/DS51 (Product Propagation)○ /DDF/DS43 (Product Name/Value Pairs Acquisition)○ /DDF/DS53 (Product Name/Value Pairs Propagation)○ /DDF/DS21 (Location Acquisition)○ /DDF/DS31 (Location Propagation)○ /DDF/DS22 (Location Name/Value Pairs Acquisition)○ /DDF/DS32 (Location Name/Value Pairs Propagation)○ /DDF/DS61 (Attribute Reference)○ /DDF/DS05 (Time String Acquisition Market Research)○ /DDF/DS15 (Time String Propagation Market Research)○ /DDF/DS71 (Hierarchy Metadata)○ /DDF/DS72 (Hierarchy Level Metadata)○ /DDF/DS44 (Product Hierarchy Acquisition)○ /DDF/DS54 (Product Hierarchy Propagation)○ /DDF/DS24 (Location Hierarchy Market Research Acquisition)○ /DDF/DS34 (Location Hierarchy Market Research Propagation)
RecommendationCreate as many partitions as you need to cover as many contexts as possible in the early implementation phase in order to keep the changes to the SPOs and the necessary uploads to InfoObject /DDF/CTX_PART at a minimum.
For more information, see the Customizing documentation for Cross-Application Components under Demand Data Foundation Data Model Define Partition Values for Master Data .
More Information
Setting Up Semantic Partitioning [page 45]
Setting Up the Data Model by Partition Values for Transactional Data [page 47]
Accessing the Correct Partitions [page 49]
2.5.3.5 Accessing the Correct Partitions
Use
When you use semantically partitioned objects (SPOs) you have to adjust the standard data flows in a way that the data upload feeds the correct SPOs. Therefore, you copy the standard data transfer processes (DTPs). This
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 49
affects all transformations from the Persistent Staging Area (PSA) to the acquisition DataStore objects (DSOs) and from there into the propagation DSOs.
When you define the partitions of an SPO, you can create DTPs for the upload to the different DSOs of the SPO, using a standard DTP as a template. By doing so, you reuse the standard transformations for the upload from the PSA to the data acquisition layer, and from the data acquisition layer to the data propagation layer. These transformations fill the partition value (/DDF/DDA_PART for transactional data or /DDF/CTX_PART for master data) as key field of the target DSO, according to the data delivery agreement or context that is known during the upload process. This allows the system to direct the data to the correct partition of the target SPO.
ExampleIf you do not assign partition values, the system uploads product master data to the product acquisition DSO DDF/DS41.
If the current data flow for master data contains a DSO with SPOs as data target, you can assign a partition value to a specific context. The system then uploads the master data to the corresponding SPO for this partition value, and you avoid having duplicate entries with different partition values for the same master data in the standard acquisition DSO.
If the master data from different retailers does not have the same structure, you can use multiple SPOs for one acquisition DSO.
To make sure that the system reads the data from the correct SPO in the acquisition layer, you assign the SPOs that store data in the acquisition layer to the standard acquisition DSO in Customizing for Cross-Application Components under Demand Data Foundation Data Model Assign Semantically Partitioned Objects to DataStore Objects .
NoteIf you want to implement your own logic how the system determines the correct SPO, you can implement BAdI: Semantic Partitioning of Master Data /DDF/MD_ACCESS accordingly. For more information, see Customizing for Cross-Application Components under Demand Data Foundation Business Add-Ins (BAdIs) BAdI: Semantic Partitioning of Master Data
Additionally, in the following steps of the upload processes, the standard DSOs are accessed by SAP HANA procedures:
● Quality validation● Data enrichment● Extraction to global reporting● Data consolidation
For performance reasons, the system delegates this time-consuming part of the upload process to SAP HANA procedures. For each data delivery, the system has to make the correct semantic partitions that include the POS data or retail panel data known to SAP HANA.
For each of the upload steps, there are SAP HANA procedure templates that contain the physical DSO from which the data is read as a parameter. If you use semantic partitioning, proceed as follows to access the correct SPO:
● For each procedure template that is contained in the SAP HANA content package sap.ddf.dsimddf, create a procedure template instance for every physical DSO of your SPO. This SPO represents the layer from which the business logic of the procedure template has to read the data from.
50 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
● Implement BAdI: Stored Procedure (/DDF/CALL_PROCEDURE) to determine the physical DSO based on the data delivery agreement and the assigned partition value or a superset of partition values. Based on this DSO the name of the procedure template instance is determined.You can assign the standard DSO to the SPO that stores the data in Customizing for Cross-Application Components under Demand Data Foundation Data Model Assign Semantically Partitioned Objects to DataStore Objects . After that you can use the example implementation of the BAdI to determine the correct physical DSO.
For more details about the procedure templates used in SAP Demand Signal Management, version for SAP BW/4HANA and the implementation of the BAdI, see the Customizing documentation for Cross-Application Components under Demand Data Foundation Business Add-Ins (BAdIs) BAdI: Stored Procedures .
If semantic partitioning is not needed, the system calls standard procedure template instances, where the parameter of the procedure template is replaced by the standard DSO.
More Information
Setting Up Semantic Partitioning [page 45]
Setting Up the Data Model by Partition Values for Transactional Data [page 47]
Setting Up the Data Model by Partition Values for Master Data [page 48]
2.5.4 Direct Update of Master Data
Concept
You can use this feature to make the changes that were made to your master data objects in data harmonization directly visible and available for reporting.
To make the changes to your products and locations available for reporting, you must transfer them to the corresponding InfoObjects.
Update of Master Data If Direct Update is Not Activated
If the direct update feature is not activated, this process runs as follows:
1. You save your changes in data harmonization.2. A scheduled daemon from the process flow control (PFC) picks up the changes. The system uses a process
definition from a data delivery agreement of the type Harmonization Data. The assigned process steps use process chains that extract the relevant changes and finally update the InfoObjects using DTPs.
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 51
Update of Master Data InfoObjects Without the Direct Update Feature
To see your changes in the InfoObjects, you must wait until the daemon has started and the process has completed successfully.
Update of Master Data If Direct Update Is Activated
If you activate the direct update of master data, the changes made in data harmonization are transferred directly to the InfoObjects without the help of the daemons from the process flow control (PFC). The update is executed by the corresponding APIs.
Direct Update Using APIs
To avoid conflicts with data upload (for example, File-> Acquisition DSO -> Propagation DSO -> InfoObject), the system must not use DTPs but APIs for both processes. The APIs are encapsulated as a SAP BW process type, /DDF/ATTR (Update InfoObject), that replaces the DTPs that are used in the process without direct update, in the corresponding process chains.
52 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
Update of Master Data InfoObjects Using the Direct Update Feature
The following InfoObjects and attributes with texts are updated:
● Internal Source Product (/DDF/PRODUCT)● Product (/DDF/GPID)
○ Brand (/DDF/BRAG_TXT)○ Manufacturer (/DDF/MANG_TXT)○ Category (/DDF/PCAG_TXT)○ Subcategory (/DDF/PSCG_TXT)
● Global Product (/DDF/GMPID)○ Global Brand (/DDF/BRAGM_TXT)○ Global Manufacturer (/DDF/MANGM_TXT)○ Global Category (/DDF/PCAGM_TXT)○ Global Subcategory (/DDF/PSCGM_TXT)
● Internal Source Location (/DDF/LOCATION)● Location (/DDF/GLID))● Global Location (/DDF/GMLID)
Prerequisites
You have activated the direct update of master data. For more information, see Activating the Direct Update of Master Data [page 54].
More Information
Update InfoObjects from Change Log [page 70]
Migrate Master Data Attributes [page 71]
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 53
Examples of Process Definitions for Uploading Point of Sale Data [page 64]
Examples of Process Definitions for Uploading Market Research Data [page 66]
2.5.4.1 Activating the Direct Update of Master Data
Use
The activation process comprises the following mandatory steps:
1. Ensure that you have defined a data delivery agreement of the type Harmonization Data and that this data delivery agreement has been activated. For more information, see Define Data Delivery Agreement for Harmonization Data [page 56].
2. Ensure that you have run report /DDF/FDH_TRANSFER_CONV_EXIT, as described in SAP note 2044743 . It is important that you enter the new conversion sequences DALOI_INPUT, DALOI, and GTINF_INPUT (DALOI_INPUT, DAL10, and GLNF1_INPUT) for the field mapping of the source product/source location in the mapping instruction sets for the object type FDH_PROD/FDH_LOC. For a detailed description, see SAP note 2036128 . The required adaptation of the field mappings is described in step 8 of this list.
3. Ensure that the following BAdI implementations are active:
○ Check in Customizing under Demand Data Foundation Data Harmonization Business Add-Ins (BAdIs) BAdI: Logic During Save in Data Harmonization that the implementation /DDF/BW_MD_UPD has the Active in IMG Activity checkbox selected.
○ Check in Customizing under Demand Data Foundation Data Harmonization Business Add-Ins (BAdIs) BAdI: Additional Attribute Checks and Input Help that the BAdI implementations /DDF/BADI_FDH_BO_LOC_BW and /DDF/BADI_FDH_BO_PROD_BW have the Active in IMG Activity checkbox selected.
If you have other active implementations of these BAdIs, deactivate them in Customizing. Alternatively, you can copy the code of the above default implementations into your own implementations. Do not deactivate the code of the default implementations.
4. Assign mapping instructions to data origins. For more information, see Assign Mapping Instructions to Data Origins [page 57].
5. Create number range intervals. For more information, see Create Number Range Intervals [page 58].6. Define InfoObject-specific settings. For more information, see Define InfoObject-Specific Settings [page
59].
7. Define the general settings for data harmonization in Customizing under Cross-Application Components Demand Data Foundation Data Harmonization General Settings . Enter the interval for the direct
update ID that you have created before. If you do not have any specific requirements, you can use the values you defined for your data delivery agreement of the agreement type Harmonization Data.
8. Define settings for the update of master data. For more information, see Define General Settings for the Update of Master Data [page 61].
NoteIn case of a new installation of SAP Demand Signal Management, version for SAP BW/4HANA, you can skip the steps 11 to 18 and start directly with the execution of the report Update InfoObjects from Change Log.
54 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
9. Adapt field mappings for specific SAP BW attributes. For more information, see Adapt Field Mappings for Specific Attributes [page 61].
10. Execute transaction Check Customizing for Data Model (/DDF/BW_CUST_CHECK) to ensure that your Customizing settings are consistent.
11. Execute the transaction /DDF/MD_MIGRATE_ATTR to upload those attributes to data harmonization that are only available in SAP BW. For more information, see Migrate Master Data Attributes [page 71].
12. Replace steps in process definitions. For more information, see Replace Steps in Process Definitions [page 63].
13. Deactivate steps of process definitions. For more information, see Deactivate Steps of Process Definition [page 64].
14. Stop the following standard jobs:○ Create Harm. Data Delivery ID○ Release Harm. Data Delivery ID
15. If you have data delivery agreements for master data upload with the Disable Harmonization checkbox selected, deselect the checkbox for those data delivery agreements. The direct update of master data is based on data harmonization and does not work if harmonization is disabled.
16. Wait until all processes in the process flow control (PFC) which had been started before the process definitions were changed, are finished. This way you can ensure that all DTPs for updating SAP BW master data are finished.
17. Execute the report /DDF/FDH_SET_CHG_LOG2PROCESSED. This report sets the status of the change pointers of all completed data deliveries to Processed.
18. Release those data deliveries of type Harmonization Data that still have the status New by running the report /DDF/FDH_RELEASE_DELIVERIES
19. Schedule the report /DDF/BW_MD_UPD (Update InfoObjects from Change Log [page 70]) to switch on the direct update. Select the Save Message Log checkbox. As the update of the SAP BW InfoObjects for products and locations was stopped when the corresponding process chains were removed from the process definitions, this report processes all change pointers that have been created since then and carries out the pending updates. Only those changes that have not been transferred to SAP BW yet are processed because the report /DDF/FDH_SET_CHG_LOG2PROCESSED has set the change log entries to Processed. Therefore, you should expect a longer run time for the initial run of this report.
20.After the report Update InfoObjects from Change Log has finished, check the application log (object /DDF/FDH, subobject /DDF/FDH_BW_MD_UPD).
Activities After the Direct Update has Been Activated
1. Remove the step that contains the process chain /DDF/MD_EXTRACT from the process definition you use in the data delivery agreement of the type Harmonization Data. Once the direct update is active, the upload processes for that process definition are no longer triggered automatically. They have to be scheduled instead.After removing this step, only one step with the process chain /DDF/MD_STAGE remains in the process definition. This process chain also updates the quantity conversion DSO, which cannot be updated directly.
2. Schedule the process chain /DDF/MD_EXTRACT for periodic execution. It is recommended to schedule this process chain not more than once every hour for the harmonization data deliveries. The process chain /DDF/MD_EXTRACT creates process flow control instances of your data delivery agreement of the type Harmonization Data and triggers the process chain /DDF/MD_STAGE as an automatic follow-up step.
3. Schedule the report /DDF/BW_MD_UPD for periodic execution. This report processes all the change pointers which the direct update could not process successfully, for example because of locking. Note that this report only sets the status of a change pointer to Processed if the extraction of the master data record
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 55
to the corresponding propagation DSO has been carried out successfully by the process chain /DDF/MD_STAGE.
4. Schedule the report /DDF/MD_BW_CHGLOG_DELETE for periodic execution. This report cleans up SAP BW change pointers. The corresponding change pointers from data harmonization have been cleaned up before.
5. Consider switching on the direct deletion of master data. This is done in Customizing for the data model under Cross-Application Components Demand Data Foundation Data Model General Settings . Only switch on the direct deletion if you do not want to archive harmonized master data that has been marked for deletion and if you have not used one of the following InfoObjects in your own InfoProviders:○ /DDF/GPID○ /DDF/GLID○ /DDF/GMPID○ /DDF/GMLID
NoteThe runtime of report /DDF/BW_MD_UPD increases if you switch on direct deletion of master data.
More Information
Direct Update of Master Data [page 51]
Update InfoObjects from Change Log [page 70]
Examples of Process Definitions for Uploading Market Research Da [page 66]
Examples of Process Definitions for Uploading Point of Sale Data [page 64]
2.5.4.1.1 Define Data Delivery Agreement for Harmonization Data
If you have carried out a new installation of SAP Demand Signal Management, version for SAP BW/4HANA, set up a data delivery agreement of the type Harmonization Data as follows:
1. Create a new process definition for data harmonization in Customizing for Demand Data Foundation under Data Upload Define Processes and Steps . The new process definition must have two steps
underneath, one referring to the process chain ID /DDF/MD_EXTRACT and the other one referring to process chain ID /DDF/MD_STAGE.
2. Create a new data origin for data harmonization in the area menu under Administrator Data UploadData Delivery Agreements Define Data Origins .
3. Create a new data provider for data harmonization in the area menu under Administrator Data UploadData Delivery Agreements Define Data Providers .
4. Create a new context for data harmonization in the area menu under Administrator Data UploadData Delivery Agreements Define Contexts .
56 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
5. Create a new data delivery agreement for data harmonization in the area menu under AdministratorData Upload Data Delivery Agreements Define Data Delivery Agreements with the following specifics:○ Agreement type: Harmonization Data○ Data format: Tables○ Data upload method: Manual Upload – Harmonization○ Data origin, data provider, context, and process definition, as defined in the previous steps
6. Activate the data delivery agreement.
2.5.4.1.2 Assign Mapping Instructions to Data Origins
Use transaction /DDF/FDH_SETUP to assign mapping instructions for the default origin (initial value) of the following mapping instruction sets:
Object Type Mapping Instruction Set
FDH_LOC SAP_EXP_HARM_OBJECT_GLOBAL
FDH_LOC SAP_EXP_HARM_OBJECT_LOCAL
FDH_LOC SAP_EXP_SOURCE_OBJECT
FDH_LOC SAP_BW_LOCAL_NVP_AFTER_FDH
FDH_PROD SAP_EXP_HARM_OBJECT_LOCAL
FDH_PROD SAP_EXP_BRAND_LOCAL
FDH_PROD SAP_EXP_HARM_OBJECT_GLOBAL
FDH_PROD SAP_BW_LOCAL_NVP_AFTER_FDH
FDH_PROD SAP_EXP_MANUFCT_GLOBAL
FDH_PROD SAP_EXP_MANUFCT_LOCAL
FDH_PROD SAP_EXP_MATLGRP_LOCAL
FDH_PROD SAP_EXP_PRODCAT_GLOBAL
FDH_PROD SAP_EXP_PRODCAT_LOCAL
FDH_PROD SAP_EXP_PRODSCAT_GLOBAL
FDH_PROD SAP_EXP_PRODSCAT_LOCAL
FDH_PROD SAP_EXP_SOURCE_BRAND
FDH_PROD SAP_EXP_SOURCE_MANUFCT
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 57
Object Type Mapping Instruction Set
FDH_PROD SAP_EXP_SOURCE_MATLGRP
FDH_PROD SAP_EXP_SOURCE_OBJECT
FDH_PROD SAP_EXP_SOURCE_PRODCAT
FDH_PROD SAP_EXP_SOURCE_PRODSCAT
If you have carried out a new installation of SAP Demand Signal Management, version for SAP BW/4HANA, assign mapping instructions for the default origin to the following mapping instruction sets:
Object Type Mapping Instruction Set
FDH_LOC SAP_BW_HARMONIZE_LOCAL_RECORD
FDH_LOC SAP_BW_IMP_SOURCE_OBJECT_GM
FDH_LOC SAP_BW_LOCAL_RECORD_AFTER_FDH
FDH_PROD SAP_BW_HARMONIZE_LOCAL_RECORD
FDH_PROD SAP_BW_IMP_SOURCE_OBJECT_GM
FDH_PROD SAP_BW_LOCAL_RECORD_AFTER_FDH
2.5.4.1.3 Create Number Range Intervals
Create number range intervals with the ID 01 for the following number range objects:
Number Range Object Where?
/DDF/ATTR Customizing under Cross-Application Components Demand Data Foundation Data
Model Define Number Range for Attribute IDs of Length 10
/DDF/ATT18 Customizing under Cross-Application Components Demand Data Foundation Data
Model Define Number Range for Attribute IDs of Length 18
/DDF/ATT9 Customizing under Cross-Application Components Demand Data Foundation Data
Model Define Number Range for Attribute IDs of Length 9
/DDF/FDHDI Customizing under Cross-Application Components Demand Data Foundation Data
Harmonization Technical Settings Define Number Ranges for Direct Update ID
58 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
2.5.4.1.4 Define InfoObject-Specific Settings
Ensure that the following entries are available in the Customizing activity InfoObject-Specific Settings for Direct Update. For more information, see the Customizing for Data Model under Cross-Application ComponentsDemand Data Foundation Data Model InfoObject-Specific Settings for Direct Update .
InfoProvider Object Type Mapping Instruction Set
Number Range Object
Number Range Text InfoObject
/DDF/BRANDG
FDH_PROD SAP_EXP_BRAND_LOCAL
/DDF/ATT18 01 /DDF/BRAG_TXT
/DDF/BRANDGM
FDH_PROD SAP_EXP_BRAND_GLOBAL
/DDF/ATT18 01 /DDF/BRAGM_TXT
/DDF/BRANDL
FDH_PROD SAP_EXP_SOURCE_BRAND
/DDF/ATT18 01 /DDF/BRAND_TXT
/DDF/GLID FDH_LOC SAP_EXP_HARM_OBJECT_LOCAL
- - -
/DDF/GMLID FDH_LOC SAP_EXP_HARM_OBJECT_GLOBAL
- - -
/DDF/GMPID FDH_PROD SAP_EXP_HARM_OBJECT_GLOBAL
- - -
/DDF/GPID FDH_PROD SAP_EXP_HARM_OBJECT_LOCAL
- - -
/DDF/LOCATION
FDH_LOC SAP_EXP_SOURCE_OBJECT
- - -
/DDF/MANUFACTG
FDH_PROD SAP_EXP_MANUFCT_LOCAL
/DDF/ATTR 01 /DDF/MANG_TXT
/DDF/MANUFACTL
FDH_PROD SAP_EXP_SOURCE_MANUFCT
/DDF/ATTR 01 /DDF/MANUF_TXT
/DDF/MANUFCTGM
FDH_PROD SAP_EXP_MANUFCT_GLOBAL
/DDF/ATTR 01 /DDF/MANGM_TXT
/DDF/MATL_GRPG
FDH_PROD SAP_EXP_MATLGRP_LOCAL
/DDF/ATT9 01 /DDF/MGRG_TXT
/DDF/MATL_GRPL
FDH_PROD SAP_EXP_SOURCE_MATLGRP
/DDF/ATT9 01 /DDF/MATGR_TXT
/DDF/PRODCATG
FDH_PROD SAP_EXP_PRODCAT_LOCAL
/DDF/ATT18 01 /DDF/PCAG_TXT
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 59
InfoProvider Object Type Mapping Instruction Set
Number Range Object
Number Range Text InfoObject
/DDF/PRODCATGM
FDH_PROD SAP_EXP_PRODCAT_GLOBAL
/DDF/ATT18 01 /DDF/PCAGM_TXT
/DDF/PRODCATL
FDH_PROD SAP_EXP_SOURCE_PRODCAT
/DDF/ATT18 01 /DDF/PCAT_TXT
/DDF/PRODSCATG
FDH_PROD SAP_EXP_PRODSCAT_LOCAL
/DDF/ATT18 01 /DDF/PSCG_TXT
/DDF/PRODSCATL
FDH_PROD SAP_EXP_SOURCE_PRODSCAT
/DDF/ATT18 01 /DDF/PSCAT_TXT
/DDF/PRODSCTGM
FDH_PROD SAP_EXP_PRODSCAT_GLOBAL
/DDF/ATT18 01 /DDF/PSCGM_TXT
/DDF/PRODUCT
FDH_PROD SAP_EXP_SOURCE_OBJECT
- - -
NoteThese entries for attribute InfoObjects are necessary if you use the standard content version of these InfoObjects that contains texts. Consequently, a direct update of these text tables is required.
If the texts are stored as keys of these InfoObjects, without updating text tables in addition, those entries are not required. As a minimum, the following settings are required:
InfoProvider
Object Type Mapping Instruction Set
Number Range Object
Number Range Text InfoObject
/DDF/GMLID
FDH_LOC SAP_EXP_HARM_OBJECT_GLOBAL
- - -
/DDF/GLID FDH_LOC SAP_EXP_HARM_OBJECT_LOCAL
- - -
/DDF/LOCATION
FDH_LOC SAP_EXP_SOURCE_OBJECT
- - -
/DDF/GMPID
FDH_PROD SAP_EXP_HARM_OBJECT_GLOBAL
- - -
/DDF/GPID FDH_PROD SAP_EXP_HARM_OBJECT_LOCAL
- - -
/DDF/PRODUCT
FDH_PROD SAP_EXP_SOURCE_OBJECT
- - -
60 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
2.5.4.1.5 Define General Settings for the Update of Master Data
Define the settings for the update of master data in Customizing under Cross-Application ComponentsDemand Data Foundation Data Model General Settings . The settings you make here control the mass processing of master data change pointers.
Field Recommendation
Maximum Package Size in Dialog
Set this to a number similar to the maximum number of records that result from a search in one of the UIs in data harmonization and that are changed together. A typical number would be 200.
Maximum Package Size in Background
Set this to a number similar to the package size of the DTPs for uploading master data from the acquisition layer to the propagation layer, that are used when the direct update is not activated.. This package size should be smaller than the default package size of other DTPs in SAP BW. A typical number would be 5,000.
Maximum Block Size Set a block size that is smaller than the Maximum Package Size in Background times the number of parallel processes. Different blocks of change pointers are processed sequentially while the change pointers within the block are divided into packages that are processed in parallel.
Number of Parallel Processes Set this to a number of parallel processes similar to the number of parallel processes config-ured for the execution of DTPs in the SAP BW batch manager. This number depends on the overall number of background processes available in your system.
Waiting Time in Seconds Allow for a sufficient waiting time. This helps to minimize locking issues that may occur because of concurrent direct updates of the same master data record in SAP BW. Note that master data activation locks an InfoObject globally. A typical waiting time would be 20 seconds.
Direct Deletion Choose whether the system will directly delete SAP BW master data that is no longer used. Do not switch on the direct deletion of master data before activating the direct update feature.
2.5.4.1.6 Adapt Field Mappings for Specific AttributesIf you use the direct update, all attributes of /DDF/PRODUCT and /DDF/LOCATION must be passed to data harmonization. Otherwise, the system initializes the attributes that are not available in data harmonization when updating the InfoObjects.
Therefore, you have to adapt the existing field mappings for all attributes that are available in SAP Business Warehouse but not in data harmonization. For more information, see the Customizing documentation under
Cross-Application Components Demand Data Foundation Data Harmonization Technical SettingsDefine Object Types .
The following tables list the attributes that must be adapted.
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 61
Field Mapping for Object Type FDH_PROD
Mapping Instruction Set SAP_BW_HARMONIZE_LOCAL_RECORD
Define Mapping Instruction SAP_BW_HARMONIZE_LOCAL_RECORD
Counter
Mapping Type Import Field Field Name Conversion Sequence
Export Field
104 Standard Mapping /1DD/S_HNUMLVL HNUMLVL - /1DD/S_HNUMLVL
109 Standard Mapping /1DD/S_PHLVL PHLVL - /1DD/S_PHLVL
111 Standard Mapping /1DD/S_TXT_LONG DESCRIPTION_LONG - /1DD/S_TXT_LONG
112 No Mapping /1DD/S_TXTLG1 - - -
113 No Mapping /1DD/S_TXTLG2 - - -
114 Standard Mapping - DESCRIPTION_LONG GETCHAR_1_TO_60 /1DD/S_TXTLG1
115 Standard Mapping - DESCRIPTION_LONG GETCHAR_61_TO_120
/1DD/S_TXTLG2
116 Standard Mapping GROSS_WT GROSS_WT - GROSS_WT
118 Standard Mapping TXTMD DESCRIPTION_MEDIUM
- TXTMD
119 Standard Mapping TXTSH DESCRIPTION_SHORT
- TXTSH
125 Standard Mapping /1DD/S_PRODLEVEL PRODLEVEL - /1DD/S_PRODLEVEL
Field Mapping for Object Type FDH_LOC
Mapping Instruction Set SAP_BW_HARMONIZE_LOCAL_RECORD
Define Mapping Instruction SAP_BW_HARMONIZE_LOCAL_RECORD
Counter
Mapping Type Import Field Field Name Conversion Sequence
Export Field
20 Standard Mapping /1DD/S_LHLVL LHLVL - /1DD/S_LHLVL
21 Standard Mapping /1DD/S_HNUMLVL HNUMLVL - /1DD/S_HNUMLVL
105 Standard Mapping /1DD/S_DLVLOC DLVLOC_SOURCE - /1DD/S_DLVLOC
111 Standard Mapping /1DD/S_LOC_GUID LOC_GUID - /1DD/S_LOC_GUID
112 Standard Mapping /1DD/S_MRCOUNTRY
MRCOUNTRY - /1DD/S_MRCOUNTRY
62 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
Counter
Mapping Type Import Field Field Name Conversion Sequence
Export Field
113 Standard Mapping /1DD/S_SALESAREA SALESAREA - /1DD/S_SALESAREA
114 Standard Mapping /1DD/S_UN_SAREA UN_SAREA - //1DD/S_UN_SAREA
115 Standard Mapping /1DD/S_TXT_LONG DESCRIPTION_LONG - /1DD/S_TXT_LONG
116 No Mapping /1DD/S_TXTLG1 - - -
117 No Mapping /1DD/S_TXTLG2 - - -
118 Standard Mapping - DESCRIPTION_LONG GETCHAR_1_TO_60 /1DD/S_TXTLG1
119 Standard Mapping - DESCRIPTION_LONG GETCHAR_61_TO_120
/1DD/S_TXTLG2
120 Standard Mapping TXTMD DESCRIPTION_MEDIUM
- TXTMD
121 Standard Mapping TXTSH DESCRIPTION_SHORT
- TXTSH
NoteThe counter can differ depending on your Customizing settings.
2.5.4.1.7 Replace Steps in Process Definitions
Use
If you use the standard process chains in SAP Demand Signal Management, version for SAP BW/4HANA, replace all steps in your process definitions that point to the following process chains with new process chains:
Old Process Chain New Process Chain
/DDF/PROD_STAGE /DDF/PROD_STAGE_2
/DDF/LOC_STAGE /DDF/LOC_STAGE_2
/DDF/GMPID_STAGE /DDF/GMPID_STAGE_2
/DDF/GMLID_STAGE /DDF/GMLID_STAGE_2
Alternatively, you can create new process definitions. However, if you decide to do so, you have to change the process definition in all your data delivery agreements.
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 63
More Information
Examples of Process Definitions for Uploading Point of Sale Data [page 64]
Examples of Process Definitions for Uploading Market Research Da [page 66]
2.5.4.1.8 Deactivate Steps of Process Definition
Deactivate all steps of the process definition you use in the data delivery agreements that point to the following process chains:
● /DDF/GLID_EXTRACT● /DDF/GPID_EXTRACT● /DDF/LOC_EXTRACT● /DDF/PROD_EXTRACT● /DDF/GMPID_EXTRACT● /DDF/GMLID_EXTRACT● /DDF/GMPROD_EXTRACT● /DDF/GMLOC_EXTRACT
These steps are no longer needed once the direct update is active.
Master data propagation DSOs cannot be updated directly, but they are still required by certain applications. Therefore, you have to add the following steps to this process definition in the data delivery agreements of the type Harmonization Data:
● /DDF/MD_EXTRACT● /DDF/MD_STAGE
NoteThe exact way of changing the process definition of your data delivery agreement of type Harmonization Data depends on whether or not you have used semantic partitioning for your master data. If you use semantically partitioned objects (SPOs) instead of the DSOs /DDF/DS31 and /DDF/DS51, you should create multiple copies of the process chain /DDF/MD_STAGE, depending on the number of semantic partitions, and multiple steps in the corresponding process definition.
To enable the system to choose the correct step at runtime, you must implement the BAdI /DDF/BADI_ADU_STEP.
Do not deactivate the data delivery agreement of type Harmonization Data.
2.5.4.1.9 Examples of Process Definitions for Uploading Point of Sale Data
The following examples show process definitions that define the required steps for uploading point of sale data. These process definitions are defined in Customizing. For more information, see Customizing for Data Upload
64 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
under Cross-Application Components Demand Data Foundation Data Upload Define Processes and Steps .
The first table shows an example of a process definition when the direct update of master data is not active. The second table shows an example after the direct update has been activated.
Process Definition When Direct Update Is Not Active
Step Description Parallelize Group
Process Chain ID Upload Stage
101 Location Acquisition MD1A /DDF/LOC_FILE Data Acquisition
102 Product Acquisition MD1A /DDF/PROD_FILE
103 Sales Acquisition MD1A /DDF/SALES_FILE
104 Stock Acquisition MD1A /DDF/STOCK_FILE
150 Quality Validation n/a /DDF/QV_PERFORM Quality Validation
161 Upload Addtional Locations
MD1X /DDF/LOC_ADD Master data creation during Data Acquisition. These steps are only required if new products or new locations, for which no master data has been created yet, were included in the delivery.
162 Upload Additional Products
MD1X /DDF/PROD_ADD
201 Location Staging MD2A /DDF/LOC_STAGE Data Propagation
202 Product Staging MD2A /DDF/PROD_STAGE
203 Sales Staging TD2 /DDF/SALES_STAGE
204 Stock Staging TD2 DDF/STOCK_STAGE
205 Sales Enrichment TD3 /DDF/SALES_ENRICHMT_STAGE
Data Enrichment
206 Stock Enrichment TD3 /DDF/STOCK_ENRICHMT_STAGE
Process Definition When Direct Update Is Active
Step Description Parallelize Group
Process Chain ID Upload Stage
101 Location Acquisition MD1A /DDF/LOC_FILE Data Acquisition
102 Product Acquisition MD1A /DDF/PROD_FILE
103 Sales Acquisition MD1A /DDF/SALES_FILE
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 65
Step Description Parallelize Group
Process Chain ID Upload Stage
104 Stock Acquisition MD1A /DDF/STOCK_FILE
150 Quality Validation n/a /DDF/QV_PERFORM Quality Validation
161 Upload Addtional Locations
MD1X /DDF/LOC_ADD Master data creation during Data Acquisition. These steps are only required if new products or new locations, for which no master data has been created yet, were included in the delivery.
162 Upload Additional Products
MD1X /DDF/PROD_ADD
201 Location Staging MD2A /DDF/LOC_STAGE_2 Data Propagation
202 Product Staging MD2A /DDF/PROD_STAGE_2
203 Sales Staging TD2 /DDF/SALES_STAGE
204 Stock Staging TD2 DDF/STOCK_STAGE
205 Sales Enrichment TD3 /DDF/SALES_ENRICHMT_STAGE
Data Enrichment
206 Stock Enrichment TD3 /DDF/STOCK_ENRICHMT_STAGE
2.5.4.1.10 Examples of Process Definitions for Uploading Market Research Data for Global Reporting
The following examples show process definitions that define the required steps for uploading market research data for global reporting. These process definitions are defined in Customizing. For more information, see Customizing for Data Upload under Cross-Application Components Demand Data Foundation Data Upload Define Processes and Steps .
The first table shows an example of a process definition when the direct update of master data is not active. The second table shows an example after the direct update has been activated.
Process Definition When Direct Update Is Not Active
Step Description Parallelize Group
Process Chain ID Upload Stage
101 Hierarchy Metadata Extraction
/DDF/HIER_MR_EXTRACT Data Acquisition
66 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
Step Description Parallelize Group
Process Chain ID Upload Stage
102 Hierarchy Level Metadata Extraction
/DDF/HIER_LVL_MR_EXTRACT
103 Location Hierarchy Extraction
HD1 /DDF/LOC_HIER_MR_EXTRACT
104 Product Hierarchy Extraction
HD1 /DDF/PROD_HIER_MR_EXTRACT
111 Location Extraction MD12 /DDF/LOC_MR_EXTRACT
112 Product Extraction MD1 /DDF/PROD_MR_EXTRACT
113 Location Name Value Extraction
MD2 /DDF/LOC_NV_MR_EXTRACT
114 Product Name Value Extraction
MD2 /DDF/PROD_NV_MR_EXTRACT
120 Time String Extraction /DDF/MRTIME_FILE
130 Retail Panel Extraction /DDF/MRTPN_FILE
198 Location Staging MD3 /DDF/LOC_STAGE Data Propagation
199 Product Staging MD3 /DDF/PROD_STAGE
201 Location Hierarchy Staging
HD2 /DDF/LOC_HIER_STAGE
202 Product Hierarchy Staging
HD2 /DDF/PROD_HIER_STAGE
213 Location Name Value Staging
MD4 /DDF/LOC_NV_STAGE
214 Product Name Value Staging
MD4 /DDF/PROD_NV_STAGE
220 Time String Staging /DDF/MRTIME_STAGE
230 Retail Panel Staging /DDF/MRTPN_STAGE
350 Quality Validation Before Global Layer
/DDF/GMQV_PERFORM Quality Validation
401 Upload Locations for Global Reporting
MD7 /DDF/GMLID_STAGE Data Propagation
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 67
Step Description Parallelize Group
Process Chain ID Upload Stage
402 Upload Products for Global Reporting
MD7 /DDF/GMPID_STAGE
430 Global Retail Panel Staging
/DDF/GMRTPN_STAGE
Process Definition When Direct Update Is Active
Step Description Parallelize Group
Process Chain ID Upload Stage
101 Hierarchy Metadata Extraction
/DDF/HIER_MR_EXTRACT Data Acquisition
102 Hierarchy Level Metadata Extraction
/DDF/HIER_LVL_MR_EXTRACT
103 Location Hierarchy Extraction
HD1 /DDF/LOC_HIER_MR_EXTRACT
104 Product Hierarchy Extraction
HD1 /DDF/PROD_HIER_MR_EXTRACT
111 Location Extraction MD12 /DDF/LOC_MR_EXTRACT
112 Product Extraction MD1 /DDF/PROD_MR_EXTRACT
113 Location Name Value Extraction
MD2 /DDF/LOC_NV_MR_EXTRACT
114 Product Name Value Extraction
MD2 /DDF/PROD_NV_MR_EXTRACT
120 Time String Extraction /DDF/MRTIME_FILE
130 Retail Panel Extraction /DDF/MRTPN_FILE
198 Location Staging MD3 /DDF/LOC_STAGE_2 Data Propagation
199 Product Staging MD3 /DDF/PROD_STAGE_2
201 Location Hierarchy Staging
HD2 /DDF/LOC_HIER_STAGE
202 Product Hierarchy Staging HD2 /DDF/PROD_HIER_STAGE
220 Time String Staging /DDF/MRTIME_STAGE
230 Retail Panel Staging /DDF/MRTPN_STAGE
68 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
Step Description Parallelize Group
Process Chain ID Upload Stage
350 Quality Validation Before Global Layer
/DDF/GMQV_PERFORM Quality Validation
401 Upload Locations for Global Reporting
MD7 DDF/GMLID_STAGE_2 Data Propagation
402 Upload Products for Global Reporting
MD7 /DDF/GMPID_STAGE_2
430 Global Retail Panel Staging
/DDF/GMRTPN_STAGE
2.5.4.1.11 Examples of Process Definitions for Uploading Master Data from Harmonization
The following examples show process definitions that define the required steps for uploading master data from data harmonization. These process definitions are defined in Customizing. For more information, see Customizing for Data Upload under Cross-Application Components Demand Data Foundation Data Upload Define Processes and Steps .
The first table shows an example of a process definition when the direct update of master data is not active. The second table shows an example after the direct update has been activated.
Process Definition When Direct Update Is Not Active
Step Description Parallelize Group Process Chain ID Upload Stage
101 Extract Harmonized Location
MD1 /DDF/GLID_EXTRACT
102 Extract Harmonized Product
MD1 /DDF/GPID_EXTRACT
103 Extract Harmonized Location to Global Reporting
MD1 /DDF/GMLID_EXTRACT
104 Extract Harmonized Product to Global Reporting
MD1 /DDF/GMPID_EXTRACT
201 Extract Source Location after Harmonization
MD2 /DDF/LOC_EXTRACT
202 Extract Source Product after Harmonization
MD2 /DDF/PROD_EXTRACT
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 69
Step Description Parallelize Group Process Chain ID Upload Stage
203 Extract Global Location Assignment
MD2 /DDF/GMLOC_EXTRACT
204 Extract Global Product Assignment
MD2 /DDF/GMPROD_EXTRACT
Process Definition When Direct Update Is Active
Step Description Parallelize Group Process Chain ID Upload Stage
101 Stage Data After Harmonization
/DDF/MD_STAGE
2.5.4.2 Update InfoObjects from Change Log
Use
This program is used to switch on the direct update. It determines all change pointers for objects that were changed in data harmonization and that could not be transferred to corresponding InfoObjects, for example due to a locking problem.
Scheduling this program regularly ensures that all master data updates reach the corresponding InfoObjects and are available for reporting.
Prerequisites
You have made your settings for the update of master data in Customizing under Demand Data FoundationData Model General Settings . The parallelization settings from Customizing are used as defaults for this transaction. However, you can overwrite them for a specific program run, if required.
Features
Object Selection
The standard selection type is Selection by Unprocessed Change Pointers. You can also select change pointers for products or for locations only or you can narrow the selection by date or time.
Message Log
You can choose whether you want to generate a message log.
Display of Check Results
70 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
You can choose how the check results are displayed and whether the system displays them at all.
Result
Once you have executed this program and the system has activated this new method for updating the affected InfoObjects, the previous update method is not available anymore.
2.5.4.3 Migrate Master Data Attributes
Use
You use transaction /DDF/MD_MIGRATE_ATTR to migrate product and location attributes from SAP Business Warehouse (SAP BW) to data harmonization before activating the direct update of master data. The program transfers attribute values for source products or source locations from the SAP BW InfoObjects /DDF/PRODUCT and /DDF/LOCATION to the corresponding objects in data harmonization. This is necessary because, for the direct update of master data, all attributes of /DDF/PRODUCT and /DDF/LOCATION must be passed to data harmonization. Otherwise, the system initializes the attributes that are not available in data harmonization when updating the InfoObjects.
Prerequisites
You must ensure that for the object type FDH_PROD the above product attributes are mapped as import and export fields with mapping type Standard Mapping to the corresponding field name of data harmonization in the mapping instruction set SAP_BW_HARMONIZE_LOCAL_RECORD. The same applies to the listed location attributes for the object type FDH_LOC in mapping instruction set SAP_BW_HARMONIZE_LOCAL_RECORD.
Features
For testing purposes, you can restrict the migration to a specific data origin or to specific object IDs of source products or source locations from data harmonization.
The following product attributes are migrated:
● /DDF/HNUMLVL Numerical Hierarchy Level● /DDF/PHLVL Source Hierarchy Level● /DDF/PRODLEVEL Source Product Level● /DDF/GROSS_WT Gross Weight● /DDF/DESCRIPTION_MEDIUM Medium Description of Object● /DDF/DESCRIPTION_SHORT Short Description of Object
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 71
The following location attributes are migrated:
● /DDF/HNUMLVL Numerical Hierarchy Level● /DDF/LHLVL Source Hierarchy Level● /DDF/LOC_GUID Source Location Reference GUID● /DDF/SALESAREA Selling Area (Floor Space)● /DDF/UN_SAREA Selling Area (Floor Space) Unit● /DDF/DLVLOC Internal Source Delivery Location● /DDF/MRCOUNTRY Delivery Region● /DDF/DESCRIPTION_MEDIUM Medium Description of Object● /DDF/DESCRIPTION_SHORT Short Description of Object
All master data migration processes can be run in parallel mode. You can overwrite the default settings, if required.
2.5.5 Check the Customizing for Data Model
You use transaction /DDF/BW_CUST_CHECK to check whether the Customizing settings for the data model are correct, for example, the parameters for direct update or the field mappings for products or locations in data harmonization. When you start the transaction, several selection options are available:
● All Checks: All checks are performed.● General Settings: The system checks the Customizing of the general settings for the data model.● Product: The system performs only product-relevant checks.● Location: The system performs only location-relevant checks.
2.6 Configuring and Administrating Global Market Share Analysis
As a data provider for Global Market Share Analysis, you have the following tasks:
● Define Market Groups [page 72]● Include Images in Global Market Share Analysis User Interfaces [page 73]● Define Path for Image Files [page 74] or Mass Definition of Image Paths [page 75]
2.6.1 Define Market Groups
Use
You use this transaction to define for each global brand the combinations of countries and global categories that you would like to report on.
72 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
These combinations are then stored as market groups in the InfoObject /DDF/MRMGRP. The assigned countries are stored in the InfoObject /DDF/MRMGRPM and the assigned global categories are stored in /DDF/MRMGRPC.
All defined market groups are available to be selected when you start the application Global Market Share Analysis. The transaction Define Market Groups is available in the SAP Easy Access Menu under Cross-Application Components Demand Signal Management Analytics and Reporting Global Market Share Analysis .
You can define several market groups for one global brand to reflect different responsibilities for global categories and countries. To determine the corresponding countries and global categories, you can either use a publishing group or explicitly assign individual countries and global categories to a market group. You can also combine both options.
You can change, display, or delete market groups. To add or remove a country or a global category, you just have to list all countries and global categories that you would like to include in the selection options.
NoteThe number of combinations of countries and global categories is limited due to restrictions in the lengths of a URL. You can include up to 8 global categories and 25 countries. If you reduce the number of global categories you can enhance the list of countries.
NoteIf you change a publishing group, for example, by replacing one country with another one, you must adjust the market group that corresponds to the publishing group.
More Information
Manage Publishing Groups [page 297]
2.6.2 Include Images in Global Market Share Analysis User Interfaces
Use
For the following object types you can include images in the user interfaces of Global Market Share Analysis:
● Country● Global Brand● Global Category
NoteOnly the MIME types image/pngand image/jpeg are supported, thus you can use only JPG and PNG files.
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 73
Process
1. You upload the corresponding image files into the MIME repository of your system.2. You define the path for each image file to the corresponding object key of the InfoObject using the
transactions Define Path for Image Files (/DDF/MIME) or Mass Definition of Image Files (/DDF/MIMEDEF).
NoteDisclaimer: Be aware of copyright. The images used are your responsibility.
More Information
Define Path for Image Files [page 74]
Mass Definition of Image Paths [page 75]
2.6.2.1 Define Path for Image Files
Use
You use the transaction Define Path for Image Files (/DDF/MIME) to assign image files stored in the MIME repository to the corresponding object keys to make the images visible in the application Global Market Share Analysis. The following object types are supported:
● Country● Global brand● Global category
For each object type you must specify the InfoObject that provides the key values that you would like to link to an image path. In the SAP standard the following InfoObjects are used:
Object Type InfoObject
Country 0COUNTRY
Global Brand /DDF/BRAGM_TXT
Global Category /DDF/PCAGM_TXT
Select one object type and define, for the relevant object keys, the path to the image file.
NoteDefine the paths before you start importing image files into the MIME repository. If the file does not exist yet, the system provides a warning but you can save the path definition anyway.
74 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Model
Alternatively, you can use transaction /DDF/MIMEDEF for a convenient mass definition of image paths.
More Information
Mass Definition of Image Paths [page 75]
2.6.2.2 Mass Definition of Image Paths
Use
You can use transaction /DDF/MIMEDEF to define the image paths for multiple object keys of a given object type at once.
You have the option to scan a specified folder of the MIME repository for existing image files. The system searches for images with file names that are equal to the selected object keys. If there is a match of a file name (excluding the file extension) the corresponding assignment of the object key to the path is saved.
Alternatively, you first create the assignments of object keys to image paths. The system then creates the path as a concatenation of the specified path, the object key, and the file extension depending on the chosen image type. Afterwards, you can upload the image files into the MIME repository with the corresponding file name.
More Information
Define Path for Image Files [page 74]
SAP Demand Signal Management, version for SAP BW/4HANAData Model C O N F I D E N T I A L 75
3 Data Upload
Use
You can upload company-internal data from systems, such as an ERP system and SAP Business Warehouse (SAP BW) to SAP Demand Signal Management, version for SAP BW/4HANA.
External data may come in files or may be sent directly to the Persistent Staging Area (PSA) tables. If the data is provided using files, you define the relevant file sets and data sets in Customizing; for the direct upload to PSA tables, you define only data sets.
In the standard system the data upload process runs automatically using background jobs that are scheduled on a regular basis using the load manager. The process flow control triggers subsequent steps based on the status of the jobs.
You can also upload external data manually.
Prerequisites
● You have identified the relevant data for upload.For example, you collect point-of-sale (POS) data from retailers and market research data from market research companies.
● The data is either already in the defined format for SAP Demand Signal Management, version for SAP BW/4HANA or it has been transformed to fit the structure of the inbound interface.For example, you use an extraction, transformation, and loading (ETL) tool, such as SAP Data Services that can transfer the data directly into the system.The data can be sent by the data provider directly to the PSA or to the inbound folder in the required format (tables, self-descriptive exchange format, flexible or static comma-separated values (CSV) formatted files).
● You have set up the data upload in Customizing for Cross-Application Components under Demand Data Foundation Data Upload .
● You have configured the data upload.
3.1 General Settings for Data Upload
76 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
3.1.1 Basic Settings for Uploading Data in Files
Procedure
NoteFor more information about steps 1 to 3 including optional customizable parameters, see the Customizing documentation for Cross-Application Components under Demand Data Foundation Data UploadBasic Settings General Settings .
1. Create a logical file path.2. Create an RFC destination.3. Create a folder structure for file deliveries, processing, and archiving.
4. Define number ranges in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings for the following objects:○ Processes○ Steps○ Files○ Data sets○ Data delivery agreements○ Planned data deliveries
5. Define reasons for dropping planned data deliveries.For more information, see the Customizing documentation for Cross-Application Components under
Demand Data Foundation Data Upload Basic Settings Define Allowed Delivery Deviation .6. Define the threshold values for the permitted deviation from planned delivery dates.7. Define process definitions and process steps.
For more information, see the Customizing documentation for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings Define Processes and Steps .
8. Run the report Check Consistency of Configuration for Data Upload (transaction /DDF/CHECK_CUST) to check your settings for the following:○ Processes and steps○ Data delivery agreements○ File sets and files○ Folder access
More Information
Automatic Data Upload with Folder Scanner [page 116]
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 77
3.1.1.1 New Data Upload FunctionNew Extraction Framework for SAP Demand Signal Management, version for SAP BW/4HANA.
You can use the BW/4HANA system itself as a source system, that is, for transferring/exchanging data within the same system. This solution offers a new data extraction framework that is based on the CDS view DataSources. The previously used extraction framework, that is, the classical DataSources with PSA, is not recommended for use with this solution.
Why CDS View based DataSource: CDS Views have simplified the creation of DataSources. With simple annotations, a DataSource is readily available for extraction. Generic extractors for data upload that are based on function modules are currently created using CDS Views.
Process Type and Intermediate Table:
Process Type: This determines which tasks the process has and which properties it has in maintenance. The process type is determined in the table RSPROCESSTYPES.
Using process types in process chains.
The existing DataSources are designed based on function modules and it is mandatory to run the code to decode the files that are placed on the application server.
As we are using a CDS View DataSource, the data must be available in the system. This process type runs in a process chain where the file extraction/decoding logic is executed. The process types read the data and store it in an intermediate table that acts like a PSA in the classical extraction scenario. The CDS view reads the data from the intermediate table and transfer it to the ADSO. No data is stored in the intermediate table.
To create a process type, proceed as follows:
1. Create an Intermediate table, for example, /DDF/D_DS21; The fields and technical properties of this intermediate table are based on ADSO /DDF/ADS21.
2. Create a CDS view, for example, POS Location Attributes. Use the following annotations to mark the view for data extraction.
Sample Code
@Analytics: dataCategory: #DIMENSION - Master Data DataSource @Analytics.dataCategory: #FACT – Transaction Data [email protected]: trueSample CDS View code for /DDF/D_DS21define view /DDF/D_DS21_CDS as select from /ddf/d_ds21{key processid,key location_ext,key location_type,key location_qual,ddagr,delivery,txtlg,gln,country,region,county_code,street,city,postal_code,dlvlloc_ext,dlvlloc_type,@DefaultAggregation: #SUMsales_area,sales_area_unit,
78 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
discha,division,salorg, shopformat}
3. Create a CDS View Based DataSource for the view.To do so, in transaction RSDS, enter the DataSource and source system of ABAP_CDS and choose Create.
4. On the extraction tab select the CDS View which you created for the data source and Activate the DataSource.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 79
Pre-requisites for Data Upload Configuration:
For the new extraction framework to work, you must maintain the following configuration in the maintenance views for /DDF/V_UPL_CNF (Data upload configuration) and /DDF/V_DTSRC_SEL (Generic Data Transfer DataSource Selections) as these are the prerequisites for any data flow to work in this new framework. This configuration is related to data flows in BW/4HANA system. In this maintenance view you must model a proper data flow; which includes the following:
● File● Format● Target● DataSource● Provider Structure● Table
Configuration of /DDF/V_UPL_CNF
Upload Type Format Target DataSourceProvider Structure Database Table
Max.No of entries
RP Data SDX Market Research Retail Panel Acquisition
/DDF/MRTPN_EXAMPLE
/DDF/S_MRTPN_EXAMPLE
/DDF/D_DS04 10000
RP Data SDX Time String Acquisition Market Research
/DDF/TIME_ATTR_COL_DX
/DDF/S_MR_TIME_ATTR_COL_DX
/DDF/D_DS05 10000
RP Data SDX Location Acquisition
/DDF/LOC_ATTR_COL_DX
/DDF/S_MR_LOC_ATTR_COL_DX
/DDF/D_DS21_DX
10000
RP Data SDX Location Hierarchy
/DDF/LOC_HIER_DX
/DDF/S_MR_LOC_HIER_DX
/DDF/D_DS24 10000
RP Data SFF Market Research Retail Panel Acquisition
/DDF/MRTPN_EXAMPLE
/DDF/S_MRTPN_EXAMPLE
/DDF/D_DS04 10000
RP Data SFF Product Name Value Pair Acquisition
/DDF/PROD_ATTR_NV_DX
/DDF/S_MR_PROD_ATTR_NV_DX
/DDF/D_DS43_DX
10000
Retailer Data Confi CSV Aggregated Sale Acquisition
/DDF/SALES_POS
/DDF/S_POS_SALES
/DDF/D_DS01 10000
80 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Upload Type Format Target DataSourceProvider Structure Database Table
Max.No of entries
Retailer Data Confi CSV Location Acquisition
/DDF/LOC_ATTR_COL_POS
/DDF/S_POS_LOC_ATTR_COL
/DDF/D_DS21 10000
Configuration of /DDF/V_DTSRC_SEL:
Upload Type Format Target DataSourceProvider Structure Field
RP Data SDX Market Research Retail Panel Acquisition
/DDF/MRTPN_EXAMPLE
/DDF/S_MRTPN_EXAMPLE
DDAGR
RP Data SDX Market Research Retail Panel Acquisition
/DDF/MRTPN_EXAMPLE
/DDF/S_MRTPN_EXAMPLE
PARAL_PROC_IDX
RP Data SDX Market Research Retail Panel Acquisition
/DDF/MRTPN_EXAMPLE
/DDF/S_MRTPN_EXAMPLE
PARAL_PROC_NR
RP Data SDX Time String Acquisition Market Research
/DDF/TIME_ATTR_COL_DX
/DDF/S_MR_TIME_ATTR_COL_DX
DDAGR
RP Data SDX Time String Acquisition Market Research
/DDF/TIME_ATTR_COL_DX
/DDF/S_MR_TIME_ATTR_COL_DX
DELIVERY
RP Data SDX Time String Acquisition Market Research
/DDF/TIME_ATTR_COL_DX
/DDF/S_MR_TIME_ATTR_COL_DX
PROCESSID
RP Data SDX Location Hierarchy Market Research Acquisition
/DDF/LOC_HIER_DX
/DDF/S_MR_LOC_HIER_DX
DDAGR
RP Data SDX Location Hierarchy Market Research Acquisition
/DDF/LOC_HIER_DX
/DDF/S_MR_LOC_HIER_DX
DELIVERY
RP Data SDX Location Market Research acquisition
/DDF/LOC_HIER_DX
/DDF/S_MR_LOC_HIER_DX
PROCESSID
Introduction Process Types
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 81
In the classical framework, all the DataSources were designed based on the function module and it was mandatory to run the code for reading the files and placing them on the application server. This process was run at the DataSource level.
In the current solution the DataSources are no more based on function modules. It introduces the concept of process type in a process chain where the function module logic will be implemented as a process type that includes the file reading logic.
At the time of adding a process type to the process chain the system offers a list of variants for selection. The variant should be selected based on the target of the object to be loaded.
Example:
When you drag and drop the process type of retail panel data, the system displays a popup to select the list of data targets related to retail panel data targets to be loaded.
More Details on Process Types:
To determine the implemented ABAP OO class for a specific process type, check the settings of the respective process type. In transaction RSA1 choose ‘Settings’ > Maintain Process Types’, and see the “possible process types”; double-click on the process type and check the object type name, which provides you with the name of the ABAP OO class for the implementation of the process type.
3.1 Implementation of ABAP OO class for process type:
1. Run transaction SE24 and specify the class name you want to create on the Class builder initial screen and choose Create.
2. Select object type Class.
3. Implement your own required class to be used in the process type.
3.2 Specify settings for new process type
82 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
After you have implemented the class that performs the processing and maintenance of your custom defined process type, you must declare the new process type in the process chain maintenance
Call transaction RSA1 and open any of the existing process chains or create a new one. Click on settings and navigate to Maintain Process Types.
Create a new entry in the table, and specify the following settings:
Setting Description
Process Type Name of the process type Short & Long Description
Object Type Name Name of your implemented ABAP OO class
Object Type CL ABAP OO Class
Possible Events ‘Process ends “successful” or “incorrect”
Repeatable X
ID Choose an icon via the F4-value help
Internal name
Process Category Choose a category where you want to find your process type later in the process chain maintenance.
The process type you created is available for selection under your specified process category
The following figure illustrates a sample process type created with the above-mentioned settings.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 83
To navigate to the settings details, you can double-click on the entry.
84 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
An application of the process type is visualized in the following figure.
3.1.2 Basic Settings for Data Upload in Tables
Use
For detailed information about the basic settings and configuration required to upload data in tables using the persistent staging area (PSA), go to SAP Solution Manager.
More Information
Automatic Data Upload with PSA Scanner [page 117]
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 85
3.1.3 Scheduling and Monitoring Jobs for Automatic Data Uploads
Use
Automatic data uploads are performed using preconfigured jobs that are triggered in the background and run in a controlled loop. These jobs execute repetitive functions and can run in multiple processes.
The jobs are scheduled by the control job and are executed at intervals specified in Customizing.
The Monitor Jobs user interface gives you an overview of the status of all scheduled jobs (for example, Running, Scheduled or Stopped) and shows whether there are any issues in the data upload.
Activities
1. Configure the standard background jobs in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings Configure Jobs .
Define the following settings:○ Background job variants○ Start date and time
For example, you can schedule the job to run during the least usage period of the system every hour each weekday night from 22:00:00 to 06:00:00.
○ Job duration○ Maximum runtime and the intervals in which the jobs are executed○ The sleep time in between job runs
2. Monitor the upload jobs using the Monitor Jobs user interface.You can start or stop jobs manually.○ If you stop a job manually, you place the function that is being implemented by the job on hold.
For example, if you stop the folder scanner for files in SDX and CSV format, it stops new files from being detected and processed. If you stop the PSA scanner, it prevents the scanner from detecting new entries in the PSA tables.
○ If a job has been stopped manually or it has been aborted by the system, check the job messages in the Monitor Jobs.After you solved the issues, restart the job manually on the Monitor Jobs.
NoteTry to reset the jobs in the Monitor Jobs UI by simply stopping the relevant job and then restarting. If this does not work, execute report Reset Jobs (transaction /DDF/JOB_RESET). You use the Reset Jobs report in the very rare case when the jobs are in a status from which they cannot be started or stopped manually in the Monitor Jobs. This situation can occur if the system was restarted and the jobs were not stopped before the system restart. In this case, you can use this report to reset the job status to New.
86 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
More Information
Standard Jobs [page 88]
Housekeeping [page 91]
3.1.3.1 Example: Job
The table and figure that follow describe an example of a job defined in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings Configure Jobs .
Attribute Value Explanation
Start Date 03.09.2012 Defines the date on which the background job is scheduled to start for the first time.
Start Time 21:00:00 Defines the time of day at which the background job is scheduled to start for the first time.
Maximum Runtime 1 Defines the length of time in hours that the job takes to run until completion.
If you enter zero (0), the job runs forever or until it is stopped either manually or due to a problem.
Period 1 Controls when the job scheduler creates a new instance of the job.
If you enter zero (0), the job will run once only and is not rescheduled.
Sleep Time 5 This is an optional setting that defines how long the job is inactive in between a run being completed and another run starting.
If you enter zero (o), the job does not rest in between runs and it runs continuously.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 87
Example Job Based on Daemon Framework
1. The job is started on the defined start date and time.2. It runs for one hour (runtime) and stops itself after one hour.3. During its runtime, the job executes a function in a continuous loop. This function could be, for example,
searching for new deliveries in all inbound folders for all active delivery agreements.4. During the job's runtime it goes to sleep for 5 minutes after each loop.5. Every hour (period), a new job instance is created.
NoteOnly one job instance can exist at a time. If an instance exists, but a new instance has to be created in the next period, a new instance will not be created until the current instance is completed.
3.1.3.2 Control Job
This job controls all the other jobs and does the following:
● Monitors and controls the statuses of other jobs● Schedules other jobs (simultaneously)● Exists always as a single instance● Starts jobs● Stops jobs● Aborts processing with error logging so that errors can be displayed using the Monitor Jobs.
3.1.3.3 Standard Jobs
Use
You can automate any function that is repetitive. You make the relevant settings in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings Configure Jobs .
The following table displays the standard jobs that are used in SAP Demand Signal Management, version for SAP BW/4HANA.
88 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Application Area Description of Integration
Demand Data Foundation The control job controls all the other jobs, and starts and stops the jobs.
Data Upload The folder scanner and Persistent Staging Area (PSA) scanner jobs automatically detect data deliveries in either the inbound folder or the pre-configured Service Delivery PSA.
Process Flow Control (PFC) The process scheduler job triggers the data processing for the processes and steps that fulfill certain conditions.
Administration The administrative tasks job ensures that an error in a process chain that is running is propagated to its calling step.
This job also moves files from the File Processing folder to the archive folder for data deliveries that have the status Completed or Canceled.
Archiving Instance table entries are archived to SAP Business Warehouse (SAP BW) to maintain system performance. The clean archived instances job ensures that archived entries of a certain age are deleted.
Harmonization The Create Harm. Data Delivery ID (Create Harmonization Data Delivery ID) job is used to create data delivery IDs that are used in data harmonization to pass on data that is manually changed in data harmonization.
The Release Harm. Data Delivery ID (Release Harmonization Data Delivery ID) job is used in data harmonization to release the data delivery IDs that are used to pass on data that is manually changed in data harmonization.
The Resolve Harm. Work Items (Resolve Harmonization Work Items) job is used to resolve work items that were created for pending activities in data harmonization.
The Harmonization Postprocessing job is used to execute harmonization at a later point in time after data upload.
More Information
Scheduling and Monitoring Jobs for Automatic Data Uploads [page 86]
Scheduling Background Jobs for Data Harmonization [page 235]
Control Job [page 88]
Folder Scanner [page 90]
PSA Scanner [page 90]
Administrative Tasks Job [page 92]
Clean Archived Instances Job [page 93]
Housekeeping [page 91]
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 89
3.1.3.3.1 Folder Scanner Job
Use
This job detects incoming deliveries by continuously scanning through configured folders.
Features
The folder scanner performs the following functions:
● It checks all the pre-configured directories for the presence of any files.● It checks the delivery completeness by checking the presence of all the mandatory files in the current
delivery.● It renames or moves the files, depending on your settings, from the inbound folder to the process folder,
and finally to the archive folder.● It updates the runtime instance tables with the initial status for the following objects:
○ Data deliveries○ Processes○ Steps
More Information
Automatic Data Upload with Folder Scanner [page 116]
Checking Completeness of Data Deliveries [page 171]
BI Content documentation:
(/DDF/COUNT)
3.1.3.3.2 PSA Scanner Job
Use
This job detects incoming deliveries by continuously scanning the Service Delivery PSA.
The PSA scanner detects data sequentially, which means that although there can be multiple data deliveries for a single data delivery agreement at the same time, the data delivery with the oldest time stamp is processed before any data deliveries with a more recent time stamp.
For more information about this job, see the documentation in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings Configure Jobs .
90 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Features
The PSA scanner does the following:
● It checks the Service Delivery PSA for new data deliveries.● It prepares the data for upload.
○ It validates the uniqueness.○ It ensures that the data deliveries are complete
This means checking that all the mandatory data sets are included in the data delivery.○ It loads the complete data delivery to the instance tables and sets the status of all the instances to
ReadyThe instances include the following:○ Data deliveries○ Process○ Steps
More Information
Automatic Data Upload with PSA Scanner [page 117]
3.1.3.3.3 Housekeeping Jobs
Use
Since the amount of data in SAP Demand Signal Management, version for SAP BW/4HANA is continuously growing through processing incoming data deliveries, the system may reach its limits, for example, in hard drive capacity.
If you want to keep the amount of data in your SAP BW system constant, you use data archiving for the following data:
● Processed files in the archiving folderAfter files are processed successfully, they are moved to the archive folder using the Administrative Tasks Job . Archive all processed files in all archive folders for all data deliveries on a regular basis. The recommended frequency depends on the number of deliveries received and the server capacity.
● Instance tablesInstances are extracted to SAP BW and deleted by the Clean Archived Instances Job, when specified prerequisites are met.Since the data is extracted to SAP BW, you use the standard data archiving processes for SAP BW.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 91
More Information
Administrative Tasks Job [page 92]
Clean Archived Instances Job [page 93]
For more information about the data archiving process in SAP BW, see SAP Library for SAP NetWeaver on SAP Help Portal at http://help.sap.com .
3.1.3.3.3.1 Administrative Tasks Job
The administrative tasks job ensures that an error in a process chain that is running is propagated to its calling step. This is essential in case a process chain fails and does not update the step status. For example, if the status of the process chain instance is in Error or the process chain could not be triggered, and the status of the step instance remains In Process, the administrative job sets the step status to be in Error also to reflect the issue with the process chain in SAP BW.
The job also specifically searches for steps that are in error for which the checkbox Step Retry has been set. This setting is made in Customizing for Cross-Application Components under Demand Data FoundationData Upload Define Processes and Steps . The job restarts any steps that are in error and have the checkbox selected in Customizing. Restarting means the step status is changed from Error to Ready. The process flow control can then pick the step up again.
NoteThe Process Chain Check parameter (PROCESS_CHAIN_CHECK) is a parameter value that is available for you to change in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings General Settings . This parameter value is used to define whether the process flow control (PFC) checks for any conflicts between process chains before the PFC triggers a process chain. By default, the value of the parameter is OFF. In this case, any process chain conflicts are resolved by SAP BW. If the value of the parameter is ON, PFC checks for conflicts.
This job also moves files from the Process folder to the Archive folder for data deliveries that have the status Completed or Canceled.
The data delivery instance is marked for archiving by this job, if the following conditions are met:
● Delivery instance has a final status (Canceled, Completed)● The associated files are moved to the Archive folder.
You can change the settings for this job in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings Configure Jobs .
92 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
3.1.3.3.3.2 Clean Archived Instances Job
Use
For analytical purposes, SAP Demand Signal Management, version for SAP BW/4HANA is set up to extract information from instance tables, planned data deliveries and the assignment between planned data deliveries and received data deliveries to DataStore objects (DSOs) in SAP BW.
To set up this extraction, you schedule the process chain /DDF/PFC_ALL.
This extraction has the following tasks:
● Back up historical uploads and their instances, planned data deliveries and their assignments● Enable analysis of the upload process and enable penalty reporting
To improve the performance of access to the instance tables and planned data deliveries table, the Clean Archived Instances Job monitors the instance and planned delivery tables and checks whether a data delivery with all its processes and steps can be deleted, and deletes all detected data deliveries.
In the standard system, the number of days until instances are deleted is defined in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings General Settings (parameter INSTANCE_CLEAN_DAYS_NUMBER). The default value is 730 days.
NoteYou can change the number of days by changing the parameter INSTANCE_CLEAN_DAYS_NUMBER in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings General Settings or by implementing the BAdI: Enhancement for Archiving.
Prerequisites
In the standard system, the following objects are identified before deletion:
Objects and Prerequisites for Deletion
Objects Criteria for Deletion
1. Delivery instances ● The last change took place longer ago than the value set in the parameter INSTANCE_CLEAN_DAYS_NUMBER
● The data delivery is marked for archiving
● The data delivery and its dependent objects (process and steps) have been extracted to instance DSOs
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 93
Objects Criteria for Deletion
2. Planned data deliveries ● The last change took place longer ago than the value set in the parameter INSTANCE_CLEAN_DAYS_NUMBER and the planned date for delivery is also older than the number of days set as the value for INSTANCE_CLEAN_DAYS_NUMBER
● The planned data delivery has been extracted to DSOs
3. Assignments between planned data deliveries and received data deliveries
● The assignments are between delivery instances and planned data deliveries that fulfill the above criteria
● The last change to the assignments took place longer ago than the value set in the parameter INSTANCE_CLEAN_DAYS_NUMBER
● The assignments have been extracted to DSOs
Result
The system deletes the following objects:
● The delivery instances identified in 1. that do not have assignments outside the assignments identified in 3.● The planned deliveries identified in 2 . that do not have assignments outside the assignments identified in
3.● The assignments identified in 3.● The quality validation KPIs of the deleted deliveries
This job does not need to run daily, but depends on the volume of incoming deliveries. The default value is set to 180 days. The job will run every 180 days, but it will delete deliveries and planned deliveries, for example, that are older than 730 days (which is set in INSTANCE_CLEAN_DAYS_NUMBER). If the default value for INSTANCE_CLEAN_DAYS_NUMBER is changed, the period setting of the job should be adjusted accordingly.
More Information
● For more information about the broader topic of archiving in SAP Demand Signal Management, version for SAP BW/4HANA, see Archiving [page 174].
● For more information about configuring the job, see Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings Configure Jobs .
94 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
● BI Content documentation:○ (/DDF/PFC_ALL)○ (/DDF/MPPFCD_PFC_STATUS_001)○ (/DDF/MPPFC_PFC_DELIVERY_STAT01)○○
3.1.4 Manage Agreements
In the UI Manage Agreements, you can complete the following tasks:
● Create a new agreement● Create an agreement as a template for other agreements● Edit an agreement, for example, add new master data to new agreements that you created based on
templates● Activate agreements for data upload and deactivate agreements that you plan to delete, for example● Copy agreements● Delete agreements● Create the field mapping for retail panel agreements
Prerequisites
You have created the process definition and the steps belonging to it in SAP GUI. For more information, see Process Definitions and Steps [page 97].
Activities
Create an Agreement
In the UI for Manage Agreements, you can create new data delivery agreements by completing the following steps:
Note
Alternatively, on the SAP Easy Access screen, choose Administrator Data Upload Data Delivery Agreements or Data Delivery Agreements for Market Research and Map External Fields for Retail Panel Data.
1. Choose New.2. In the pop-up, enter the following mandatory information:
○ Agreement nameThe agreement name must be unique and no longer than 10 characters.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 95
○ Agreement type○ Data format○ Data upload method
For more information about the allowed combinations of mandatory information, see Agreement Types, Upload Methods, and Data Formats [page 115].
3. In the Details tab page, you maintain the master data that is required for proper data upload by selecting the agreement you just created from the Data Delivery Agreements list on the UI and choosing Edit. In the Details tab page, you can then enter the new master data. Checks are run when you save and if any required information is missing, the system displays error messages. Enter the following:○ Process definition○ Data origin○ Data provider○ Region○ Contexts
For more information about these master data types, see the field help by right-clicking on the field and choosing More Field Help.
NoteAlternatively, you can create the master data by choosing the New Master Data button at the top of the UI and selecting each master data type one by one from the dropdown: pop-ups prompt you to fill the mandatory fields for each (data origin, data provider, region, and context).
4. In the Retail Panel Data tab page, define the data that is relevant to retail panel-relevant agreement types.5. In the Data Set tab page, define data sets and data set types.6. In the File Set tab page, define files and file sets.7. In the Mapping tab page, create the mapping based on your selection of data set type or DataSource.8. In the Filters tab page, define the extraction filters based on the dimensions product, time, and location.9. In the Harmonization Parameters tab page, define the general parameters for setting up harmonization for
data origin.10. In the Assign Harmonization Group tab page, assign the harmonization group to the data delivery
agreement.11. In the Prioritize Data Origin tab page, prioritize the data origin.12. In the Prioritize Data Origin Per Attribute Value tab page, prioritize the data origin as per the attribute value.13. In the Harmonization Score tab page, automatic harmonization score is set.14. In the Mapping Instructions tab page, define the mapping instructions.15. In the Quality Validation tab page, after choosing the Execution time you can add Function, Sequence and
other details.
NoteIn the Harmonization Parameters, Assign Harmonization Group, Harmonization Score, and Mapping Instructions tab pages, if you choose the Default checkbox, default data is displayed. If the Default checkbox is not selected, then data delivery agreement specific settings should be maintained.
For more information about what to enter in the fields on these tab pages, see the field help by right-clicking on the field and choosing More Field Help..
Copy an Agreement
96 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
You can create new agreements by selecting an existing agreement from the list and choosing the Copy button. Note the following important points:
● To save the new target agreement, you have to assign it a new, unique name.● You can only copy from one source agreement to one target agreement at a time. You cannot do this for
multiple agreements at once.● You can copy from source agreements that have been used for data upload (“Is Used” indicator), but the
new target agreement does not acquire the attribute of being used for upload.● You can copy from source agreements that are already activated, but the new target agreement does not
acquire the attribute of being activated for upload. You also need to activate the new agreement before using it.
More Information
Deleting Data Delivery Agreements [page 122]
3.1.4.1 Process Definitions and Steps
Use
Process definitions and process steps need to be created as a prerequisite to defining data delivery agreements. You must create the process definition and process steps in SAP GUI. You do this in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Define Processes and Steps .
For more information, see the Customizing documentation and field help.
More Information
Examples of Process Definitions for Uploading Master Data from H [page 69]
Examples of Process Definitions for Uploading Market Research Da [page 66]
Examples of Process Definitions for Uploading Point of Sale Data [page 64]
3.1.4.1.1 Examples of Process Definitions for Uploading Master Data from Harmonization
The following examples show process definitions that define the required steps for uploading master data from data harmonization. These process definitions are defined in Customizing. For more information, see
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 97
Customizing for Data Upload under Cross-Application Components Demand Data Foundation Data Upload Define Processes and Steps .
The first table shows an example of a process definition when the direct update of master data is not active. The second table shows an example after the direct update has been activated.
Process Definition When Direct Update Is Not Active
Step Description Parallelize Group Process Chain ID Upload Stage
101 Extract Harmonized Location
MD1 /DDF/GLID_EXTRACT
102 Extract Harmonized Product
MD1 /DDF/GPID_EXTRACT
103 Extract Harmonized Location to Global Reporting
MD1 /DDF/GMLID_EXTRACT
104 Extract Harmonized Product to Global Reporting
MD1 /DDF/GMPID_EXTRACT
201 Extract Source Location after Harmonization
MD2 /DDF/LOC_EXTRACT
202 Extract Source Product after Harmonization
MD2 /DDF/PROD_EXTRACT
203 Extract Global Location Assignment
MD2 /DDF/GMLOC_EXTRACT
204 Extract Global Product Assignment
MD2 /DDF/GMPROD_EXTRACT
Process Definition When Direct Update Is Active
Step Description Parallelize Group Process Chain ID Upload Stage
101 Stage Data After Harmonization
/DDF/MD_STAGE
3.1.4.1.2 Examples of Process Definitions for Uploading Market Research Data for Global Reporting
The following examples show process definitions that define the required steps for uploading market research data for global reporting. These process definitions are defined in Customizing. For more information, see Customizing for Data Upload under Cross-Application Components Demand Data Foundation Data Upload Define Processes and Steps .
The first table shows an example of a process definition when the direct update of master data is not active. The second table shows an example after the direct update has been activated.
98 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Process Definition When Direct Update Is Not Active
Step Description Parallelize Group
Process Chain ID Upload Stage
101 Hierarchy Metadata Extraction
/DDF/HIER_MR_EXTRACT Data Acquisition
102 Hierarchy Level Metadata Extraction
/DDF/HIER_LVL_MR_EXTRACT
103 Location Hierarchy Extraction
HD1 /DDF/LOC_HIER_MR_EXTRACT
104 Product Hierarchy Extraction
HD1 /DDF/PROD_HIER_MR_EXTRACT
111 Location Extraction MD12 /DDF/LOC_MR_EXTRACT
112 Product Extraction MD1 /DDF/PROD_MR_EXTRACT
113 Location Name Value Extraction
MD2 /DDF/LOC_NV_MR_EXTRACT
114 Product Name Value Extraction
MD2 /DDF/PROD_NV_MR_EXTRACT
120 Time String Extraction /DDF/MRTIME_FILE
130 Retail Panel Extraction /DDF/MRTPN_FILE
198 Location Staging MD3 /DDF/LOC_STAGE Data Propagation
199 Product Staging MD3 /DDF/PROD_STAGE
201 Location Hierarchy Staging
HD2 /DDF/LOC_HIER_STAGE
202 Product Hierarchy Staging
HD2 /DDF/PROD_HIER_STAGE
213 Location Name Value Staging
MD4 /DDF/LOC_NV_STAGE
214 Product Name Value Staging
MD4 /DDF/PROD_NV_STAGE
220 Time String Staging /DDF/MRTIME_STAGE
230 Retail Panel Staging /DDF/MRTPN_STAGE
350 Quality Validation Before Global Layer
/DDF/GMQV_PERFORM Quality Validation
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 99
Step Description Parallelize Group
Process Chain ID Upload Stage
401 Upload Locations for Global Reporting
MD7 /DDF/GMLID_STAGE Data Propagation
402 Upload Products for Global Reporting
MD7 /DDF/GMPID_STAGE
430 Global Retail Panel Staging
/DDF/GMRTPN_STAGE
Process Definition When Direct Update Is Active
Step Description Parallelize Group
Process Chain ID Upload Stage
101 Hierarchy Metadata Extraction
/DDF/HIER_MR_EXTRACT Data Acquisition
102 Hierarchy Level Metadata Extraction
/DDF/HIER_LVL_MR_EXTRACT
103 Location Hierarchy Extraction
HD1 /DDF/LOC_HIER_MR_EXTRACT
104 Product Hierarchy Extraction
HD1 /DDF/PROD_HIER_MR_EXTRACT
111 Location Extraction MD12 /DDF/LOC_MR_EXTRACT
112 Product Extraction MD1 /DDF/PROD_MR_EXTRACT
113 Location Name Value Extraction
MD2 /DDF/LOC_NV_MR_EXTRACT
114 Product Name Value Extraction
MD2 /DDF/PROD_NV_MR_EXTRACT
120 Time String Extraction /DDF/MRTIME_FILE
130 Retail Panel Extraction /DDF/MRTPN_FILE
198 Location Staging MD3 /DDF/LOC_STAGE_2 Data Propagation
199 Product Staging MD3 /DDF/PROD_STAGE_2
201 Location Hierarchy Staging
HD2 /DDF/LOC_HIER_STAGE
202 Product Hierarchy Staging HD2 /DDF/PROD_HIER_STAGE
100 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Step Description Parallelize Group
Process Chain ID Upload Stage
220 Time String Staging /DDF/MRTIME_STAGE
230 Retail Panel Staging /DDF/MRTPN_STAGE
350 Quality Validation Before Global Layer
/DDF/GMQV_PERFORM Quality Validation
401 Upload Locations for Global Reporting
MD7 DDF/GMLID_STAGE_2 Data Propagation
402 Upload Products for Global Reporting
MD7 /DDF/GMPID_STAGE_2
430 Global Retail Panel Staging
/DDF/GMRTPN_STAGE
3.1.4.1.3 Examples of Process Definitions for Uploading Point of Sale Data
The following examples show process definitions that define the required steps for uploading point of sale data. These process definitions are defined in Customizing. For more information, see Customizing for Data Upload under Cross-Application Components Demand Data Foundation Data Upload Define Processes and Steps .
The first table shows an example of a process definition when the direct update of master data is not active. The second table shows an example after the direct update has been activated.
Process Definition When Direct Update Is Not Active
Step Description Parallelize Group
Process Chain ID Upload Stage
101 Location Acquisition MD1A /DDF/LOC_FILE Data Acquisition
102 Product Acquisition MD1A /DDF/PROD_FILE
103 Sales Acquisition MD1A /DDF/SALES_FILE
104 Stock Acquisition MD1A /DDF/STOCK_FILE
150 Quality Validation n/a /DDF/QV_PERFORM Quality Validation
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 101
Step Description Parallelize Group
Process Chain ID Upload Stage
161 Upload Addtional Locations
MD1X /DDF/LOC_ADD Master data creation during Data Acquisition. These steps are only required if new products or new locations, for which no master data has been created yet, were included in the delivery.
162 Upload Additional Products
MD1X /DDF/PROD_ADD
201 Location Staging MD2A /DDF/LOC_STAGE Data Propagation
202 Product Staging MD2A /DDF/PROD_STAGE
203 Sales Staging TD2 /DDF/SALES_STAGE
204 Stock Staging TD2 DDF/STOCK_STAGE
205 Sales Enrichment TD3 /DDF/SALES_ENRICHMT_STAGE
Data Enrichment
206 Stock Enrichment TD3 /DDF/STOCK_ENRICHMT_STAGE
Process Definition When Direct Update Is Active
Step Description Parallelize Group
Process Chain ID Upload Stage
101 Location Acquisition MD1A /DDF/LOC_FILE Data Acquisition
102 Product Acquisition MD1A /DDF/PROD_FILE
103 Sales Acquisition MD1A /DDF/SALES_FILE
104 Stock Acquisition MD1A /DDF/STOCK_FILE
150 Quality Validation n/a /DDF/QV_PERFORM Quality Validation
161 Upload Addtional Locations
MD1X /DDF/LOC_ADD Master data creation during Data Acquisition. These steps are only required if new products or new locations, for which no master data has been created yet, were included in the delivery.
162 Upload Additional Products
MD1X /DDF/PROD_ADD
201 Location Staging MD2A /DDF/LOC_STAGE_2 Data Propagation
202 Product Staging MD2A /DDF/PROD_STAGE_2
102 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Step Description Parallelize Group
Process Chain ID Upload Stage
203 Sales Staging TD2 /DDF/SALES_STAGE
204 Stock Staging TD2 DDF/STOCK_STAGE
205 Sales Enrichment TD3 /DDF/SALES_ENRICHMT_STAGE
Data Enrichment
206 Stock Enrichment TD3 /DDF/STOCK_ENRICHMT_STAGE
3.1.4.2 Data Origins, Data Providers, and Data Delivery Agreements
You can model your data delivery agreements with your retailers or providers of syndicated data, for example, in SAP Demand Signal Management, version for SAP BW/4HANA.
You set up data delivery agreements between the following parties:
1. Data origin2. Data provider
The data delivery agreements you define in SAP Demand Signal Management, version for SAP BW/4HANA reflect your actual agreements and enable you to define all the relevant information in one place.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 103
The following figure shows the relationship between data origins, data providers, and data delivery agreements for file upload:
Relationship between origin, provider, and agreement
The information required for data delivery agreements includes not only the data origin and provider, but also the following:
1. Agreement type (POS data from a retailer, retail panel data, company-internal data, harmonization data)2. Data format (static and configurable comma-separated values format, self-descriptive exchange format,
tables)3. Data upload method (automatic upload with folder scanner, automatic upload with PSA scanner; manual
upload, manual upload – harmonization data)4. Data set5. Process6. Context
104 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
3.1.4.3 Defining Data Origins
You use data origins to identify parties that create or own the data that is delivered to SAP Demand Signal Management, version for SAP BW/4HANA, for example, retailers, distributors, market research companies or data syndicators.
On the SAP Easy Access screen, choose SAP Menu > Cross-Application Components > Demand Signal Management > Data Upload > Data Delivery Agreements > Define Data Origins or use transaction Define Data Origins /DDF/DORIGIN.
Related Settings
The following settings can be defined dependent on data origins:
● Setting Up Automatic Harmonization [page 223]● Defining Priorities for Copying Attribute Values into Harmonized Objects [page 224]● Defining the Appearance of Object Attributes [page 221]● Defining Location Calendars [page 44]
3.1.4.4 Defining Data Providers
You use data providers to identify the companies that deliver data to SAP Demand Signal Management, version for SAP BW/4HANA.
If the data provider is the same as the data origin, you can use the same value for both attributes.
On the SAP Easy Access screen, choose SAP Menu > Cross-Application Components > Demand Signal Management > Data Upload > Data Delivery Agreements > Define Data Providers or use transaction Define Data Providers /DDF/DPROVIDER.
3.1.4.5 Defining Regions
You use regions to define the geographic area for which a data delivery is valid. The region is used, for example, during analysis in global reporting.
On the SAP Easy Access screen, choose SAP Menu > Cross-Application Components > Demand Signal Management > Data Upload > Data Delivery Agreements > Define Regionsor use transaction Define Regions /DDF/DORIGIN.
Before you can use a new region for the upload of retail panel data, you have to upload these to the /DDF/MRCOUNTRY InfoObject. For more information, seeOne-Time Uploads Before Recurring Regular Data Uploads Are Started [page 164]
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 105
3.1.4.6 Defining Contexts
You use contexts to define the domain in which the external identifiers of a source object, for example, a source product or source location, are unique.
For each data origin, multiple contexts can exist.
You can, for example, use different contexts for the following purposes:
● The external identifiers of the source objects are not unique in data deliveries from one data origin.● You want to distinguish reporting areas. For example, you can use different contexts to differentiate
reporting areas like "Food" or "Animal Food".
1. On the SAP Easy Access screen, choose SAP Menu > Cross-Application Components > Demand Signal Management > Data Upload > Data Delivery Agreements > Define Contexts or use transaction Define Contexts /DDF/CDPA.
2. Define contexts and assign number ranges for internal source products and internal source locations.
Related Settings
The following settings can be defined dependent on contexts:
● Defining Time Derivation [page 147]● Revising Hierarchy Levels [page 152]● Defining Product Level Descriptions [page 153]● Setting Up Semantic Partitioning [page 45]
3.1.4.7 Defining Data Delivery Agreements
Use
You use data delivery agreements to specify the data that is expected to be sent to SAP Demand Signal Management and how the data is handled during the upload process.
You can define specific delivery agreements for retail panel data (market research data), retailer data (point of sale data), and for company-internal data. For data delivery agreements that are only used in data harmonization you specify the agreement type Harmonization Data
Prerequisites
You have defined the following objects:
● Data Set Types in Customizing for Cross-Application Components under Demand Data FoundationData Upload Basic Settings Define Data Set Types .
106 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
● Process definitions and process steps in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Define Processes and Steps .
● (Optional) Partition Values in Customizing for Cross-Application Components under Demand Data Foundation Data Model Define Partition Values .
● Data origins● Data providers● Regions● Contexts
Activities
1. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Data Upload Data Delivery Agreements Define Data Delivery Agreements or use transaction Define Data Delivery Agreements /DDF/DDAGR.
2. Define the agreement type, the date format, and the data upload method.You can combine the values for these attributes as displayed in the following table:
Agreement Type Data Format Upload Method
Retailer Data
This agreement type is used for POS data that is provided by retailers.
Tables Automatic upload with PSA scanner
Manual upload
Comma-separated values (CSV) Automatic upload with folder scanner
Manual upload
Retail Panel Data
This agreement type is used for data collected from retailers by market research companies, who then consolidate and deliver the information.
Self-descriptive exchange format (SDX)
Automatic upload with folder scanner
Company-Internal Data
This agreement type is used for company-internal data like data loaded from SAP ERP.
Tables Automatic upload with PSA scanner
Manual upload
Comma-separated values (CSV) Manual upload
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 107
Agreement Type Data Format Upload Method
Harmonization Data
This agreement type is used for data that has been changed in the system by data harmonization.
You can define only one data delivery agreement with agreement type Harmonization Data.
Tables Manual upload for harmonization
3. Assign the data origin, data provider, region, and context that you created before.4. (Optional) Enter a partition value for semantically partitioned objects.
NoteSemantic partitioning is a property of the InfoProvider in SAP Business Warehouse. It divides the InfoProvider into several small, identical units (partitions).
5. (Optional) Enter the time zone and the frequency of the data deliveries that are planned for this data delivery agreement.
6. Make the following settings dependent on the settings you made before.
Agreement Type or Data Upload Method Settings
Agreement type Company-Internal Data Enter the source system that is connected to SAP Business Warehouse (SAP BW) and is used as a source of data.
Data upload method Automatic Upload With PSA Scanner Enter the source system and the delivery DataSource that is used for the PSA scanner.
Agreement type Retailer Data Choose which attribute is used to identify products in the source data. You can either choose the Product Number or a Universal Product Code like the European Article Number (EAN).
7. Enter a process definition that controls the data upload and specify the default currency that is used if no currency is provided with the data.
8. If you want to prevent data harmonization from being called when master data is uploaded for this data delivery agreement, you can disable data harmonization.For example, you disable data harmonization if you want to report on data that relates to a single data delivery agreement and you, therefore, do not want this data to be harmonized with data from other data delivery agreements.If you switch on the direct update of master data, all master data needs to pass harmonization before entering the propagation layer. Therefore, data harmonization cannot be disabled in this case.
9. Assign data sets to data delivery agreements.For data delivery agreements that have the data upload method Automatic Upload with PSA Scanner or use the SDX data format, the data set is mandatory.For all other data delivery agreements, you can use data sets to influence the instantiation of specific process steps that must be executed after the data is stored in the acquisition layer.
108 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
1. Select the data delivery agreement and choose subview Data Set Definition.2. Choose a data set type and define a description.3. (Optional) Specify a step number of a step in the process flow. Each step can be associated with a
process chain in SAP BW that is triggered during instantiation of this particular data set4. Specify the DataSource in SAP BW that is the Persistent Staging Area (PSA) for the received data.
10. Use report Check Consistency of Configuration for Data Upload /DDF/CUST_CHECK to check whether all settings for data delivery agreements are consistent.
11. Activate the data delivery agreement.
Result
The system considers the newly created data delivery agreements when detecting new data in the inbound folder on the application server.
NoteA data delivery agreement is considered as being “used”, if a data delivery has been uploaded for the data delivery agreement. A used data delivery agreement cannot be deleted. If you need to delete the data delivery agreement nevertheless, you can use report Reset 'Used' Indicator for Data Delivery Agreements /DDF/ADU_RESET_DDAGR_USED_FLAG.
More Information
Enhancing Data Delivery Agreements for Retail Panel Data [page 141]
Self-Descriptive Exchange Format [page 134]
Automatic Data Upload with Folder Scanner [page 116]
Automatic Data Upload with PSA Scanner [page 117]
Manual Data Upload [page 118]
Related Settings
The following settings can be defined dependent on data delivery agreements:
Defining Files and File Sets [page 111]
Configuring Quality Validation [page 203]
Defining Harmonization Groups [page 230]
Setting Up Semantic Partitioning [page 45]
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 109
3.1.4.8 Defining Harmonization Groups
Use
You use the harmonization group to collect objects whose attributes are handled in data harmonization in the same way. For each harmonization group, you can create derivation instruction sets for harmonized objects that define how attribute values of harmonized objects are derived from source attribute values.
NoteYou can use Business Add-In BAdI: Default Values for Set Up of Harmonization Groups/DDF/BADI_FDH_ADU_DEFAULT to define default values that take effect when nothing has been defined under Define Data Delivery Agreements /DDF/DDAGR.
Prerequisites
You have defined data delivery agreements. To define data delivery agreements, choose on the SAP Easy Access screen SAP Menu > Cross-Application Components > Demand Signal Management > Data Upload > Define Data Delivery Agreements or use transaction Define Data Delivery Agreements /DDF/DDAGR.
Activities
1. On the SAP Easy Access screen, choose SAP Menu > Cross-Application Components > Demand Signal Management > Data Harmonization > Set Up Data Harmonization or use transaction Set Up Data Harmonization /DDF/FDH_SETUP.
2. On view Define Harmonization Groups, define all harmonization groups that are available in the system.3. On view Select Object Type, select the object type for which you want to define the settings for
harmonization groups.4. On view Assign Harmonization Groups to Delivery Agreements, assign harmonization groups to data
delivery agreements for the selected object type.
You can use the same harmonization group for multiple data delivery agreements, but you can only assign one data delivery agreement to a harmonization group.
When data is uploaded, the system automatically assigns the harmonization group to all source objects that are provided with the specified data delivery agreement.
If you add a harmonization group without specifying the data delivery agreement, this harmonization group is used as default for all source objects for which no specific settings are defined.
110 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
3.1.4.9 Data Sets
A data set groups the data that belongs to a data delivery. Data sets are assigned to data set types in Customizing.
Data sets must be defined for agreements in one of the following cases:
● The data upload method is automatic upload with PSA scanner● The data format is self-descriptive exchange format (SDX)
For all other agreements, data sets can be used to influence the instantiation of specific process steps that must be executed after the data is stored in the acquisition layer.
You specify the DataSource to which the data set corresponds. You do this in the data delivery agreements, for which the upload method is automatic upload with PSA scanner and the data format is self-descriptive exchange format (SDX).
You can configure which data sets are mandatory and which are optional in a data delivery.
You also assign a specific process step to the data set to define how the system should process the data during the upload.
Example
Data Set Type Description Step Number DataSource Name Optional
L_ATTR_COL Location Attribute Column
101 /DDF/LOCATION_ATTR
Yes
POS_SALES POS Sales Data 201 /DDF/SALES No
In this example, the location attribute master data is optional in data deliveries. The POS transaction data is mandatory.
This means that for the data to be uploaded by the load manager, only the POS transaction data must be present in the Sales PSA.
3.1.4.10 Defining Files and File Sets
Use
You use file sets to group the files that belong to a data delivery. You specify the files that belong to the file set and to which data delivery agreement the file set belongs.
Prerequisites
You have defined data delivery agreements.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 111
Activities
1. In the UI for Manage Agreements, select the data delivery agreement for which you want to define files and file sets.
Note
Alternatively, on the SAP Easy Access screen, choose SAP Menu Cross-Application ComponentsDemand Signal Management Data Upload Data Delivery Agreements Define Files and File Setsor use transaction Define Files and File Sets /DDF/FILESET.
Specify the key and a description for a file set definition. The file set definition could, for example, include the name of the retailer.
2. Choose a data delivery agreement. Each data delivery agreement can have only one file set definition associated with it.
3. (Optional) Specify the logical file paths in which the folder scanner searches for any new files. This logical file name is associated with the logical file path that is translated at runtime to the physical file path.If you do not specify a logical file path, the standard folder that is specified in the general settings in Customizing is used. If you want to define any of the logical file paths for a specific file set, you have to define all of them.
4. Define the files that are contained in the file set.A file can be marked as optional, which means the data upload can be triggered even if the file is not available.○ Choose a process step that defines how the system processes the file during data upload. You can
choose any step of the process that is assigned to the data delivery agreement.○ Create a file name pattern through which the folder scanner can associate new files with specific data
delivery agreements.The file name pattern can be used to associate files that have a file name that follows this pattern to a specific data delivery.
Note
File Name Pattern Requirements
Agreement Type Data Format File Name Pattern Required
Company-Internal Tables (PSA) Optional
Retail Panel (MRD) *Configurable CSV Mandatory
Retail Panel (MRD) Self-Descriptive Exchange (SDX) Mandatory
Retailer Data (POS) *Static CSV Optional
*You must use one of the following approaches:
○ A file name pattern with any number of consecutive plus signs (+) for each file○ No file name pattern at all
112 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Each file needs to match perfectly the file name in the configuration. If you do not define any file name patterns, issues can occur during the data upload when multiple data deliveries are received and in particular when optional files are included in the delivery.
○ (Optional) Specify the code page and the replacement character.
5. Assign data sets to files.You can choose any data set that is defined for the related data delivery agreement.
Example
In this example, the files that contain product and location master data are optional in data deliveries. The files that contain POS transaction data and stock data are mandatory.
File Set Files Belonging to File Set
Step Number File Name Pattern Optional
File Set ID: TOMA001
File Set Description: TopMarket ePOS Daily
File Set Location as defined in Customizing
File 01 101 01 ProdMaster++++++++.csv
Yes
File 02 201 02 StoreMaster++++++++.csv
Yes
File 03 301 03 Sales++++++++.csv
No
File 04 401 04 Stock++++++++.csv
No
More Information
Defining Data Delivery Agreements [page 106]
Enhancing Data Delivery Agreements for Retail Panel Data [page 141]
For more information about configuring files and file sets in the UI for Manage Agreements, see Defining File Sets [page 159].
3.1.4.10.1 File Sets
To enable data in files to be uploaded, you must configure file sets and files.
A file set groups together all the files that belong to a data delivery.
You specify the files that belong to the file set and to which data delivery agreement the file set belongs.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 113
For each data delivery agreement, you can specify the logical folders in which the folder scanner searches for any new files. If you do not specify a logical folder, the standard folder that is specified in Customizing is used.
The files can either be provided in comma-separated values (CSV) or self-descriptive exchange (SDX) format.
You can specify which files are mandatory and which are optional in a data delivery.
You assign a specific process step to each file in the file set to define how the system should process the file during the upload.
You can use the file name pattern to associate a file in the inbound directory to the one that is defined in Customizing.
Example
File Set Files Belonging to File Set
Step Number File Name Pattern Optional
File Set ID: TOMA001
File Set Description: TopMarket ePOS Daily
File Set Location as defined in Customizing
File 01 101 01 ProdMaster++++++++.csv
Yes
File 02 201 02 StoreMaster++++++++.csv
Yes
File 03 301 03 Sales++++++++.csv
No
File 04 401 04 Stock++++++++.csv
No
In this example, the files that contain product and location master data are optional in data deliveries. The files that contain POS transaction data and stock data are mandatory.
This means that the delivery must contain at minimum all the mandatory files that are to be uploaded to PSA. If optional files are missing, the delivery is uploaded.
114 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
3.1.4.11 Agreement Types, Upload Methods, and Data Formats
Use
The following table shows which combinations of agreement type, data format, and data upload method can be used for defining a data delivery agreement:
Agreement Type Data Format Data Upload Method
Distributor Data Static Comma-Separated Values (CSV) Automatic Upload with Folder Scanner
or
Manual Upload
Configurable CSV Automatic Upload with Folder Scanner
or
Manual Upload
Retailer Data (POS Data) Tables Automatic Upload with Persistent Staging Area (PSA) Scanner
or
Manual Upload
Static Comma-Separated Values (CSV) Automatic Upload with Folder Scanner
or
Manual Upload
Configurable CSV Automatic Upload with Folder Scanner
or
Manual Upload
Retail Panel Data (Market Research Data)
Self-Descriptive Exchange (SDX) Automatic Upload with Folder Scanner
Configurable CSV Automatic Upload with Folder Scanner
or
Manual Upload
Standard File Format (SFF) Automatic Upload with Folder Scanner
Company-Internal Data Tables Automatic Upload with PSA Scanner
or
Manual Upload
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 115
Agreement Type Data Format Data Upload Method
Static CSV Manual Upload
Configurable CSV Manual Upload
Harmonization Data Tables Manual Upload
More Information
Data Formats
Self-Descriptive Exchange Format [page 134]
Upload Methods
Automatic Data Upload with PSA Scanner [page 117]
Automatic Data Upload with Folder Scanner [page 116]
Manual Data Upload [page 118]
3.1.4.11.1 Data Upload Methods
Select the appropriate upload method:
Automatic Data Upload with Folder Scanner [page 116]
Automatic Data Upload with PSA Scanner [page 117]
Manual Data Upload [page 118]
3.1.4.11.1.1 Automatic Data Upload with Folder Scanner
With this method of uploading data, the files are placed in a folder on a file server. The folder scanner repeatedly scans the folder and triggers the upload process.
1. The files are placed in the file system by file transfer protocol, for example.2. The load manager reads the active data delivery agreements to determine which folders in the file system
to scan.It determines this based on the definition of the logical file path you made in the transaction Logical File Path Definition (FILE).
3. The folder scanner scans the inbound folder.In the standard system each data delivery is placed in a folder that is reserved for one specific data delivery agreement.
116 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
File names can contain a pattern, such as a date or an ID, by which files from the same delivery can be recognized and placed in the correct folder.
4. As soon as the load manager detects that all the mandatory files in a data delivery agreement are in the inbound folder, it creates runtime instances of the following:○ Steps○ Processes○ Data sets
After creating the instance data, the load manager does the following:○ Associates the process ID with the files and the data sets○ Tries to set the status of all the process steps to Ready
The statuses of the steps are propagated to all the instances.The files that have the status Ready are moved to the file processing folder.If any of these conditions are not met, the data delivery stays in status New.Steps 1-4 are repeated continuously.
5. The process scheduler scans the steps instance table to determine what is ready for execution.6. The process scheduler updates the status of the step.7. The process scheduler executes a step.8. The step triggers a process chain.9. The process chain reads instance tables for information that is needed for processing.10. The process chain finishes executing the step and updates its status.11. When all the steps are executed and the data delivery is complete, the files are moved to the archive folder.
3.1.4.11.1.2 Automatic Data Upload with PSA Scanner
Use
If you have data that does not come in files but is sent directly to SAP Business Warehouse, you can use the Persistent Staging Area (PSA) scanner. This method is useful for uploading data that has been collected and transformed by a tool to fitSAP Demand Signal Management, version for SAP BW/4HANA interfaces. The preferred tool is SAP Data Services.
A new delivery is processed as follows:
1. The system identifies the data delivery agreements that you defined in Customizing for this method of uploading data.
2. The system scans the Delivery PSA of the data delivery agreement, where information about a new data delivery can be found. The Delivery PSA is defined by the delivery DataSource attribute and the source system attribute in the Customizing of the data delivery agreement.If no delivery DataSource is defined on data delivery agreement, the delivery DataSource is taken from the general settings parameter (DELIVERY_INFO_DATASOURCE_NAME, standard value /DDF/DELIVERY) of the data upload Customizing.The Delivery PSA assigns a unique delivery ID to each new data delivery.
3. If the data delivery is complete, the PSA scanner does the following:○ Creates process and step instances configured for the corresponding data delivery agreement for the
detected data deliveries and sets the status of all the instances to New.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 117
○ Calls status management to set the statuses of all the detected data deliveries, the corresponding data sets, processes, and steps to Ready.
4. The process scheduler picks up the processes and steps that have the status Ready and triggers the associated process chains to propagate the detected data.
Result
In the Monitor Deliveries, you can monitor the scheduled process, which handles the data upload from the PSA.
3.1.4.11.1.3 Manual Data Upload
Use
You upload data manually in the following cases:
● You want to upload company-internal data from an ERP system like SAP ERP● The folder or Persistent Staging Area (PSA) scanner fails● You want to do a test upload for data from a new DataSource, such as a new retailer
SAP NetWeaver Business Warehouse (SAP NetWeaver BW) allows you to upload data by executing the appropriate InfoPackage manually.
Prerequisites
You have configured the following:
● An active data delivery agreementYou may wish to define a separate data delivery agreement for each individual data set: one agreement for product, one agreement for location, and one for sales data.You may also reuse an existing data delivery agreement.
● The process scheduler jobEnsure that it has the status In Process so that subsequent steps can be triggered automatically by the process scheduler.
● The process definitionEnsure that it has a step with a process chain assigned for transferring data from the Acquisition DataStore object (DSO) to the Propagation DSO
● The step definitionYou must assign the step to an upload stage for manual uploads. Ensure that you select the correct upload stage setting, as this setting determines which steps should and should not be instantiated in the manual upload.
● If you upload data by manually triggering a process chain and you want the subsequent steps to be triggered automatically by the process flow control (PFC), you must assign the process type /DDF/STATU at the end of the process chain.
118 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
● If you want to use the same data delivery agreement for a manual upload as you have used for an automatic upload with the folder scanner, you have to define upload stages in Customizing.
Process
1. You log on to SAP NetWeaver BW Data Warehousing Workbench (transaction RSPC) and navigate to the appropriate InfoPackage.
2. You trigger the data upload by doing one of the following:○ You execute the InfoPackage to trigger the data upload to the relevant PSA.
If you upload a file, you must include the file name in the InfoPackage file name routine.If you upload company-internal data from SAP ERP, you use an SAP ERP DataSource.The PSA gets the relevant data delivery agreement from the file or SAP ERP.
○ You execute the Data Transfer Process (DTP) to do one of the following:○ Bring the data from a PSA request to the Acquisition DSO.○ Bring the data from any DSO to the Acquisition DSO.
○ You execute the process chain that brings the data to the Acquisition DSO.The process chain must have the process type /DDF/STATU at the end of it.
3. The system creates a delivery ID and a process ID.4. The system also creates an additional step with step number 000000 and initial status Manual.
This status is for the step only and is not propagated. While the step has the status Manual, you can find it in the Monitor Deliveries UI under the status category Running for the related process.If you did the manual upload without a process chain, then this step has neither a process chain nor a process chain log ID. You must set the status manually to Completed, as there is no PFC interaction.If you used a process chain in the manual upload, then this step is instantiated with a process chain and a process chain log ID. The process type /DDF/STATU sets the step status to Completed.
5. The system creates the subsequent steps.Using the delivery and process created in the previous step, the subsequent steps, which are assigned to upload stages other than Data Acquisition, are created with the status Ready. The PFC executes the subsequent steps once the additional step (with step number 000000 and initial status Manual) has the status Completed.
6. You check in the Monitor Deliveries UI, whether the upload was successful or not.
Example
If you want to use the same data delivery agreement for a manual upload as you have used for an automatic upload with the folder scanner, you have to define upload stages in Customizing as displayed in the following table.
Step Attribute Value
Agreement Type Retailer Data
Upload Method Automatic Upload With Folder Scanner
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 119
Step Attribute Value
Data Format Comma-Separated Values
The process definition has the following steps assigned:
● PRODUCT_FILE with upload stage Data Acquisition● LOCATION_FILE with upload stage Data Acquisition● PRODUCT_STAGE with upload stage Quality Validation● LOCATION_STAGE with upload stage Data Propagation
You decide you want to upload the product file manually and you can use the same data delivery agreement for this.
The system creates an additional step with the step number 000000 with status Manual for the manual upload and instances for the steps that are not relevant for data acquisition (PRODUCT_STAGE and LOCATION_STAGE).
The system does not create an instance for the step LOCATION_FILE because you want to upload only the PRODUCT_FILE in the manual upload.
Although no location data is uploaded, the step LOCATION_STAGE is nevertheless executed. This does not cause any conflicts because the Acquisition DataStore object does not contain any location data with the delivery ID that was created from the manual upload.
More Information
Data Reload [page 184]
3.1.4.11.2 Data Formats
Select the appropriate data format for your agreement type depending on the format in which you receive data from your provider:
Uploading Data in Configurable CSV Format [page 121]
Uploading Data in Self-Descriptive Exchange Format [page 134]
Uploading Data in Static CSV Format (Setting Up Automatic Upload with Folder Scanner [page 166])
Uploading Data in Tables (Setting Up Automatic Upload with PSA Scanner [page 169])
3.1.4.11.2.1 Example: Metafile Fields and Column Positions
The following table illustrates the metafile fields and their column positions. The records for each file must be grouped together, but the files can be described in any order.
120 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
1 2 20 40
Hide File Name File Property Value
CONTROL NAME_OF_FILE FILES\CNTLFILE.TXT
TYPE_OF_FILE CONTROL
DIMFILE NAME_OF_FILE FILES\DIMFILE.TXT
TYPE_OF_FILE DIMENSION
This example describes two files; the control file and the dimension file.
The file names are CONTROL and DIMFILE.
The file properties are NAME_OF_FILE and TYPE_OF_FILE.
The values for NAME_OF_FILE are FILES\CNTLFILE.TXT and FILES\DIMFILE.TXT.
The values for TYPE_OF_FILE are CONTROL and DIMENSION.
This information in the metafile tells the load manager that the control file is CNTLFILE.TXT in the inbound folder, and the dimension file is DIMFILE.TXT in the inbound folder.
3.1.4.11.2.2 Uploading Data in Configurable CSV Format
Use
As a data upload supervisor you may need to upload retailer data (POS) and retail panel data. The data comes in Comma-Separated Values (CSV) files whose structure may differ from one file or delivery to the next. To enable the system to interpret, map, and upload the data in this “configurable CSV format”, you must make the settings described below.
Prerequisites
You have defined the following:
● A process definition specifically for configurable CSV formatted files in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Define Processes and Steps
● A data delivery agreement specifically for configurable CSV formatted files in the SAP Easy Access Menu under Administrator Data Upload in either transaction Define Data Delivery Agreements (/DDF/DDAGR) or transaction Data Delivery Agreements for Market Research Data (/DDF/DDAGR_MRD)
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 121
Process
1. Create a folder structure on your application server and place the CSV files in your inbound folder.For more information on creating folder structures, see the documentation in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings General Settings .
2. Go to the Manage Agreements UI and, under the tab page File Sets, define the files.3. Add, delete, and edit the attributes of individual files as required.4. Under the tab page File Format, modify the format of each file as required.5. Under the tab page Mapping, make manual adjustments to the mapping as required.6. Save your entries.7. Activate the data delivery agreement.
Result
The system uploads the data automatically.
More Information
Example: Setting Up the Configurable CSV Format for Uploading PO [page 155]S Data
3.1.4.12 Deleting Data Delivery Agreements
Use
Deleting data delivery agreements is rather an exception than a standard procedure. If you delete a data delivery agreement that has already been used to upload data, you will create inconsistencies in BW reporting. To avoid inconsistencies, follow the steps below.
Prerequisites
Ensure the following prerequisites are fulfilled before deleting an agreement:
● The agreement is not active● The agreement has not already been used to upload data● No data exists in dependent tables
122 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Procedure
1. On the UI for Manage Agreements, select the data delivery agreements you want to delete and click the Delete button.
2. If you cannot delete a data delivery agreement that you selected, do the following:○ Remove all usages of the data delivery agreement from the following transactions or applications:
○ Configure Quality Validation (/DDF/QV_DEF)○ Set Up Data Harmonization (/DDF/FDH_SETUP)○ Global Market Data Provider app: Define Consolidation○ Global Market Data Provider app: Define Extrapolation○ Global Market Data Provider app: Define Time Split○ Data Provider app: Data Delivery Monitor (remove the assignments)○ Data Provider app: Plan Data Deliveries
○ Manually delete any data that was already uploaded to SAP BW using that data delivery agreement.○ Execute the standard job Clean Archived Instances in the Monitor Jobs UI or (in Customizing for Cross-
Application Components under Demand Data Foundation Data Upload Basic SettingsConfigure Jobs (/DDF/V_DAEMON)
○ Go to transaction Reset 'Used' Indicator for Data Delivery Agreements (/DDF/RESET_USED_AGR) to run the report (/DDF/ADU_RESET_DDAGR_USED_FLAG) that resets the data delivery agreement to an initial state (unused for data upload).
○ Deactivate the data delivery agreement by deselecting the Activate checkbox on the UI.○ Delete the data delivery agreement.
3.2 Configuring and Administrating the Data Upload
Use
The following table shows the tasks you have as an administrator for the data upload and explains, in which cases and where you make the individual settings. Dependent on your system landscape you make Customizing settings in the development system, and all other settings in the production system.
NoteFor more information, see Roles for SAP Demand Signal Management, version for SAP BW/4HANA. [page 530].
Task When? Where?
Set up data upload in Customizing Implementation Phase Customizing under Cross-Application
Components Demand Data
Foundation Data Upload.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 123
Task When? Where?
Define data origins
Define data providers
Define regions
Define contexts
You receive new files from an external data provider for which the data is not defined yet.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements .
Define new data delivery agreements You receive new files from an external data provider for which no data delivery agreement is defined yet.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements
Define Data Delivery Agreements or use transaction Define Data Delivery Agreements /DDF/DDAGR.
Enhance data delivery agreements for retail panel data
You receive new files for retail panel data (market research data) from an external data provider for which no data delivery agreement is defined yet.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements for
Market Research Data Enhance Data
Delivery Agreements or use transaction Enhance Data Delivery Agreements /DDF/DDAGR_MRD.
Define files and file sets A new data delivery agreement has been created.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements
Define Files and File Sets or use transaction Define Files and File Sets /DDF/FILESET.
Map external fields A new data delivery agreement for retail panel data has been created.
UI Manage Agreements
124 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Task When? Where?
Change mapping of external fields A new data delivery agreement for retail panel data has been created, and the mapping has been defined.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements for
Market Research Data Change
Mappings for External Fields or use transaction Change Mappings /DDF/SDX_MAP_MAINT
UI Manage Agreements
Define extraction filters A new data delivery agreement for retail panel data has been created.
UI Manage Agreements
Define time derivation A new context for retail panel data has been created in the system.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements for
Market Research Data Define Time
Derivation or use transaction Define Time Derivation /DDF/DEF_TD
Revise hierarchy descriptions A new data delivery agreement for retail panel data has been created in the system.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements for
Market Research Data Revise
Hierarchy Descriptions or use transaction Revise Hierarchy Descriptions /DDF/MR_HIER
Revise hierarchy levels A new context for retail panel data has been created in the system.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements for
Market Research Data Revise
Hierarchy Levels or use transaction Revise Hierarchy Levels /DDF/MR_PRIO
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 125
Task When? Where?
Define product level descriptions A new context for retail panel data has been created in the system that is used in global reporting.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements for
Market Research Data Define Product
Level Descriptions or use transaction Define Product Level Descriptions /DDF/MR_LVL
Perform one-time uploads New stock types or types of sales data have been defined in Customizing, or new regions have been created.
See One-Time Uploads Before Recurring Regular Data Uploads Are Started [page 164]
Schedule background jobs for automatic uploads
Implementation phase See Scheduling and Monitoring Jobs for Automatic Data Uploads [page 86]
Monitor the upload processes and jobs Monitor DeliveriesUI
Monitor Jobs
UI
3.2.1 Enabling the Upload of Name/Value Pairs With Special Characters
Use
If you receive data that contains name/value pairs with special characters, you must execute the mapping report and the Business Add-In (BAdI) described below.
The mapping report Copy External Field Names (/DDF/ADU_ORIG_EXT_FNAME_COPY) uses the BAdI Mapping External Field Names (/DDF/BADI_MRD_NVP) which, if required, removes unsupported characters, truncates the character string to the maximum 30 characters, and ensures the string remains unique. The original external field name must be persisted beforehand.
If you have data delivery agreements for which external field names have already been uploaded and for which you want to implement the BAdI Mapping External Field Names and persist the original external field name, this report allows you to copy the values in the external field name column to the original external field name.
Procedure
1. Execute the mapping report
126 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
You use this report to automatically copy the values in the Map All Fields table (/DDF/C_FIELDMAP) from the External Field Name column (EXTFIELDNAME) to the Original External Field Name column (ORIG_EXTFNAME).
2. Create mappings using one of the following:○ Execute the report Map Fields for Market Research Data (/DDF/ADU_MAP_CUST_SDX).
For more information, see the report documentation.○ Navigate to the user interface Manage Agreements and choose Create Mappings.
For more information, see SAP Library in SAP Help Portal at https://help.sap.com/dsimbw4h .3. Go to transaction Data Delivery Agreements for Retail Panel Data (/DDF/SDX_MAP_MAINT) and check that
the original external field name is filled.4. Implement the BAdI
Before you upload external field names, particularly in the case of runtime mapping for data delivery agreements from countries whose languages contain special characters, such as Russia and China, it is important to note the following:To enable the upload of external field names beyond the initial load to the persistent staging area, unsupported characters and external field names that are longer than 30 characters must be modified to prevent dumps. To prevent dumps, you must implement the BAdI Mapping External Field Names. This BAdI allows you to change the external field names before they are saved to the mapping table. The modification of the external field names involves replacing or removing unsupported characters in the attribute name/value pair data type. The BAdI does the following:○ Modifies the external field names of data set types of subtype Attribute Name/Value Pair○ Disables the external field name checks by data delivery agreement before the name/value pair upload
5. Disable checks on external field names for specific data delivery agreements and extend this to exclude name/value pairs that you have already uploaded.You can implement this BAdI without affecting existing name/value pairs. The modified external field names are stored in the Map All Fields table in transaction Data Delivery Agreements for Retail Panel Data (/DDF/SDX_MAP_MAINT). The external field name of data set types of subtype Attribute Name/Value Pair is mapped to the ATTR_NAME extractor field during the name/value pair upload. The following checks are performed by default for the attribute name/value pair (external field name):○ It cannot be longer than 30 characters○ It cannot contain special characters (such as spaces)○ It cannot be duplicated for the same data delivery agreement or the same data set type○ When the check for the external field name fails, the upload process for the name/value pairs is
stopped.For more information, see the BAdI documentation in Customizing for Cross-Application Components under Demand Data Foundation Business Add-Ins BAdI: Mapping External Field Names .
3.2.2 Configuring the Data Upload for Retail Panel Data
Use
Whenever you receive retail panel data from providers of market research data, you create new data delivery agreements. You can either create single data delivery agreements or you can create multiple data delivery agreements at the same time.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 127
Activities
1. Create a folder structure with an inbound folder for the new data delivery agreements in accordance with the file path required.
NoteDefine folder names containing 10 or fewer characters, as the name of a data delivery agreement is limited to 10 characters.
2. Place data deliveries or empty metafiles in the inbound folders so that the new folder can be detected.3. You can create a single data delivery agreement for retail panel data (see Enhancing Data Delivery
Agreements for Retail Panel Data [page 141]).4. Define a file set that only contains the metafile and has no process step assigned (see Defining Files and
File Sets [page 111]).5. Create the mapping for external fields and adjust the automatically generated mapping manually, if
necessary (see Mapping External Fields for Retail Panel Data [page 143]).6. Define extraction filters (see Defining Extraction Filters [page 145]).7. Define the time derivation (see Defining Time Derivation [page 147]).8. Revise the hierarchy data (see Revising Hierarchy Descriptions [page 150] and Revising Hierarchy Levels
[page 152]).9. Define product level descriptions (see Defining Product Level Descriptions [page 153]).10. Activate the data delivery agreement.11. Make sure that standard background jobs are running (see Scheduling and Monitoring Jobs for Automatic
Data Uploads [page 86]).
3.2.2.1 Name/Value Pairs in Retail Panel Data
SAP Demand Signal Management, version for SAP BW/4HANA provides name/value pairs for any unknown attributes that are included in retail panel data deliveries. Any unknown attributes that have no target for storage must be mapped for future analytics and reporting.
To ensure that product and location attributes are mapped correctly, the Persistent Staging Area (PSA) contains the following key field names and values:
Field Name Field Value
ATTR_NAME /DDF/ATTRIBUTE_NAME
ATTR_VALUE /DDF/ATTRIBUTE_VALUE
3.2.2.2 Example: File Types
This table lists the file types you may find in the metafile.
128 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
File Type File Property
Metafile Not declared in the metafile
Format file FORMAT
Control file CONTROL
Dimension file DIMENSION
Hierarchy file HIERARCHY
Hierarchy level file LEVEL
Variable file VARIABLE
Key figure file MEASURE
Hierarchy structure file PARENTAGE
Attribute value file ATTRIBUTE_VALUE
Sales data file DATA
3.2.2.3 Example: Configurable CSV Format for Uploading Retail Panel Data
Procedure
1. Create a process definition and steps specifically for configurable CSV formatted files in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Define Processes and Steps .Use the following direct update process chains for retail panel data:
Step Number Description Process Chain ID Activate
10 Extract Market Research Hierarchy Metadata
/DDF/HIER_MR_EXTRACT Yes
20 Extract Market Research Hierarchy Level Metadata
/DDF/HIER_LVL_MR_EXTRACT
30 Extract Market Research Location Hierarchy
/DDF/LOC_HIER_MR_EXTRACT
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 129
Step Number Description Process Chain ID Activate
40 Extract Market Research Product Hierarchy
/DDF/PROD_HIER_MR_EXTRACT
101 Extract Location for Market Research Data
/DDF/LOC_MR_EXTRACT
102 Extract Product for Market Research Data
/DDF/PROD_MR_EXTRACT
103 Extract Location Name/Value for Market Research Data
/DDF/LOC_NV_MR_EXTRACT
104 Extract Product Name/Value for Market research Data
/DDF/PROD_NV_MR_EXTRACT
105 Extract Market Research Time String from File System
/DDF/MRTIME_FILE
110 Extract Market Research Retail Panel Exm. from File System
/DDF/MRTPN_FILE
201 Stage Locations with PFC /DDF/LOC_STAGE_2
202 Stage Products with PFC /DDF/PROD_STAGE_2
203 Stage Locations Name/Value Pairs with PFC
/DDF/LOC_NV_STAGE
204 Stage Products Name/Value Pairs with PFC
/DDF/PROD_NV_STAGE
207 Stage Market Research Time String with PFC
/DDF/MRTIME_STAGE
211 Stage Market Research Retail Panel Example with PFC
/DDF/MRTPN_STAGE
221 Stage Market Research Location Hierarchy with PFC
/DDF/LOC_HIER_STAGE
222 Stage Market Research Product Hierarchy with PFC
/DDF/PROD_HIER_STAGE
2. In the SAP Easy Access Menu under Administrator Data Upload define the following attributes, which you can then assign to your data delivery agreement:○ Data origins
130 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
○ Data providers○ Contexts
NoteWe recommend you create a new context. If you reuse an existing context, it may affect the harmonization results or final reporting.
3. Create a data delivery agreement specifically for configurable CSV formatted files in the SAP Easy Access Menu under Administrator Data Upload Define Data Delivery Agreements (transaction /DDF/DDAGR).
NoteUse the same syntax as the parent folder of the inbound/process/archive file folder structure when you define the name of the data delivery agreement.
4. Make the following settings in the data delivery agreement attributes:○ Select the agreement type Retail Panel Data.○ Select the data format Configurable CSV.○ Select the data upload method Automatic Upload with Folder Scanner.
NoteThe Template checkbox does not have any impact on a retail panel data delivery agreement type with the Configurable CSV data format.
5. Enter the process definition you created in step 1 and add the data origin, data provider, and context you defined.
NoteDo not select the checkbox Activate if the configuration for the data delivery agreement, file set, file format, and mapping is not yet done.
6. You must maintain the DataSource name for the data set types and steps that upload the data from the files to the acquisition layer. Add the data sets in the table below:
Data Set Definition
Data Set Type Description Step Number DataSource Name
H_HIER Hierarchy Header 10 /DDF/HIERARCHY_DX
H_LEVEL Hierarchy Level 20 /DDF/HIER_LEVEL_DX
L_ATTR_COL Location Attribute Column 101 /DDF/LOC_ATTR_COL_DX
L_ATTR_NV Location Attribute Name/Value Pair
103 /DDF/LOC_ATTR_NV_DX
L_HIER Location Hierarchy 30 /DDF/LOC_HIER_DX
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 131
Data Set Type Description Step Number DataSource Name
P_ATTR_COL Product Attribute Column 102 /DDF/PROD_ATTR_COL_DX
P_ATTR_NV Product Attribute Name/Value Pair
104 /DDF/PROD_ATTR_NV_DX
P_HIER Product Hierarchy 40 /DDF/PROD_HIER_DX
RP_DATA Retail Panel Transactional Data
110 /DDF/MRTPN_EXAMPLE
T_ATTR_COL Time Attribute Column 105 /DDF/TIME_ATTR_COL_DX
7. Go to the SAP User Menu and choose Administrator Data Upload Data Delivery Agreements for Market Research Data Enhance Data Delivery Agreements .Enhance the data delivery agreement with the following information and see the field help for more details:○ Region○ Normalized unit of measure○ Source value dimension product
Enter PROD.○ Source value dimension location
Enter GEOG.○ Source value dimension time
Enter TIME.○ Priority calculation type
Keep the default selection: Calculation Uses DEFAULT Value for Empty Hierarchy Level○ Time granularity
Depends on your data, for example, 0CWEEK for weekly deliveries or 0CMONTH for monthly, and so on.
○ 1st day of weekDepends on your data
○ 1st week of yearDepends on your data
NoteEnsure the data delivery agreement is inactive.
8. Drop the files in the Inbound folder.9. Define the file set by going to the UI for Manage Agreements using either the SAP Fiori Launchpad, the SAP
NetWeaver Business Client, or by navigating from the SAP User Menu.○ Select the data delivery agreement you created and choose Edit mode.○ In the File Set tab page, insert all the files that are available in your inbound folder and belong to the
data delivery.○ Make the following entries:
○ Select a row in the chosen file displays the file content in the preview○ Add a description○ Change the file name to a file name pattern to ensure future files are loaded
132 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
○ Assign data set types○ Choose file, thousand, and decimal separators○ Enter the number of header rows○ Define whether the file is optional in the data delivery○ Optional: check the Delete button function
10. In the File Format tab page, select the files one after the other and do the following:○ Check the original external field name and modify the system proposals for external field names○ Introduce conversion exits, if needed○ Choose which attribute should be included in a name/value pair data set
11. In the Mapping tab page, make the following settings:○ Select a non-name/value pair data set and assign attributes in one of the following ways:○ Attribute Value (Custom)○ External Field Value○ Fixed Value
Select a name/value pair data set and assign one of the three attributes described above to the following:○ Key fields○ Attribute fields
Result
Check your settings as described in Checks After Setting Up Configurable CSV Data Format [page 133]
3.2.2.4 Example: Checking the Set-Up for Configurable CSV Format (Retail Panel)
NoteOptional
The system does not let you activate data delivery agreements for which there are Customizing inconsistencies. Therefore, you can run the report Check Consistency of Configuration for Data Upload (/DDF/ADU_CUST_CONSIST_CHECK) to check for any inconsistencies and correct them so that you are able to activate the data delivery agreement.
1. Activate the data delivery agreement in the UI for Manage Agreements.2. Start the folder scanner and the process dispatcher.3. Make sure that both the folder scanner and process dispatcher jobs are running in the Monitor Jobs UI
under Status Overview.4. Monitor the process in the Monitor Deliveries UI to ensure that the upload process is successful.
5. In the test system in the SAP User Menu, under Data Upload Supervisor Monitor Deliveries , search for the data delivery agreement and monitor the last process until all its steps have been processed and the data delivery agreement process has the status Completed.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 133
6. Make a note of the process ID. Then check the uploaded data in the acquisition DataStore objects (DSOs). Once the process chains have been processed and have the status Completed, the data has already been uploaded to SAP Business Warehouse (SAP BW).
7. Go to Data Warehouseing Workbench: Modeling (transaction RSA1).8. Select the InfoProvider node and expand the node Demand Data Foundation to perform the checks on the
following DSOs:
DSOs for Checking
Technical Name Description
/DDF/DS21 Location Acquisition
/DDF/DS41 Product Acquisition
/DDF/DS05 Time String Acquisition
/DDF/DS04 Market Research Retail Panel Acquisition
To perform checks, select each DSO and right-click to select Display Data.
9. Enter the process ID you noted previously with preceding 0000's.
3.2.2.5 Self-Descriptive Exchange Format
The self-descriptive exchange (SDX) format is a data format that you can use to upload file sets that contain the following:
● Transaction data● Master data (attribute values or hierarchy data)● Hierarchy structures● Hierarchy metadata● Metadata on the key figures and the desired aggregation behavior of the key figures● Technical metadata on the file set itself
The metafile is a prerequisite for data deliveries in SDX format. The metafile provides the following information:
● A complete list of all the files belonging to the file set● A file name for all files● A file property field that specifies the type of data contained in the set
134 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
3.2.2.5.1 Hierarchy Upload
Use
Market research data can contain hierarchies in the dimensions product, market (location), and time; for example:
● Product data hierarchy: Category, subcategory, segment, brand, product● Market (location) data hierarchy: Market, country, region, zip code● Time data hierarchy: Year, half year, quarter, and month
You can analyze the data on all levels of the hierarchies, if the following preconditions are fulfilled:
● The hierarchy levels must be available as master data records identified with a qualifier (node or leaf).● The provided hierarchical structure must be converted into a hierarchy over the relevant InfoObject (for
example, /DDF/PRODUCT and /DDF/LOCATION).● The priorities are calculated to enable correct reporting of pre-aggregated totals
In the standard system the hierarchy upload is implemented for the product and location dimension using the following DataSources in SAP Business Warehouse:
● Hierarchy Metadata: Contains external IDs and long texts of external hierarchies maintained in the default language of the data delivery agreement
● Hierarchy Level Metadata: Contains the level ID, the level number, and a long text for the levels maintained in the default language for each hierarchy of the data delivery agreement
● Hierarchy Data: Contains the parent-child relationship that must be converted into a BW hierarchy
Process
1. Extract hierarchy and hierarchy level information from the delivered retail panel data.Create a data delivery agreement for market research data and the mapping of external fields.The source values of the three dimensions used in the source data are stored (for example, PROD for the dimension product). This is necessary to be able to distinguish between the different hierarchies. During data upload, the names of the hierarchies themselves could be identical, for example, the source data could contain a product hierarchy called H1 and a location hierarchy called H1. If there are no hierarchy levels for a dimension and the priority calculation type of the data delivery agreement is set to Calculation Uses DEFAULT Value for Empty Hierarchy Level the report creates default entries for all dimensions with empty levels
2. Revise hierarchy descriptions and hierarchy levels manually, if necessary. For example, you can adapt the descriptions of hierarchy levels to your needs.Changes to the descriptions or the hierarchy names apply after a reload of the hierarchies.Changes to the numeric hierarchy level require a complete reload of the hierarchy and the transaction data.
NoteTo update the hierarchy data, you have to update the mapping.
3. Upload the retail panel data for the first time (initial upload).
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 135
○ Hierarchy dataThe system identifies the hierarchy for each dimension using the source value of the dimension that is assigned to the data delivery agreement. The numeric hierarchy level is determined by the system for each element of the dimensions product, location (market) and time. The result is stored in the corresponding master data InfoObject.
○ Transaction dataThe system derives numeric hierarchy levels from the corresponding master data records belonging to the transaction data and multiplies the values to calculate the priority.
4. Subsequent upload of retail panel data.○ Hierarchy data
During metadata upload of hierarchies and hierarchy levels the system determines the changes of the current delivery compared to the existing settings for hierarchies.The settings are updated automatically, old entries are deactivated and new entries created (for example, because a new hierarchy is provided). Afterwards, the numeric hierarchy levels are recalculated for each element in the three dimensions.
NoteIt may happen that hierarchy data or hierarchy level data is missing in a delivery that previously contained this data. In this case, the system uses the existing settings to determine the numeric hierarchy levels and the priority calculation.
○ Transaction dataThe system derives numeric hierarchy levels from the corresponding master data records belonging to the transaction data and multiplies the values to calculate the priority
More Information
Priority Calculation [page 136]
Time Derivation [page 148]
3.2.2.5.2 Priority Calculation
Use
Market research data often contains key figures that are provided as so called pre-aggregated totals, for example, percentage values like Numeric Distribution or Weighted Distribution. For these non-cumulative totals on different hierarchy levels and aggregation levels you cannot use standard aggregation methods like sum. To ensure correct reporting, you must use an exception aggregation, for example, First Value with respect to a reference characteristic.
As market research data can contain hierarchies not only for products but as well for the dimensions location (market) and time, you cannot use one hierarchy level, for example, one level of the product hierarchy, as the reference characteristic. Instead, you have to combine all possible hierarchy levels in an additional characteristic that can be used as a reference characteristic for all delivered key figures in reporting. The
136 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
InfoObject /DDF/MRPRIO (Market Research Priority for Exception Aggregation) is used as the reference characteristic for this purpose. For the key figures the exception aggregation First Value is used.
Prerequisites
The data delivery agreement has the following properties:
● Each dimension element must have a numeric hierarchy levelIf no hierarchy is available for a dimension you can force the system to create a default entry for the calculation of priorities using the corresponding priority calculation type for your data delivery agreement. If no numeric hierarchy level is available for an element, priority calculation is not possible at all.
● The numeric hierarchy levels must be ascending from top to bottom.The key figures use exception aggregation First Value. This means that a dimension element that is higher in the hierarchy must have a lower number for the numeric hierarchy level (1 is the top level). The level numbering can have gaps from one level to the next. This may be necessary if multiple hierarchies are available for one dimension with different level numbers.
● A hierarchy level must be unique for a dimension independent of all hierarchies.If a dimension element belongs to more than one hierarchy, the hierarchy level must be the same in all hierarchies
You can use transaction Revise Hierarchy Data to check these settings and to define exceptions.
ExampleThere are two product hierarchies with the following hierarchy levels (in parentheses the semantic meaning of the level):
● H1: L1 (Category) - L2 (Manufacturer) - L3 (Brand) - L4 (Sub-Brand) - L5 (Product)
● H2: L6 (Category) - L7 (Manufacturer) - L8 (Segment) - L9 (Brand) - L10 (Sub-Brand) - L11 (Product)
The level Brand has a different meaning in hierarchy H1 and in hierarchy H2. In H1 it is directly below Manufacturer whereas in H2 it is below Segment and Segment is below Manufacturer.
The numeric hierarchy levels are defined as follows:
● H1: 1 (Category) - 2 (Manufacturer) - 4 (Brand) - 5 (Sub-Brand) - 6 (Product)● H2: 1 (Category) - 2 (Manufacturer) - 3 (Segment) - 4 (Brand) - 5 (Sub-Brand)
- 6 (Product)
Process
1. The system determines the numeric hierarchy level for each element of the three dimensions (attribute /DDF/HNUMLVL of InfoObjects /DDF/PRODUCT, /DDF/LOCATION, and /DDF/TIMEREF).The three numerical hierarchy levels are multiplied for each record and stored as a priority. This value can be used as a reference to report on pre-aggregated values.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 137
2. The system executes the priority calculation during the upload of transaction data by multiplying the numeric hierarchy levels of each dimension. This is done in a transformation for each data record before the result is stored as a numeric value in InfoObject /DDF/MRPRIO in DataStore Object (DSO) /DDF/DS14:○ Priority of each transaction data record (/DDF/MRPRIO) =○ Numerical hierarchy level product (/DDF/PRODUCT-HNUMLVL) *○ Numerical hierarchy level location (/DDF/LOCATION-HNUMLVL) *○ Numerical hierarchy level time (/DDF/TIMEREF-HNUMLVL)
Example
You receive the following master data:
● A market hierarchy with the market Total Market and the two submarkets Region North and Region South.
● A product hierarchy with the category Cake and two products Lemon Cupcake and Raspberry Muffin.● No time hierarchy is defined, so the time hierarchy level for all records is 1.
You receive the transaction data with the percentage value for numeric distribution as displayed in the following table:
Market Product Value Market Level Product Level Priority
Total Market Cake 61% 1 1 1
Total Market Lemon Cupcake 53% 1 2 2
Total Market Raspberry Muffin 52% 1 2 2
Region North Cake 69% 2 1 2
Region North Lemon Cupcake 61% 2 2 4
Region North Raspberry Muffin 72% 2 2 4
Region South Cake 56% 2 1 2
Region South Lemon Cupcake 51% 2 2 4
Region South Raspberry Muffin 59% 2 2 4
The data of the first three columns is delivered by a market research company.
The hierarchy levels for markets and products are determined by the system during master data upload.
The system calculates the priority during the upload of transaction data.
Key figures with a reference to the calculated priority and the exception aggregation First Value provide the correct pre-aggregated values for reporting.
In this example, you report on the numeric distribution of Lemon Cupcakes in the Total Market. The system does not summarize the records for Region North and Region South but takes the value with the
138 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
highest priority (lowest number). The records for Lemon Cupcake on regional level have priority 4. Compared to priority 2 for the Total Market, 2 is the higher priority. The numeric distribution of 53% is the correct value.
3.2.2.6 DataSources and Extract Structures for Retail Panel Data
Use
In SAP BW, you need to distinguish between different types of information.
When you extract retail panel data using the self-descriptive exchange format, there is a DataSource for each data set. This table shows which DataSources are linked with which extract structures. The function module used for these is /DDF/ADU_SAPDX_EXTRACT_GENERIC.
DataSource Description Extract Structure Required Data Format
/DDF/PROD_ATTR_COL_DX Product Attribute Column /DDF/S_MR_PROD_ATTR_COL_DX
SDX
Configurable CSV
/DDF/PROD_ATTR_NV_DX Product Attribute Name/Value Pair
/DDF/S_MR_PROD_ATTR_NV_DX
/DDF/PROD_HIER_DX Product Hierarchy /DDF/S_MR_PROD_HIER_DX
/DDF/LOC_ATTR_COL_DX Location Attribute Column /DDF/S_MR_LOC_ATTR_COL_DX
/DDF/LOC_ATTR_NV_DX Location Attribute Name/Value Pair
/DDF/S_MR_LOC_ATTR_NV_DX
/DDF/LOC_HIER_DX Location Hierarchy /DDF/S_MR_LOC_HIER_DX
/DDF/TIME_ATTR_COL_DX Time Attributes /DDF/S_MR_TIME_ATTR_COL_DX
/DDF/TD_DX_TEMPLATE SDX Transaction Data Template
/DDF/S_MR_TD_DX_TEMPLATE
/DDF/HIERARCHY_DX Hierarchies /DDF/S_MR_HIERARCHY_DX
/DDF/HIER_LEVEL_DX Hierarchy Levels /DDF/S_MR_HIER_LVL_DX
/DDF/MRTPN_EXAMPLE Retail Panel Example /DDF/S_MRTPN_EXAMPLE
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 139
More Information
DataSources and Extract Structures for Retailer (POS) Data [page 158]
3.2.2.6.1 Parallelizing Extraction Using Multiple InfoPackages
Use
To improve performance of the extraction of transactional data in self-descriptive exchange (SDX) format, you can parallelize the extraction.
To do this you distribute the data of a delivery that is to be extracted equally to multiple InfoPackages that can be processed in parallel.
You use the control structure of the following DataSources to define the index and the total number of InfoPackages:
● /DDF/TD_DX_TEMPLATE SDX Transaction Data Template● /DDF/MRTPN_EXAMPLE Retail Panel Example
If the control structure is not used, the system extracts the data sequentially.
Procedure
1. Create InfoPackages for the DataSource with the following selection parameters:○ PARAL_PROC_NR contains the total number of InfoPackages used○ PARAL_PROC_IDX contains the sequence number of each InfoPackage
2. Create a process chain that executes the InfoPackages in parallel and calls the subordinated process chain /DDF/MRTPN1 subsequently.
3. Assign the process chain to the process step that uploads SDX transactional data in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Define Processes and Steps .
More Information
BI content documentation:
●●
140 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
3.2.2.7 Enhancing Data Delivery Agreements for Retail Panel Data
Use
You enhance data delivery agreements with additional attributes that are necessary for uploading retail panel data from market research data providers.
Prerequisites
You have defined time granularities in Customizing under Cross-Application Components Demand Data Foundation Data Upload Settings for Retail Panel Data Define Time Granularity .
You have defined a data delivery agreement with the following properties:
● A process definition is assigned that refers to process chains that are created for DataSources for retail panel data.
● Agreement type is Retail Panel Data● Data format is Self-Descriptive Exchange Format● Data upload method is Automatic Upload with Folder Scanner● Data sets for incoming files in self-descriptive exchange (SDX) format are defined and associated with
process steps and the DataSources that are used for extraction.
Activities
1. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Data Upload Data Delivery Agreements for Market Research Data Enhance Data Delivery Agreements or use transaction Enhance Data Delivery Agreements /DDF/DDAGR_MRD.Select a data delivery agreement.
2. Specify the dimensions that can be used to upload hierarchies for the dimensions product, location, and time.For example, the hierarchies of dimension Product can be defined for product category, subcategory, segment, brand, and product
3. Specify the normalized unit of measure that is used to enrich market research data during data upload if no normalized unit of measure is provided.The normalized unit of measure typically has the dimension weight or volume for reporting in kilogram (KG) or liter (L).
4. Choose the priority calculation type.The priority calculation type defines whether a priority calculation is executed during data upload and how the system reacts if hierarchy levels are empty, for example, because they have not been provided with the retail panel data.
5. Choose the time granularity and dependent settings for time data.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 141
The time granularity defines the time periods that the data provided in the data deliveries can be subdivided into, for example, calendar weeks or periods of four weeks.
6. Define additional metafiles.If you want to load any additional files that are not contained in the metafile of the self-descriptive exchange (SDX) format, you can create a record for these files here. The records in this table are handled in exactly the same way as they would be if they were contained in the SDX metafile.
7. Define additional format files.If you want to load any additional files that are not explained in the SDX format file, you can create your own explanations by creating records in this table. The records in this table are handled in exactly the same way as they would be if they were contained in the SDX format file.
8. Define synonyms.Synonyms allow you to reduce the workload of updating multiple entries to updating a single entry. The records in this table are handled in exactly the same way as they would be if they were contained in the SDX synonym file.
NoteNot all attributes mandatory for data delivery agreements are required for defining a templates. On the UI, you create the dependent mappings and extraction filters and activate the data delivery agreements.
9. If you want to activate the data delivery agreement and do not want to use it as a template, you proceed as follows:1. (Optional) You use report Check Consistency of Configuration for Data Upload /DDF/CUST_CHECK to
check whether all settings for data delivery agreements are consistent.2. You map external fields using report Map Fields for Market Research Data DDF/SDX_MAP or on the
Manage Agreements UI.3. You activate the data delivery agreement.
Result
You have created a template for an active data delivery agreement for retail panel data from a market research company.
All active data delivery agreements are taken into account when the system scans the inbound folder for new files. All files with an active data delivery agreement are uploaded.
More Information
Self-Descriptive Exchange Format [page 134]
Defining Data Delivery Agreements [page 106]
Mapping External Fields for Retail Panel Data [page 143]
Defining Extraction Filters [page 145]
Revising Hierarchy Descriptions [page 150]
142 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Defining Files and File Sets [page 111]
Related Settings
The following settings can be defined dependent on data delivery agreements:
Configuring Quality Validation [page 203]
Defining Harmonization Groups [page 230]
3.2.2.8 Mapping External Fields for Retail Panel Data
Use
You map external fields to fields in the extract structure based on a data delivery agreement.
This mapping configuration allows you to take market research data from external sources and map the information to the fields in SAP Demand Signal Management, version for SAP BW/4HANA where you can then analyze and perform reporting on the data.
If you want to use a new field in analytics and reporting, you have to define the mapping.
The mapping also defines the hierarchy and hierarchy level information and the priority calculation.
Prerequisites
You have created inactive data delivery agreements for market research data.
You have assigned common fields to data set types in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Settings for Retail Panel Data Assign Common Fields to Data Set
Types .
Examples for common fields are Process ID, Language, or Currency.
Activities
1. Open the Management of Data Delivery Agreements Web user interface (Web UI).2. Select one or more data delivery agreements and choose Create Mapping. For existing mapping you can do
the following○ Update existing mapping○ Delete the mappings that are disabled○ Delete existing mapping○ Delete all existing mappings and creates new mappings
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 143
NoteYou can also use report Map Fields for Market Research Data /DDF/ADU_MAP_CUST_SDX to create or delete the mapping for a single data delivery agreement.
The system creates the mapping of the external fields to extractor fields used in SAP NetWeaver Business Warehouse.
3. Adjust the field mapping for one or multiple data delivery agreements, if necessary. You can choose any field to which the external field can be mapped from the input help.
NoteYou can also change the mappings in transaction Change Mappings of External Fields /DDF/SDX_MAP_MAINT:
q
1. On the SAP Easy Access screen choose SAP Menu Cross-Application Components Demand Signal Management Data Upload Data Delivery Agreements for Market Research DataChange Mappings of External Fields .
2. Select a data delivery agreement and choose Map All Fields. Adjust the individual settings, if necessary
3. Activate or disable the mapping for a specific data set type.
RecommendationYou can create the initial mapping in a test system and synchronize it with the production system at a later point in time.
Proceed as follows:
1. Deactivate the data delivery agreement.2. Place all the files into the inbound folder.3. Create the mapping.4. Adjust the mapping if necessary.
Result
The system has created the mapping and the hierarchies. You can change the mappings and revise the hierarchy data afterwards.
NoteIf there are no hierarchies for a dimension (product, market, or time) and the priority calculation type of the data delivery agreement is set to Calculation Uses DEFAULT for Empty Hierarchy Level, default entries for all the empty dimensions are created.
144 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
3.2.2.9 Defining Extraction Filters
Use
You can define and modify your own extraction filters for data delivery agreements to limit the amount of data that is loaded into SAP Demand Signal Management, version for SAP BW/4HANA to what is most relevant for your business needs. Extraction filters are applied to transaction data only. Master data is always uploaded in its entirety into the system to ensure consistent master data and hierarchies.
You can define an extraction filter at hierarchy level or attribute level for the following dimension types:
● Location (market)● Product● Time
NoteIf a data delivery in self-descriptive exchange (SDX) format contains a new dimension type and the transactional data is to be filtered by the new dimension type, you need to define an additional dimension. For more information on how to set up the extraction filtering of transactional data for an additional dimension type, see SAP Note 2086125 also.
Prerequisites
You have created mapping entries for the data set types for the following dimensions to ensure the extraction filter works at runtime:
● Product with dimension subtype Attribute Name/Value Pair● Location with dimension subtype Attribute Name/Value Pair● Time with dimension subtype Attribute Column-Based
The appropriate files and file sets are defined.
The data delivery agreement is not activated.
The files from the retail panel data delivery are in the inbound folder and ready to be uploaded.
NoteFor more information, see Roles for SAP Demand Signal Management, Roles for SAP Demand Signal Management, version for SAP BW/4HANA. [page 530]
Activities
1. Open the Web UI Management of Data Delivery Agreements.2. Select a data delivery agreement and open the Filters assignment block.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 145
3. In the Filters assignment block, choose the dimension for which you want to edit the filters.4. Add new rows to the filter definition and enter the data. If you assign a group number, the system evaluates
rows that belong to the same group number with AND.All groups or entries without group number are evaluated with OR.○ Define filters by attributes. You can use all the attributes that are defined in the format file.
Example
Example of Product Filter by Attributes for a Data Delivery Agreement
Activated Group Number Field Name Operator Field Value
Yes 1 Brand EQ Best Muffin
Yes 1 Manufacturer EQ Best Manufacturer
Yes 2 Brand EQ Sweet Dreams
Yes 2 Type NE Cookies
In this example, the rule is interpreted as follows:
All products are considered when (Brand is equal to “Best Muffin” AND Manufacturer is equal to “Best Manufacturer”) OR (Brand is equal to “Sweet Dreams” AND Type is not equal to “Cookies”).
○ Define filters by hierarchiesYou can define filters by hierarchies for all hierarchies that are defined for the dimensions Product and Location.Filters by hierarchies can consist of several rules. Each rule is evaluated as EQUAL TO and applied to entire hierarchies. Multiple rules are logically linked with OR.
ExampleYou enter a filter with hierarchy name H_1.
In this example, all products that belong to product hierarchy H_1 are a positive match.
If you define a filter by attributes and a filter by hierarchy for a dimension, the rules are combined.
ExampleFor a data delivery agreement that has filters Product by Attributes and Product Hierarchy Filters defined as above, the system evaluates all products when ((Brand is equal to “Best Muffin” AND Manufacturer is equal to “Best Manufacturer”) OR (Brand is equal to “Sweet Dreams” AND Type is not equal to “Cookies”)) OR Hierarchy is equal to H_1.
5. Activate the filters.6. Choose Simulate Filters to simulate the filters for each dimension.
For the selected dimension, the system extracts master data from the files according to the extraction filters you created. A dialog box is displayed and you can download the results log to check whether the filter results are as you would have expected.
146 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
NoteYou can only simulate filters for inactive data delivery agreements.
Example
As a data upload supervisor, you are responsible for the data upload process, data quality, and the correct selection of data. You want to prevent your database from getting filled up with aggregated data for the following reasons:
● Maintains a low total cost of ownership● Optimizes data upload and reporting
Your marketing department informs you that they want to analyze and report on your main markets and products only. You start to define your own extraction filters for your main markets and product categories by hierarchy and by attribute. You simulate the filters using the provided files to ensure that the results are as you would expect. Afterwards, you activate the data delivery agreement and proceed with the data upload.
3.2.2.10 Defining Time Derivation
You define how the system derives start date, end date, and if possible, all standard time attributes like 0CALYEAR or 0CALMONTH from the source time strings.
If the standard time derivations do not deliver the desired output (for example, if you have special formats or customer-specific needs), you can create a Business Add-In (BAdI) implementation for BAdI: Time Derivation. After activating the implementation, the filter value is available in the input help for the field Time Derivation Method. Select your filter value and save the record to assign your conversion routine for this time derivation.
NoteYou can add custom attributes like time, date and so on. For this you should perform the custom Business Add-In (BAdI) implementation.
Prerequisites
You have defined time granularities in Customizing under Cross-Application Components Demand Data Foundation Data Upload Settings for Retail Panel Data Define Time Granularity .
You have assigned the time granularity to a data delivery agreement.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 147
Activities
1. Run report Create Standard Time Derivations /DDF/TIMETYPE_EXAMPLE to create standard entries of combinations of time derivation types and time granularities that typically exist in data deliveries and that you can use for time derivation.On the SAP Easy Access screen, choose Cross-Application Components Demand Signal Management
Data Upload Data Delivery Agreements for Market Research Data Create Standard Time Derivationsor use transaction Create Standard Time Derivations /DDF/DEF_TD_EXP.
2. On the SAP Easy Access screen, choose Cross-Application Components Demand Signal ManagementData Upload Data Delivery Agreements for Market Research Data Define Time Derivation or use
transaction Define Time Derivation /DDF/DEF_TD.3. Enter the time granularity using the input help.4. If you want to define the time derivation for a specific context, enter a context using the input help.5. Create a regular expression that matches your external time strings.6. Specify the source time string that is provided with the market research data.7. If the standard derivation methods do not support the delivered time string, enter a filter value to identify
the time derivation method.8. Specify the offsets that determine the position of time information in the time string.
More Information
Time Derivation [page 148]
Example: Defining Time Derivation [page 149]
3.2.2.10.1 Time Derivation
Use
The system derives start date, end date, and if possible, all standard time attributes like 0CALYEAR or 0CALMONTH from the source time strings. The result is posted to InfoObject /DDF/TIMEREF.
This is done during data upload from DataStore object /DDF/DS05 to DataStore object /DDF/DS15 as follows:
1. The system determines the time granularity and the context assigned to the data delivery agreement.2. The system determines all possible time derivations for the given combination of context and time
granularity.If no entries are defined the system collects all time derivation types for the given time granularity, regardless of the context. If the time granularity of the data delivery agreement is set to MULTIPLE all time granularities are taken into account. Again, the system first checks if a context is assigned and if no matches are found, a context- independent search is run.
3. The system evaluates the source time string and checks the regular expression for all identified time derivations until one regular expression matches. If no time derivation is possible the upload process is stopped.
148 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
4. The system converts the source time string into a start date and end date and determines all standard time attributes, for example 0CALYEAR or 0CALMONTH, if possible. The conversion is done using implemented methods that evaluate the various offset parameters assigned to the time derivation. These parameters define where the time information can be found in the source time string.
More Information
Defining Time Derivation [page 147]
Example: Defining Time Derivation [page 149]
3.2.2.10.2 Example: Defining Time Derivation
You receive a data delivery with the time strings M201210, M201211, M201212, M201301, and so on.
The source time strings represent calendar months. They are built according to the following pattern of seven characters:
● First character M stands for month● The next four digits describe the calendar year● The last two digits describe the calendar month
Definition of a Regular Expression
A regular expression to validate the source time strings can be defined as follows: [M][2][0][0123456789]{2}[01][0123456789].
● The allowed set of values for each single digit is denoted within the square brackets. The first three characters are always M20, assuming that you receive only time strings beginning with M in the 20th century.
● The next two digits can contain any number between 0 and 9. Instead of entering the expression [0123456789] twice for each digit, you can use the abbreviated expression [0123456789]{2} indicating that the next two concatenated digits have the same value set.
● The sixth digit can have only the values 0 or 1.● The last digit can contain any number between 0 and 9.
Using this regular expression the system searches the time string for any sequence of seven characters following this expression. Any string that contains characters before or after this pattern, for example ABCM201302 or DEFM201303GH is also accepted by the system. To limit the expression so that it accepts exactly seven characters, you must define the start and end of the string using \< and \>, for example: \<[M][2][0][0123456789]{2}[01][0123456789]\>.
For more details about regular expressions, see documentation for the following ABAP keywords:
● Syntax of Regular Expressions
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 149
● Character String Patterns● Special Characters in Regular Expressions
Definition of Offsets
The four-digit year (in the example string 2012) starts at position 2 and the two-digit calendar month (in the example 09) starts at position 6. Therefore, the corresponding offsets of 1 and 5 are defined to identify the year and month within the source time string. All other offsets are set to 99 to indicate that no other time information is provided.
Method of Time Derivation
The time derivation is implemented as a standard routine. Therefore, no filter value is needed to identify a BAdI implementation. The system automatically determines the start date and end date.
Result of Time Derivation
As a result, the system derives the following values for the attributes of /DDF/TIMEREF for source time string M201209:
● 0DATEFROM = 20120901● 0DATETO = 20120930● 0CALMONTH = 201209● 0CALMONTH2 = 09● 0CALQUARTER = 20123● 0CALQUART1 = 3● 0CALYEAR = 2012
All other attributes, for example 0CALWEEK, are undefined and cannot be determined.
3.2.2.11 Revising Hierarchy Descriptions
Use
You can revise and change the source descriptions for the uploaded hierarchy data, if necessary.
150 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Prerequisites
You have mapped external fields for a data delivery agreement.
Activities
1. On the SAP Easy Access screen, choose Cross-Application Components Demand Signal ManagementData Upload Data Delivery Agreements for Market Research Data Revise Hierarchy Descriptions or
use transaction Revise Hierarchy Descriptions /DDF/MR_HIER.2. Select the data delivery agreement you want to use.3. (Optional) On subview Revise Hierarchy Descriptions, overwrite the name or the description of the
hierarchy that is displayed in the InfoObject of the corresponding dimension (product or location). Determine which hierarchies should be transferred to the InfoObject of the corresponding dimension.
4. On subview Revise Level Descriptions, overwrite the description of the hierarchy levels provided in the source data.
5. On subview Display Hierarchy History and Display Level History, check for old, inactive versions of a hierarchy or hierarchy level. The Data Delivery ID column is empty for active hierarchies. For inactive hierarchies, it contains the delivery ID that deactivated it, for example, because this hierarchy is no longer part of the delivered files.
Example
Transfer only some hierarchies and change descriptions
You receive three product hierarchies. However, only two of them should be transferred to /DDF/PRODUCT. You therefore disable the transfer for the third hierarchy. In addition, you change the descriptions of the hierarchy levels of the two relevant hierarchies.
More Information
Mapping External Fields for Retail Panel Data [page 143]
Hierarchy Upload [page 135]
Priority Calculation [page 136]
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 151
3.2.2.12 Revising Hierarchy Levels
Use
You can revise and change the numerical levels of the source data for the uploaded hierarchy data, if necessary.
Prerequisites
You have mapped external fields for market research data.
Activities
1. On the SAP Easy Access screen, choose Cross-Application Components Demand Signal ManagementData Upload Data Delivery Agreements for Market Research Data Revise Hierarchy Levels or use
transaction Revise Hierarchy Levels /DDF/MR_PRIO2. Select the context you want to use.3. On subviews Revise Product Hierarchy Level or Revise Location Hierarchy Level, change the source numeric
hierarchy level for a complete level of the product or location hierarchy.You can enter a user-defined value or add an offset to adapt the determination of numeric levels as a precondition for correct priority calculation.
4. On subview Define Level for Specific Product or Define Level for Specific Location, change the source numeric levels not for a complete hierarchy level but for single product or location by adding a user-defined value or an offset.
Example
Change Numeric Hierarchy Levels for a Complete Hierarchy
You receive two market hierarchies that have semantically the same top node. The first market hierarchy has lower levels separating the markets into regions and subregions. The other hierarchy separates the markets into different channels, for example drugstore, discounter, and supermarkets with different store sizes. To ensure correct reporting based on the pre-aggregated totals of those hierarchies, you can shift all levels of one hierarchy so that the top nodes of both hierarchies are not aggregated. Only the node with the higher priority (lower numeric hierarchy level) is taken into account.
Change Numeric Hierarchy Level for a Single Dimension Element
You receive retail panel data for 10 markets without any hierarchy. However, you know that the markets have a hierarchical dependency and you would like to report on the pre-aggregated totals accordingly. In this case, you can manually define a numeric level for each market.
152 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
More Information
Mapping External Fields for Retail Panel Data [page 143]
Revising Hierarchy Descriptions [page 150]
Hierarchy Upload [page 135]
Priority Calculation [page 136]
3.2.2.13 Defining Product Level Descriptions
Use
You can define descriptive names for product levels. You use product level descriptions in global reporting, especially in the Define Consolidation user interface.
Prerequisites
You have mapped external fields for a data delivery agreement.
Activities
1. (Optional) Run report Copy Product Level Descriptions /DDF/HIELVL2PLEVEL to copy hierarchy level descriptions that are defined for a specific data delivery agreement.
1. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Data Upload Data Delivery Agreements for Market Research Data Copy Product Level Descriptions or use transaction Copy Product Level Descriptions /DDF/MR_COPYLVL.
2. You specify the context and the language and execute the report.
2. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Data Upload Data Delivery Agreements for Market Research Data Define Product Level Descriptions or use transaction Define Product Level Descriptions /DDF/MR_LVL.
3. Select the context.4. On subview Define Product Hierarchy Level, define the descriptions.5. Start the process chain Stage Market Research Product Hierarchy Level Texts /DDF/PHLVL_STAGE using
the Data Warehousing Workbench (transaction RSPC) to make sure that the texts are transferred to InfoObject /DDF/PHLVL and are visible on the user interface.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 153
More Information
Mapping External Fields for Retail Panel Data [page 143]
Revising Hierarchy Descriptions [page 150]
Revising Hierarchy Levels [page 152]
Hierarchy Upload [page 135]
Priority Calculation [page 136]
BI content documentation: (/DDF/PHLVL_STAGE)
3.2.3 Configuring the Data Upload for Retailer Data
Use
NoteIf a new delivery ID has been pushed into a persistent staging area (PSA), the scanner detects this and creates an entry in the table so that you can choose a PSA-specific template from which you can then create new data delivery agreements.
1. Create a folder structure with an inbound folder for the new data delivery agreements in accordance with the file path required.
NoteDefine folder names containing 10 or fewer characters, as the name of a data delivery agreement is limited to 10 characters.
2. Place data deliveries in the inbound folders so that the new folder can be detected.3. Create a data delivery agreement in one of the following ways:
○ In the UI for Manage Agreements.
○ In the SAP User Menu under Administrator Data Upload Data Delivery Agreements Define Data Delivery Agreements (transaction /DDF/DDAGR).
4. Complete the following steps for the data delivery agreement:○ Define a file set.○ Define the file format.○ Create the mapping for external fields.
5. In the UI for Manage Agreements, choose New Master Data to create and assign the following new master data to the data delivery agreement:○ Data provider○ Data origin○ Context○ Region
154 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
More Information
Example: Configurable CSV Format for Uploading POS Data [page 155]
3.2.3.1 Example: Configurable CSV Format for Uploading POS Data
Procedure
1. Create a process definition specifically for configurable CSV formatted files in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Define Processes and Steps .Use the following direct update process chains for POS data:
Example: Process Definition CSF_POS
Step No. Description Process Chain ID Activate
101 Extract Locations from CSV File /DDF/LOC_POS Yes
102 Extract Products from CSV File /DDF/PROD_POS
103 Extract Location Attributes NV from CSV File
/DDF/LOC_NV_POS
104 Extract Product Attributes NV from CSV File
/DDF/PROD_NV_POS
105 Extract Aggregated Sales from CSV File
/DDF/SALES_POS
106 Extract Stock Snapshot from CSV File
/DDF/STOCK_POS
201 Stage Locations with PFC /DDF/LOC_STAGE_2
202 Product Staging (Direct Update) /DDF/PROD_STAGE_2
203 Stage Products with PFC /DDF/LOC_NV_STAGE
204 Stage Products Name Value Pairs with PFC
/DDF/PROD_NV_STAGE
205 Stage Aggregated Sales with PFC
/DDF/SALES_STAGE
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 155
Step No. Description Process Chain ID Activate
206 Stage Stock Snapshot with PFC /DDF/STOCK_STAGE
2. Create a data delivery agreement specifically for configurable CSV formatted files in the SAP Easy Access Menu under Administrator Data Upload Define Data Delivery Agreements (transaction /DDF/DDAGR).○ Select the agreement type Retailer Data.○ Select the data format Configurable CSV.○ Deselect the Activate Agreement checkbox.
3. Define the data sets for the data delivery agreement.
NoteIt is mandatory to assign DataSources for data delivery agreements that use the configurable CSV format.
Example: Data Set Definition for Agreement CSF_POS
Data Set Type Description Step No. DataSource Name
L_ATTR_COL Location Attribute Column 101 /DDF/LOC_ATTR_COL_POS
L_ATTR_NV Location Attribute Name/Value Pair
102 /DDF/LOC_ATTR_NV_POS
POS_SALES POS Sales Data 103 /DDF/SALES_POS
POS_STOCK POS Stock Data 104 /DDF/STOCK_POS
P_ATTR_COL Product Attribute Column 105 /DDF/PROD_ATTR_COL_POS
P_ATTR_NV Product Attribute Name/Value Pair
106 /DDF/PROD_ATTR_NV_POS
4. Create a folder structure on the application server for the new data delivery agreement with read/write access and place the files in the inbound folder.
5. Configure the file set by completing the following steps:○ Go to the UI Manage Agreements and select the new data delivery agreement CSF_POS.○ In Edit mode, choose on Insert.○ Define the following attributes for the file set and the individual files that belong to the file set:
NoteChange the file name pattern to ++++ before selecting the file row in the table of the file set tab page.
○ File descriptions○ File separators
156 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
○ Number of header rows○ Data set types for which the file is relevant
For example, you assign the data set types P_ATTR_COL and P_ATTR_NV to a product file.The file preview helps you to identify the file separator, escape characters, number of header lines, decimal separators, and thousand separators.For more information about these attributes, see the individual field help on the UI.
6. Configure the files format in the File Format tab page by selecting the files you want to configure.You can change the external field names, which act like a key and are used similarly to the attribute name (ATTR_NAME) in name/value pairs. Select the attributes that you want to have included in the name/value pair extraction.
7. Configure the mapping by selecting a data set type and then choosing one of the following mapping options:○ External field name: the system assigns the value that comes directly from the incoming source file○ Fixed attribute value: you can enter a fixed value for the system to use○ Custom attribute value: the system uses a calculated value, which comes from the implementation of
the Business Add-In BAdI: Mapping Enhancements (/DDF/BADI_ADU_EXTR_MAPPING)8. Activate the data delivery agreement.9. Start the folder scanner and the process dispatcher.
Result
Check your settings as described in Checking the Configurable CSV Formatted File Set-Up [page 157]
3.2.3.2 Example: Checking the Set-Up for Configurable CSV Format (POS)
NoteOptional
The system does not let you activate data delivery agreements for which there are Customizing inconsistencies. Therefore, you can run the report Check Consistency of Configuration for Data Upload (/DDF/ADU_CUST_CONSIST_CHECK) to check for any inconsistencies and correct them so that you are able to activate the data delivery agreement.
1. Activate the data delivery agreement in the UI for Manage Agreements.2. Start the folder scanner and the process scheduler.3. Make sure that both the folder scanner and process scheduler jobs are running in the Monitor Jobs under
Status Overview.4. Monitor the process in the Monitor Deliveries UI to ensure that the upload process is successful.5. In the test system in the Monitor Deliveries UI , under search for the data delivery agreement and monitor
the last process until all its steps have been processed and the data delivery agreement process has the status Completed.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 157
6. Make a note of the process ID. Then check the uploaded data in the acquisition DataStore objects (DSOs). Once the process chains have been processed and have the status Completed, the data has already been uploaded to SAP Business Warehouse (SAP BW).
7. Go to Data Warehouseing Workbench: Modeling (transaction RSA1).8. Select the InfoProvider node and expand the node Demand Data Foundation to perform the checks on the
following DSOs:
DSOs for Checking
Technical Name Technical Name
/DDF/DS21 Location Acquisition
/DDF/DS41 Product Acquisition
/DDF/DS05 Time String Acquisition
/DDF/DS01 Aggregated Sales Acquisition
/DDF/DS02 Stock Snapshot Acquisition
To perform checks, select each DSO and right-click to select Display Data.
9. Enter the process ID you noted previously with preceding 0000's.
3.2.3.3 DataSources and Extract Structures for Retailer (POS) Data
Use
In SAP BW, you need to distinguish between different types of information.
When you extract POS data using the configurable CSV format, there is a DataSource for each data set. This table shows which DataSources are linked with which extract structures. The function module used for these is also /DDF/ADU_SAPDX_EXTRACT_GENERIC.
DataSource Description Extract Structure Required Data Format
/DDF/PROD_ATTR_COL_POS Product Attribute Column POS
/DDF/S_POS_PROD_ATTR_COL
Configurable CSV
/DDF/PROD_ATTR_NV_POS Product Attribute Name/Value Pair POS
/DDF/S_POS_PROD_ATTR_NV
/DDF/LOC_ATTR_COL_POS Location Attribute Column POS
/DDF/S_POS_LOC_ATTR_COL
158 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
DataSource Description Extract Structure Required Data Format
/DDF/LOC_ATTR_NV_POS Location Attribute Name/Value Pair POS
/DDF/S_POS_LOC_ATTR_NV
/DDF/SALES_POS Sales POS /DDF/S_POS_SALES
/DDF/STOCK_POS Stock POS /DDF/S_POS_STOCK
More Information
DataSources and Extract Structures for Retail Panel Data [page 139]
3.2.3.4 Defining File Sets
Procedure
1. In the Manage Agreements UI, search for the data delivery agreement that you want to configure.This data delivery agreement should be inactive, have no mapping, and have configurable CSV format as the assigned data format.
2. Make the following settings using the input help for each of the files listed in the tab page for file sets:○ Adjust the file name pattern that the system proposes.○ Enter a description.○ Assign data set types.○ Enter separators.○ Enter the number of header rows that should be skipped before the system starts reading the file data.
3. Optional Enter the following:○ Escape character○ Optional○ Code page○ Replacement character
More Information
For more information, see the field help on the UI.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 159
3.2.3.5 Defining the File Format
1. In the UI for Manage Agreements, after configuring the files in the file set, go to the File Format tab page.All the original external fields from the file are listed in the table together with proposals for external fields. The proposals for external fields are created by the system automatically. The external fields are cleansed versions of the original external fields, which you can adjust yourself.
NoteThe external field is used in further steps such as mapping and data harmonization. The external field must, therefore, pass the following system checks that are executed upon saving:
○ Cannot have an empty value○ Cannot be duplicated in the same file○ The maximum field length is 30 characters○ The field name contains valid characters for harmonization purposes○ The field name uses upper case characters
Any of the following characters are considered to be valid: <ABCDEFGHIJKLMNOPQRSTUVWXYZ_0123456789#$%&*-/;<=>?@^{|}>
2. You cannot edit the Original External Field, which comes from the incoming source files for information purposes.However, you can edit the External Field, which is filled automatically but with a cleansed version of the original external field.
3. Select a conversion exit for localization of formatting.
When you select a specific file in a file set, the file preview shows the first 20 values of the selected file in table format.
3.2.3.6 Mapping Configurable CSV Formatted Files
1. In the UI for Manage Agreements, go to the Mapping tab page and select a data set type that is not used for name/value pairs.
2. Select one of the following mapping options for each row in the table:○ Attribute Value (Custom)○ External Field Value○ Fixed Value
For more information about these mapping options, see the individual field help.3. Select a name/value pair data set type.
NoteCheck the following when creating the mapping for name/value pair-related data set types:
○ All the attributes that have been marked as included for the name/value pair data set type are present in the attribute mapping for name/value pairs table
160 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Check the following when creating the mapping for any data set types:
○ Only one assignment type has been made for each field○ At least one key field has been mapped○ The BAdI: Mapping Enhancements has been implemented if you are assigning the Attribute Value
(Custom) option of mapping
4. Save your changes.
3.2.3.7 Handling Changes to Incoming Configurable CSV Formatted Files
The example scenarios below describe what you should do in the UI for Manage Agreements when changes are made to the configurable CSV formatted files.
Select the relevant data delivery agreement and click Edit.
Define a New File
New File Added to Data Delivery
File Set File Format Mapping Mapping Name/Value Pairs
1. Click Insert.2. Enter the file descrip
tion.3. Modify the file name
pattern, assign data set types and separators, and make all other entries necessary for the new file.
1. Select the file.2. The original external
field names are displayed in sequence.
3. The system proposes external field names that you can overwrite.
1. Select the data set type.2. Select the external field
names.
1. Select the data set type.2. Under Attribute Mapping
for Name/Value Pairs the system displays all the external field names in the same order as they are displayed in the File Format tab page.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 161
Delete an Existing File
File No Longer Included in Data Delivery
File Set File Format Mapping Mapping Name/Value Pairs
1. Click Delete.2. File is removed from list.3. Click Save.
The file is no longer available for selection.
When you select the data set type, the external field names of the deleted file are no longer available and the mapping is no longer available.
When you select the data set type, the attribute external field names of the deleted file are no longer available and the mapping is no longer available.
Change File Format
Changes to File Attributes (External Field Names)
File Format Mapping Mapping Name/Value Pairs
1. Select the file whose attributes you need to update.
2. Click Update External Fields.3. A pop-up displays the new external
field names that will be added to the file and the current external field names that will be disabled. Confirm or discard the proposed updates to external field names in the pop-up.
1. Select the data set type.2. Map the newly updated external
field names.
1. Select the data set type.2. Under Attribute Mapping for
Name/Value Pairs the system displays all the external field names in the same order as they are displayed in the File Format tab page.
Removal of File Attributes (External Field Names)
File Format Mapping Mapping Name/Value Pairs
1. Select the file whose attributes you need to update.
2. Click Update External Fields.3. A pop-up displays the new external
field names that will be added to the file and the current external field names that will be disabled. Confirm or discard the proposed updates to external field names in the pop-up.
1. Select the data set type.2. Mapped extractor fields that were
assigned before to these external fields are unassigned and the external field names are no longer available for selection.
1. Select the data set type.2. Under Attribute Mapping for
Name/Value Pairs the system displays all the external field names in the same order as they are displayed in the File Format tab page and the deleted external field names are removed.
162 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Disable External Field Names
File Format Mapping Mapping Name/Value Pairs
1. Select a field in the list and select Disable or choose Update External Fields to have the system automatically detect that a previously present field is no longer present in the new file.
2. Click Update Preview and the file preview no longer contains the disabled external field name.
NoteIf there are more columns in the actual file than in the file format, the system displays the header columns from the file as columns in the preview of the file.
1. Select the data set type.2. Existing assignments of extractor
fields to external fields are renamed
1. Select the data set type.2. Under Attribute Mapping for
Name/Value Pairs the system displays all the external field names in the same order as they are displayed in the File Format tab page and the disabled external field names are removed.
Exclude External Field Names From Name/Value Pairs
File Format Mapping Name/Value Pairs
1. Select the file whose attributes you need to update.2. Set the relevant field to Not Included.
1. Select the data set type.2. Under Attribute Mapping for Name/Value Pairs the sys
tem displays all the external field names in the same order as they are displayed in the File Format tab page and the excluded external field names are removed.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 163
Re-Enable External Field Names
File Format Mapping Mapping Name/Value Pairs
1. Select a field in the list and deselect Disable or choose Update External Fields to have the system automatically detect that a previously disabled field is now present in the new file.
2. Click Update Preview and the file preview contains the re-enabled external field name.
1. Select the data set type.2. The re-enabled external field
names are available again for selection.
1. Select the data set type.2. Under Attribute Mapping for
Name/Value Pairs the system displays all the external field names in the same order as they are displayed in the File Format tab page and the re-enabled external field names are available again.
Modify External Field Names
File Format Mapping Mapping Name/Value Pairs
1. Change the proposed external field name.
2. Click Update Preview and the file preview contains the modified external field name.
1. Select the data set type.2. The modified external field names
are available for selection.
1. Select the data set type.2. Under Attribute Mapping for
Name/Value Pairs the system displays all the external field names in the same order as they are displayed in the File Format tab page and the modified external field names are available.
3.2.4 One-Time Uploads Before Recurring Regular Data Uploads Are Started
Upload of Aggregated Sales or Stock Snapshot Data from Retailers
If there are not time master data available in the propagation layer, time characteristics cannot be assigned properly to transactional data. Therefore, point-of-sales data from retailers are not uploaded from the acquisition layer to the propagation layer.
Before you upload aggregated sales or stock snapshot data of retailers, you upload time master data to the /DDF/TIME InfoObject as follows:
1. You define an InfoPackage with generic time data (daily and weekly) for DataSource /DDF/TIME_ATTR.You specify the period of time for which generic time data with appropriate time characteristics as attributes are required in reporting by using fields VALIDFROM and VALIDTO. We recommend doing this at
164 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
the same time that you set up year-specific fiscal year variants, and that it covers a similar number of years into the future.
2. You execute the InfoPackage.3. You execute the /DDF/TIME process chain, to upload the time master data from the persistent staging
area to the /DDF/TIME InfoObject.
Upload of Aggregated Sales Data from Retailers
Before you upload aggregated sales data of retailers, you upload the types of sales data that are defined in Customizing under Cross-Application Components Demand Data Foundation Data Model Define Types of Sales Data to the /DDF/SALGRP InfoObject as follows:
1. You execute the Type of Sales Data Attributes InfoPackage2. You execute the Type of Sales Data Texts InfoPackage.3. You execute the /DDF/SALGRP_ATTR -> /DDF/SALGRP data transfer process.4. You execute the DTP /DDF/SALGRP_TEXT -> /DDF/SALGRP data transfer process.
Upload of Stock Snapshot Data from Retailers
Before you upload stock snapshot data from retailers, you upload the stock types that are defined in Customizing under Cross-Application Components Demand Data Foundation Data Model Define Stock Types to the /DDF/STOCKTYPE InfoObject as follows:
1. You execute the Stock Types InfoPackage.2. You execute the DTP /DDF/STOCKTYPE_TEXT -> /DDF/STOCKTYPE data transfer process.
Upload of Market Research Data
If you have defined regions for market research data, you must upload these to the /DDF/MRCOUNTRY InfoObject before you can upload market research data.
1. You create an InfoPackage for DataSource /DDF/MRCOUNTRY_TEXT2. You execute the data transfer process /DDF/MRCOUNTRY_TEXT -> /DDF/MRCOUNTRY
If you use global reporting of market research data, you assign countries as attributes of global locations. In order to be able to report retail panel data by these countries, you have to upload the country texts as follows:
1. If the data flow below InfoObject 0COUNTRY does not appear in the list of InfoSources in transaction RSA1, you first have to install the transfer structure of the corresponding data flow from the content. To do so, proceed as follows:1. You migrate DataSource 0COUNTRY_TEXT using transaction RSDS.
2. In transaction RSOR you choose in the 3.x Types Transfer Rules in the list of all object types.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 165
3. You select DataSource 0COUNTRY_TEXT and make sure that you selected Only Necessary Objects as grouping criterion.
4. You choose Install.2. In transaction RSA1, you create an InfoPackage for DataSource 0COUNTRY_TEXT. On the Processing tab,
you choose PSA and then in the InfoObject (Package by Package).3. You schedule the InfoPackage with update method Full Update.
Upload of Transactional Data (POS Data or Market Research Data)
If you want to use semantic partitioning of DataStore objects by defining partitions for semantically partitioned objects (SPO), you first have to upload the texts of the partition values to InfoObject /DDF/DDA_PART.
Execute the data transfer process /DDF/PARTVAL_TEXT/SRDCLNT001 to /DDF/DDA_PART.
3.2.5 Setting Up Automatic Upload with Folder Scanner
Use
You use the folder scanner to detect new file-based data automatically.
Activities
1. Allocate the appropriate disk space for the volume of incoming data you expect to receive.2. Configure the file system and the folders to enable the correct file movement as files move through the
inbound, process, and archive folders.3. Check that all the file directories have the appropriate access, modification, and execution privileges (for
example, read, copy, rename, and so on).4. Configure a data delivery agreement to set how the system should detect and upload new files.5. Define an active data delivery agreement with the data upload method Automatic Upload with Folder
Scanner and the data format Comma-separated values (CSV) or Self-descriptive exchange format (SDX).Only files for active data delivery agreements are detected.
6. Configure the folder scanner job and the process scheduler in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings Configure Jobs and ensure that the jobs have the status Started.
3.2.5.1 Scanning Static CSV Files
When the folder scanner scans the inbound folder, it identifies which data deliveries are complete by checking that all the mandatory Comma-Separated Values (CSV) files are present.
166 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
It detects which files are mandatory from the settings for file sets and files. Those files for which the Optional checkbox has not been selected in the file definition Customizing are considered by the folder scanner to be mandatory and must therefore be present for the delivery to be considered as complete.
Files that are associated with steps in the file definition Customizing are moved to the file processing folder. Files that are not associated with a step in the file definition Customizing are skipped. Their status is changed from New to Skipped. For example, if you use files with a file extension that marks them as being completely uploaded or finished (by using a file extension .fin, for example), the status of those files does not change to Ready.
3.2.5.2 Scanning SDX Files
Use
When the folder scanner scans the inbound folder, it identifies which data deliveries are complete by checking that all the mandatory files are present.
When data is delivered in the self-descriptive exchange (SDX) format, the folder scanner detects a data delivery as being complete when the metafile is present along with all the files described in the metafile.
A valid metafile must have a format and a dimension file.
Files in the SDX format are not associated to any steps in the file definition.
More Information
Self-Descriptive Exchange Format [page 134]
3.2.5.3 Example: Folder Scanner Failure
The system may not be able to detect and upload incoming data deliveries for various reasons. Here is a checklist of possible reasons for data detection failure:
Reason Action Required
You have insufficient disk space.
Running out of disk space can disrupt the load manager from completing operations such as data detection. You should free disk space by deleting files, for example. For more information about archiving, deletion, and housekeeping, see Housekeeping [page 91].
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 167
Reason Action Required
The file system and folders have not been configured correctly in the following Customizing settings:
● Logical File Path (transaction FILE)
● Customizing for Cross-Application Components under
Demand Data
Foundation Data
Upload Basic
Settings General
Settings .
For more information about how to set up the file system and folders correctly, see the following documentation:
● Customizing for Cross-Application Components under Demand Data Foundation
Data Upload Basic Settings General Settings● SAP Demand Signal Management Application Operations Guide on SAP Service Market
place
The jobs are not running. ● Check your Customizing settings for the daemon framework in Customizing for Cross-
Application Components under Demand Data Foundation Data Upload Basic
Settings Configure Jobs .● Check the Monitor Jobs to determine the jobs statuses.
The same data delivery comes twice.
The data is detected as duplicate data and an error message is logged in the Monitor Deliveries UI. The duplicate files are not uploaded.
The files do not match the file name pattern.
If you configured a pattern for your files (for example, DOC++++++.csv), files with an incorrect pattern are not processed. The file name pattern must be adjusted manually. Check your file name pattern in the settings for files and file sets.
The same file name pattern was used twice.
If the delivery instance table already contains a delivery with file name pattern 12345 and a second data delivery arrives for the same data delivery agreement with the file name pattern 12345, the load manager ignores the data delivery. Only one data delivery with the file name pattern that matches the file name pattern associated with the relevant data delivery agreement is allowed.
The files are detected and marked as being ready for upload. Some of the files are uploaded successfully. However, one of the remaining files to be uploaded has failed to be uploaded.
Not all mandatory files were delivered. No more files can be uploaded until all mandatory files are present. The scanner keeps checking for a complete data delivery.
Missing authorization. Contact your system administrator to gain full access for all the sub-folders that are relevant for the data delivery.
168 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
3.2.6 Setting Up Automatic Upload with PSA Scanner
Use
You use the Persistent Staging Area (PSA) scanner to detect new data in the Service Delivery PSA.
Activities
1. Define an active data delivery agreement with the data upload method Automatic Upload with PSA Scanner and the data format Tables.
2. Define a delivery DataSource for the data delivery agreement.If you do not define a delivery DataSource name, then the default delivery DataSource is used to detect any new data deliveries for this data delivery agreement.The default delivery DataSource for the parameter DELIVERY_INFO_DATASOURCE_NAME is defined in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings General Settings .
3. Configure the PSA scanner job in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings Configure Jobs and ensure that the jobs have the status Started.You can configure how frequently you want the PSA scanner to scan the Service Delivery PSA for new data.
3.2.7 Manual Data Upload
Use
You upload data manually in the following cases:
● You want to upload company-internal data from an ERP system like SAP ERP● The folder or Persistent Staging Area (PSA) scanner fails● You want to do a test upload for data from a new DataSource, such as a new retailer
SAP NetWeaver Business Warehouse (SAP NetWeaver BW) allows you to upload data by executing the appropriate InfoPackage manually.
Prerequisites
You have configured the following:
● An active data delivery agreementYou may wish to define a separate data delivery agreement for each individual data set: one agreement for product, one agreement for location, and one for sales data.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 169
You may also reuse an existing data delivery agreement.● The process scheduler job
Ensure that it has the status In Process so that subsequent steps can be triggered automatically by the process scheduler.
● The process definitionEnsure that it has a step with a process chain assigned for transferring data from the Acquisition DataStore object (DSO) to the Propagation DSO
● The step definitionYou must assign the step to an upload stage for manual uploads. Ensure that you select the correct upload stage setting, as this setting determines which steps should and should not be instantiated in the manual upload.
● If you upload data by manually triggering a process chain and you want the subsequent steps to be triggered automatically by the process flow control (PFC), you must assign the process type /DDF/STATU at the end of the process chain.
● If you want to use the same data delivery agreement for a manual upload as you have used for an automatic upload with the folder scanner, you have to define upload stages in Customizing.
Process
1. You log on to SAP NetWeaver BW Data Warehousing Workbench (transaction RSPC) and navigate to the appropriate InfoPackage.
2. You trigger the data upload by doing one of the following:○ You execute the InfoPackage to trigger the data upload to the relevant PSA.
If you upload a file, you must include the file name in the InfoPackage file name routine.If you upload company-internal data from SAP ERP, you use an SAP ERP DataSource.The PSA gets the relevant data delivery agreement from the file or SAP ERP.
○ You execute the Data Transfer Process (DTP) to do one of the following:○ Bring the data from a PSA request to the Acquisition DSO.○ Bring the data from any DSO to the Acquisition DSO.
○ You execute the process chain that brings the data to the Acquisition DSO.The process chain must have the process type /DDF/STATU at the end of it.
3. The system creates a delivery ID and a process ID.4. The system also creates an additional step with step number 000000 and initial status Manual.
This status is for the step only and is not propagated. While the step has the status Manual, you can find it in the Monitor Deliveries UI under the status category Running for the related process.If you did the manual upload without a process chain, then this step has neither a process chain nor a process chain log ID. You must set the status manually to Completed, as there is no PFC interaction.If you used a process chain in the manual upload, then this step is instantiated with a process chain and a process chain log ID. The process type /DDF/STATU sets the step status to Completed.
5. The system creates the subsequent steps.Using the delivery and process created in the previous step, the subsequent steps, which are assigned to upload stages other than Data Acquisition, are created with the status Ready. The PFC executes the subsequent steps once the additional step (with step number 000000 and initial status Manual) has the status Completed.
6. You check in the Monitor Deliveries UI, whether the upload was successful or not.
170 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Example
If you want to use the same data delivery agreement for a manual upload as you have used for an automatic upload with the folder scanner, you have to define upload stages in Customizing as displayed in the following table.
Step Attribute Value
Agreement Type Retailer Data
Upload Method Automatic Upload With Folder Scanner
Data Format Comma-Separated Values
The process definition has the following steps assigned:
● PRODUCT_FILE with upload stage Data Acquisition● LOCATION_FILE with upload stage Data Acquisition● PRODUCT_STAGE with upload stage Quality Validation● LOCATION_STAGE with upload stage Data Propagation
You decide you want to upload the product file manually and you can use the same data delivery agreement for this.
The system creates an additional step with the step number 000000 with status Manual for the manual upload and instances for the steps that are not relevant for data acquisition (PRODUCT_STAGE and LOCATION_STAGE).
The system does not create an instance for the step LOCATION_FILE because you want to upload only the PRODUCT_FILE in the manual upload.
Although no location data is uploaded, the step LOCATION_STAGE is nevertheless executed. This does not cause any conflicts because the Acquisition DataStore object does not contain any location data with the delivery ID that was created from the manual upload.
More Information
Data Reload [page 184]
3.2.8 Checking Completeness of Data Deliveries
Use
You can check on the Monitor Deliveries UI to see whether a process was created or not. You can also see whether the status of the process has changed to Ready. If the status is Ready, this means that all mandatory content (files or data sets) are present and are ready to upload.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 171
The criteria for a data delivery to be considered as complete depend on the data format and on the tool used for detecting the data:
● Self-descriptive exchange (SDX) format for retail panelA data delivery in SDX format is considered complete when the metafile and all the files described in the metafile are present in the inbound folder
● Comma-Separated Values (CSV) file format for point-of-sale (POS) dataA data delivery in CSV format is considered complete when all the mandatory files are present in the inbound folder as defined in Customizing.If you receive large volumes of files and if you always receive the files in the same sequence in the inbound folder, you can indicate that the data delivery is complete, for example, by defining a mandatory empty file (with file extension .fin) that is used as a marker that the data delivery is complete.The folder scanner detects when all the mandatory files are available in the inbound folder.Since you cannot guarantee the correct sequence when you manually copy or transfer files to the inbound folder, you can use an automatic mechanism to ensure that all files are transferred, before the mandatory empty file (.fin) gets transferred.
ExampleYou use an empty file with file extension .fin as marker for completeness as in the following table:
File Number File Name Pattern Optional
10 TopMarket++++++++.loc Yes
20 TopMarket++++++++.pro No
30 TopMarket++++++++.sal Yes
40 TopMarket++++++++.stk Yes
50 TopMarket++++++++.fin No
● The PSA scanner considers a data delivery to be complete once all the mandatory data sets are present in the PSA.
More Information
For more information about how you can influence the system detecting whether a data delivery is complete or not, see the documentation in Customizing for Cross-Application Components under Demand Data Foundation Business Add-Ins (BAdIs) BAdI: Data Delivery Enhancements .
172 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
3.2.8.1 Errors in Data Deliveries
Use
If a file causes a severe error, such as a core dump, the folder scanner aborts but starts again automatically.
If there is a severe error in the data delivery, the Persistent Staging Area scanner aborts but starts again automatically.
You can see what errors have occurred by checking the Monitor Deliveries UI, or in the case of jobs, the Monitor Jobs UI, where all errors for data uploads are displayed. The errors for the following are displayed:
● Processes● Steps● Process chains● Jobs● General system errors
More Information
Monitor Deliveries [page 180]
3.2.8.2 Data Consistency
Use
In the rare case that discrepancies occur between object statuses in different instance tables, you can execute a report (/DDF/ADU_CANCEL_INSTANCE) that sets the instance statuses to Canceled.
It is necessary to run this report because otherwise the data upload process is blocked.
The report uses the data delivery ID to select all the related instance statuses.
You can choose one of the following actions:
● Cancel deliveryYou can then reload the data delivery.
● Archive deliveryUse this option if there are discrepancies in the system, such as files having been deleted that prevent the Administrative Tasks job from marking the instances for archiving. In this case, you can run this report to archive the instances that could not be archived by the job.The prerequisite for running the report on this setting is that the delivery status is set to Canceled or Completed.
If you select both checkboxes in the report, the system first sets the instance statuses to Canceled and then marks the instances for archiving.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 173
More Information
For more information about this report, see the documentation of the report Set Instance Statuses to Canceled in the system.
Data Reload [page 184]
3.2.9 Archiving
Use
For archiving data upload instances in the area of data upload in SAP Demand Signal Management, version for SAP BW/4HANA, implement your SAP Business Warehouse (SAP BW) strategy for archiving. Data and objects should, therefore, be extracted to SAP BW in DataStore objects (DSOs) that have SAP BW time characteristics (0CALDAY, for example). You set up the archiving of your data in transaction Edit Data Archiving Process (RSDAP).
You use the Clean Archived Instances job to delete objects from the system. We recommend deleting objects after 1 to 2 years with the standard parameter being 2 years (730 days). There is no reason to delete the objects any earlier. Once the objects are deleted, the Monitor Deliveries and Data Delivery Monitor will no longer display the deleted objects and there will be inconsistencies if parts of the upload process or data harmonization process still try to access the objects that have been deleted. To access historical data that has already been cleaned up from tables in SAP Demand Signal Management, version for SAP BW/4HANA, implement BI reports.
More Information
Clean Archived Instances Job [page 93]
3.3 Supervising the Data Upload
3.3.1 Process Flow Control
Use
The process flow control allows you to sequence and monitor how data is uploaded for consumption by analytical applications.
174 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Each process consists of a sequence of steps that are connected to process chains in SAP NetWeaver Business Warehouse.
The process flow control runs periodically and evaluates which processes steps can be started, and calculates the status of each process and step.
The status is displayed on the Monitor Deliveries UI Web user interface. There you can monitor all processes and solve issues.
Upload Stages
You can use upload stages to differentiate steps of the overall upload process. These upload stages include, for example, the following:
● Data acquisition● Quality validation● Data propagation● Data enrichment● Data harmonization
You configure the upload stages by assigning them to individual steps.
In most cases, the usage of upload stages is optional. If you want to use the same data delivery agreement for a manual upload as you have used for an automatic upload with the folder scanner, you have to define upload stages in Customizing.
Parallelization
You can parallelize the execution of steps to reduce the time it takes to upload data. The steps remain independent of one another but are executed in parallel.
You can group steps for parallelization according to the following rules:
● A group contains only subsequent steps● A group does not contain any steps that trigger the same process chain● A group contains steps of the same upload stage, which are independent from each other
RecommendationParallelize the following steps:
○ Steps of the upload stage Data Acquisition○ Steps of the upload stage Data Propagation for master data○ Steps of the upload stage Data Propagation for transaction data○ Steps of the upload stage Data Enrichment for transaction data
Prerequisites
You have defined processes and steps in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Define Processes and Steps .
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 175
More Information
Examples of Process Definitions for Uploading Master Data from Harmonization [page 69]
Status Management [page 178]
Monitor Deliveries [page 180]
For more information on the data flow see:
● Data Flow [page 36]● Data Flow for Master Data [page 37]● Data Flow for Transaction Data [page 39]● Process Chains in the Data Flow [page 41]
For more information on the process flow in the data upload see:
● Process Flow in Data Upload [page 177]● Manual Data Upload [page 118]● Data Reload [page 184]
176 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
3.3.1.1 Process Flow in Data Upload
Use
The following figure shows how the background jobs and the process flow control (PFC) interact with one another to upload data to SAP Demand Signal Management, version for SAP BW/4HANA:
Process Flow in Data Upload
Process
1. One of the following steps is performed:1. The folder scanner detects files in the inbound folder and checks for completeness.2. The Persistent Staging Area (PSA) scanner detects new deliveries in the Service Delivery PSA and
checks for completeness.2. The instance creation for the detected files or PSA tables is triggered,3. Instances of the data delivery, process, step, file, and data set are created in instance tables.4. The process scheduler reads the instance tables and settings and detects new data deliveries.
The process scheduler always executes the next step that has the status Ready.5. Before a step is executed, the process scheduler changes the status from Ready to In Process.6. Process scheduler triggers the process chain assigned to the step.7. At the end of the execution the process chain changes the step status to Completed.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 177
3.3.2 Status Management
Use
Status management does the following:
● Manages the status of each step in the process of uploading data to SAP Demand Signal Management, version for SAP BW/4HANA and enables the system to trigger the right events to either continue or abort data processing
● Ensures that all steps are executed in the right order and under the right conditions● Assigns statuses to the following object instances:
○ Steps○ Processes○ Data deliveries
● Propagates statuses based on predefined rules (cannot be adjusted).
Only one user can modify the status of an object at a time.
Based on the current status of processes and steps, the process flow control (PFC) determines the next possible step for execution. The current statuses can be viewed in the Monitor Deliveries. Here the user can also change statuses, processes, and steps manually, according to the predefined propagation rules.
More Information
Monitor Deliveries [page 180]
178 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
3.3.2.1 Automatic Status Propagation
Use
Statuses of objects in SAP Demand Signal Management, version for SAP BW/4HANA are propagated automatically as shown in the figure:
Status Propagation
Object statuses are propagated automatically based on the following rules:
● When any single step moves from status New to Ready, then the file, data set, or process with which that step is associated undergoes a status change to Ready.
● When all the steps that belong to a particular process are completed and their status is set to Completed, then all the processes and data deliveries with which these steps are associated are set to Completed.
● If you assign a step to a file or data set in Customizing, status management propagates the step status to the files and data sets to which that step is assigned.Files and data sets to which no step is assigned are set to status Skipped.
Based on the propagation rules, the status of the file, data set, or process changes when the status change information is received.
The process flow control uses this status information to decide on which is the next possible executable step in the process.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 179
Example
The following simplified example shows how propagation rules are used within SAP Demand Signal Management, version for SAP BW/4HANA .
Propagation Rules
Sender Sender Status: Changes from
Sender Status: Changes to
Listener Listener Status Rule
Step New Ready File Ready Any
Step New Ready Data Set Ready Any
Step New Ready Process Ready All
3.3.3 Monitor Jobs
Use
You use the Monitor Jobs UI to do the following:
● Start jobs● Stop jobs● Display job messages
More Information
Scheduling and Monitoring Jobs for Automatic Data Uploads [page 86]
3.3.4 Monitor Deliveries
Use
The Monitor Deliveries gives you a detailed overview of process statuses.
The following tables show what the process statuses mean and which actions you can take.
180 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Scheduled Processes
Process Status Description
New A new data delivery was found and a process instance was created.
Ready The data delivery is complete and ready for processing.
Running Processes
Process Status Description
In Process Data is currently being processed.
When you upload data manually, a step is created with the status Manual.
Interrupted Processes
Process Status Description
Errors There are multiple errors in the process. Display messages to view error details.
Error – Process Definition Level (All Steps*) An error occurred at process definition level.
Error – Process Definition Level (Subsequent Steps**)
An error occurred at process definition level.
Error – Data Delivery Agreement Level (All Steps*) An error occurred at the level of data delivery agreement.
Error – Data Delivery Agreement Level (Subsequent Steps**)
An error occurred at the level of data delivery agreement.
On Hold The process is on hold and can be restarted.
On Hold – Process Definition (All Steps*) You put on hold all the steps belonging to that process definition.
On Hold – Process Definition (Subsequent Steps**) You selected a step that you want to put on hold along with all its subsequent steps.
On Hold – Data Delivery Agreement (All Steps*) You put on hold all the steps belonging to that data delivery agreement.
On Hold – Data Delivery Agreement (Subsequent Steps**)
You selected a step that you want to put on hold along with all its subsequent steps.
Finished Processes
Process Status Description
Skipped (step level only) You skipped a step and the process is set to Completed once all the other steps have been executed successfully.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 181
Process Status Description
Canceled The execution is canceled.
Completed The execution is completed.
*All Steps
All steps of all processes that belong to this process definition or data delivery agreement are affected by the error and cannot be processed until the error is corrected.
**Subsequent Steps
All subsequent steps of all processes that belong to this process definition or data delivery agreement are affected by the error and cannot be processed until the error is corrected.
More Information
Manual Data Upload [page 118]
3.3.4.1 User Actions on Processes
Use
In the Monitor Deliveries UI, you can cancel a process manually, trigger a reload, display messages, and navigate to the quality validation results of the data upload.
Action Prerequisites Impact New Status Category
Cancel The process you selected has one of the followings statuses*:
● New● Ready● Error
The status of the process is set to Canceled. Canceling prevents any further steps from being executed. The statuses of any further steps are set to Skipped. Any step that is in status In Process continues until it is finished. Its status is set to Completed.
Finished
182 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Action Prerequisites Impact New Status Category
Reload The process you selected has one of the followings statuses:
● Error● Canceled● Completed
If the process has status Error, it gets the status Canceled. The system starts a new process with status New.
● Finished for the process selected for reload
● Scheduled for the new process
Display Process Messages
You have selected one process
You can view messages relating to all three levels in the hierarchy: Data delivery agreements; Processes; Steps. There is no time period restriction on the age of the messages.
-
Display System Messages
You have selected one process
You can view messages that are up to four days old and that relate to general system administration.
-
Quality Validation The button is active for all of the processes that belong to a data delivery if at least one of the following criteria is met:
● The process defini-tion has a step for the upload stage Quality Validation
● The quality validation already has data for the delivery ID of the process which has been selected
You navigate to the Quality Validation UI and can view the results of the quality validation checks.
-
NoteYou can display messages that are older than the displayed messages using the application log as follows:
1. Go to transaction Analyze Application Log SLG1.2. Enter object /DDF/ADU.3. Enter subobject /DDF/ADU_GENERAL.4. Select a date range.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 183
5. Execute.
More Information
* Available if you have installed SAP Demand Signal Management, version for SAP BW/4HANA 2.0, FP02 and configured the required user roles and navigation to the Monitor Deliveries UI.
** Available if you have installed SAP Demand Signal Management, version for SAP BW/4HANA 2.0, FP03 and configured the required user roles and navigation to the Monitor Deliveries UI.
3.3.4.2 Data Reload
Use
Reloading a data delivery can be helpful in the following cases:
● A process contains an error that cannot be corrected or repairedYou want to reload a data delivery that has the status Error and the error cannot be corrected, so the process could not be completed successfully. Before you reload the process, you have to ensure that there are no issues with the system and it is able to perform a new data upload. This could be, for example, ensuring that the process chain can be executed again. You reload data in the Monitor Deliveries UI by selecting the process and clicking the Reload button. This sets the status of the old process to Canceled and creates a new instance that can then be uploaded.
● The sequence of the data deliveries needs to be corrected if you received, for example, the Tuesday data delivery after the Wednesday data deliveryYou want to reload one or several data deliveries to correct the sequence manually so that subsequent processing steps (for example, data harmonization) are able to be executed and processed correctly. In this event, manually upload the Tuesday data delivery, then the Wednesday data delivery, then the Thursday data delivery until you have loaded all data deliveries in the correct order. From then on, the data deliveries arrive in the correct sequence and subsequent data deliveries can continue to be processed automatically.
Process
The reload functionality is available for all agreement types and data formats that use the data upload method Automatic Data Upload With Folder Scanner or Automatic Data Upload With PSA Scanner.
To reload a data delivery, do the following:
1. In the Monitor Deliveries UI, select a process that has one of the following statuses:○ Error○ Completed○ Canceled
184 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
2. Choose Reload.3. The system reloads the entire process and all its steps to avoid any inconsistencies:
○ In the case of processes from agreements for file-based deliveries, all the files are copied to the inbound folder and the folder scanner processes them again.
○ In the case of processes from agreements for table-based PSA deliveries, the PSA scanner rereads the PSA tables and processes the delivery IDs.
Result
The status of the old process changes as follows:
Before Reload After Reload
Status Status Category Status Status Category
Error Interrupted Canceled Finished
Canceled Finished Canceled Finished
Completed Finished Completed Finished
The new statuses are propagated to all the associated steps, process definition, and data delivery agreement levels. You can find the old process in the Status Overview under the status category Finished.
You can find the new process that was just created for the reload under the status category Scheduled, or Running, if it has already started.
After a while, the new process can be found under any other category based on its progress.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 185
3.3.4.3 User Actions on Steps
Use
Select a step on which you want to perform one of the following actions:
Action Prerequisites Impact
Skip
Skip a step or multiple steps* that are either in error or did not start their execution.
The steps you selected to skip must all have one the following statuses:
● New● Error● Ready● On Hold
NoteWhen multiple steps have been selected to be skipped, all of them have to be eligible for skipping for the Skip Steps button to be enabled.*
NoteIf a step has the status In Process, you cannot skip it because the process chain is already running in SAP BW. To resolve this issue, you can navigate to SAP BW and stop the process chain. This will turn the status of the process chain status to a red traffic light. This error status is propagated to the process flow control by the Administrative Tasks job.
Once you skip steps, you cannot restart them.
Hold
Switch from status Ready to On Hold.
The step you selected has the status Ready. Setting a step to On Hold prevents any subsequent steps from being executed, as they are on hold as well. Steps that were started before continue to be executed. The process definition or data delivery agreement is interrupted.
Restart The step you selected must have one of the following statuses:
● On Hold● Error*
All the process chains for this step are executed.
186 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Action Prerequisites Impact
Complete
When you upload data manually, you have to set the manual step to Completed.
The step you selected has the status Manual. Follow-on steps can be executed to bring the new data into the propagation layer.
Display Step Messages Select a step. The system displays all the messages logged for that step. There is no time period restriction on the age of the messages displayed.
Display Process Chain Messages Select a step. The system displays all the messages logged for process chain to which that step is assigned. There is no time period restriction on the age of the messages displayed.
Display Quality Validation Results Select a process. You can check the key figures for a specific data delivery.
More Information
* Available if you have installed SAP Demand Signal Management, version for SAP BW/4HANA 2.0, FP02 and configured the required user roles and navigation to the Monitor Deliveries UI.
Administrative Tasks Job [page 92]
3.3.4.4 Example: Manual Status Change from “Ready” to “On Hold”
Use
You have a process definition “Retailer North America”. Assigned to this process definition you have:
● Data Delivery Agreement: “TopMarket”.○ Data Delivery 1 (process 1)
○ Step 101○ Step 102 (You select this step in the Monitor Deliveries and choose Hold.)○ Step 103
○ Data Delivery 2 (process 2)○ Step 101○ Step 102
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 187
○ Step 103● Data Delivery Agreement: “GreatBuys”
○ Data Delivery 1 (process 1)○ Step 101○ Step 102○ Step 103
A delivery is received for each of the data deliveries in the following order:
1. TopMarket Data Delivery 12. TopMarket Data Delivery 23. GreatBuys Data Delivery 1
Activities
1. In the Monitor Deliveries, you select step 102 and choose Hold.2. In the dialog box, you choose how you want to impact the process flow:
Scenario Step Origin Locking Impact New Process Status
1 Data delivery agreement TopMarket
All steps All processes that belong to data deliveries from data delivery agreement TopMarket are on hold.
On Hold - Data Delivery Agreement Level (All Steps)
2 Subsequent steps of 102
All processes that belong to data deliveries from data delivery agreement TopMarket that are subsequent to step 102 are on hold.
On Hold - Data Delivery Agreement (Subsequent Steps)
3 Process definition: Retailer North Africa
All steps All processes that belong to the process definition Retailer North Africa are on hold.
On Hold - Process Definition Level (All Steps)
4 Subsequent steps of 102
All processes that belong to the process definition Retailer North Africa that are subsequent to step 102 are on hold.
On Hold - Process Definition Level (Subsequent Steps)
188 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Result
Processes and Steps On Hold Are Highlighted
Scenario 1: All processes that belong to data delivery agreement “TopMarket” are on hold.
Scenario 2:
All processes that belong to data delivery agreement “TopMarket” are on hold from step 102.
Process Definition: North America
● Data Delivery Agreement: “TopMarket”.○ Data Delivery 1 (process 1)
○ Step 101○ Step 102○ Step 103
○ Data Delivery 2 (process 2)○ Step 101○ Step 102○ Step 103
● Data Delivery Agreement: “GreatBuys”○ Data Delivery 1 (process 1)
○ Step 101○ Step 102○ Step 103
Process Definition: North America
● Data Delivery Agreement: “TopMarket”.○ Data Delivery 1 (process 1)
○ Step 101○ Step 102○ Step 103
○ Data Delivery 2 (process 2)○ Step 101○ Step 102○ Step 103
● Data Delivery Agreement: “GreatBuys”○ Data Delivery 1 (process 1)
○ Step 101○ Step 102○ Step 103
Scenario 3:
All processes that belong to the process definition “Retailer North America” are on hold.
Scenario 4:
All processes that belong to the process definition “Retailer North America” are on hold from step 102.
Process Definition: North America
● Data Delivery Agreement: “TopMarket”.○ Data Delivery 1 (process 1)
○ Step 101○ Step 102○ Step 103
○ Data Delivery 2 (process 2)○ Step 101○ Step 102○ Step 103
● Data Delivery Agreement: “GreatBuys”○ Data Delivery 1 (process 1)
○ Step 101○ Step 102○ Step 103
Process Definition: North America
● Data Delivery Agreement: “TopMarket”.○ Data Delivery 1 (process 1)
○ Step 101○ Step 102○ Step 103
○ Data Delivery 2 (process 2)○ Step 101○ Step 102○ Step 103
● Data Delivery Agreement: “GreatBuys”○ Data Delivery 1 (process 1)
○ Step 101○ Step 102○ Step 103
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 189
3.3.5 Data Upload Monitor
The Data Upload Monitor UI allows you to monitor the status of the following:
● Process definitions that are involved in data upload● Jobs that are, for example, being run currently or are scheduled to run for the purposes of the data upload
and integration with SAP Demand Signal Management, version for SAP BW/4HANA
3.3.6 Error Handling
Status management ensures that data is processed accurately. There are predefined rules to propagate a status automatically within the system.
Error handling is integrated within status management, so that when an error occurs during the processing of incoming data, it can be monitored closely using accurate statuses. Once each step of a process has been executed, the step status is propagated to the level of the process. The process flow control evaluates the statuses to assess how to proceed.
● Statuses linked to errors that occur as of a specific step number in the process are assigned only to the steps where the error occurred.
● All the other steps in the process maintain their status to make it easier to identify the source of the error and the steps that should be executed again.
In case the user manually sets a step to status On Hold, the propagation rules work in the same way as for all error statuses. Therefore, processes that have this status are also listed in the status category Interrupted.
You can view the error statuses in the Monitor Deliveries UI. Errors are propagated upwards from step level to process definition level and then to the data delivery agreement level.
Status Impact on Process Flow
Error – Data Delivery Agreement Level (All Steps) (1200) The process flow control (PFC) stops all processing for process instances belonging to this particular data delivery agreement.
190 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Status Impact on Process Flow
Error – Data Delivery Agreement Level (Subsequent Steps) (1400)
If a process has a step that contains an error, this status ensures that all the process instances that are assigned to the corresponding data delivery agreement also acquire the statusError.
For example, there are two instances of process definition ABC. Process definition ABC is assigned to data delivery agreement DEF. Each process instance has three steps. If for process instance 1, the status is Data Delivery Agreement – Error From at step 2, then the process flow control executes process instance 2 up to step 1 only. This affects data processing at the level of data delivery agreement and, therefore, affects all process instances linked with that agreement.
Steps higher than this step from processes in the same data delivery agreement are not triggered.
On Hold – Data Delivery Agreement (All Steps) (0120) ● The user sets this status manually in the Monitor Deliveries.
● The PFC holds all the process instances that belong to this particular data delivery agreement.
On Hold – Data Delivery Agreement (Subsequent Steps) (0140)
Refer to Data Delivery Agreement Level (Subsequent Steps) (1400).
Errors (1000) ● The status management framework propagates this status if different steps belonging to the same process have different error statuses assigned to them.For example, process definition ABC has three steps and these steps can be executed in parallel: step 1 has the status Data Delivery Agreement – Error All; step 2 has the status Process Definition – Error From. Status management propagates the status Errors to the corresponding processes and data delivery agreements.
● To know how to proceed, PFC checks the statuses of the steps and assesses what action to take based on each of their statuses.
3.3.6.1 Errors
Errors are propagated upwards from step level to process definition level and then data delivery agreement level.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 191
Status Details Impact on Process Flow
Error - Process Definition Level (All Steps) (1100)
The process flow control (PFC) stops all processing for process instances belonging to this particular process definition.
Error - Process Definition Level (Subsequent Steps) (1300)
For more information, see the status details for Refer to Process Definition — Hold From.
Errors (1000) ● The status management framework propagates this status if different steps belonging to the same process have different error statuses assigned to them.For example, process definition “ABC” has three steps and these steps can be executed in parallel: step 1 has the status Process Definition — Error All; step 2 has the status Process Definition — Error From. Status management propagates the status Errors to the corresponding processes and data delivery agreements.
● To know how to proceed, the PFC checks the statuses of the steps and assesses what action to take based on each of their statuses.
On Hold - Process Definition (All Steps)(110)
● The user sets this status manually in the Monitor Deliveries UI.● The PFC holds all the process instances that belong to this particular process definition.
On Hold - Process Definition (Subsequent Steps) (130)
● The user sets this status manually in the Monitor Deliveries UI.● The PFC holds the execution of all the process instances that belong to this particular proc
ess definition.● Steps higher than this step from processes in the same process definition are not triggered.● For example, there are two instances of process definition “ABC”:
Each process instance has three steps.If for process 1, the status Process Definition - Hold From is set for step 2, then the PFC executes process 2 up to step 1 only since this status is applicable at process definition level.
3.3.6.2 Errors in Data Delivery Agreements
Errors are propagated upwards from step level to process definition level and then data delivery agreement level.
Status Details Impact on Process Flow
Error- Data Delivery Agreement Level (All Steps) (1200)
The process flow control (PFC) stops all processing for process instances belonging to this particular data delivery agreement.
192 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Status Details Impact on Process Flow
Errror - Data Delivery Agreement Level (Subsequent Steps) (1400)
● If a process has a step that contains an error, this status ensures that all the process instances that are assigned to the corresponding data delivery agreement are put into status Error..For example, there are two instances of process definition “ABC”. Process definition “ABC” is assigned to data delivery agreement “DEF”. Each process instance has three steps. If for process instance 1, the status is Errror - Data Delivery Agreement Level (Subsequent Steps) at step 2, then the process flow control executes process instance 2 up to step 1 only. This affects data processing at the level of data delivery agreement and, therefore, affects all process instances linked with that agreement.
● Steps higher than this step from processes in the same data delivery agreement are not triggered.
On Hold – Data Delivery Agreement (All Steps) (120)
● The user sets this status manually in the Monitor Deliveries.● The PFC holds all the process instances that belong to this particular data delivery
agreement.
On Hold - Data Delivery Agreement Level (Subsequent Steps) (140)
Refer to Error - Data Delivery Agreement Level (Subsequent Steps).
Errors (1000) ● The status management framework propagates this status if different steps belonging to the same process have different error statuses assigned to them.
● For example, process definition “ABC” has three steps and these steps can be executed in parallel: step 1 has the status Data Delivery Agreement — Error All; step 2 has the status Process Definition — Error From. Status management propagates the status Errors to the corresponding processes and data delivery agreements.
● To know how to proceed, PFC checks the statuses of the steps and assesses what action to take based on each of their statuses.
3.3.6.3 Errors in Steps
Use
An error at step level can impact process definition level and/or data delivery agreement level.
Once a SAP NetWeaver BW process chain is triggered by SAP Demand Signal Management, version for SAP BW/4HANA, it runs autonomously and reports a positive or negative status back to SAP Demand Signal Management, version for SAP BW/4HANA. This requires that the process type /DDF/STATU is used at the end of the process chain. If a process chain fails, for example due to an extraction issue or DTP failure, the step then shows the status Error - Data Delivery Agreement Level (Subsequent Steps) (1400).
Ensure you have scheduled a periodic job to adjust the step statuses when a process chain fails. You can use the Administrative Tasks Job standard job.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 193
Example
Examples for failed steps are:
● Requests with red statuses in the data target (see Resolving Step Errors (I) [page 194])● Requests with red statuses in the step (data does not contain errors) (seeResolving Step Errors (I) [page
194])● Requests with red statuses in the step (data contains errors) (see Resolving Step Errors (II) [page 195])● Transformation errors (software bug) (see Resolving Step Errors (II) [page 195])
More Information
User Actions on Steps [page 186]
Errors in Data Delivery Agreements [page 192]
Errors [page 191]
Scheduling and Monitoring Jobs for Automatic Data Uploads [page 86]
Administrative Tasks Job [page 92]
3.3.6.3.1 Resolving Step Errors (I)
Use
To fix errors as a result of red statuses in data target do the following:
1. Log on to the Monitor Deliveries UI2. Choose the erroneous process and mark the step that failed.3. Choose Display Process Chain Messages, which forwards you to the SAP NetWeaver Business Warehouse
logon screen and go to Data Warehousing Workbench: Modeling (transaction RSPC1).4. Click on execute to confirm the suggested process chain and log ID and the process chain log is displayed.5. Right-click on the Data Transfer Process element and choose Administrate Target.6. In the Requests tab page, select the lines containing red request statuses and delete them.
○ If the step fails because of a previous red request in the target, delete the red request from this step in the target.
○ If the step fails because its own request turned red and there is no error in the data, solve the issue that turned the request red: This could be any number of things from data being locked to the database being full, for example.
○ If the step fails because of an error in the transformation, transport a fix in the system.7. Select the process chain component that failed and right-click to choose Repair from the context menu.8. Either the repair is successful and the data upload continues or you need to cancel the processing and do a
reload of the data.○ If the process chain is executed successfully, its status changes to green. The status of the process
(which was propagated from the step status) changes to Completed automatically.
194 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
○ If you cannot repair the process chain, return to the Monitor Deliveries UI to cancel the corresponding step. This cancels the process. The status of the old process changes from Error to Canceled. You then choose to reload the data and the system creates a new process instance for the same data delivery instance that can be uploaded.
More Information
Errors in Steps [page 193]
3.3.6.3.2 Resolving Step Errors (II)
Use
If the step failed due to one of its own requests turning red and there are errors in the data, proceed as follows:
1. Log on to SAP NetWeaver Business Warehouse and go to Data Warehousing Workbench: Modeling (transaction RSPC).
2. Under Demand Data Foundation, select the process chain that failed.3. Delete the red request from the target that caused the error thereby removing corrupt data from the
upload.4. Delete all other requests from this step even if they are green.5. Fix the problem with the data in the source of this step.6. Schedule that process chain synchronously.7. Restart the entire process chain and step.
More Information
Errors in Steps [page 193]
3.3.7 Planning Data Deliveries
Use
Using the Plan Data Deliveries app you can plan data deliveries for data delivery agreements of the following agreement types:
● Retailer Data for point of sale data (POS data)● Retail Panel Data
You can have the following options
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 195
● You create a planned delivery as follows:1. You select a data delivery agreement.2. In the calendar view, you select a date for which you want to create a new delivery.3. (Optional) You enter the latest date of the data that is contained in the data delivery (field Contains
Data Until).4. By default, the delivery is planned as all-day event. You can define a specific time frame in which the
data delivery is expected.5. You save.
● If, for example a data delivery is rescheduled or the delivery is erroneous, you drop a planned delivery as follows:1. You select a data delivery agreement.2. In the calendar view, you select a date for which a delivery is planned. If there are multiple data
deliveries planned for a day, you select a planned delivery from the list.3. You select the Dropped checkbox.4. You select a reason, the date when the reason was communicated, and optionally a comment.
Note
You can use the reasons that are defined in Customizing under Cross-Application ComponentsDemand Data Foundation Data Upload Basic Settings Define Reasons for Dropping Planned Data Deliveries .
● You delete a planned delivery as long as it has not been assigned to a received data delivery.
RecommendationIf you want to have historical information on the data delivery, for example, to track the performance of the data provider, you should only drop and not delete the planned delivery.
More Information
Defining Data Delivery Agreements [page 106]
Assignment of Received Deliveries to Planned Deliveries [page 197]
Monitoring Data Deliveries [page 200]
196 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
3.3.7.1 Automatic Assignment of Received Deliveries to Planned Deliveries
Use
1. When a data delivery is received, the system checks, whether the date of the planned delivery and the date on which the data delivery was received is the same.If the dates are the same, the system automatically assigns the received data delivery to the planned data delivery.
2. After the upload, the system checks, if the planned data delivery comprises the information about the latest date of the data contained in the expected data delivery (attribute Contains Data Until).If the latest date of the contained data falls in the permitted time frame of a planned data delivery for the related data delivery agreement, the system automatically assigns the received data delivery to the planned data delivery.
Note
The allowed deviation is defined in Customizing under Cross-Application Components Demand Data Foundation Data Upload Basic Settings Define Allowed Delivery Deviation
3. If both checks fail and the system cannot assign the received data delivery to a planned data delivery, the data delivery is marked as Unexpected.
4. On the Data Delivery Monitor app, you can see the current status of received and planned deliveries. For example, you can see, which data deliveries are received in time or which are delayed.You can assign unexpected data deliveries to planned data deliveries, or change the assignment.
More Information
Planning Data Deliveries [page 195]
Monitoring Data Deliveries [page 200]
3.3.7.2 Mass Maintenance of Planned Data Deliveries
Use
The user that maintains the planned data deliveries has to maintain all deliveries for all databases the company will receive in a year. Usually such a deliveyr schedule is provided once a year by the data provider. As a data upload administrator, you can do mass maintenance of existing or new planned data deliveries to save time and effort from the SAP Fiori Launchpad in the Data Delivery Monitor app. The app provides the following functions for multiple planned data deliveries:
● Create
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 197
● Delete● Edit
More Information
Planning Data Deliveries [page 195]
Planning Data Deliveries Using GWM Excel [page 198]
3.3.7.3 Planning Data Deliveries Using GWM Excel
Use
You have the flexibility to plan data deliveries using SAP Gateway for Microsoft: Excel Add-In (GWM Excel) and have all the new or changed data from it also available in the Plan Data Deliveries app.
You can do the following tasks in GWM Excel:
● Enter new planned data deliveries● Edit existing planned data deliveries● Delete existing planned data deliveries
Then you can upload all the data from GWM Excel back to the Plan Data Deliveries app.
Prerequisites
1. Install GWM Excel on your computer.For more information about installing and configuring this add-in, see SAP Library at http://help.sap.com
and choose Technology SAP Gateway SAP Gateway for Microsoft 1.0 Installation and Configuration Information Microsoft Excel Add-In Documentation .
2. Execute the GWM Excel wizard to do the binding of the OData Service /DDF/DDEL_PLAN_SRV.
RecommendationWe recommend you use the following properties as filters to fetch records from the Plan Data Deliveries app:
○ Agreement○ Date From○ Date To
3. Fill your GWM Excel worksheet with planned data deliveries by clicking Fetch Records and then selecting properties and values in the Select Filter Values pop-up.○ Properties are displayed in the GWM Excel as column headers in your worksheet and may be, for
example, Agreement, Date From, and Date To.
198 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
○ Values are displayed in the GWM Excel as information in the rows and this information is the planned data deliveries and the data belonging to the planned data deliveries.
Procedure
CautionDo not insert, change the sequence of, or remove columns from the GWM Excel, as this breaks the binding. If you do any of these changes, click Bind to redo the binding.
1. Depending on which properties you selected in the Select Filter Values pop-up, you can begin entering new values for these properties in the GWM Excel.You can change the column headers if you wish..
2. Enter values for the following mandatory properties:○ Agreement○ Date From○ Date To (mandatory if All Day is not filled). In the majority of cases, the Date To is equal to the Date
From)○ Time From (mandatory if All Day is not filled)○ Time To (mandatory if All Day is not filled)○ Reason for Change (mandatory if planned delivery is dropped)
3. Click Submit Changes to upload all the changes to the Plan Data Deliveries app.The Submit Change Response pop-up provides a list of statuses for all the new and changed records that you made. Correct any errors in the new or changed values before submitting changes again.
NoteIf the information in the Plan Data Deliveries app and the GWM Excel becomes desynchronized because, for example, several users are editing the collection of data at once, click Clear Data in GWM Excel and then click Fetch Records to fill the worksheet with the most up-to-date values.
More Information
For more information, go to SAP Community Network at https://scn.sap.com/community/demand-signal-management and choose Content Documents to find the document “How to Create Planned Deliveries Using SAP Gateway for Microsoft: Excel Add-In”.
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 199
3.3.8 Monitoring Data Deliveries
Use
You use the Data Delivery Monitor app to get an overview over the statuses of all data deliveries that are planned or received.
You can perform the following actions on received data deliveries:
● Assign Planned Delivery to assign planned deliveries or remove an existing assignment, if the system could not automatically assign the received delivery to a planned delivery
● Release Data to open the Release Data for Global Reporting applicationThere you can check and change the release status of the current and previous data deliveries.
● Monitor Upload Process to open the Monitor Deliveries Web UI to resolve errors that occurred during the upload process
● Navigate to Quality Validation to ensure the quality of the uploaded data as long as the data delivery has been received
The following table displays the possible statuses of a data delivery.
Status Reason
Expected The data delivery has not yet been received, the planned date is in the future.
In Process A data delivery was received, but the upload processes are not yet completed.
Cancelled A data delivery was received, and one of the upload processes has been canceled.
Error A data delivery was received, and one of the upload processes has status Error.
Available The data delivery was received, successfully uploaded, and a planned delivery has been assigned.
Delayed The data delivery has not yet been received; the planned date is in the past.
Unexpected A data delivery has been received that could not be automatically assigned to a planned data delivery.
A planned delivery has to be assigned manually.
More Information
Planning Data Deliveries [page 195]
200 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Upload
Automatic Assignment of Received Deliveries to Planned Deliveries [page 197]
Release Data for Global Reporting [page 295]
Quality Validation [page 202]
SAP Demand Signal Management, version for SAP BW/4HANAData Upload C O N F I D E N T I A L 201
4 Quality Validation
Use
Whenever data is uploaded into the system, the quality validation automatically checks the quality of the uploaded data using specified key figures.
For each key figure you define in Customizing in between which thresholds the values are expected, and in which cases the process step is stopped.
The process step can be stopped in the following cases:
● The defined thresholds are violated.● A key figure could not be calculated.
The calculation of a key figure can fail due to one of the following reasons:○ The calculated data was not provided with the current data delivery.○ Not enough historic data is available for the calculation of a comparative key figure.
NoteIf data deliveries are automatically archived after a specific amount of days you have to make sure that there are enough historic data deliveries available for comparison. You define in Customizing how many deliveries are needed.
The quality validation is implemented as a separate process step and can be supervised using the process monitor. If the process step ends with an error the whole process is stopped.
You review the quality validation results in the process monitor. If you want to proceed with the process although the quality validation failed, you can skip the step manually.
More Information
Overview of Key Figures in Quality Validation [page 204]
Process Monitor [page 180]
Reports for Administrating Quality Validation [page 207]
202 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Quality Validation
4.1 Configuring Quality Validation
Use
The system performs the quality validation step using sequences of functions that are defined in Customizing. Each function represents a key figure for quality validation.
You make key figure-specific settings and define per data delivery agreement which sequences are used at specified execution times.
NoteYou can use Business Add-In BAdI: Default Values for Set Up of Quality Validation /DDF/BADI_QV_DEFAULT to define default values that are only taken into account if no other settings have been specified.
Prerequisites
You have made the following settings for quality validation in Customizing for Cross-Application Components under Demand Data Foundation Quality Validation :
● Set Up Quality Validation● Define Number Range for Key Figure Detail Data
You have defined data delivery agreements. To define data delivery agreements, choose SAP Menu Cross-Application Components Demand Signal Management Data Upload Define Data Delivery Agreementson the SAP Easy Access screen.
Activities
1. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Quality Validation Configure Quality Validation ore use transaction Configure Quality Validation /DDF/QV_DEF.
2. Select the point in time at which the system executes sequences of functions (view Select Execution Time).In the standard system the following execution times are used:○ After Data Acquisition
This execution time is used for all standard sequences.○ Before Data Extraction to Global Reporting
This execution time is used for data delivery agreements that deal with market research data that is uploaded for global reporting.
3. Select a data delivery agreement, and assign a sequence of functions that is defined in Customizing (view Assign Sequence to Data Delivery Agreement).
SAP Demand Signal Management, version for SAP BW/4HANAQuality Validation C O N F I D E N T I A L 203
You can define different settings for each data delivery agreement that is available in the system. If you do not enter a data delivery agreement, this setting is used as default for all data delivery agreements for which not settings are defined.
4. For each function contained in the selected sequence make the following key figure-specific settings (view Define Key Figure-Specific Settings):1. Specify the key figure type.
○ An absolute key figure contains an absolute value, for example, the number of locations in a data delivery.
○ A relative key figure contains a relative value, for example, the percentage of new or unknown locations relative to the complete number of locations of this data delivery.
2. Define whether the key figure is comparative.A comparative key figure is used to evaluate the data quality of a data delivery comparing the data of the current data delivery to a specified number of historic data deliveries. For a comparative key figure you make the relevant settings for the comparison in Customizing.You can use Business Add-In BAdI: Number of Deliveries for Comparison in Quality Validation to define per data delivery agreement how many data deliveries are considered for key figures that compare data of multiple data deliveries.
3. Define the lower and upper threshold for the value the function calculated for the respective key figure.If the value for the key figure lies between the specified thresholds, the data quality is considered as being sufficient regarding this key figure.
4. Define the status of quality validation step that is set in the following cases:○ The thresholds for this key figure are violated.○ The key figure could not be calculated.
You can choose one of the following statuses:○ Error - Stop Processing○ Warning - Continue Processing○ Success - Continue Processing (No Message)
4.2 Overview of Key Figures in Quality Validation
In the standard system the following key figures are defined:
Key Figure Description Calculation
No. of Locations Number of locations provided with the specified data delivery
Absolute value
No. of Products Number of products provided with specified data delivery
No. of Products with Qty > 0 Average number of point of sales (POS) records for each location with a quantity greater than 0
204 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Quality Validation
Key Figure Description Calculation
No. of New Locations (Relative) Number of new or unknown locations Percentage of the number of new records in the current data delivery related to the total number of records in the current data delivery
No. of New Products (Relative) Number of new or unknown products
No. of Locations w/o GLN (Relative) Number of locations without global location number (GLN)
Percentage of the number of records without the property in the current data delivery related to the total number of records in the current data delivery
No. of Products w/o GTIN (Relative) Number of products without global trade item number (GTIN)
No. of POS Rec. w/o Prd. Master (Rel.) Number of POS records that do not refer to a known product
No. of POS Rec. w/o Loc. Master (Rel.) Number of POS records that do not refer to a known location
No. of Stock Rec. w/o Prd. Master (Rel.) Number of stock (STK) records that do not refer to a known product
No. of Stock Rec. w/o Loc. Master (Rel.) Number of stock records that do not refer to a known location
No. of Products (Comparative) The number of products in the current delivery compared to the number of products in a specified number of historic data deliveries
Division of the number of products in the current delivery (numerator) and the average number of products in the historic data deliveries (denominator)
No. of Products with Qty > 0 (Comp.) The number of products in the current delivery with a quantity greater than 0 compared to the number of products with a quantity greater than 0 in a specified number of historic data deliveries
Division of the number of products in the current delivery (numerator) and the average number of products in the historic data deliveries (denominator)
SAP Demand Signal Management, version for SAP BW/4HANAQuality Validation C O N F I D E N T I A L 205
Key Figure Description Calculation
Data Delivery Is Not in Sequence The latest date of the POS data in the current delivery is not before the latest date in the historic data deliveries
The latest date of the sales POS data (stock data is not taken into account) in the current delivery compared to the latest date in the historic data deliveries of the exact same data delivery agreement in the sales acquisition layer.
If the latest date in the current delivery is less than (before) the latest date in the historic data, the delivery is “not in sequence” and the function returns 0. If the latest date in the current delivery is equal or greater than (after) the latest date in the historic data, the delivery is “in sequence” and the function returns 1.
Valid threshold values in the quality validation settings are 0 and 1.
Market Coverage of Consolidation Percentage of sales value that is covered by the consolidated products relative to the market total
(Sum over all sales values of all consolidated products at all consolidated locations) divided by (sum over the sales values of the product total of the data delivery agreement, as defined in consolidation settings, at all consolidated locations) multiplied by 100
In the most common case there is only one consolidated location that represents the market total. In this case the calculation is as follows:
(Sum over the sales values of all consolidated products for the market total) divided by the sales value of the product total at the market total, multiplied by 100
No. of Missing Locations The number of missing locations provided with specified delivery_id and process_id compared with the last X deliveries
Absolute value
206 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Quality Validation
Key Figure Description Calculation
No. of Missing Products The number of missing products provided with specified delivery_id and process_id comparing with the last X deliveries
Absolute value
Change in Tot. Sales wrt Prev X Deliveries The change in total sales value compared to the previous x deliveries
Absolute value
Change in Tot Sales wrt Prev X Weekdays The change in total sales value compared to the previous x same weekdays (5 previous Mondays)
Absolute value
4.3 Reports for Administrating Quality Validation
The following reports are available for quality validation:
Report Technical Name Transaction Code Purpose
Cleanup Detail Data of QV Key Figures
/DDF/FDH_REORG_QV_KPI_CONTEXT
/DDF/FDH_REORG_QV You can use this report to remove obsolete or outdated detail data for key figures in quality validation.
The report removes all detail data for key figures that are related to no longer existing delivery IDs, or all detail data that is older than a specified date.
SAP Demand Signal Management, version for SAP BW/4HANAQuality Validation C O N F I D E N T I A L 207
5 Data Harmonization
Use
Different external and internal data providers usually use different master data attributes, such as keys, number ranges or descriptions. To ensure that you have a consistent set of master data for consuming applications, and for analytics and reporting across different data providers, the data must be harmonized.
ExampleA certain product produced by the manufacturer “Taste” is called “Chili Chocolate” in-house at the manufacturer. The different retailers and syndicated data providers might have other names for the product, such as “CHILI_CHOC_1254”, “Chocolate_Taste_Chilli”, or “03-TAS-chil”.
You use data harmonization to link all data that refers to the same object (for example, the same product or the same retailer store) to one single data record which exclusively identifies the object, no matter in which data source it appeared and independent from its description or ID that was provided. This harmonized object record contains consistent harmonized attributes. It also contains a link to all data records that were taken into account for harmonization of this record.
Process
Data Harmonization Process
The process of data harmonization can be segmented into the following phases:
1. As an administrator you check and adapt Customizing and configuration of data harmonization.2. If for harmonized objects only specific attribute values are allowed, you restrict attribute values.3. The data is uploaded. During the upload process, the system maps and harmonizes the uploaded data for
all object types for which automatic harmonization is turned on.
NoteYou can view these harmonized object types in theHarmonization Status app to get an overview of the status of the object types.
208 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
For issues that are not resolved automatically, the system creates work items.4. As a harmonization user you perform the following manual tasks:
○ For source attribute values that are derived from attribute values of source objects, you define derivation instructions for source objects.
○ For harmonized attribute values that are derived from attribute values of source objects, you define derivation instructions for harmonized objects.
○ You monitor the worklists and resolve the issues.○ You review and edit the mapping of source objects to harmonized objects.○ You review and edit harmonized objects in single editing or mass change mode.○ You define unit conversions from the base unit of measure (UoM) of source products to the base UoM
of harmonized products.○ For objects that are relevant for global reporting, you release the harmonized object for global
reporting.
More Information
5.1 Harmonized Objects and Source Objects
In data harmonization we refer to all incoming master data from the various data origins as “source objects”.
Harmonized objects contain all attributes of an object that are relevant in the business context. All attributes are stored with harmonized values.
A harmonized object also contains the reference to all source objects from the various data origins that refer to the same “real” object.
The aim of data harmonization is to have similar source objects grouped together and assigned to one unique harmonized object with harmonized attributes that can be used for consistent reporting.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 209
Harmonized Object with Assigned Source Objects
5.1.1 Example: Harmonized Product with Assigned Source Products
The following graphic shows a harmonized product GP00001 that has five source products from various data origins assigned.
210 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
Example: Harmonized Product with Assigned Source Products
5.2 Object Types Used in Data Harmonization
Use
In the standard system data harmonization supports the master data object types Product and Location.
For the storage of allowed attribute values, an additional object type Attribute Value is defined.
To each object type other object types can be associated. Associated objects contain additional data of the objects, but are handled separately during harmonization.
Object Type
Associated Object Type Comment
Location FDH_LOC
Work item FDH_WI Work items are created as soon as a situation occurs for an object that needs manual interaction.
Product FDH_PROD
Work item FDH_WI
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 211
Object Type
Associated Object Type Comment
Unit conversions from source objects to harmonized objects FDH_CONV
The conversion from the base unit of measure (UoM) of a source product to the base UoM of a harmonized product is stored as harmonized object of type FDH_CONV that is associated to the source product.
Attribute value FDH_ATT_V
Attribute hierarchy FDH_HIER Objects of object type FDH_HIER define how allowed attribute values hierarchically depend on each other.
More Information
Work Items [page 212]
Units of Measure in Data Harmonization [page 214]
Defining Allowed Attribute Values [page 229]
5.3 Work Items
Use
During data harmonization the system automatically creates work items to indicate that a specified step of the harmonization process needs manual interaction. A work item contains a description of the issue that occurred and the affected harmonized object or source object.
The system collects all work items for each object type in one worklist per work item type.
ExampleFor a new source product, the system has found a similar harmonized product to which the source product could be assigned. However, the score of the similarity is not above the threshold that is defined for automatic harmonization in Customizing. The assignment has to be done manually.
If you do not want to process a work item, you can also disregard it. The system does not resolve the work item; the problem that caused the work item can persist.
212 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
The following table provides an overview of the possible work item types and how they can be resolved manually:
Work Item Type Possible Reason Procedure
01 Map Manually
● No matching harmonized object could be determined.
● The minimal score for automatic mapping that is set in Customizing has not been reached.
● Multiple matching harmonized objects were determined.
● The source object has been unassigned from the harmonized object manually.
Map data manually.
02 Confirm Mapping
The Customizing settings require confirmation of mapping and the system user who created the mapping has not sufficient authorization.
Review the mapped objects and confirm the mapping.
03 Revise Object Attribute values of the harmonized object or the source object were changed that affect dependent data, for example, unit conversions.
NoteIn the standard system these work items are created in the following situations:
● The base unit of measure (UoM) of a harmonized object to which a source object is assigned has been changed.
● The source object is assigned to a different harmonized object.
In these cases, the unit conversion from the base unit of measure of the harmonized object to the base UoM of the source object has to be revised and adapted, if necessary.
Open the object in edit mode, revise the dependent data, and save. Delete the work item manually.
04 Copy Attribute Values
The system tried to copy a new attribute value to a harmonized object, but the Customizing settings require changes to be approved.
NoteNo work item of type Copy Attribute Values is created if the harmonized object is new (that is, it has not been saved in the database).
Open the worklist for this work item type, review the current attribute value in the harmonized object, and choose Resolve.
NoteThe worklist shows for each work item the technical name of the affected attribute and the value to be copied.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 213
Work Item Type Possible Reason Procedure
05 Perform Pending Mapping
The harmonized object to which a source object was mapped could not be locked and the mapping could not be saved.
Retry to perform the mapping or reject it in the worklist.
NoteIf you want to use this work item type for manual changes also, you can specify the user-specific parameter Use Work Item Queue accordingly.
06 Perform Pending Update
The source object could not be locked during data import and could not be updated.
Retry to perform the update in the worklist.
07 Release for Global Reporting
A source object that is relevant for global reporting has been assigned to the harmonized object.
Change the value of attribute Status for Global Reporting to Released if you want to use the object in global reporting or to Revoked if not.
08 Recalculate Status for Global Reporting
The harmonized object to which a source object that is relevant for global reporting was assigned could not be locked and the attribute Status for Global Reporting could not be updated.
Retry to perform the update in the worklist.
More Information
Monitoring Work Items [page 254]
Resolving or Disregarding Work Items in Manual Mapping [page 256]
5.4 Units of Measure in Data Harmonization
Use
Alternative Units of Measure
Each product can have one base unit of measure (UoM) and other alternative UoMs. In data harmonization, alternative UoMs for the dimensions mass and volume are stored with the product. If additional UoMs are provided with the source data, these can be stored as name/value pairs that are associated to the source product.
For harmonized products you can define the conversion from the base UoM to the alternative UoMs for dimensions mass and volume. To do so, you enter the denominator and numerator for the conversion into the alternative UoM in the harmonized product.
214 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
Base Unit Conversions
If a source product is mapped to a harmonized product, the base units of measure (UoM) can be different. In that case, you define a conversion that is used to convert the quantity in the base UoM of the source product to the base UoM of the harmonized product.
The unit conversion is associated to the source product.
More Information
Object Types Used in Data Harmonization [page 211]
Name/Value Pairs [page 215]
Editing Harmonized Objects [page 260]
5.5 Name/Value Pairs
The data that has been imported into data harmonization may contain attributes that cannot be mapped to the internal attribute structures.
The system stores the names and values of these additional attributes in dynamic name/value pairs. Additionally, attributes of imported data can be explicitly mapped into name/value pairs with the mapping instruction sets.
You can display the name/value pair data with the source object.
Name/value pairs are not harmonized.
If you also want to use these attributes in harmonized objects, you first have to enhance the attribute structure. After that you can transfer the name/value pairs into static attributes using report /DDF/FDH_COPY_NVP_TO_ATTR Transfer Name/Value Pairs to Object Attributes.
Attributes that are stored as name/value pairs can also be used as source attributes for the derivation of attribute values for harmonized objects or source objects.
5.6 Search Strategy
Use
The search strategy defines how the system performs a search for objects in the database where the object data is stored.
Depending on whether or not you use an SAP HANA database, the available search functions differ.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 215
Prerequisites
Search strategies are defined in Customizing as sequences of functions for the purpose Search Functions FDH_SEARCH. The functions define which object attributes can be used as search criteria and which operators are available.
You define search sequences in Customizing for Cross-Application Components under Demand Data Foundation Data Harmonization Technical Settings Define Sequences for Data Harmonization .
Features
In the standard system the following search strategies are defined:
● Default search (All Attributes) for the object types Product and LocationThis search strategy finds objects using the standard search on the database.
● Fuzzy-Enabled Search for the object types Product and LocationWith this search strategy you can use the fuzzy search for text-like attributes by choosing the operator contains.The fuzzy search also finds similar words or phrases and can therefore cope with spelling mistakes, for example.For all other attributes that are not text-like and all other operators the default search is performed.
● Search Work Items by Object Attributes for the object type Work ItemThis search strategy finds work items that exist for objects with the specified object attribute. It is only available on user interfaces that contain work items.
● Search Including Data Delivery AgreementThis search strategy allows you to use a data delivery agreement as a search criterion instead of the attributes Data Provider, Data Origin and Context. The selected data delivery agreement is used by the system to determine its assigned Data Provider, Data Origin and Context, which are used by the system as search criteria.
● Search Manually Changed Harm. RecordThis search strategy finds harmonized records for which attribute values have been changed manually or manually protected.
● Search Manually Changed Source RecordThis search strategy finds source records for which attribute values have been changed manually.
NoteIt is currently not possible to change source records manually with the standard harmonization UIs.
● Search by Distributed Source Record ValueYou can use this search strategy to get an overview of the assignment of source objects to harmonized objects, based on the attribute values (of source objects) of a specified attribute. You can use this overview, for example, to evaluate whether it makes sense to perform the harmonization based on the specified attribute.This search strategy finds records that meet the following criteria:○ The system determines all attribute values of existing source record attributes. Initial values are also
taken into account.
216 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
○ For all source objects that have the same attribute value (for the specified attribute), the system checks how many harmonized objects are assigned these source objects.
○ If all source objects are assigned to the same harmonized object, the system ignores those objects.○ If the source objects are assigned to multiple harmonized objects, the system adds the records to the
result list.
5.7 User-Specific Parameters
You can use user-specific parameters on the user interfaces (UI) for data harmonization. These settings are stored with your user profile and are always considered when you open a UI. Depending on the UI not all user-specific parameters are available.
The following table lists all parameters that can be available.
Parameter Description
Aggregate Messages If you aggregate the messages, only the most important messages for actions performed on the UI are displayed.
Data Origin for Labels If specific descriptions are defined for specific attributes depending on data origins in Customizing, you can change the labels of fields or columns on the UI accordingly.
ExampleOne of your retailers uses the term “article” instead of “product”.
The administrator has defined alternative descriptions that refer to “article” for all attributes whose description contains the term “product” (see Defining the Appearance of Object Attributes [page 221]).
You can then easily switch to the terminology used by the retailer.
Field Length in Forms You can define the display length of input fields in forms. The system calculates the display length depending on the technical length of the displayed attribute.
Default Score You can specify the default score that is used for determining mapping proposals.
Use Work Item Queue If you select this checkbox, the system creates work items for specific pending activities that could not be performed during harmonization, because the involved objects were locked. The system then stores these work items in a queue that can be processed automatically or manually.
Highlight Color You can define the color in which explicit values are highlighted in a derivation instruction set.
Columns for Source Values You can choose how many columns are used to display source attribute values in a derivation instruction set and which information is displayed there.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 217
Parameter Description
Search Table If you want to search in the table that displays derivation instructions for a specific set, you can choose whether the system also searches in collapsed nodes, or only in expanded nodes. To improve performance, search only expanded nodes if the derivation instruction set contains a huge number of instructions.
5.8 Support of Global Reporting in Data Harmonization
Source objects can be marked as Relevant for Global Reporting when they are uploaded into SAP Business Warehouse. When a source object that is relevant for global reporting is assigned to a harmonized object, the harmonized object also is marked as relevant for global reporting.
You review and release harmonized objects that are relevant for global reporting.
NoteProducts can contain additional attributes that provide attribute values that are used globally (Manufacturer, Brand, Product Category, and Product Subcategory). These attributes can also be harmonized.
Status for Global Reporting Remarks for Source Object Remarks for Harmonized Object
Not Relevant The object is harmonized using the standard harmonization processes
Relevant The source object is relevant for global reporting.
A work item of work item type Release for Global Reporting exists.
The harmonized object has to be released.
You can do this by either changing the status in the global object or using the work list.
Released Not available You set this status manually to make this object available for global reporting.
Revoked Not available This status of the harmonized object is automatically set when all source objects that are relevant for global reporting have been unassigned.
You can also set the status manually if you want to exclude the object from global reporting.
218 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
5.9 Configuring and Administrating Data Harmonization
Use
The following table shows the tasks you have as an administrator for data harmonization, and explains in which cases and where you make the individual settings. Depending on your system landscape, you make Customizing settings in the development system, and all other settings in the production system.
Task When? Where?
Set up data harmonization in Customizing
Implementation phase Customizing under Cross-Application
Components Demand Data Foundation
Data Harmonization
Define appearance of object attributes on user interfaces
Implementation phase Customizing activity Cross-Application
Components Demand Data Foundation
Data Harmonization Technical Settings
Make Attribute-Specific Settings
Implementation phase
Whenever a new data origin is introduced in the system for which you do not want to use the default settings
On the SAP Easy Access screen choose
Cross-Application Components Demand
Signal Management Data Harmonization
Set Up Data Harmonization or use transaction Set Up Data Harmonization /DDF/FDH_SETUP.
Set up automatic harmonization
Implementation phase Customizing activity Cross-Application
Components Demand Data Foundation
Data Harmonization Technical Settings
Define Sequences for Data Harmonization
Implementation phase
Whenever a new data origin is introduced in the system for which you do not want to use the default settings
On the SAP Easy Access screen choose
Cross-Application Components Demand
Signal Management Data Harmonization
Set Up Data Harmonization or use transaction Set Up Data Harmonization /DDF/FDH_SETUP.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 219
Task When? Where?
Define priorities for copying attribute values
Whenever you want to change the config-uration for object attributes (for example, because new object attributes were implemented)
Whenever a new data origin is introduced in the system for which you do not want to use the default settings
On the SAP Easy Access screen choose
Cross-Application Components Demand
Signal Management Data Harmonization
Set Up Data Harmonization or use transaction Set Up Data Harmonization /DDF/FDH_SETUP.
Define restrictions for harmonized attribute values
Implementation phase
Whenever you want to change which object attributes are considered (for example, because new object attributes were implemented)
UI Select Attributes
UI Define Allowed Attribute Values
Define harmonization groups and assign them to data delivery agreements
Implementation phase
Whenever a new data delivery agreement is introduced in the system for which you do not want to use the default settings.
On the SAP Easy Access screen choose
Cross-Application Components Demand
Signal Management Data Harmonization
Set Up Data Harmonization or use transaction Set Up Data Harmonization /DDF/FDH_SETUP.
Schedule background jobs Implementation phase Scheduling Background Jobs for Data Harmonization [page 235]
Activate Conversion exits for Product and Location
Implementation phase Activate Conversion Routines for MATERIAL, EANUPC, GLN and DATA_LOAD_ID [page 236]
Activate the action log Implementation phase Activate the Action Log [page 237]
Activate DOASO Implementation phase Activate Data Origin per Attribute for Source Objects [page 238]
Activate distinct ABBIDs Implementation phase Activate Distinct ABBIDs in Data Harmonization UIs [page 239]
Activate the action log and DOASO for user-created object types
Implementation phase Activate the Action Log and DOASO for User-Created Object Types [page 240]
Maintain General Harmonization Parameters
Implementation Phase Maintaining General Harmonization Parameter [page 241]
NoteAccess to Web user interfaces (Web UIs) is granted by authorization roles (PFCG roles) that contain the Web UI in the assigned menu.
220 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
To be able to access Web UIs, for example, from the user menu or in the SAP NetWeaver Business Client (NWBC), you must have a corresponding authorization role assigned to your system user
For more information, see Roles for SAP Demand Signal Management, version for SAP BW/4HANA [page 530].
5.9.1 Set Up Data Harmonization
Use
You use transaction Set Up Data Harmonization /DDF/FDH_SETUP to make the following configuration settings:
1. Define appearance of object attributes on user interfaces2. Set up automatic harmonization3. Define priorities for copying attribute values4. Define harmonization groups and assign them to data delivery agreements
On the SAP Easy Access screen choose Cross-Application Components Demand Signal ManagementData Harmonization Set Up Data Harmonization .
More Information
Defining the Appearance of Object Attributes [page 221]
Setting Up Automatic Harmonization [page 223]
Defining Priorities for Copying Attribute Values into Harmonized Objects [page 224]
Defining Harmonization Groups [page 230]
5.9.2 Defining the Appearance of Object Attributes
Use
For each object type, you define how object attributes are displayed on the following Web Dynpro user interfaces:
1. Mapping UI2. Detail UI for source object3. Detail UI for harmonized objects4. Search UI for source object5. Search and mass change UI for harmonized object
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 221
6. POWL
You can define different settings for each data origin that is available in the system. If you do not enter a data origin for a specific setting, this setting is used as default for all data origins for which not settings are defined.
NoteYou can use Business Add-In BAdI: Default Values for Set Up of Data Harmonization /DDF/BADI_FDH_HARMON_DEFAULTS to define default values that are used when no values have been defined under Set Up Data Harmonization /DDF/FDH_SETUP.
Prerequisites
1. You have defined object types for which harmonization is turned on in Customizing for Cross-Application Components under Demand Data Foundation Data Harmonization Technical Settings Define Object Types (see Object Types Used in Data Harmonization [page 211]).
2. You have defined attribute bundles, groups, and sets that control which attributes are displayed on the Web Dynpro user interfaces with which properties in Customizing for Cross-Application Components under
Demand Data Foundation Data Harmonization Technical Settings Make Attribute-Specific Settings .
3. You have defined data origins. To define data origins, choose SAP Menu Cross-Application Components Demand Signal Management Data Upload Data Delivery Agreements Define Data Origins on the SAP Easy Access screen, or use transaction Define Data Origins /DDF/DORIGIN.
Activities
1. On the SAP Easy Access screen, choose Cross-Application Components Demand Signal ManagementData Harmonization Set Up Data Harmonization , or use transaction Set Up Data
Harmonization /DDF/FDH_SETUP.2. On view Assign Attribute Sets to Attribute Values, assign attribute sets to attribute values.
For example, you can use different attribute sets for different product categories. For each object type, you select one attribute that is used to select the relevant attribute set and assign the attribute sets to specific values.If a data origin uses specific attribute values that are not covered with the standard values or if the same value means different things for different data origins, you can make this assignment dependent on the data origin.
ExampleThe attribute Product Category is defined as the selecting attribute. You are working with product categories Food and Beverages.
You define different attribute sets FOOD and BEVERAGE with the relevant attributes and assign these attribute set to the product categories.
222 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
Data origin 1 uses the attribute value Liquids for the product category Beverages. You assign the attribute set BEVERAGE to the attribute value Liquids.
3. On view Select Object Type, select the object type.4. On view Select Object Attributes, select an attribute.5. On view Define Alternative Descriptions of Attributes, define alternative descriptions for attributes.
For each data origin, you can define how attributes are named in this area. If you then choose a specific data origin in the user-specific parameters, all field labels are displayed using the description defined for this data origin (see User-Specific Parameters [page 217]).
5.9.3 Setting Up Automatic Harmonization
Use
You define for each object type, how automatic data harmonization is performed.
You can define different settings for each data origin that is available in the system. If you do not enter a data origin for a specific setting, this setting is used as default for all data origins for which not settings are defined.
NoteYou can use Business Add-In BAdI: Default Values for Set Up of Data Harmonization /DDF/BADI_FDH_HARMON_DEFAULTS to define default values.
Prerequisites
You have defined object types for which harmonization is turned on in Customizing under Cross-Application Components Demand Data Foundation Data Harmonization Technical Settings Define Object Types(see Object Types Used in Data Harmonization [page 211]).
You have defined mapping instructions that control how attributes are imported, converted, and exported before and after harmonization in Customizing under Cross-Application Components Demand Data Foundation Data Harmonization Technical Settings Define Object Types .
You have defined the sequences for automatic harmonization in Customizing under Cross-Application Components Demand Data Foundation Data Harmonization Technical Settings Define Sequences for Data Harmonization .
You have defined data origins. To define data origins, choose on the SAP Easy Access screen SAP MenuCross-Application Components Demand Signal Management Data Upload Data Delivery AgreementsDefine Data Origins or use transaction Define Data Origins /DDF/DORIGIN.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 223
Activities
1. On the SAP Easy Access screen, choose Cross-Application Components Demand Signal ManagementData Harmonization Set Up Data Harmonization , or use transaction Set Up Data
Harmonization /DDF/FDH_SETUP.2. On view Select Object Type, select the object type for which you want to change the settings.3. On view Define General Parameters, define the following:
○ Specify whether the system automatically harmonizes data provided by a selected data origin at all.○ Specify whether the system automatically creates new harmonized objects for all objects that cannot
be mapped to an existing harmonized object.○ Specify whether the system creates a work item for approval whenever data provided by the specified
data origin is mapped automatically.4. On view Set Up Automatic Harmonization, select the harmonization sequence that is used to determine
similar objects for objects provided by the specified data origin and define the minimal score that must at least be reached by a record so that it is included in automatic harmonization.○ The sequence that is defined for the target record type Source is only used to determine mapping
proposals on the Web user interface for manual mapping.○ The sequence that is defined for the target record type Harmonized is used to determine mapping
proposals on the Web user interface for manual mapping and for automatic mapping of source objects to harmonized objects.
5. On view Define Weighting for Harmonization Score, define how the individual functions of the selected harmonization sequence are weighted for the calculation of the score.
6. If you want to use different mapping instructions for attributes provided by specific data origins, you assign the mapping instruction to the data origin.1. On view Select Mapping Instruction Sets, select the set that contains the mapping instruction you want
to use.2. On view Assign Mapping Instructions to Data Origins, choose the data origins using the input help and
assign a mapping instruction. If you do not enter a data origin for a specific setting, this setting is used as default for all data origins for which not settings are defined.For each data origin, you can use only one mapping instruction, but you can assign one mapping instruction to multiple data origins.
5.9.4 Defining Priorities for Copying Attribute Values into Harmonized Objects
Use
For each object type, you define the priorities with which attribute values are copied from source objects into the harmonized objects.
If you also define a derivation instruction for harmonized objects for a specific attribute, the attribute values are derived, before the priorities are taken into account.
You can define different settings for each data origin that is available in the system. If you do not enter a data origin for a specific setting, this setting is used as default for all data origins for which not settings are defined.
224 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
NoteYou can use Business Add-In BAdI: Default Values for Set Up of Data Harmonization /DDF/BADI_FDH_HARMON_DEFAULTS to define default values that are used when no settings have been made under Set Up Data Harmonization /DDF/FDH_SETUP.
Prerequisites
You have defined and activated object types that support source and harmonized record types for which harmonization is turned on and the Copy Attribute Values option is active in Customizing under Cross-Application Components Demand Data Foundation Data Harmonization Technical Settings Define Object Types (see Object Types Used in Data Harmonization [page 211]).
You have defined data origins. To define data origins, choose on the SAP Easy Access screen SAP MenuCross-Application Components Demand Signal Management Data Upload Data Delivery AgreementsDefine Data Origins , or use transaction Define Data Origins /DDF/DORIGIN.
Activities
1. On the SAP Easy Access screen, choose Cross-Application Components Demand Signal ManagementData Harmonization Set Up Data Harmonization , or use transaction Set Up Data
Harmonization /DDF/FDH_SETUP.2. On view Select Object Type, select the object type for which you want to change the settings.3. On view Prioritize Data Origins, define the default priorities of data origins that are used if no specific
settings are used for the individual attributes.If attribute values are provided by multiple data origins, the value with the highest priority is copied to the harmonized object during automatic harmonization. If at a later point in time a value is provided by a data origin with an even higher priority, the attribute value of the harmonized object is overwritten.You can also exclude a data origin from attribute value harmonization in general. This means that attribute values provided by this data origin are never copied to a harmonized object. If, for example, the data quality of a specific data origin is insufficient, you can exclude the data origin or define a low priority.You can decide whether initial values of attributes are taken into account. You can prevent the copy of initial values per data origin by selecting the No Initials checkbox. In this case, initial values are ignored even if they have the highest priority. The system determines the next valid source record with the highest priority and copies its value.
4. On view Prioritize Data Origins per Attribute, narrow down the prioritization for specific attributes.You can define the data origin priorities for each attribute that is contained in the object structure.
5. On view Prioritize Data Origins per Attribute Value, narrow down the prioritization for specific attributes. The priorities defined at the data origins or attribute level can be overruled by priorities defined on attribute value level. You can enter only negative values in order to reduce the priority to map the attribute value to the level 2 harmonized record.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 225
6. If you change the configuration in the production system, make sure that the new priorities are applied to the harmonized objects by using report Recalculate Priorities of Data Origins for Attribute Values /DDF/FDH_RECALC_ATTRIB_VALUES .On the SAP Easy Access screen, choose Cross-Application Components Demand Signal Management
Data Harmonization Recalculate Priorities per Attribute or use transaction Recalculate Priorities per Attribute /DDF/FDH_RECALC_ATTV.
7. On view Define Work Item Creation for Attribute Changes, define in which cases the system creates a work item for manual approval of attribute changes whenever the value of a specified attribute of a harmonized object is changed. Note: Work items of type Copy Attribute Values are created if the attribute values for saved harmonized objects are changed; changes to new harmonized objects are not taken into account.You have the following options:○ A work item for manual approval is always created.○ A work item is only created if the priority of the data origin that provides the attribute value is lower
(has an equal or higher number) than the specified threshold value.For example, if you specify the threshold 20, the system creates work items for approval for data origins with priority 20 or higher, whereas changes caused by data origins with a priority up to 19 are automatically performed.
5.9.5 Defining Restrictions for Harmonized Attribute Values
Use
For some attributes only specific attribute values are allowed to be used in harmonized objects, for example, manufacturer names are standardized.
To make sure that only these allowed attribute values are used, you perform the following steps:
1. Select the attributes for which you want to restrict the values.You can also define hierarchies of attribute for which the values depend on each other, for example, manufacturers and brands.
2. Define the allowed values.3. Define a derivation instruction set for the derivation of harmonized attribute values that contains the
restricted attributes.
More Information
Selecting Attributes for Value Restrictions [page 227]
Defining Allowed Attribute Values [page 229]
Defining Derivation Instruction Sets for Harmonized Attribute Values [page 245]
226 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
5.9.5.1 Selecting Attributes for Value Restrictions
Use
For each object type, such as Product or Location, you select specific attributes for which you want to restrict the allowed attribute values.
You can define hierarchies for attributes that depend on each other, for example, you define a hierarchy with the attribute Manufacturer with the subordinate attribute Brand.
The standard system landscape contains different systems, for example, a development system in which the Customizing settings and the configuration is done, and a production system in which the company’s data is stored and processed.
Changing the configuration in the production system is in most cases prohibited because it can lead to severe data inconsistencies. However, if for example you introduce a new attribute for an object type, you may want to include this attribute in harmonization in a timely manner.
To make that possible, you define attribute hierarchies in the development system and load them into the production system using a remote function call (RFC). You can review the impact the changes would have on existing data before activating the new hierarchies.
NoteAttribute values for the following attributes cannot be harmonized:
● Attributes that are defined in Customizing as external key for the object type● Attributes that are used to identify the object record, for example, Data Provider, Data Origin, Context● Administrative data
NoteFor more information on the roles, see Roles for SAP Demand Signal Management, version for SAP BW/4HANA [page 530].
Activities
1. Open the Web UI Select Attributes for Value Restriction for the object type for which you want to change the settings in a system in which you are able to change the configuration.
2. In the Work Area, enter all attributes for which you want to restrict the allowed attribute values and define their hierarchies.You can use the list of available attributes that is displayed on the UI as a repository.
CautionAttributes that depend on each other hierarchically must have the same settings regarding priority and work item creation for the harmonization of attribute values (see Defining Priorities for Copying Attribute Values into Harmonized Objects [page 224]).
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 227
3. Check in the Comparison assignment block the impact that the configuration change can have on existing data.
NoteIn the standard system landscape there is no object data available in the source system that would need to be recalculated.
4. Open the Web UI Set Source System in the production system and select the development system from which the configuration is to be loaded as source system.Select the Select Attributes for Value Restriction checkbox and save.
5. Open the Web UI Select Attributes for Value Restriction for the object type for which you want to change the settings.○ In the assignment block Active Configuration, you see the selection of attributes that is active in the
current system.○ In the assignment block Configuration in Source System, you see the configuration that is retrieved
from the source system.This configuration replaces the current configuration once you activate it.
6. Check the impact on existing data that an activation of the new selection would have in the Comparison and Impact Description assignment blocks.Whenever the selection of attributes for value harmonization is changed, the related attribute values of harmonized objects are recalculated when you save. Additionally, changes can have the following impact on existing data:○ Allowed attribute values are deleted with their hierarchy with the following exceptions:
○ If you remove only the root node of a hierarchy, all allowed values on the lower levels are kept.○ If you remove a node in the middle of a hierarchy or the last entry, all allowed values on higher
levels are kept, entries on lower levels are deleted.○ If you insert an element in a hierarchy, entries on higher levels are kept, entries on lower levels are
deleted.
NoteIf you move an attribute up or down in a hierarchy, the system behaves as if a node was removed and then inserted.
○ If an attribute is added to the selection, the attribute values of all harmonized objects will be cleared before they are recalculated. This includes attribute values that are changed manually. You can use Business Add-In BAdI: Default Values for Restricted Attributes to define the default values for all restricted attributes for which no allowed value can be determined.
7. Make any changes to the configuration in the source system and reload the configuration to the production system afterwards.
8. If you are sure that the impact on the current data does not lead to severe data inconsistencies, choose Prepare Activation.The system checks whether there are processes running in the system that access master data that need to be recalculated during activation of the new configuration. If no processes are running, the system locks all the affected object records. No harmonization processes can be carried out from this point in time until the activation is finished.
9. After the system has successfully locked all object records, choose Activate.The system starts a background job, in which the attribute values of the affected harmonized objects are harmonized anew based on the new attribute hierarchies.All changes are traced in the application log for subobject /DDF/FDH_ADMIN of object /DDF/FDH.
228 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
10. After the recalculation is finished, the locks are removed.
Result
You have selected the attributes for which the allowed attribute values are restricted. You can now define allowed attribute values for these attributes.
More Information
Defining Allowed Attribute Values [page 229]
Defining Restrictions for Harmonized Attribute Values [page 226]
Defining Derivation Instruction Sets for Harmonized Attribute Values [page 245]
Defining Derivation Instructions for Harmonized Attribute Values [page 248]
5.9.5.2 Defining Allowed Attribute Values
Use
For all attributes for which attribute values are restricted, you define allowed attribute values. For example, you define the list of all the manufacturer names that are to be used in harmonized objects.
If you do not define allowed values, the default values are used as defined in Business Add-In BAdI: Default Values for Restricted Attributes.
If a hierarchy is defined for an attribute, you can only define values dependent on superordinate attributes. For example, you define brand names dependent on manufacturers.
NoteIf an input help is defined for an attribute using the Business Add-In BAdI: Additional Attribute Checks and Input Help in Harmonization you can only define values that are provided by the BAdI implementation.
If a user wants to change attributes of harmonized objects for which the values are restricted, only the allowed values are available.
To make sure that the allowed values are taken into account during automatic harmonization, you have to define derivation instructions for the derivation of harmonized attribute values for all restricted attributes.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 229
Prerequisites
You have selected attributes for value restriction.
NoteFor more information on the roles, see Roles for SAP Demand Signal Management, version for SAP BW/4HANA [page 530].
Activities
1. Open the Define Allowed Attributes UI for the object type for which you want to define allowed values.2. Select an attribute in the list of restricted attributes.3. Enter all allowed values for the selected attribute in the Allowed Attribute Values list.
NoteYou cannot delete allowed attribute values that are already in use in a harmonized object or in a derivation instruction set.
5.9.6 Defining Harmonization Groups
Use
You use the harmonization group to collect objects whose attributes are handled in data harmonization in the same way. For each harmonization group, you can create derivation instruction sets for harmonized objects that define how attribute values of harmonized objects are derived from source attribute values.
NoteYou can use Business Add-In BAdI: Default Values for Set Up of Harmonization Groups /DDF/BADI_FDH_ADU_DEFAULT to define default values that take effect when nothing has been defined under Define Data Delivery Agreements /DDF/DDAGR.
Prerequisites
You have defined data delivery agreements. To define data delivery agreements, choose on the SAP Easy Access screen SAP Menu Cross-Application Components Demand Signal Management Data UploadDefine Data Delivery Agreements or use transaction Define Data Delivery Agreements /DDF/DDAGR.
230 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
Activities
1. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Data Harmonization Set Up Data Harmonization or use transaction Set Up Data Harmonization /DDF/FDH_SETUP.
2. On view Define Harmonization Groups, define all harmonization groups that are available in the system.3. On view Select Object Type, select the object type for which you want to define the settings for
harmonization groups.4. On view Assign Harmonization Groups to Delivery Agreements, assign harmonization groups to data
delivery agreements for the selected object type.You can use the same harmonization group for multiple data delivery agreements, but you can only assign one data delivery agreement to a harmonization group.When data is uploaded, the system automatically assigns the harmonization group to all source objects that are provided with the specified data delivery agreement.If you add a harmonization group without specifying the data delivery agreement, this harmonization group is used as default for all source objects for which no specific settings are defined.
5.9.7 Maintain Name Value Pair Filters
The Name Value Pair (NVP) filter allows you to manage large volume of name value pairs in the Derive Harmonized Product Attributesuser interface.
Name value pairs are created during data loads or by configuration in data harmonization and they are used as source attributes for derivation of attribute values for harmonized objects or source objects. NVP filter allows you to filter the name value pairs based on user authorization by attribute group.
Prerequisites
You have defined data delivery agreements. To define data delivery agreements, choose on the SAP Easy Access screen SAP Menu Cross-Application Components Demand Signal Management Data UploadDefine Data Delivery Agreements or use transaction Define Data Delivery Agreements /DDF/DDAGR.
Activities
● On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Data Harmonization Set Up Data Harmonization or use transaction Set Up Data Harmonization /DDF/FDH_SETUP.
● On view Assign Harmonization Group for NVP Filter, assign harmonization groups for NVP filter for the selected object type.
● Create a new attribute group. On view Assign Attributes to Attribute Group, assign attributes to the group. Assign this attribute to the harmonization group.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 231
Note
To assign attribute group to users you need authorization for the authorization object DH_NVPFLT.
Related Information
Name/Value Pairs [page 215]
5.9.8 Reports for Administrating Data Harmonization
Use
The following reports are available for data harmonization.
Report Technical Name Transaction Code Purpose
Check Customizing for Data Harmonization
/DDF/FDH_CUSTOMIZING_CHECK
/DDF/FDH_CHECK_CUST Check whether the Customizing settings for data harmonization are correct
Compare Customizing of Remote System with Local System
/DDF/FDH_COMPARE_IMG /DDF/FDH_COMPARE_IMG Compare the Customizing settings in the current client with the settings in a system that is connected using a remote function call (RFC)
Compare Customizing of Current Client with Client 000
/DDF/FDH_COMPARE_IMG_CLNT000
/DDF/FDH_COMP_CUST Compare the Customizing settings in the current client with the settings in client 000
Transfer Name/Value Pairs to Attributes
/DDF/FDH_COPY_NVP_TO_ATTR
/DDF/FDH_COPY_NVP Transfer the name/value pairs into static attributes of the structure used in data harmonization
Remove Processed Objects from Change Log
/DDF/FDH_DEL_CHGLOG /DDF/FDH_DEL_CHGLOG Remove processed entries from the change log. The change log contains entries for all objects for which the Customizing settings allow the creation of change pointers.
232 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
Report Technical Name Transaction Code Purpose
Remove Entries for Selected Objects from Change Log
/DDF/FDH_REORG_CHGLOG /DDF/FDH_REORG_CL Set selected entries in the change log to Processed
Remove Objects Marked for Deletion
/DDF/FDH_DEL_OBJ /DDF/FDH_DEL_OBJ Remove all objects from the database that are marked for deletion
Remove Obsolete Harmonized Records
/DDF/FDH_DEL_OBSOLETE_HARM_REC
/DDF/FDH_DEL_OBS_HRM Remove obsolete harmonized objects from the database that do not have a source object assigned and have not been changed during the specified amount of days in the past
Remove Objects by Data Delivery ID
/DDF/FDH_DEL_REC_BY_DELID
/DDF/FDH_DEL_BYDELID Delete all source objects that have been created during the upload for a specific data delivery ID. Deletion is performed based on records in the change log and can only be performed before change pointers have been removed.
Remove Selected Objects /DDF/FDH_DEL_RECORDS /DDF/FDH_DEL_REC Delete selected objects of object types used in data harmonization
Remove Assignment of Source Objects
/DDF/FDH_UNASSIGN_RECORDS
/DDF/FDH_UNASSIGN Delete the assignment of source objects to harmonized objects
Recalculate Priorities of Data Origins for Attribute Values
/DDF/FDH_RECALC_ATTRIB_VALUES
/DDF/FDH_RECALC_ATTV Recalculate the priorities of attribute values after the priorities of data origins have been changed
Select Objects for Postprocessing
/DDF/FDH_SELECT_FOR_PPE
/DDF/FDH_SEL4PPE Select objects for harmonization postprocessing; during harmonization postprocessing, the selected objects are harmonized
Start Postprocessing /DDF/FDH_START_PPE /DDF/FDH_START_PPE Start the harmonization postprocessing that calculates harmonized objects
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 233
Report Technical Name Transaction Code Purpose
Start Resolution of Work Items
/DDF/FDH_START_WI_RESOLVE
/DDF/FDH_START_WIR Start the resolution of work items that were created during data harmonization
Cancel Data Harmonization Process Instances
/DDF/FDH_CANCEL_PROC_INS
/DDF/FDH_CANCEL_PROC Cancel process instances that have been created by data harmonization.
This is only relevant if direct update is not activated.
Find Source Attribute Values Assigned to Multiple Harmonized Objects
/DDF/FDH_FIND_SRC_ATTV_MULT_HR
/DDF/FDH_FIND_SRCATV Get an overview of the assignment of source objects to multiple harmonized objects, based on the attribute values of a specified attribute and a selected object type. You can use this overview, for example, to evaluate whether it makes sense to perform the harmonization based on the specified attribute.
For source objects assigned to multiple harmonized objects, the report shows the object IDs of these source objects with the ID of the assigned harmonized object and the attribute value.
Create a New Data Delivery for Data Harmonization
/DDF/FDH_CREATE_NEW_DELIVERY
/DDF/FDH_NEW_DELIVER Create new data deliveries that are used to integrate the data changed by data harmonization into the standard upload process.
This is only relevant if direct update is not activated.
Release Data Deliveries for Harmonization
/DDF/FDH_RELEASE_DELIVERIES
/DDF/FDH_REL_DELIVER Release data deliveries that are used to integrate the data changed by data harmonization into the standard upload process.
This is only relevant if direct update is not activated.
234 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
Report Technical Name Transaction Code Purpose
Add Selected Objects to Change Log
/DDF/FDH_SELECT_FOR_CHGLOG
/DDF/FDH_SEL4CHGLOG Add selected objects to the change log for further processing
Revoke Manual Changes of Harmonized Object Attribute Values
/DDF/FDH_REVOKE_MANUAL_CHANGES
/DDF/FDH_REVOKE_MCHG Revoke manual changes of attribute values for selected harmonized objects
Revoke Manual Changes of Source Object Attribute Values
/DDF/FDH_REVOKE_MANUAL_CHG_SRC
/DDF/FDH_REVOKE_SRC Revoke manual changes of attribute values for selected source objects
Delete Obsolete Action Logs /DDF/FDH_DEL_OBSOLETE_ACT_LOG
/DDF/FDH_REORG_ALOG Delete action logs for harmonization object types for which the action log function was disabled in the object type definition.
Create Records for DOASO for Existing Objects
/DDF/FDH_CREATE_SAO /DDF/FDH_CREATE_SAO Create records of the data origin of attributes for already existing source objects.
Delete Obsolete Records for DOASO
/DDF/FDH_DEL_OBSOLETE_SAO
/DDF/FDH_REORG_SAO Remove obsolete records of the data origin of attributes for source objects.
More Information
For more information see the documentation of the reports in the backend system.
Scheduling Background Jobs for Data Harmonization [page 235]
5.9.9 Scheduling Background Jobs for Data Harmonization
You schedule the following background jobs on a regular basis:
● Pass on changes using standard data upload process if direct update is not activated. If direct update is activated, changes to the standard SAP Business Warehouse info objects are passed on automatically.To pass on data that is manually changed in data harmonization using the standard data upload process the system uses data delivery IDs.Whenever data harmonization is started the latest data delivery of the data delivery agreement with agreement type Harmonization Data is locked. As soon as harmonization is finished, the lock on the data delivery is removed. Now the data delivery can be released and the status of the related process instance can be set to Ready and the Process Scheduler job triggers the execution of the next process steps.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 235
You schedule jobs that create and release data deliveries on a regular basis.For more information see Customizing under Cross-Application Components Demand Data Foundation
Data Harmonization Set Up Data Upload .● Resolve harmonization work items
The system creates work items whenever objects could not be mapped or changed during harmonization because one of the involved harmonized or source object records was locked. In most cases these work items can be resolved by the system without user interaction as soon as the locks are removed.You schedule a job that tries to resolve these work items on a regular basis.For more information see Customizing under Cross-Application Components Demand Data Foundation
Data Harmonization Set Up Resolution of Work Items .In transaction Set Up Data Harmonization /DDF/FDH_SETUP you define for each object type, how many sessions can be active for this job in parallel and how many records can be processed together.
● Harmonization postprocessingYou schedule a job that starts harmonization.For more information see Customizing under Cross-Application Components Demand Data Foundation
Data Harmonization Set Up Postprocessing .In transaction Set Up Data Harmonization /DDF/FDH_SETUP you define for each object type, how many sessions can be active for this job in parallel and how many records can be processed together.
NoteEvaluate which settings for the jobs are ideal in your business context.
If direct update is not activated, if you want to send the results of data harmonization as fast as possible to SAP Business Warehouse (BW), you schedule the jobs in short intervals, for example, every 5 min. However, this increases the data load on BW significantly. If direct update is not activated, the system creates a separate process instance for each data delivery.
Minimizing the time that is needed to pass on harmonized data slows down system performance by increasing the data load on BW.
There are other reports available that can be scheduled on a regular basis, for example, to clean up the database. For more information, see Reports for Administrating Data Harmonization [page 232].
5.9.10 Activate Conversion Routines for MATERIAL, EANUPC, GLN and DATA_LOAD_ID
You use report /DDF/FDH_TRANSFER_CONV_EXIT to activate conversion routines that add leading zeros to the data stored in data harmonization for the following standard fields:
Object Type Attribute Name Domain Conversion Routine
Product MATERIAL /DDF/FDH_MATERIAL MATLF
Product EANUPC /DDF/FDH_GTIN GTINF
236 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
Object Type Attribute Name Domain Conversion Routine
Location GLN /DDF/FDH_GLN GLNF1
All Object Types DATA_LOAD_ID /DDF/FDH_DATA_LOAD_ID DALOI
5.9.11 Activate the Action Log
Use
You can track changes of assignments from source to harmonized records as well as changes of the global reporting status.
Activities
You can activate the action log for object types which are relevant for global reporting or which can be harmonized. When you activate the action log, the system tracks the following actions:
● Assignment of source records to harmonized records● Unassignment of source records● Changes of the global reporting status
Tracked actions are displayed on the Action Log tab on the detail screens for the objects of an object type for which the action log is active.
NoteRelevant actions, such as the assignment performed before the action log was activated, are not reflected in the log. If you switch off the action log for a data harmonization object type, you can execute the report Delete Obsolete Action Logs (transaction /DDF/FDH_REORG_ALOG) to delete the obsolete action log entries for this object type.
To activate the action log for object types which are relevant for global reporting or which can be harmonized:
1. In Customizing for Cross-Application Components under Demand Data Foundation Data Harmonization Technical Settings Define Object Types , in Dialog Structure, select the node Define Object Types.
2. To activate the action log for products:1. In the Obj.Type column, select the line with the value FDH_PROD.2. In the Act. Log column, select the checkbox.
3. To activate the action log for locations:1. In the Obj.Type column, select the line with the value FDH_LOC.2. In the Act. Log column, select the checkbox.
4. Choose Save.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 237
More Information
Activate the Action Log and DOASO for User-Created Object Types [page 240]
5.9.12 Activate Data Origin per Attribute for Source Objects
Use
You can store and view the data origin per attribute for source objects (DOASO) in data harmonization.
You can activate DOASO for each object type in Customizing for Cross-Application Components under Demand Data Foundation Data Harmonization Technical Settings Define Object Types . Select the
option DOASO Act for the relevant object type.
For example, for products and locations you perform the following steps:
1. In the Dialog Structure, select the node Define Object Types.2. To activate DOASO for products:
1. In the column Obj.Type, select the line with the value FDH_PROD.2. In the column DOASO Act, select the checkbox.
3. To activate DOASO for locations:1. In the column Obj.Type, select the line with the value FDH_LOC.2. In column the DOASO Act, select the checkbox.
4. Choose Save.
If you activate DOASO for a data harmonization object type, the following features are available:
● During the import of source records into data harmonization, the imported attribute values are saved in the database.
● On the Manage Products and Manage Locations details UIs, the Data Origin per Attribute tab provides information about each attribute (name/value pairs are not included) of the source object, its current value, the last imported value and its value source. The value source can be:○ None – No attribute value has been imported or derived○ Imported – Attribute value has been imported○ Enriched – Attribute value is taken from subobjects or has been added by object specific logic, for
example, the attribute Pending Update○ Derived – Attribute value has been derived with an Attribute Value Derivation Set for source objects○ Determined by System – Attribute value has been set by system, for example, for technical fields○ Status – Attribute value is determined by the rules of the status of the object
After you have activated DOASO for an object type, you should execute report Create Records for DOASO for Existing Objects (transaction /DDF/FDH_CREATE_SAO) to create DOASO records for the object type.
NoteYou should execute the report in all systems and clients in which the DOASO is activated and in which harmonization data exists.
238 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
If you have deactivated DOASO for a data harmonization object type, you can execute report Delete Obsolete Records for DOASO (transaction /DDF/FDH_REORG_SAO) to delete the obsolete source attribute origin records.
Using DOASO in User-Created Application Configurations
If you want to use DOASO in the UIs for Manage Objects that you have created for your own object types, you must set the parameter FDH_RECORD_TYPE in the application configurations you have created yourself.
More Information
Activate the Action Log and DOASO for User-Created Object Types [page 240]
5.9.13 Activate Distinct ABBIDs in Data Harmonization UIs
Use
You can use distinct Application Building Block IDs (ABBIDs) per object type in Data Harmonization UIs. This reduces the size of the metadata provider during runtime and improves performance. However, you must activate the functionality before it can be used.
When an application is called for a given object type, the metadata provider will be set up for all object types sharing the same ABBID. If no ABBID has been specified for this object type, the metadata provider will be set up for all object types maintained in Customizing and the default ABBID DDF_FDH is used. SAP delivers the ABBIDs DDF_FDH_P for object type FDH_PROD (products) and DDF_FDH_L for object type FDH_LOC (locations).
You can also use your own ABBIDs for these object types or for your own object types. These ABBIDs must be created in SPI in Customizing for Cross-Application Components under Processes and Tools for Enterprise Applications Settings for BO Framework and Navigation BO Framework Define Application Building Blocks .
Activities
To activate distinct Application Building Block IDs (ABBIDs) per object type in Data Harmonization UIs:
● Specify an ABBID in Customizing for Cross-Application Components under Demand Data FoundationTechnical Settings Define Object Types . You can specify the same ABBID for one or several object types.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 239
5.9.14 Activate the Action Log and DOASO for User-Created Object Types
Use
You can enable the Action Log tab and the Data Origin per Attribute for Source Objects tab in the object UIs that you have created for your own object types. To do this, you must make sure that on the corresponding FPM Floorplan Configuration, the component FPM_TABBED_UIBB has been used to display the tabs that are to be used on the UI.
Activities
To enable the Action Log tab and the Data Origin per Attribute for Source Objects tab in the object UIs that you have created for your own object types:
Step 1
1. Run transaction SE80.2. Choose package /DDF/UI_FDH_OBJ.
3. Choose WebDynpro FPM Applications /DDF/WDA_FDH_OBJ_OVP FPM Application Configurations .
4. Double-click an application configuration that has been created for your own object type.5. Open the FPM Floorplan Configuration that has been entered in the application configuration.6. Open Overview Page Schema. There should be one section where the component FPM_FORM_UIBB_GL2
has been added to display the object attributes.7. If in the second section the component FPM_TABBED_UIBB has not already been used, you should replace
the UIBBs added here with the component FPM_TABBED_UIBB that will be used to display the UIBBs currently available in this second section.
NoteA similar FPM_TABBED_UIBB should already be used for the Search and mass change: Harmonized Object UI or the Search: Source Object UI for your object type.
The simplest way to create a suitable FPM_TABBED_UIBB for the object UI is to copy one as described in step 2.
Step 2
1. Run transaction SE80.2. Choose either package /DDF/UI_FDH_MCH or the package in which your FPM_TABBED_UIBB has been
created.
3. Choose WebDynpro FPM Layout Component Configurations .4. Double-click the component configuration that is being used for your own object type.5. Choose Start Configurator.6. Choose Copy Configuration.
240 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
7. In the popup, specify the new configuration ID and description.8. Open Tabbed UIBB Schema.9. Remove the Master UIBB.10. Remove the tab used for the attributes as it is not needed on the object UI.11. Check that the UIBBs displayed in the second section of the FPM Floorplan Configuration specified in step
1 have all been added to this new configuration.12. In General Settings, deselect the option Master/Detail.13. Choose GUIBB Settings and in the drop-down list, select Application Controller Settings.14. Ensure that for the Web Dynpro Component / Class, the class /DDF/CL_FDH_TABBED_APPCC has been
entered.15. Choose OK.16. Choose Save.
Step 3
1. Run transaction SE80.2. Open the FPM Floorplan Configuration that was entered in the application configuration in step 1.3. Open Overview Page Schema.4. Replace the UIBBs added to the second section with the FPM_TABBED_UIBB created in step 2.5. Choose Save.
5.9.15 Maintaining General Harmonization Parameter
With this Customizing activity, you maintain the optional Harmonization parameters.
To maintain the settings, proceed as follows:
During initial data loads, if the volume of data load is very high, then you can set X in the following parameter to stop BW Direct Update on adhoc basis. Once the harmonization process is complete, you can manually run the following BW Direct Update Program to synchronize the records:
SAP_DSIM_HARM_DISABLE_BW_DUP
If the ratio of the source records to the harmonized record is high (more than 30,000:1), loading all relevant source records in the memory, to perform Attribute Value Derivation, can result in memory leakage and locking issues. In such cases, you can maintain optimal packet size in the following parameter:
SAP_DSIM_HARM_PACK_SIZE
For more information, see the Customizing documentation for Cross-Application Componentsunder Demand Data Foundation Data Harmonization Maintain General Harmonization Parameter
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 241
5.10 Automatic Data Harmonization
Use
When data is uploaded the system can automatically harmonize all incoming master data.
Depending on the Customizing and configuration settings for the object type, the system harmonizes master data automatically during the upload process.
In the standard system master data objects are automatically harmonized when the uploaded data is extracted from the Data Acquisition Layer to the Data Propagation Layer in SAP Business Warehouse using process chains (see Steps in the Data Flow for Master Data [page 38] and Process Chains in the Data Flow [page 41]).
NoteIf the direct update is not activated, you can disable data harmonization for specific data delivery or data origins.
● If you disable harmonization for a specific data delivery agreement, no data provided with this data delivery agreement is harmonized.To change data delivery agreements, choose SAP Menu Cross-Application Components Demand Signal Management Data Upload Data Delivery Agreements Define Data Delivery Agreementson the SAP Easy Access screen or use transaction Define Data Delivery Agreements /DDF/DDAGR.
● If you disable harmonization for a specific data origin the data is not harmonized automatically. Nevertheless, you can harmonize the objects manually.To change this setting, choose SAP Menu Cross-Application Components Demand Signal Management Data Harmonization Set Up Data Harmonization on the SAP Easy Access screen or use transaction Set Up Data Harmonization /DDF/FDH_SETUP.
If direct update is activated, you cannot disable data harmonization for standard objects.
Process
1. For each new source object, the system searches for a harmonized object to which the source object can be mapped.For this, the system applies derivation instructions for source objects, if any are defined, and then compares the new source object to all existing source objects. If a similar source object is found that is already assigned to a harmonized object, the system assigns the new source object to the same harmonized object.The system uses the harmonization sequence defined in the configuration settings. It also calculates the score that specifies how good source and harmonized objects match.If it cannot find a suitable harmonized object and the Customizing settings allow the creation of new harmonized objects, the system creates a new harmonized object.
2. In the following cases, the system creates a work item:○ The system cannot determine a suitable harmonized object and cannot create a new one (work item
type Map Manually).
242 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
○ The system determines more than one suitable harmonized object (work item type Map Manually).○ The mapping needs manual approval (work item type Confirm Mapping).○ One of the objects involved is locked (work item type Perform Pending Mapping).
3. A user processes the work item manually, or it is resolved automatically by a background job.4. The system maps the source object to the harmonized object.5. The system calculates for each attribute which attribute value is to be copied into the harmonized object
as follows:1. The system checks whether an attribute of the harmonized object is manually changed or protected.
○ If so, the system does not change the attribute value; no values are copied for those attributes.2. The system checks whether an active derivation instruction set for harmonized objects is defined that
has the attribute as target attribute, and is defined for the harmonization group of one of the assigned source objects.○ If so, the system derives the attribute value from source attribute values as defined in the
derivation instruction for harmonized objects.Attribute values of source objects that have a harmonization group for which no active derivation instruction set for harmonized objects is defined are ignored.
○ If no active derivation instruction set for harmonized objects is defined, the attribute values of all source objects are taken into account.
3. The system determines the priority of the data origin for all attribute values that were determined in the previous step and the priority of the data origin of the current attribute value of the harmonized object.
4. The system calculates which attribute value is to be copied into the harmonized object as follows:○ If the attribute value of a source record is initial and the system was set up to ignore initial values of
this data origin, this value is ignored even if its data origin has the highest priority. The system continues with the attribute value of the next data origin.
○ If the data origin of the current attribute value of the harmonized object has the highest priority, the system keeps the attribute value.
○ If the highest priority of all attribute values is as high as priority of the current attribute value, the system copies this value.
○ If there is an attribute value that has a higher priority than the current attribute value, the system copies the attribute value with the highest priority into the harmonized object.
6. Depending on Customizing settings, the system creates a work item of type Copy Attribute Values for attribute changes to be confirmed. Note: Work items of type Copy Attribute Values are created if the attribute values for saved harmonized objects are changed; changes to new harmonized objects are not taken into account.
7. The user manually confirms the attribute change.8. The harmonized object is updated.9. In the following cases, the system creates new work items:
○ The system cannot update the harmonized object because the harmonized object is locked (work item type Perform Pending Update).
○ The system changes the base unit of the harmonized product, the unit conversion has to be revised for all source objects that are assigned to the harmonized object (work item type Revise Object).
○ The source object is relevant for global reporting (work item type Release for Global Reporting).10. The user processes the remaining work items.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 243
More Information
Setting Up Automatic Harmonization [page 223]
Scheduling Background Jobs for Data Harmonization [page 235]
Work Items [page 212]
Defining Derivation Instruction Sets for Harmonized Attribute Values [page 245]
Defining Priorities for Copying Attribute Values into Harmonized Objects [page 224]
5.11 Manual Tasks in Data Harmonization
Use
The following table shows the tasks you have as a harmonization user, and which user interfaces are provided.
Task Where? See Also
Define derivation instruction sets and the individual derivation instructions.
Search: Derivation Instruction Sets for Products
Search: Derivation Instruction Sets for Locations
You can search for an existing set or create a new one.
Defining Derivation Instruction Sets for Harmonized Attribute Values [page 245]
Monitor the work items and resolve issues.
● Worklist (for object type Product)● Worklist (for object type Location)
Monitoring Work Items [page 254]
Review and edit the mapping of source objects to harmonized objects.
● Manage Products● Manage Locations
Manual Mapping [page 255]
Create, change, or delete single harmonized objects
● Manage Products● Manage Locations
You can also use the detail page for a single object.
Manual Mapping [page 255]
Editing Harmonized Objects [page 260]
Display or edit multiple harmonized objects with the assigned source objects.
● Manage Products● Manage Locations
Editing Harmonized Objects [page 260]
244 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
Task Where? See Also
Display multiple source objects.
Only for object type Product: You define the conversion of the base unit of measure of the source product to the base unit of measure of the assigned harmonized product
● Manage Products● Manage Locations
You can also use the detail page for a single object.
Editing Source Objects [page 261]
Review the data origin of each harmonized object attribute.
Detail page for a single object. Editing Harmonized Objects [page 260]
Release for global reporting ● Manage Products● Manage Locations
You can also use the detail page for a single object or the work area of the UI for manual mapping.
Support of Global Reporting in Data Harmonization [page 218]
NoteFor more information on the roles, see Roles for SAP Demand Signal Management, version for SAP BW/4HANA [page 530].
NoteAccess to Web user interfaces (Web UIs) is granted by authorization roles (PFCG roles) that contain the Web UI in the assigned menu.
To be able to access Web UIs, for example, from the user menu or in the SAP NetWeaver Business Client (NWBC), you must have a corresponding authorization role assigned to your system user.
For more information, see Roles for SAP Demand Signal Management [page 530].
5.11.1 Defining Derivation Instruction Sets for Harmonized Attribute Values
Use
You set up the attribute value derivation for harmonized objects by defining derivation instruction sets for the usage AVH. In the first step you define a header.
The header contains the following main parts:
● Information on whether or not the set is active● Object type and harmonization group● Hierarchical list of source attributes from which attribute values can be derived
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 245
● List of target attributes for which values are derived from source attribute values
NoteAttribute values for the following attributes cannot be derived:
● Attributes that are defined in Customizing as external key for the object type● Attributes that are used to identify the object record, for example, Data Provider, Data Origin, Context● Administrative data
Activities
1. Open the Search: Derivation Instruction Sets for Products (or Locations) UI for your object type.You create a new header or select an existing one and navigate to the header definition.
2. Specify the identifier and description, and the harmonization group for which the derivation instruction set is valid.You can define several derivation instruction sets for the same object type and harmonization group.
3. Select the source attributes from which the attribute values are derived and define their hierarchy. Although the list is depicted as a flat table, the sequence of source attributes defines the hierearchy.You can use all object attributes and attributes that are stored as name/value pairs as source attributes.You cannot change the source attribute hierarchy when there are already derivation instructions defined. You can only add new source attributes or delete the lowest hierarchy node. If you delete a node, all existing derivations for this source attribute will also be deleted.
4. If you want to group specific source attribute values, define groupings using various operators.If a value fulfills the grouping criteria for multiple groupings, the value is sorted in the first group.
ExampleFor example, you can define a grouping A* for attribute values starting with A and a second grouping B* for attribute values starting with B. The attribute values are then grouped under three nodes in the derivation instruction set: A*, B*, and one group for all other values.
5. Select the target attributes.You can use all attributes that are available in the internal structure of the selected object type as target attribute with regard to the following constraints:○ If you have already defined an attribute hierarchy for the restriction of attribute values, you have to
make sure that all attributes on which a target attribute hierarchically depends are also selected as target attributes.
○ A target attribute can only be used in one active derivation instruction set for harmonized objects for the same object type and harmonization group.If you add a target attribute that is already selected in a different active set for harmonized objects, you cannot activate the set.
6. Define for each target attribute which derivation types are allowed:○ Explicit value
The target attribute value for the target attribute can be entered explicitly in the derivation instruction.For example, from the value “TST Inc.” of source attribute Company you enter the explicit value “Taste Inc.” for the target attribute Manufacturer.To allow explicit values you select the related checkbox.
246 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
○ Copied valueThe target attribute value is copied from an explicitly specified reference attribute.For example, if for source attribute “Company” the value is “TST Inc.”, the attribute value for target attribute Product Category is always copied from the source attribute Segment whatever value it contains.To allow copied values, you add the reference attributes from which values can be copied to the target attribute. You can use only reference attributes that have the same type as the target attribute.
NoteIn the Target attribute table, the Compatibility column contains an icon to indicate the compatibility of the assigned reference attributes with the associated target attributes, as follows
○ A green icon indicates all the reference attributes are compatible○ A yellow icon indicates there is a potential incompatibility with the reference attributes
NoteAttributes that are in the same attribute hierarchy for the restriction of attribute values must have the same settings for explicit and copied values.
7. Save the header.8. Choose Open Derivation Instructions and define the derivation instructions.9. Activate the derivation instruction set header. The system recalculates all target attributes of harmonized
objects accordingly.
NoteThe system recalculates the attribute values of harmonized objects in the Harmonization Postprocessing background job.
More Information
Defining Derivation Instructions for Harmonized Attribute Values [page 248]
Defining Restrictions for Harmonized Attribute Values [page 226]
Selecting Attributes for Value Restrictions [page 227]
Scheduling Background Jobs for Data Harmonization [page 235]
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 247
5.11.1.1 Defining Derivation Instructions for Harmonized Attribute Values
Use
1. Search for a derivation instruction set and choose Open Derivation Instructions.The system determines which source attribute values for the selected source attributes are available and displays a two-dimensional matrix:○ The selected target attributes define the columns.○ The number of rows is calculated as sum of the number of source attribute values on each hierarchy
level (1 + x1 + x2 + ... + xn, 1 for the Default Values row, xi for the number of source attribute values on each hierarchy level, n for the maximum number of hierarchy levels).If a grouping is defined for a source attribute, the rows are grouped accordingly.
While the number of columns is always the same (as long as the selection of target attributes is unchanged), the number of rows may grow when new source attribute values are determined in a new data delivery.
2. Define the default values that are inherited by all derivation instructions for which no other value is defined manually.
3. Define the individual instruction for the available source attribute values.You have the following options:○ You enter nothing.
If there is an explicit or copied value defined on a higher hierarchy level (including the default value), the attribute value is inherited from the higher level.If there is no derivation instruction defined for this target attribute on a higher hierarchy level, the default value is inherited.
ExampleFor the combination of the attribute values “TST Inc.” of source attribute Company and “Crnchy” for source attribute Brand, the derivation instruction inherits the value “Taste Inc.” for the target attribute Manufacturer from the derivation instruction on the higher level. In addition, it contains an explicit value “Crunchy” for target attribute Brand.
○ You enter an explicit value.If the possible values for the target attribute are restricted, you can only select allowed attribute values.
NoteYou can enter an empty value by choosing Set Empty Value in the context menu.
○ You select a reference attribute from which the target attribute value is copied.Derivation instructions that are defined manually are highlighted using a user-specific highlight color.You can filter the derivation instructions that are already defined by source attributes, or you show only instructions that have values that are defined manually.
NoteFor source attribute values that are no longer contained in the source data, the system stores the unused derivation instructions, in case they are needed again in future. You can choose Unused
248 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
Instructions to display derivation instructions that have been defined for source data that in no longer available.
4. Save. If the derivation instruction set header is active, the system recalculates all target attributes of harmonized objects accordingly.
NoteThe system recalculates the attribute values of harmonized objects in the Harmonization Postprocessing background job.
Example
The following table displays an example of a derivation instruction set. Explicit and copied values are displayed in bold; inherited values are highlighted.
Source Attribute Value
Material Group
[Explicit]
Manufacturer
[Reference Attribute]
Manufacturer
[Explicit]
Brand
[Explicit]
Comment
Default Values Others ATTRIBUTE_1 Others For the target attributes Material Group and Brand the explicit value “Others” is defined.
You know that the value for attribute Manufacturer is always provided in field Attribute 1. This value shall per default be copied.
● Material Group:
Others ATTRIBUTE_1 Others For all objects where the attribute Material Group is not filled, the default values are inherited.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 249
Source Attribute Value
Material Group
[Explicit]
Manufacturer
[Reference Attribute]
Manufacturer
[Explicit]
Brand
[Explicit]
Comment
● Material Group: Chocolate
Sweets ATTRIBUTE_1 Others The categorization of material groups is different, so you translate these into your own groups.
● Material Group: Chocolate○ Brand:
Choc1
Sweets ATTRIBUTE_1 Tasty Chocolate The values for Brand are translated into the brand names you use.
● Material Group: Chocolate○ Brand:
Choc2
Sweets ATTRIBUTE_1 Creamy Chocolate
● Material Group: Ice Cream
Sweets ATTRIBUTE_1 Others The categorization of material groups is different, so you translate these into your own groups.
● Material Group: Others
Others [Empty Value] Others You know that for the Material Group “Others” ATTRIBUTE_1 is not provided with a meaningful value. You set an empty value for this target attribute.
More Information
Defining Restrictions for Harmonized Attribute Values [page 226]
Selecting Attributes for Value Restrictions [page 227]
Scheduling Background Jobs for Data Harmonization [page 235]
250 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
5.11.2 Defining Derivation Instruction Sets for Source Attribute Values
Use
You set up the attribute value derivation for source objects by defining derivation instruction sets for the usage AVS. In the first step you define a header.
The header contains the following main parts:
● Information on whether or not the set is active● Object type and Context● Hierarchical list of source attributes from which attribute values can be derived● List of target attributes for which values are derived from source attribute values
NoteAttribute values for the following attributes cannot be derived:
● Attributes that are defined in Customizing as external key for the object type● Attributes that are used to identify the object record, for example, Data Provider, Data Origin, Context● Administrative data
Activities
1. Open the Manage Derivation Instruction Sets UI for your object type and select the correct usage from the dropdown.You create a new header or select an existing one and navigate to the header definition.
2. Specify the identifier and description, and the context for which the derivation instruction set is valid.You can define several derivation instruction sets for the same object type and context.
3. Select the source attributes from which the attribute values are derived and define their hierarchy.You can use all object attributes and attributes that are stored as name/value pairs as source attributes.You cannot change the source attribute hierarchy when there are already derivation instructions defined. You can only add new source attributes or delete the lowest hierarchy node. If you delete a node, all existing derivations for this source attribute will also be deleted.
4. If you want to group specific source attribute values, define groupings using various operators.If a value fulfills the grouping criteria for multiple groupings, the value is sorted in the first group.
ExampleFor example, you can define a grouping A* for attribute values starting with A and a second grouping B* for attribute values starting with B. The attribute values are then grouped under three nodes in the derivation instruction set: A*, B*, and one group for all other values.
5. Select the target attributes.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 251
You can use all attributes that are available in the internal structure of the selected object type as target attribute with regard to the following constraints:○ A target attribute can only be used in one active derivation instruction set source object for the same
object type and context.If you add a target attribute that is already selected in a different active set for source objects, you cannot activate the set.
6. Define for each target attribute which derivation types are allowed:○ Explicit value
The target attribute value for the target attribute can be entered explicitly in the derivation instruction.For example, from the value “TST Inc.” of source attribute Company you enter the explicit value “Taste Inc.” for the target attribute Manufacturer.To allow explicit values you select the related checkbox.
○ Copied valueThe target attribute value is copied from an explicitly specified reference attribute.For example, if for source attribute “Company” the value is “TST Inc.”, the attribute value for target attribute Product Category is always copied from the source attribute Segment whatever value it contains.To allow copied values, you add the reference attributes from which values can be copied to the target attribute. You can use only reference attributes that have the same type as the target attribute.
NoteIn the Target attribute table, the Compatibility column contains an icon to indicate the compatibility of the assigned reference attributes with the associated target attributes, as follows
○ A green icon indicates all the reference attributes are compatible○ A yellow icon indicates there is a potential incompatibility with the reference attributes
7. Save the header.8. Choose Open Derivation Instructions and define the derivation instructions.9. Activate the derivation instruction set header. The system recalculates all target attributes of harmonized
objects accordingly.
NoteThe system recalculates the attribute values of source objects and adjusts the source objects in the Harmonization Postprocessing background job.
More Information
Defining Derivation Instructions for Source Attribute Values [page 253]
Scheduling Background Jobs for Data Harmonization [page 235]
252 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
5.11.2.1 Defining Derivation Instructions for Source Attribute Values
Use
1. Search for a derivation instruction set Source Object and choose Open Derivation Instructions.The system determines which source attribute values for the selected source attributes are available and displays a two-dimensional matrix:○ The selected target attributes define the columns.○ The number of rows is calculated as sum of the number of source attribute values on each hierarchy
level (1 + x1 + x2 + ... + xn, 1 for the Default Values row, xi for the number of source attribute values on each hierarchy level, n for the maximum number of hierarchy levels).If a grouping is defined for a source attribute, the rows are grouped accordingly.
While the number of columns is always the same (as long as the selection of target attributes is unchanged), the number of rows may grow when new source attribute values are determined in a new data delivery.
2. Define the default values that are inherited by all derivation instructions for which no other value is defined manually.
3. Define the individual instruction for the available source attribute values.You have the following options:○ You enter nothing.
If there is an explicit or copied value defined on a higher hierarchy level (including the default value), the attribute value is inherited from the higher level.If there is no derivation instruction defined for this target attribute on a higher hierarchy level, the default value is inherited.
ExampleFor the combination of the attribute values “TST Inc.” of source attribute Company and “Crnchy” for source attribute Brand, the derivation instruction inherits the value “Taste Inc.” for the target attribute Manufacturer from the derivation instruction on the higher level. In addition, it contains an explicit value “Crunchy” for target attribute Brand.
○ You enter an explicit value.
NoteYou can enter an empty value by choosing Set Empty Value in the context menu.
○ You select a reference attribute from which the target attribute value is copied.Derivation instructions that are defined manually are highlighted using a user-specific highlight color.You can filter the derivation instructions that are already defined by source attributes, or you show only instructions that have values that are defined manually.
NoteFor source attribute values that are no longer contained in current deliveries, the system stores the unused derivation instructions, in case they are needed again in future. You can choose Unused Instructions to display derivation instructions that have been defined for source data that is no longer available.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 253
4. Save. If the derivation instruction set header is active, the system recalculates all target attributes of source objects accordingly.
NoteThe system recalculates the attribute values of source objects in the Harmonization Postprocessing background job.
More Information
Scheduling Background Jobs for Data Harmonization [page 235]
5.11.3 Monitoring Work Items
Use
You can use worklists to display all work items that exist in the system for specified object types and work item types. For each work item type, some simple actions are available as displayed in the following table.
NoteYou can also use the Event Monitor app to get an overview over existing work items.
Work Item Type Possible Actions
01 Map Manually Retry the mapping
Delete the work item
02 Confirm Mapping Confirm Mapping
Reject Mapping
03 Revise Object Delete the work item
254 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
Work Item Type Possible Actions
04 Copy Attribute Values Resolve the work item
When you choose to resolve the work item, if the work item has been created for approval of attribute changes, the system copies the attribute values.
Delete the work item
When you choose to delete the work item, you reject the value. The current value in the harmonized objects attribute will not be changed. To prevent further generations of work items of this type for this attribute of the referenced harmonized object, the system sets the attribute value of the harmonized object to manually protected. The work item will be deleted.
05 Perform Pending Mapping Retry to perform the mapping
Reject the mapping
06 Perform Pending Update Retry the update.
07 Release for Global Reporting Release for global reporting
Set status to Revoked
08 Recalculate Status for Global Reporting Retry the update
More Information
Work Items [page 212]
Resolving or Disregarding Work Items in Manual Mapping [page 256]
5.11.4 Manual Mapping
As a harmonization user you want to edit the mapping of source objects to harmonized objects manually in one of the following situations:
● The system could not create a mapping during automatic harmonization.● You want to change an existing mapping.● You want to review or confirm a mapping that was created automatically.
NoteYou can map multiple source objects to the same harmonized object, but you can only map one harmonized object to a source object.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 255
On the user interface you see the following assignment blocks:
● Work areaContains the following objects:○ All harmonized objects to which you have assigned source objects in the current session○ All harmonized objects you have added directly to the work area
You can see all source objects that are currently assigned to a harmonized object by expanding the node.Any changes you make in the work area are saved as soon as you save your work. If you remove objects from the work area, the changes are still temporarily stored in the buffer, and are also saved.
● Work itemsContains all work items for the object type that can be resolved in this application. You can change the selection by using the search.If a work item is completed, it is also automatically removed from the list.
● Proposal listsContain the mapping proposals that are determined by the system
● Search result lists for harmonized and source objectsContain harmonized objects or source objects for which you searched manually
NoteYou can change the layout of the UI according to your needs. You can rearrange or hide assignment blocks
using the standard personalization options by choosing .
For example, if you do not want to use system proposals, you can hide the assignment blocks.
You can perform the following actions:
● Resolve or disregard work items● Determine mapping proposals● Compare selected objects● Assign source objects to an existing harmonized object● Create a an new harmonized object and assign source objects● Change assignments● Delete harmonized objects
If a harmonized object has no source objects assigned, you can delete it.
5.11.4.1 Resolving or Disregarding Work Items in Manual Mapping
Use
You can let the system resolve the work items in your selection. Depending on the work item type, the system performs the action that is needed to resolve the problem that caused the creation of the work item. If the action succeeds, the work item is deleted.
You can also disregard a work item. The system does not resolve the work item; the problem that caused the work item can persist or occur again. The work item will then be recreated.
256 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
The following table shows the system actions that are performed when you choose Resolve or Disregard in the Manual Mapping user interface:
Work Item Type Resolve Disregard
Map Manually The system retries to map the source object.
The system deletes the work item.
Confirm Mapping You confirm the mapping that needs approval; the mapping is stored on the database.
You reject the mapping; the system removes the assignment of the source object and deletes the work item. A new work item for manual mapping is created.
Perform Pending Mapping
The system retries to save the mapping on the database.
Copy Attribute Values If the work item has been created for approval of attribute changes, the system copies the attribute values.
When you choose to delete the work item, you reject the value. The current value in the harmonized objects attribute will not be changed. To prevent further generations of work items of this type for this attribute of the referenced harmonized object, the system sets the attribute value of the harmonized object to manually protected. The work item will be deleted.
More Information
Work Items [page 212]
Monitoring Work Items [page 254]
5.11.4.2 Determining Mapping Proposals
You select an object and choose Update Proposals.
If you have selected a source object or a work item, the system generates two lists with mapping proposals for the related object. The system searches for proposals of similar harmonized objects to which the source object could be mapped, and proposals of similar source objects that could be mapped to the same harmonized object.
If you have selected a harmonized object, the system only proposes similar source objects that could also be mapped to the harmonized object.
All proposals are rated by a score; a score of 100 means that the object data matches exactly. To get more proposals, you can change the minimal score in the assignment block or specify a lower default score in the user-specific parameters.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 257
5.11.4.3 Comparing Selected Objects
You can compare multiple objects that you have selected in the various screen areas.
Using the comparison, you can evaluate to what extent the selected objects match each other before mapping.
The system opens a dialog box containing all selected objects and highlights all identical and different attributes with different colors.
You can select which objects you want to keep selected when you close the dialog box.
5.11.4.4 Assigning Source Objects to a Harmonized Object
To create a new mapping of source objects to a harmonized object you have the following options:
● Assign the source object directly to a harmonized object that you have selected in any of the screen areas● Create a new harmonized object to which the source object is assigned● Search for a harmonized object and assign the source object
You can also change existing assignments or undo the assignment.
NoteYou can also use “drag and drop” to create or change an assignment.
5.11.4.5 Example: Manual Mapping Using Proposals
Use
You want to map all source products that are not mapped to a harmonized product.
You use a layout with the following assignment blocks:
● Work Items● Proposals: Harmonized Products● Proposals: Source Products● Work Area
Process
1. In the assignment block Work Items you create a worklist for source products. For example, you select only work items that refer to data from a specified data provider.
2. You select a work item and generate a list of mapping proposals by choosing Update Proposals.
258 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
3. You review the proposals in the assignment block Proposals: Harmonized Products.The harmonized products in this list are all similar to the selected source product for which you generated the proposal list. The score indicates to what extend the two products match.
4. You select the best proposal.5. In the Work Items assignment block you choose Assign to Harmonized Product in Proposals for the selected
work item.The system assigns the source product that is referred to by the work item to the selected harmonized product.
6. The system deletes the work item. It is removed from the worklist.7. The new mapping is added to the work area.8. You select the harmonized product in the work area and update the proposals of source products
(assignment block Proposals: Source Products).9. You select all source products that can be added to the mapping and choose Assign to Harmonized Product
in Work Area.10. You save.
5.11.4.6 Example: Manual Mapping Using Manual Search
Use
You want to map all source products that are not mapped to a harmonized product.
You use a layout with the following assignment blocks:
● Work Items● Search Results: Harmonized Products● Search Results: Source Products● Work Area
Process
1. In the assignment block Work Items you create a worklist for source products. For example, you select only work items that refer to data from a specified data provider.
2. In the assignment block Search Results: Harmonized Products you search for similar harmonized products.3. You review the result list and select the harmonized product that fits best.4. In the Work Items assignment block you choose Assign to Harmonized Product in Search Result for the
selected work item.The system assigns the source product that is referred to by the work item to the selected harmonized product.
5. The system deletes the work item. It is removed from the worklist.6. The new mapping is added to the work area.7. In the assignment block Search Results: Source Products you search for similar source products.8. You select all source products that can be added to the mapping and choose Assign to Harmonized Product
in Work Area.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 259
9. You save.
5.11.4.7 Example: Manual Mapping Without Using Work Items
Use
You want to map as many similar source products to a harmonized product as possible.
You use a layout with the following assignment blocks:
● Search Results: Harmonized Products● Search Results: Source Products or Proposals: Source Products● Work Area
Process
1. In the assignment block Search Results: Harmonized Products you search for a harmonized product and add it to the work area.
2. In the assignment block Search Results: Source Products you search for similar source products; or in the Proposals: Source Products assignment block, you update the proposals for the selected harmonized product .
3. You select all source products that can be added to the mapping and choose Assign to Harmonized Product in Work Area.
4. You save.
5.11.5 Editing Harmonized Objects
If you want to change harmonized objects, you can either open a single object for editing or use the mass change.
You have the following options:
● Change attribute values, except technical fields● Revoke manually changed or manually protected attribute values● Undo assignments of source objects to the harmonized object● Delete the harmonized object, if no source objects are assigned (only in single edit mode or in the Manual
Mapping user interface)● Review the data origin of the values for each attribute
260 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Harmonization
5.11.6 Editing Source Objects
Source objects are uploaded to data harmonization and cannot be edited directly in the manual mapping or detail source object UI. However, you can define derivation instruction sets for source objects to derive values for certain attributes.
Name/value pairs of the source object cannot be edited or derived.
Only for the object type Product with the associated object type Unit Conversion can you edit the conversion from the base unit of measure of the source product to the base unit of the harmonized product.
SAP Demand Signal Management, version for SAP BW/4HANAData Harmonization C O N F I D E N T I A L 261
6 Integration with Machine Learning
Source product attributes values to be derived/predicted based on other attributes
Prerequisite
You have a license for SAP Leonardo, MEH Environment and have received the End Point URLs for Authentication, Data Manager, Model Manager and Interface.
6.1 Configuration
Setting up the environment in Machine Learning.
Establish HTTP connections of the given end point URLs with following path prefix
Connections Path Prefix
Authentication /oauth/token?grant_type=client credentials
Data Manager /v1
Model Manager /v1
Interface /v1/inference
6.2 Set up Model Definition
Use transaction /DDF/ML_SETUP to make following configuration settings
● Model Definition and connection details (Model Name, Connection and Confidence)● Model Structure details Class type (Inbound and outbound attributes combined to form a Model structure.
Class type Features represents Inbound and Class Represent Outbound)● Classification Data – (Source objects to be identified to send to Machine Learning).
262 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Integration with Machine Learning
6.3 Build Machine Learning Environment
After maintaining the configuration in the Set up Machine Learning program (transaction /DDF/ML_SETUP), use program Build Machine Learning Model (transaction /DDF/ML_BUILD) to set up environment in Machine Learning.
1. Enter Model Name and choose button Build Model Structure to create the dataset schema.2. Send training data to create dataset3. Version status provides you status dataset. If the dataset status is Ready, you train the dataset to creates
models. Activate model to in use Machine learning4. Deactivate the model if you don’t wish to use the machine learning Services
NoteAt any point of time only one model must be active.
During Data uploads, if a new source object meets the conditions maintained in classification dataset, a work item will be created, and data will be sent to Machine Learning to predict the values of attributes. If the probability of prediction values is greater than Threshold value, the system will automatically update the source record and added to PPE to propagate values to Harmonized record. If any of the probability values is less than, it will wait for user action, which can be seen in Machine Learning UI,
SAP Demand Signal Management, version for SAP BW/4HANAIntegration with Machine Learning C O N F I D E N T I A L 263
7 Data Enrichment
Use
Data enrichment enables you to enhance the uploaded external data from retailers with additional information for reporting. The analysis of the reports helps to give you insights into past and current out-of-shelf and out-of-stock situations at retailer stores. Based on the reports you can resolve the current issues and analyze the trends and problem areas that you should focus on. Reducing the number of out-of-shelf and out-of-stock situations results in a higher on-shelf availability which in turn helps to improve sales figures, consumer satisfaction, and brand loyalty.
Data enrichment in SAP Demand Signal Management, version for SAP BW/4HANA provides two specific enrichment sequences for the following purposes:
● Sales data enrichmentThe main focus is on determining out-of-shelf situations for zero sales using aggregated sales data.
● Stock data enrichmentThe main focus is on aggregated sales and stock snapshot data to determine out-of-stock situations based on the days of supply.
The rules for data enrichment are contained in an open and extensible framework enabling you to implement your own rules. Data enrichment is orchestrated and integrated into the data flow of SAP Demand Signal Management, version for SAP BW/4HANA which is based on SAP Business Warehouse using process chains and the process flow control.
In the data enrichment process new data records are created for the DataStore objects in the Data Propagation Layer. The new records are separated from the retailer data using a reserved value for the Type of Sales and for the Stock Type. The following values are available:
● 0001: Result of Sales Data Enrichment● 00: Results of Stock Data Enrichment .
Prerequisites
● You have made the settings for Data Enrichment in Customizing under Cross-Application ComponentsDemand Data Foundation Data Enrichment .
● For sales data enrichment, you have uploaded sales data to the DataStore object (DSO) Aggregated Sales Propagation (/DDF/DS11). For stock data enrichment, you have uploaded stock snapshot data to the DSO Stock Snapshot Propagation /DDF/DS12.
● To get meaningful data enrichment results, we strongly recommend that before start enriching your data, you first execute an initial upload of sales and stock history data excluding the process steps for data enrichment. This ensures that the data enrichment algorithms run on a solid data foundation that enables proper analysis and meaningful KPIs. A zero sales record, for example, can only result in a significant out-of-shelf incident, if the sales history for the respective product-location combination is taken into consideration. A solid data foundation requires a sales and stock history of 90 days or more.
264 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Enrichment
This applies to all KPIs calculated during data enrichment, for example outlier detection, zero sales generation, average calculation. We therefore recommend that you include the process step for data enrichment at a later phase.
● Data Enrichment is only supported, if the sales and/or stock data of the data delivery contains all available product-location combinations for the provided time intervals.
Use
Data Flow
The system calls the data enrichment process from within the corresponding DataSource. The result of the enrichment is written to the Persistent Staging Area and flows back to the corresponding DSO.
Example: Data Flow for Sales Data Enrichment
More Information
Process Flow Control [page 174]
Handling Transaction Data from Retailers [page 29]
SAP Demand Signal Management, version for SAP BW/4HANAData Enrichment C O N F I D E N T I A L 265
7.1 Sales Data Enrichment
Use
The purpose of sales data enrichment is to provide you with insights into current out-of-shelf situations, lost sales due to out-of-shelf situations, or durations of out-of-shelf situations. In addition, it enables you to get an overview of out-of-shelf numbers and out-of-shelf rates from the past.
Prerequisites
You have defined periods with sales deviations and assigned them to locations. For more information, see the Customizing for Data Enrichment under Cross-Application Components Demand Data Foundation .
You have defined a location calendar and assigned it to the locations. This enable the generation of zero sales records. For more information, see the Customizing for Data Model under Cross-Application ComponentsDemand Data Foundation .
Features
You can define the following sequence of functions for sales data enrichment:
● Initialize WorklistSelects the data you want to enrich from the Aggregated Sales Propagation DataStore object (DSO) and generates a worklist.
● Pre-Cleansing (optional)Marks a data record in the worklist as a non-outlier for the given combination of location and date if the date is within a period with sales deviations and the period of sales deviations is assigned to the location. Records that are marked as non-outliers are ignored in the subsequent function (Outlier Detection).
● Outlier Detection (optional)Marks a data record in the worklist as an outlier. This function is based on the interquartile range test that detects significant deviations depending on the threshold value for significance defined in Customizing.
NoteOutlier detection should be used only if a sufficient sales history is available.
● Generate Zero SalesGenerates zero sales records in the worklist, if an expected sales record is missing and the location was open for business. If a certain number of days with zero sales is reached (defined in Customizing) the product is most likely not sold anymore and no new zero sale records are created.
● Calculate Long-Term AverageDetermines the long-term average for every product-location combination by calculating the arithmetic mean. The calculation is based on the number of days defined as a threshold in Customizing.
266 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Enrichment
● Calculate Out-of-ShelfDetermines whether zero sales records should be marked as out-of-shelf situations. Two different functions can be used — simple ratios and normal distribution. Both functions are based on the long-term average that was calculated in the previous step. The result of the function is compared to the corresponding threshold value defined in Customizing to decide if an out-of shelf situation is given. The threshold value controls how many out-of-shelf signals are generated.
● Calculate Average Lost SalesCalculates the averages (quantity, gross value, and net value) for the identified sales records as average lost sales for records that are marked as out-of-shelf situations.
● Get resultPosts the enriched data records back so that they can be transferred to the DSO Aggregated Sales Propagation.
7.1.1 Periods with Sales Deviations
Depending on your business, you expect high sales around a public holiday or a sports event. Sales figures could also drop, for example, during an unexpected period of bad weather. In both cases, you can identify these periods as periods with sales deviations, which are relevant for data enrichment. The data enrichment process is used to ensure that the data records from these periods are not interpreted as outliers.
You can define periods with sales deviations and group them in a period group using transaction /DDF/PER. Based on this definition the systems generates days.
You can use transaction /DDF/PERP to display the generated periods.
In transaction /DDF/PERDO, you can assign the period group to a combination of data origin, country, and region and create a default assignment for several locations.
During data upload with process chain Stage Locations with PFC /DDF/LOC_STAGE the system evaluates the attributes data origin, country, and region of each location and assigns the locations to the period groups you defined. You can display the assignments using transaction /DDF/PERLOC.
During the execution of the function Pre-Cleansing in sales data enrichment, the system evaluates the days of periods with sales deviation assigned to a location and indicates that the corresponding data records should be ignored in the function Outlier Detection.
NoteFor more information and direct access to all transactions related to the periods with sales deviations, see the Customizing for Data Enrichment under Cross-Application Components Demand Data Foundation .
7.1.2 Location Calendar
Retailers usually do not send a data record for days when a product was not sold. To get correct values for calculations like zero sales calculations, the missing data records must be added. To know whether a product was not sold because the location was closed for a particular period of time, you have to define the periods
SAP Demand Signal Management, version for SAP BW/4HANAData Enrichment C O N F I D E N T I A L 267
when locations are open for business. If the product was not sold because the location was closed, the system creates a zero sales record during sales data enrichment.
To determine the hours of operation for a location, you define a location calendar using transaction /DDF/CAL. Based on this calendar, the systems generates days and weeks. These can be displayed using transaction /DDF/CALP.
You can assign the location calendar to a combination of data origin, country, and region and make this the default assignment for several locations using transaction /DDF/CALDO.
During the upload of locations in the process chain Stage Locations with PFC (/DDF/LOC_STAGE) the system evaluates the attributes data origin, country, and region for each location and assigns the locations to the defined location calendar. These assignments can be displayed in transaction /DDF/CALLOC.
NoteFor more information and direct access to all transactions related to the location calendar, see the Customizing for Data Model under Cross-Application Components Demand Data Foundation .
7.2 Stock Data Enrichment
Use
The purpose of stock data enrichment is to give you insights into current out-of-stock situations at a retailer. It also provides numbers and rates of out-of-stock situations from the past.
Prerequisites
You have executed sales data enrichment before starting stock data enrichment. The results from sales data enrichment are taken into account for the calculation of out-of-stock situations (average lost sales and out-of-shelf calculations).
Features
You can define the following sequence of functions for stock data enrichment:
1. Initialize WorklistSelects the data to be enriched from the Stock Snapshot Propagation DSO and generates a worklist.
2. Data CompletionCompletes missing stock values by stock type and marks the records as generated records to avoid incorrect stock values in reporting. Data completion consists of the following steps:○ If some products in a single location have delivered values while others are missing values, then the
system sets the stock quantities for the missing values to zero.
268 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Enrichment
○ If all stock values for a location are missing in a data delivery it is assumed that the stock did not change. The stock quantities from the previous data delivery for this location are transferred.
○ If no stock data is delivered at all but enriched sales data is available, the stock quantities of the previous delivery are transferred.
If a future data delivery provides the missing stock quantities the generated records are overwritten and the marker is removed.
3. Calculate Out-of-Stock SituationIndependent of the stock type, the system calculates the days of supply as the delivered stock quantity divided by the average lost sales quantity that was calculated during sales data enrichment. The result is compared to the threshold value defined in Customizing to decide whether you are looking at an out-of-stock situation.
4. Get ResultPosts the enriched data records back so that they can be uploaded to the Stock Snapshot Propagation DSO.
More Information
Sales Data Enrichment [page 266]
7.3 Overview of Key Figures in Data Enrichment
The following table provides an overview of the functions and key figures calculated during data enrichment:
Purpose Function Key Figure Description
Sales Data Enrichment Outlier Detection /DDF/OUTLNUM Number of Outlier Signals
Sales Data Enrichment Outlier Detection /DDF/OUTLIER Outliers
Sales Data Enrichment Out-Of-Shelf Calculation /DDF/OOSHNUM Number of Out-of-Shelf Signals
Sales Data Enrichment Out-Of-Shelf Calculation /DDF/OOSH Out-of-Shelf
Sales Data Enrichment Average Lost Sale Calculation
/DDF/AVGLOSSQ Average Lost Sales Quantity of Retailer in SUoM
Sales Data Enrichment Average Lost Sale Calculation
/DDF/AVGLOSST Average Lost Sales Gross Value of Retailer in LC
Sales Data Enrichment Average Lost Sale Calculation
/DDF/AVGLOSSV Average Lost Sales Net Value of Retailer in LC
SAP Demand Signal Management, version for SAP BW/4HANAData Enrichment C O N F I D E N T I A L 269
Purpose Function Key Figure Description
Sales Data Enrichment Out-Of-Shelf Calculation /DDF/OOSHDUR Duration of Out-of-Shelf Situation
Stock Data Enrichment Data Completion /DDF/ENRICHED Result of Stock Data Completion
Stock Data Enrichment Data Completion /DDF/ENRUM Number of Records from Stock Data Completion
Stock Data Enrichment Calculate OoSt Situations /DDF/OOST Out-of-Stock
Stock Data Enrichment Calculate OoSt Situations /DDF/OOSTNUM Number of Out-of-Stock Signals
Stock Data Enrichment Calculate OoSt Situations /DDF/OOSTDUR Out-of-Stock Situation
7.4 Executing Data Enrichment
Use
You have two options for executing data enrichment:
● Black Box ApproachData enrichment is called as a sequence that contains only one step. The function that represents this step executes all tasks required for enrichment. In standard sales enrichment, the function ALL_STEPS calls a bundled stored procedure from the ABAP layer that performs the calculation of all necessary key figures.
● White Box ApproachData enrichment is called as a sequence that contains several enrichment steps. The functions assigned to the corresponding steps call individual stored procedures that calculate one or more key figures that are required for the corresponding enrichment purpose (sales data enrichment or stock data enrichment). The white box approach allows you to replace specific individual calculations of key figures with customer-specific algorithms in the standard sequence.
Steps
Data enrichment is performed in enrichment steps. Enrichment steps carry out one or more specific tasks of data enrichment (for example, calculate a specific key figure, create records that are missing in the data delivery (for example, zero sales)).
Sequences
The steps are organized in sequences. SAP delivers predefined sequences for sales data enrichment and stock data enrichment. The system triggers one step after the other according to the sequence you have activated in Customizing.
270 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Enrichment
ExamplePossible sequence for stock data enrichment:
1. Initialize Worklist2. Complete Data3. Calculate Out-of-Stock Situations4. Get Result
Functions
Each enrichment step is represented by a function of a specific function type. A function type is, for example, the calculation of out-of-shelf signals during sales data enrichment. To calculate of out-of-shelf signals, you can define several functions that use different algorithms. In standard Customizing you have a function that uses a simple ratio method (OOSH_SIMPLE_RATIO) and a function that uses a more sophisticated normal distribution approach (OOSH_NORM). Each function representing an enrichment step has an ABAP class assigned to it that contains the coding to trigger the execution of the corresponding task(s). All these ABAP classes implement interface /DDF/IF_FES_ENRICHMENT. They are subclasses of class /DDF/CL_ENR. The standard classes assigned to the standard functions call stored procedures in the SAP HANA database.
More Information
For a detailed description of the steps, sequences, and functions, see the documentation Define Enrichment Sequences in Customizing for Data Enrichment under Cross-Application Components Demand Data Foundation .
Data Enrichment [page 264]
Enhancing the Data Enrichment Process [page 271]
7.5 Enhancing the Data Enrichment Process
Use
If the enrichments executed by the sequences in standard Customizing do not fit your specific business needs, you have the following options:
● Create a new enrichment sequence with customer-specific logic● Replace a single step within a sequence of individual steps with a customer-specific function or add a new
individual step to calculate a customer-specific new key figure (requires changing the data model)
Recommendation● Do not create new purposes. The purposes in standard Customizing should be sufficient.● Do not change the function types, functions, sequences in standard Customizing.
SAP Demand Signal Management, version for SAP BW/4HANAData Enrichment C O N F I D E N T I A L 271
● If you need to adjust the enrichment, create new sequences. If necessary, create new functions and function types.
Creating a New Enrichment Sequence
If the overall logic of the enrichment meets your requirements, but you want to reduce the number of calculated key figures or calculate one or more key figures in a slightly different way, you can create a new enrichment sequence. To reduce the number of key figures, add the enrichment steps you need to your new sequence and skip the ones you do not need. Activate your new sequence in Customizing.
Replacing a Single Step
If you want to calculate one or more key figures in a slightly different way, check how the standard logic works:
● If the customer-specific algorithm you need works with the same input/output, do the following:1. Add all steps of the standard enrichment sequence that meet your business requirements to your
sequence.2. Create a stored procedure for the key figure you want to calculate differently and make the system call
your procedure instead of the standard one.3. Implement the Business Add-In BAdI: Stored Procedures in Data Enrichment and return the name of
your stored procedure.● If the customer-specific algorithm you need requires a different input, do the following:
1. Add all of the steps in the standard enrichment sequence that meet your business requirements to your own sequence.
2. Create a new function for an existing function type.3. Create an ABAP class that implements the interface /DDF/IF_FES_ENRICHMENT. Let the class inherit
from /DDF/CL_ENR if the helper methods defined there seem beneficial to you. However, you can also call the helper methods without inheriting from this class.
4. Assign the name of your class to your new function in the Customizing activity Define Functions under Function Class Name.
5. Create a customer-specific stored procedure that implements the algorithm you need.6. In the EXECUTE method of your ABAP class, enter the code to call this stored procedure.7. Add your new function to your enrichment sequence at the place where it fits.
The wrapper class of the sales data enrichment calls the EXECUTE method of your class at the place in the sequence that you have defined in Customizing.
More Information
For more information about enhancements to data enrichment, see the document Enhancing the Data Enrichment Process on the SAP Community Network (SCN) at http://scn.sap.com/community/demand-signal-management .
272 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Enrichment
7.6 Configuring and Administrating Data Enrichment
As an administrator for data enrichment you have the following tasks:
● Set up data enrichment in Customizing. For more information, see the Customizing documentation for Data Enrichment under Cross-Application Components Demand Data Foundation Data Enrichment .
● Configure data enrichment in the SAP Easy Access Menu.
These configuration steps are required if you use the pre-cleansing function of sales data enrichment for POS data.
Task When? Where?
Set up data enrichment in Customizing Implementation Phase Customizing under Cross-Application
Components Demand Data
Foundation Data Enrichment
Define period groups Before you upload locations of a new retailer
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Enrichment Periods of Sales
Deviations Define Period Groups .
Define periods with sales deviations Before you upload locations of a new retailer
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Enrichment Periods of Sales
Deviations Define Period with Sales
Deviations .
Define default assignment of periods with sales deviation
Before you upload locations of a new retailer
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Enrichment Periods of Sales
Deviations Define Default Assignment
of Periods of Sales Deviation .
SAP Demand Signal Management, version for SAP BW/4HANAData Enrichment C O N F I D E N T I A L 273
Task When? Where?
Assign location to period groups On demand, to check the assignments for a data origin and to create or change assignments if necessary
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Enrichment Periods of Sales
Deviations Assign Locations to Period
Groups .
7.6.1 Defining Periods with Sales Deviations
Use
Special events throughout the year - like public holidays, festivals, or sports events - can influence consumer sales data. However, not only predictable, planned events like this have an impact on consumer behavior. Unforeseeable incidents like unexpected periods of extreme weather conditions can also lead to periods with sales deviations. Those periods with sales deviations vary for different companies, regions, and countries and therefore sometimes only affect the sales data of certain locations.
To mark days or periods with known deviations in your sales data, you can define periods with sales deviations and assign these periods to locations with or without reference to a data origin, country, or region. Using those periods helps the consuming applications to produce more accurate results in sales data analysis.
The periods with sales deviations are grouped in a period group. A period group can be assigned to data origins, countries, and regions. The system automatically assigns locations to period groups during data upload. Periods with sales deviations and the related assignments are treated like master data.
Example
During data enrichment of aggregated sales data from a retailer you would like to run an outlier analysis to mark unexpected high or low sales figures to exclude this data from certain calculations like the determination of average sales. However, based on previous business experience, you expect a higher sales volume for certain periods, for example a number of days before a public holiday. The data records from this period reflect the real sales volume and should not be excluded from analysis due to an outlier detection algorithm.
Prerequisites
For all activities related to periods with sales deviations you need authorization for the authorization object DDF_ADMIN (Administration Demand Data Foundation).
274 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Data Enrichment
Activities
Setting up Periods with Sales Deviations
1. In the view Define Periods Groups, define a period group with one or several periods with sales deviations.2. In transaction Define Periods with Sales Deviations verify the generated days of the period group and
regenerate or add periods, if necessary.
Assigning Locations to Period Groups
1. In the view Define Default Assignments of Periods of Sales Deviations assign your period group.2. Load locations into the system, for example into InfoObject /DDF/LOCATION.3. In the program Assignments Locations to Period Groups, verify the generated assignments.
More Information
Periods with Sales Deviations [page 267]
SAP Demand Signal Management, version for SAP BW/4HANAData Enrichment C O N F I D E N T I A L 275
8 Global Reporting
Use
Global reporting is used to analyze the market share of products compared to the most important competitors in different countries. Working with data from various databases to create a global reporting base poses a number of challenges:
● Defining and Selecting Relevant DataFor global reporting not all data is relevant in the level of detail provided in the database. Data providers compile a wide variety of details (for example, regional information, store size, and sales channel) and information about competitors and products that might be interesting from a local perspective but not for global reporting. In global reporting the focus is generally on the total market for a country, so the responsible user (role Global Market Data Provider) has to pick the data that is relevant for global analyses and consolidate it.
● Managing Heterogeneous DataDifferent market research data providers deliver databases to the manufacturers. The content and structure of these databases varies from data provider to data provider and from country to country. Even databases from the same data provider and country can vary.○ Databases have different delivery cycles and time granularities and they are not always available at the
same time.
ExampleDatabases with bimonthly periods are delivered six times a year whereas databases with monthly periods are delivered twelve times a year.
SAP Demand Signal Management, version for SAP BW/4HANA enables you to still execute a monthly reporting by offering an extrapolation function. Missing data records or databases of poor quality can be temporarily replaced by extrapolated values until the next delivery or a corrected delivery is available.
○ Databases have different time granularities, for example, weekly or bimonthly. These granularities have to be converted into a common reporting granularity using the time split function.
○ Databases use different taxonomies. The classification of attributes for a database that you use for detailed country-specific reporting can be different from the one you use for global reporting. It is the task of the Global Market Data Provider to harmonize these global attributes.
ExampleA competitor who is very strong in one specific country but does not act in other countries can be ignored in a global analysis. You therefore assign it to a general node like Other Competitors.
● Ensuring the Quality of DataUsually, several people from different departments are involved in buying, uploading, and using market research data. The local or regional marketing departments buy market research databases to perform a detailed analysis for the specific country.
276 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
Then the data must be transformed for global marketing using a number of processing steps (consolidation, extrapolation, time split, and harmonization). The Global Market Data Supervisor has to check the quality of the transformed data for each data delivery from a global perspective.
● Scheduling Availability of Global ReportsThe Global Market Data Supervisor checks when new data from the various different databases is available for global reporting and schedules the publishing dates. Finding the right point in time for publishing can be challenging: You want to publish your data as early as possible, but at the same time you have to ensure that all data deliveries are available and not too many values are extrapolated.
Activities
Provide Data
1. Data ConsolidationYou define selection rules to pick only those data records (master data and transaction data) that are relevant for global reporting.
2. ExtrapolationDepending on the database, the reporting-relevant data is provided in different granularities. Therefore, it is possible that at a scheduled reporting date not all data for a specific reporting period has been delivered yet. You can then extrapolate data into future periods to enable reporting even if some data deliveries are still missing. The missing data is temporarily replaced by extrapolated data.
3. Time SplitTo provide a common time granularity for reporting, data records in different source time granularities must be split and converted into a common target time granularity.
4. Global HarmonizationFor the data that is relevant for global reporting you define global attributes using data harmonization. The degree of the harmonization is visible when you release your data later on in the process.
Supervise Data
1. Releasing Global DataThe uploaded data from a new delivery must be checked and released manually before it can be used in global reporting. Only a limited set of users is authorized to perform this check. If the results of the check are satisfactory the complete delivery can be released. The data is still not visible for the marketing department at this point.
2. Publishing Global DataAfter the individual data deliveries have been released, you have to decide whether you want to make the data available for reporting This is not done for a complete delivery but for certain global category and country combinations across several data deliveries. As long as the new data has not been released and published for reporting only the global market data supervisor can see it.
More Information
Support of Global Reporting in Data Harmonization [page 218]
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 277
8.1 Provide Data for Global Reporting
Use
The following steps have to be performed for each new market research database that is uploaded and used for global reporting. The list also describes integration aspects between the data model, data upload, and data harmonization.
Provide Data for Global Reporting
1. You define a data delivery agreement.2. You set up the quality validation process that checks the data before it is extracted to the global layer.
Quality validation for global reporting checks automatically whether the subset of products and locations you have selected during data consolidation covers the total market sufficiently. It is based on the check function that has been configured for data consolidation. If the check fails, the upload to global reporting is not carried out. As long as you have not maintained any consolidation rules, for example for the first upload, the upload to global reporting stops at this point.
3. The system uploads the data to the local layer. At this point the data can be harmonized and used in local reports. The first upload stops during quality validation because no data consolidation has been defined.
4. On the Data Consolidation UI, you define which master data and transaction data is relevant for global reporting. All products and locations that meet the selection rules defined in data consolidation are passed as Relevant for Global Reporting to data harmonization. Only the harmonized products and harmonized locations that are assigned to these products and locations appear as global products and global locations in global reporting. Furthermore, only the retail panel data that is assigned to global products and global locations and has the consolidated source time granularity, can be used for global reporting. Thus, the amount of data in global reporting is usually a subset of the data that has been uploaded to the local layer.
278 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
5. You define extrapolation and time split profiles and assign them to a data delivery agreement. You define how the system performs extrapolation and time split and how seasonal deviations should be taken into account.
6. You restart the process step of quality validation to trigger the system upload to global reporting. The system executes the check which now does not fail anymore because consolidation rules are available. Extrapolation and time split are carried out automatically by the system as part of the upload to global reporting. The extrapolation function generates additional local time strings that receive time characteristics as attributes for reporting The time split function generates global time strings that receive a reduced set of time characteristics as attributes for global reporting. These time strings correspond to the system’s calendar definition. Both time master data and retail panel data are consistently available for global reporting.
7. In data harmonization, you maintain global attributes for all the global products and global locations. Once you have done that, you mark the global products and global locations as Released for Global Reporting. This status is passed to the Release Data for Global Reporting UI to indicate the completion rate in data harmonization.
More Information
● Data Consolidation [page 279]● Time Split [page 285]● Extrapolation [page 281]● Support of Global Reporting in Data Harmonization [page 218]
8.1.1 Data Consolidation
Use
During data consolidation you select which master data records and transactional data records are relevant for global reporting.
To compare the totals, you first select the overall market value that was delivered by the data provider and afterwards the single products or nodes of the product hierarchy. The values are aggregated by the system and you can verify if the total matches the delivered pre-aggregated value.
Prerequisites
You have set up the check function for data consolidation in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Settings for Retail Panel Data General Settings for Global Reporting .
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 279
You have defined a sequence of functions including threshold values for the execution time Before Data Extraction to Global Reporting. On the SAP Easy Access screen choose SAP Menu Cross-Application Components Demand Signal Management Quality Validation Configure Quality Validation .
Activities
You define the consolidation settings for each data delivery agreement:
Define Total Market
Each transactional record is based on the three dimensions time, product, and location:
● TimeThe time granularity of the data delivery agreement is preselected. If the data delivery agreement contains several granularities, you can choose which one you would like to use for global reporting (for example, month).
● ProductThe uploaded files can contain several hierarchies. You select the hierarchy and the Product Total that is used for the totals comparison. The system shows you the top nodes of the selected hierarchy sorted by sales value to support your decision.
● LocationYou then define the total market you want to report on. Again, the available locations for the product total are sorted by sales value. In the Location Analytics area you can switch between different views:○ Bar chart displaying location totals.
Click on the desired bar to select it for data consolidation.○ List view displaying location totals.
Choose Select to select a location for data consolidation.○ List view displaying the source locations.
Choose Select to select a location for data consolidation.
NoteIn exceptional cases you can select several locations. Examples include:
● One data delivery contains data for multiple countries.● The data delivery does not contain a total market. Several individual locations have to be combined to
form the market.
If you select several locations, the follow-on steps have to be executed for every single location.
Select Products
In the second step you select the products that contribute to the total market selected in the previous step. By defining selection rules you can specify the relevant products. The rules should be defined in a way that they can be reused for several data deliveries. It is therefore recommended to define your rules based on common attributes (for example, product hierarchy level) instead of selecting single products by product number. If later on a new data delivery contains a product with the same attribute, the rule can be reused.
Under Source Data by Level you can switch between different views:
● Bar chart displaying the aggregated values of each level.
280 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
● List view displaying the aggregated values of each level. Click on Add Hierarchy Level to Consolidation to add the level as rule. Click on Compare to Hierarchy Level Below to display all products that have no children in the next lower level. This function allows you to identify products that are available only on a higher level in the product hierarchy and would not be selected if your consolidation rule selected only products of the lowest hierarchy level. Click on Show Details for this Level to display all single products of the corresponding level. Here you can get detailed data for each product and add individual products
Compare Market Total with Aggregated Products
To ensure that you have selected all the relevant data records, you can compare the totals of your selection with the pre-aggregated totals delivered by the market research company. The totals usually do not match completely. You can therefore define what degree of deviation is acceptable. The function FILTER_MKT_COVERAGE in quality validation checks your selection at the execution time Before Extraction to Global Reporting. The system displays the deviation between the different totals along with the sales value of the totals and additional information, like the threshold range and the deviation, in the check results.
If the threshold is exceeded, you have to go back and revise your product consolidation definition.
If the check for a data delivery fails, the data selection must be adjusted before the data is made available for global reporting.
In addition, the check provides the number of selected products and the number of selected products with sales data. You can use this information to ensure a consistent end-to-end upload process when you compare against the number of harmonized records of your data delivery in data harmonization or against the product statistics in the Release Data for Global Reporting UI.
More Information
Configuring Quality Validation [page 203]
Overview of Key Figures in Quality Validation [page 204]
8.1.2 Extrapolation
Use
As different databases have different delivery schedules and different time granularities, it is rarely the case that all new data records for a particular reporting period are available at the same time. However, reporting should be possible as soon as the most important data is available without the need to wait for the reporting period to be complete. You can therefore use historic data and generate extrapolated values for the missing time periods that have not been delivered yet. The results of the extrapolation can be simulated and displayed on the UI.
ExampleAll monthly deliveries that cover the month of January are available in the middle of February. However, databases with bimonthly data covering the period of January and February are delivered in March.
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 281
You would like to report on the January data in early February. You therefore extrapolate the bimonthly data for January using historic data.
Extrapolation Based on Weighting Factors
To reflect seasonal deviations in product sales you can define weighting factors for the corresponding periods. These weighting factors can be independent of the year or you can define exceptional factors for single calendar years. The weighting factors are grouped in profiles and then assigned to the data delivery agreement. In the weighting profile you enter absolute numbers for the days, weeks, or months to indicate how they contribute to the overall figures.
You can specify how many historic periods should be taken into account and how many future periods shall be extrapolated.
The results of the extrapolation can be simulated and displayed on the UI. During data upload the system calculates the extrapolated values for each product-location combination taking the weighting factors and the corresponding historic values into account.
The extrapolated values are calculated based on the following formula:
expol_j = gew_j * ( sum(sales_hist) / sum(gew_hist) )
The parameters used in this formula have the following meaning:
● sum(sales_hist) is the sum of all historic sales● sum(gew_hist) is the sum of all historic weighting factors● j is the period to be extrapolated● gew_j is the weighting factor of the period j● expol_j is the extrapolated value of period j
Generate Profiles and Weighting Factors
You can automatically generate profiles and weighting factors for each data delivery agreement using the time series of one product of the delivery as a reference, for example, the product representing the total market. The system generates the weighting factors for the reference product using the Triple Exponential Smoothing algorithm of the SAP HANA Predictive Analysis Library (PAL). To adjust the weighting factors to the most current data, you can regenerate the weighting factors automatically each time an extrapolation is executed during data upload. You can also regenerate weighting factors manually from time to time, when required.
Simple Extrapolation without Weighting Factors
Alternatively, you can use the extrapolation method Simple Extrapolation which provides product-specific extrapolation based on PAL Triple Exponential Smoothing where applicable and extrapolation based on weighting factors for all other products. If you choose this method the system executes the following steps for each product during data upload:
● Check if the time series of the product is complete● If the time series is complete, perform a season/linear trend analysis for this time series and check if the
unexplained rest is less than 50%● If the unexplained rest is less than 50%, extrapolate this time series with Triple Exponential Smoothing
All other products (products with incomplete time series or products with complete time series with an unexplained rest of more than 50% after season/linear trend analysis) are extrapolated using weighting factors that were created during run time with the product total defined in data consolidation as a reference.
282 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
Simple Extrapolation allows for each product to be extrapolated with its individual season trend pattern as long as the product fulfills the above mentioned conditions. Another advantage of this method is that no weighting factors must be maintained or generated and persisted in the system. If you use the method Weighting Factors, you have one set of factors (derived from the reference product) that is applied for all products of the data delivery.
For products that are no longer sold or that are in the process of being discontinued, no extrapolation values or only few extrapolated values are displayed.
NoteThe time series of the data delivery must contain the data of at least two years.
A data consolidation check for respective data delivery agreement must be performed beforehand.
It is recommended to use an extrapolation horizon (the number of future periods ) of three to six months to facilitate global reporting for the respective database.
For more information on the algorithm, search for HANA Triple Exponential Smoothing at http://help.sap.com/analytics .
Simulation of Extrapolation
You can simulate the extrapolation results for both extrapolation methods. By default, the product with the highest sales volume is displayed first. Most likely, this is the product total or the top node of the product hierarchy. To simulate the extrapolation for products that are selected in data consolidation as relevant for global reporting, you can restrict the products using the Display Consolidated Products function in the search help. The list of consolidated products is sorted by sales volume. Within the simulation you can navigate to the next or previous product of the sorted list using the arrow buttons.
More Information
Rules for Defining Profiles [page 292]
8.1.3 Example: Extrapolation
Use
On April 12 the global marketing manager needs a report of the countries Spain, France, and Italy covering the month of March. The time granularity and the delivery schedule for the data vary from country to country. In our example the granularities are as follows:
● Spain: Weekly data; Last delivery on April 10, next delivery on May 9● France: Four-weekly data; last delivery on April 4, next delivery on May 2● Italy: Bimonthly data; last delivery on March 8, next delivery on May 8
On April 12 the reporting period March is covered completely for Spain whereas the data for France is only partly available. The missing data must be extrapolated.
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 283
For Italy all data must be extrapolated because the data for March will only be delivered on May 8 as part of the bimonthly delivery for March and April.
Extrapolation is executed according to the granularity of the source data.
Example
Available Data
Extrapolation based on granularity of the source data
284 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
Extrapolated Data
8.1.4 Time Split
Use
To prepare multi-country retail panel data for global reporting, you usually have to combine data with different time granularities, depending on the data provider and the country from which the database originates. Common granularities in the source data include weekly, monthly, four-weekly, or bimonthly granularities.
The granularity used for global reporting, however, is either calendar week or calendar month. Therefore, numbers like sales value, sales volume, and sales quantity provided in the source data must be transformed from the source granularity into the corresponding values of the reporting granularity.
ExampleGlobal reporting in your company is done by calendar month. You receive databases with monthly, bimonthly, and four-weekly source granularities. These must be transformed into monthly periods using the time split function. You define how to break down the source values or aggregate them to match with the granularity of the reporting period.
Time Split Profiles
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 285
To reflect seasonal deviations, you define time split profiles that determine how data in a given source granularity is distributed into a common reporting granularity. Depending on the source data and the reporting granularity you define one or two of the following split profiles:
● Monthly profileMonthly profiles contain weighting factors for each month of a year. This allows you, for example, to define how the source values of bimonthly periods are split into the corresponding two calendar months.
● Weekly profileWeekly profiles contain weighting factors for the weeks of a year. This allows you, for example, to split four-weekly source data into the corresponding weeks.
● Weekday profileWeekday profiles define weighting factors for each day of week. This allows you to reflect typical sales deviations within a week. You can, for example, indicate that you expect higher sales on Fridays and Saturdays compared to the days at the beginning of a week. This profile is needed to decide how to distribute sales values of a week that starts at the end of a month and ends in the following month.
You can either define split profiles independent of the year or as exceptions for particular years.
Prerequisites
You have defined the time granularities that exist in your market research data. For more information see the Customizing documentation for retail panel data under Cross-Application Components Demand Data Foundation Data Upload Settings for Retail Panel Data Define Time Granularity .
You have assigned a time granularity to the data delivery agreement in the SAP Easy Access Menu under Cross-Application Components Demand Signal Management Data Upload Data Delivery Agreements
for Market Research Data Enhance Data Delivery Agreements .
Features
Define Time Split
You define the time split for a data delivery agreement or in exceptional cases for locations of a data delivery agreement. You enter the desired reporting granularity, either week or month, and assign a split profile.
Simulate Time Split
You can simulate the time split for a data delivery. The system displays the records as stacked columns in the source time granularity, for example, weeks. In a second diagram, the system displays the data records in the target time granularity, for example in the granularity Months that is used in global reporting. You can use the tooltips to verify which sales value of the source period contributes to which target period or to verify which fraction of the target sales value has its origin in which source period.
286 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
More Information
Rules for Defining Profiles [page 292]
8.1.5 Time Split: Example for Weekday Split Profile
Global reporting in your company is based on the reporting granularity Calendar Month. You receive market research databases with the following source granularities:
● Spain: Weekly data● France: Four-weekly data● Italy: Bimonthly data
You want to prepare your data for the reporting period February and the country Spain.
Weeks Belonging to Two Calendar Months Are Split
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 287
The following weekday split profile is assigned to the data delivery agreement of the Spanish database. The weighting factors you enter for the days indicate that on Thursdays, Fridays, and Saturdays sales are generally higher than on the other days:
288 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
Distributed Sales Values After Time Split
8.1.6 Time Split: Example for Weekly Split Profile
Use
Global reporting in your company is based on the reporting granularity Calendar Month. You receive market research databases with the following source granularities:
● Spain: Weekly data● France: Four-weekly data● Italy: Bimonthly data
You want to prepare your data for the reporting period February and the country France.
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 289
Weeks Belonging to Two Calendar Months Are Split
290 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
The following weekly split profile is assigned to the data delivery agreement of the French database:
Distributed Sales Values After Time Split
After the time split into weeks has been executed, the system executes a second split for the two overlapping weeks 201205 and 201209. This is done using a weekday profile.
More Information
Time Split: Example for Weekday Profile [page 287]
8.1.7 Time Split: Example for Monthly Split Profile
Global reporting in your company is based on the reporting granularity Calendar Month. You receive market research databases with the following source granularities:
● Spain: Weekly data● France: Four-weekly data● Italy: Bimonthly data
You want to prepare your data for the reporting period February and the country Italy.
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 291
Bimonthly Delivery
Monthly Split Profile Applied to Sales Value
8.1.8 Rules for Defining Profiles
The number and type of profile you need depends on the source granularity and on the desired target granularity of your data.
292 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
The following rules apply for extrapolation profiles:
Source Granularity Extrapolation Based on Weekly Profile Extrapolation Based on Monthly Profile
Weekly Data X -
Monthly Data - X
Bimonthly Data - X
Four-weekly Data X -
The following rules apply for time split profiles:
Source Granularity Reporting Granularity: Week Reporting Granularity: Month
Weekly data Weekday profiles Weekday profiles
Monthly data Weekly profiles, weekday profiles No profiles required; 1:1 mapping
Bimonthly data Weekly profiles, weekday profiles Monthly profiles
Four-weekly data Weekly profiles, weekday profiles Weekly profiles, weekday profiles
8.1.9 Delete Unused Profiles
You can use transaction /DDF/MR_PROF_DEL to delete profiles that are not used anymore. This transaction is available in the SAP Easy Access Menu under Cross-Application Components Demand Signal Management
Data Model Global Reporting Delete Unused Profiles .
You can delete profiles of the following types:
● Extrapolation profile● Time split profile● Monthly profile● Weekly profile● Weekday profile
If you do not enter any selection criteria, the system deletes all profiles that are no longer used. Before you execute the deletion, you can choose Simulate to see which profiles will be deleted and which profiles are still in use and therefore cannot be deleted.
Delete Individual Profiles Manually
You can also delete individual profiles directly on the Extrapolation Profiles or Time Split Profiles UIs. Select the profiles that are no longer needed in the list and choose Delete Profile.
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 293
8.2 Supervise Data for Global Reporting
Use
Once the data is consolidated, converted to the desired time granularity, and potential gaps are filled with extrapolated data, you have to explicitly release and publish your data for global reporting. The preparational steps for supervising the data are performed in the planning phase. Harmonization of global attributes, releasing and publishing of data deliveries takes place in the operating phase.
Supervise Data for Global Reporting
Planning Phase
The planning phase typically takes place at the beginning of each year. You plan your deliveries and define by which date the deliveries will contain data. With the help of publishing groups you define which combinations of countries and global categories must be available to the end user at which date. Based on the information by which date all data deliveries contributing to a publishing group are complete, you define your publishing dates.
Operating Phase
The steps in the operating phase are performed for every new database that you upload in the course of the year. You classify or revise the automatic classification of global attributes in data harmonization. Then you check the quality and the degree of harmonization before releasing the data for global reporting. At the end of the process, you publish the data for the end user. All deliveries contributing to a publishing group and publishing date must be released before that.
Integration between Data Model, Data Upload, and Data Harmonization
294 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
In data harmonization, you maintain global attributes for all global products and global locations. Once this process is complete, you mark the global products and global locations as Released for Global Reporting. This status is passed to the Release Data for Global Reporting UI to indicate the completion rate in data harmonization. Each delivery of a market research database is represented by a process ID in the InfoCube Market Research Global Retail Panel (/DDF/C01).
The key figures for each process ID are calculated as a delta based on the previous delivery of the same database. This historization of market research key figures allows for reporting on released data as well as on the most recent changes of data that has not been released yet.
All deliveries that have been uploaded to global reporting form the work list of the Release Data for Global Reporting UI. Releasing the data changes the attribute Release Status of the InfoObject Process ID (/DDF/LOAD_ID). Once a publishing group has been published, all combinations of country and global category contained in this publishing group are marked as Published in the InfoObject Country (Publishing) (/DDF/PUBLISH). The InfoCube Market Research Global Retail Panel, the InfoObject Country (Publishing) as well as the InfoObjects Global Product and Global Location are combined in the CompositeProvider Global Reporting (/DDF/CP01). This combination enables reporting of market research key figures by global attributes while the global attributes remain flexibly changeable in data harmonization without the need of a realignment. You can restrict the visibility of the data to the end user by using the InfoObjects Country (Publishing) and Process ID in queries, for example queries on the CompositeProvider Global Reporting. End users can only see data that has been released and published. Users providing and supervising data for global reporting can see the data anytime.
More Information
●● Release Data for Global Reporting [page 295]● Manage Publishing Groups [page 297]● Publish Data [page 299]
8.2.1 Release Data for Global Reporting
Use
Before you publish your data for global reporting you must check the quality of each data delivery and then decide whether you want to release or reject it.
The application Release Data for Global Reporting supports you in answering the following questions:
● Does the data look reasonable? Do I understand the changes that were made?● Has the data been transformed correctly?● Are the data harmonization process and the classification of global attributes complete?
Worklist
The worklist on the left gives you an overview of all data deliveries. You can search for a name, a date, and a percentage value from data harmonization. This completion rate from harmonization indicates the percentage
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 295
of records that were manually set to Released for Global Reporting in data harmonization. This value is displayed in red, yellow, or green to indicate whether the harmonization status is satisfactory.
Additional details like the number of days until the publishing date help you prioritize your data deliveries. The worklist is always grouped by status of delivery. By default, all pending deliveries are displayed.
The buttons at the bottom of the screen allow you to sort and filter the data deliveries in the worklist. You can sort the list by delivery status, completion rate in harmonization, name of the data delivery agreement, and number of days to publishing date. You can filter the list of data deliveries using the following criteria:
● Status (Pending, Released, Published, Rejected, Previously rejected, now pending, Previously rejected, now released, Previously rejected, now published)
● Countries (of locations relevant for global reporting)● Global categories (of products relevant for global reporting)● Publishing groups (to which data deliveries contribute)● Time stamps of data delivery (Uploaded At, Released At, Published At)
General
This screen contains the details for the selected data delivery. It shows which countries and global categories are included in the data delivery as well as the status and time information when the data delivery was uploaded, released and published. You can see the period that is covered by this delivery.
You can see which publishing groups the data is relevant for and directly navigate to these publishing groups.
In addition, status and time information of the previous delivery is displayed. This helps you understand how much time overlap there is between the two deliveries and how they are related with regards to theirs statuses.
Harmonization
This screen shows you the completion rate from data harmonization weighted by sales value. It also shows the harmonization status by different attributes:
● Country● Global Manufacturer● Global Brand● Global Subcategory● Global Category
Changes
This screen allows you to compare the sales value of the current delivery with the previous delivery to check if there are unexpected changes in the history of the data. Different views (table, line chart, column chart) are available to compare the changes. You can also switch to full-screen mode.
Statistics
This screen allows you to analyze the following:
● Number of products of the current delivery● Number of products of the previous delivery● Number of new products comparing current and previous delivery● Number of missing products comparing current and previous delivery
These numbers enable you to verify if there are changes in the product master data of the new delivery that require your attention.
296 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
You can choose between different views (pie chart, column chart, table) and group the products by Global Category, Global Brand, Global Manufacturer, or Global Subcategory. By clicking on any item in the chart or table, you get a detailed list of all corresponding global products. The list offers a text search and a download function. You can navigation directly from a product to the Search and Mass Change Harmonized Products UI in data harmonization.
Release/Reject
After reviewing the status of your data delivery you can either release or reject the delivery for publishing.
If you decide that the quality of your data delivery is not acceptable you can reject the data delivery and enter a reason for your rejection.
ExampleThe delivery contains incorrect product categories.
If you reject a data delivery its status changes to Rejected. Once a new delivery of the same data delivery agreement has been uploaded to global reporting, the status of the delivery changes to Previously rejected, now pending. If you release this new delivery, the previously rejected delivery is released as well. Releasing a data delivery always releases all previous data deliveries that have not been released yet.
The system allows a number of predefined status transitions. If you want to allow additional status transitions, you can define them in Customizing. These additional status transitions are interpreted with priority when the system determines the correct target status of a data delivery. For more information, see the Customizing for retail panel data under Cross-Application Components Demand Data Foundation Data Upload Settings for Retail Panel Data Define Additional Release Status Transitions for Global Reporting .
Revoke Release
You can cancel the release of a data delivery. This sets the status of the data delivery to Pending. The status of all other data deliveries is not changed. You can only revoke the release of a data delivery if the following prerequisites are met:
● The data delivery has not had the status Published yet.● There is no later data delivery of the data delivery agreement that has already been released. If that is the
case, you must revoke the later data delivery first.
More Information
Manage Publishing Groups [page 297]
8.2.2 Manage Publishing Groups
Use
This application allows you to plan which combination of countries and product categories you want to publish together and on which date you want to publish the data of a reporting month. It provides an overview of the
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 297
data delivery agreements that contribute to a publishing group with detailed information about all relevant current deliveries to decide if publishing for a given reporting month is possible.
Worklist
The worklist gives you an overview of the existing publishing groups and shows the number of planned publishing dates for each group.
You can create a publishing group by selecting Add at the bottom of the worklist. You enter a name and a descriptive note (optional), and add the countries and global categories that you would like to assign to your publishing group. After saving the publishing group, you can later edit or delete the group using the functions at the bottom of the screen. You can sort the worklist by name or descriptive note of the publishing group.
Dates
On the Dates tab in the Publishing Group Details area, you see a list of the scheduled publishing dates of the corresponding reporting months. You see the status, for example, that shows whether a date is overdue, still planned, or whether the reporting month is already published. Here you can define new publishing dates or delete existing ones. For existing dates you can navigate to the Publishing Date Details screen.
Once the publishing date has been created, you can click the reporting month to navigate to the Reporting Month Details screen. You can navigate from all tabs of the Publishing Group Details to the Release Data for Global Reporting app. Only the data delivery agreements of the relevant publishing groups are listed in the worklist of this application.
Define Publishing Date
You select a publishing date for your reporting month by choosing Define New Date on the Dates tab. The system then determines based on the planned delivery dates for each combination of countries and global categories, and data delivery agreements whether the deliveries related to this publishing group will be available or whether parts of the month or even the entire month have to be extrapolated.
At the same time, the system also determines the corresponding values for the previous and following delivery dates that are relevant for this publishing group. By moving the publishing date, you can see what percentage of extrapolated data would be available on a particular publishing date and then decide whether the selected date makes sense.
You can reduce the number of entries in the list by using a filter on the combinations of countries and product categories. You could, for example, only display combinations that contain extrapolated data or have a certain source time granularity.
Publishing Date Details
When you create a publishing date the system determines based on the current planned delivery dates for each combination of countries, global categories, and data delivery agreement, whether the deliveries related to this publishing group will be available or whether parts of the month or even the entire month have to be extrapolated.
Reporting Month Details
The system evaluates and displays the rate of extrapolated sales values for all currently released deliveries of the publishing group. This supports you in deciding whether the publishing of data for the reporting month is reasonable because almost all values are available and only a small part of data is extrapolated. If you see, for example, that important deliveries are due soon, you might want to wait for these deliveries to reduce the percentage of extrapolated values.
298 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
You have different options to display the sales values (for example, full stacked chart, vertical stacked chart, horizontal stacked chart, or table). If you decide to publish the data, you can directly navigate from this screen to the Publish Data app.
Countries and Global Categories
By clicking Countries or Categories, you see which combinations of countries and global categories are assigned to the publishing group. By default, all combinations are displayed for which retail panel data exists in the system and that have originally been selected. However, you can also deselect individual combinations of countries and global categories if they are not relevant for the publishing group. This can be done be using the Switch On/Off function for a specific combination in the Country Details or Category Details screen.
Data Deliveries Agreements
The Data Deliveries tab gives you an overview of the status of the relevant data deliveries grouped by country. It shows you the number of data deliveries that are not yet released and allows you to navigate to the data delivery agreement details. Here all single data delivery agreements and the current delivery statuses are listed with additional information like upload time, sales value, and so on.
From the details you can then navigate further to the Release Data for Global Reporting app with the specific data delivery agreement as a filter.
More Information
Publish Data [page 299]
8.2.3 Publish Data
This application allows you to publish data for global reporting. It gives you an overview of all scheduled publishing dates and shows all the information you need to complete the publishing process.
You can see whether a publishing date is overdue and whether and the group is ready to be published because all related data deliveries have been released.
You can see how many deliveries are expected or pending. The status Pending indicates that the deliveries have been uploaded but not yet released for global reporting. To see the details of the status, you can navigate to the applications Delivery Monitor or Release Data for Global Reporting.
Change Calendar Views
You can choose between different calendar views:
● Three weeks● Four weeks● Three months
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 299
Publish Data
Once all the relevant data deliveries for a publishing group have been released, you can choose Publish to make the data available for global reporting.
All the released data deliveries that contain data for any combination of country and global category which is part of the publishing group are set to status Completed
Postpone Publishing
If it is not possible to publish your data on the scheduled date, you can choose Postpone and select a new publishing date.
Unpublish Data
You can also revoke the status of the published data to the last published version. Once the data has been published and the status changes to Published, you can select the item and choose Unpublish.
CautionThe application does not allow you to unpublish data that are not published.
8.3 Global Market Share Analysis
Use
As a Global Marketing Manager you evaluate the market share of your global brands looking at various aspects. You are looking for answers to the following questions:
● What is the short-term or long-term trend of my brand in the different countries?● Which brand is gaining or losing market share in a country?● Why does a brand gain or lose market share?● What are the key products that drive the change?
Prerequisites
You have defined a market group. For more information, see Define Market Groups [page 72].
The data you want to report on has been published.
300 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
The system displays the market shares for the last reporting month that was published for all combinations of countries and global categories in your market group.
For more information on configuration, see Configuring and Administrating Global Market Share Analysis [page 72]
Global Market Share Overview
You typically start with an overview of market shares of your global brand in all countries and global categories that you are interested in. To analyze the short-term performance, you compare the past three months (P3M) with the same period of the previous year. For a long-term trend evaluation you compare the last year with the same period of the previous year (moving annual total (MAT)). Positive and negative market share trends are highlighted by the system in a way that you can easily identify the country-category combination that requires your attention.
Gainer/Loser Overview
In the next step you want to see for one specific country-category combination which brands gained or lost market share and how the overall market performed in comparison. You would like to analyze the history of the sales data of your brand as well as your competitor brands. For gainer brands you would like to understand what went well, whereas for loser brands you need to analyze what caused the negative trend.
Root Cause Analysis
In the last step you would like to know whether the positive or negative trend of your brand or a competing brand is caused by a significant change in distribution, prices, or the introduction of new products. If possible, you would like to drill down to the specific product level to see which products the market share changes can be attributed to.
More Information
Global Market Share Overview [page 301]
Gainer/Loser Overview [page 303]
Root Cause Analysis [page 304]
SAP Fiori-based documentation for
8.3.1 Global Market Share Overview
Use
Before entering the Global Market Share Overview, you select a market group. The subsequent screen gives you an overview of the market shares for all available combinations of countries and global categories in the selected market group. In addition, you see the market share changes compared to the same period of the previous year.
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 301
You have the following options to select market share values:
● Value: P3M (market share by sales value in the last three months up to the last published month)● Volume: P3M (market share by sales quantity in the last three months up to the last published month)● Value: MAT (market share by sales value for moving annual total, meaning the last 12 months up to the last
published month)● Volume: MAT (market share by sales quantity for moving annual total)
In the header you always see the current selection.
By clicking any of the values for a country or category, you can navigate to the Gainer/Loser Overview that provides you with further details for this value.
For direct access to the global market share analysis of a particular market group, you can save dedicated tiles for these market groups using the Save as Tile function.
Determination of the Last Published Month
Global Market Share Analysis is based on the latest data that has been published. The system determines the last published month for every combination of country and global category. The latest month that is published for all country-category combinations of the market group is used as the last period for P3M (past three months) and MAT (moving annual total).
Key Figures
For each global category, you see the brand market share in the left column and the brand market share growth in the right column. These key figures are calculated as follows:
● Brand market share is the market share of the selected brand compared to the total market. The total market is defined as the total sales value or sales quantity of the global category in a specific country in the selected time frame.
● Brand market share growth shows the difference of the brand market share in the selected time frame and the brand market share in the comparison time frame.
Negative trends are marked in red, positive are marked in green to allow easy identification of the areas where special attention is required.
By clicking any of the values for a country or category, you can navigate to the Gainer/Loser Overview that provides you with further details for this value.
More Information
Gainer/Loser Overview [page 303]
Global Market Share Analysis [page 300]
302 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
8.3.2 Gainer/Loser Overview
Use
You navigate to this screen based on the global brand, global category, reporting time frame, comparison time frame and the key figure type (value or volume) you selected in the Global Market Share Overview. In the top section of the overview screen you see the following key figures:
● Brand: Market Share: Market share of the global brand● Brand: Market Share Growth: Market share growth of the global brand● Brand Sales Growth (Value/Volume): Sales growth of the global brand● Market Sales Growth (Value/Volume): Sales growth of the overall market
For more information on how these key figures are calculated, see Key Figures in Global Market Share Analysis [page 307]
History Charts
You can navigate horizontally using the arrow buttons to display charts with the complete history for the following key figures:
● Brand: Market Share History: Brand market share● Brand: Sales History (Value/Volume): Brand sales● Market: Sales History (Value/Volume): Market sales
The time frame covered by these charts is typically much longer than the selected P3M or MAT time frame. It contains all months for which published data exists, allowing you to visualize long term trends.
By clicking the Sales History (Value/Volume) chart, you can choose between different representations of the history data and compare the data with one or multiple other competing brands.
By clicking the Market: Sales History (Value/Volume) chart, you can compare the market sales with the sales curve of one of the global brands.
Filter Brand Lists by Market Share Coverage
A global category may contain hundreds of brands, many of which are irrelevant for the assessment of your selected brand. Therefore, you can set a threshold for the percentage of the total market share that should be covered by the leading brands of the market.
In addition, you can restrict the number of brands to be displayed.
These are personalized settings that are saved for each user.
List of Gainer/Loser Brands
To see how your brand performed compared to other brands in the market, you can analyze the lists of gainer brands and loser brands at the bottom of the screen based on the following key figures:
● Brand: Market Share Growth shows the difference of the brand market share in the selected time frame and the brand market share in the comparison time frame.
● Brand: Market Share is the market share of the gainer or loser brand compared to the total market The total market is defined as the total sales value or sales quantity of the global category in a specific country in the selected time frame.
Both lists are sorted by Market Share Growth.
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 303
To gain further insights into the performance of individual brands, you can click the brand to navigate to the Root Cause Analysis screen.
More Information
Global Market Share Analysis [page 300]
Global Market Share Overview [page 301]
Root Cause Analysis [page 304]
8.3.3 Root Cause Analysis
Use
This screen provides information about the average price and average price growth of the brand and lists a number of influencing factors that could be relevant for the brand's change in market share. A number of positive and negative influences are evaluated.
You can use the filter function to define which products should be considered. For a detailed description of how the key figures for positive and negative influences are calculated, see Influencing Factors for Root Cause Analysis [page 305].
You can navigate to this screen from the Gainer/Loser Overview. The values you see are based on the gainer/loser brand, country, global category, time frame, comparison time frame, and key figure type (value or volume) you selected.
At the top of the screen you see the following key figures:
● Brand: Market Share● Brand: Market Growth● Brand: Sales Growth (Value/Volume)● Average Price● Average Price Growth
For more information on how these key figures are calculated, see Key Figures in Global Market Share Analysis [page 307]
History Charts
You can navigate horizontally using the arrow buttons to display charts with the complete history for the following key figures:
● Brand market share● Brand sales value/volume● Market sales value/volume
The time frame covered by these charts is typically much longer than the selected P3M or MAT time frame. It contains all months for which published data exists, allowing you to visualize long term trends.
304 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
By clicking these charts you open a dialog window which allows you to perform a more detailed analysis of the values.
Influencing Factors for Market Share Change
At the bottom of the screen you see a list of positive and negative influencing factors. For products that contribute significantly (depending on threshold values) to a market share change, the following influences are evaluated by the algorithms described below:
● New Product (can appear only as positive influence)● Weighted Distribution● Pricing Strategy to Increase Sales (Value)● Pricing Strategy to Increase Sales (Volume)● Others
Products that have a significant contribution to the market share, but which cannot be assigned to one of the above mentioned influences are grouped with influences Others. When you navigate from the Gainer/Loser Overview to this screen, the system evaluates whether a product is assigned to one of the influences.
By clicking an influencing factor you can drill down to the product level and see to which products the influences can mainly be attributed.
For a detailed description of how the key figures for positive and negative influences are calculated, see Influencing Factors for Root Cause Analysis [page 305]
More Information
Global Market Share Analysis [page 300]
Gainer/Loser Overview [page 303]
8.3.3.1 Influencing Factors for Root Cause Analysis
The influencing factors in Root Cause Analysis are defined and calculated as follows:
New Products
A product is considered as new if no sales occurred for this product for a specific period of time. You can define a threshold for this period by selecting a number of months. The system calculates this period as a range of the number of months defined by the threshold, ending at the last month of the current period, as defined in the Global Market Share Overview.
Accordingly, a global product is considered a new product if the following condition holds throughout the defined period:
Sum (Sales Value of Product) = 0
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 305
By default, the threshold for new products is set to 12 months. This means that a product is considered as a new product if it has a zero sales value for the MAT.
Weighted Distribution
Weighted distribution is considered a root cause in one of the two following cases (either positive or negative influence):
● If the weighted distribution minus the previous weighted distribution is bigger than the distribution threshold, this is considered a positive influence.Weighted Distribution – Previous Weighted Distribution > Distribution Threshold = Positive Influence
● If the previous weighted distribution minus the weighted distribution is bigger than the distribution threshold, this is considered a negative influence.Previous Weighted Distribution – Weighted Distribution > Distribution Threshold = Negative Influence
Pricing Strategy to Increase Sales (Value)
The price index is calculated according to the formula:
Price Index = Average Price (Product) / Average Price (Market Group)
Pricing Strategy to Increase Sales (Value) is considered a root cause if the price index changes as follows:
Price Index – Previous Price Index > Price Change by Value Threshold (in percentage points)
The system counts the root cause as a positive influence, if the product has increased in market share at the same time. Otherwise, it counts the root cause as a negative influence. You can define thresholds for both values.
Prices are calculated as sales values divided by sales units. Sales volumes and their units of measure are not taken into account.
Pricing Strategy to Increase Sales (Volume)
The price index is calculated according to the formula:
Price Index = Average Price (Product) / Average Price (Market Group)
Pricing Strategy to Increase Sales (Volume) is considered a root cause if the price index changes as follows:
Previous Price Index – Price Index > Price Change by Volume Threshold (in percentage points)
The system counts the root cause as a positive influence, if the product has increased in market share at the same time. Otherwise, it counts the root cause as a negative influence. You can define thresholds for both values.
Prices are calculated as sales values divided by sales units. Sales volumes and their units of measure are not taken into account.
306 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
Others
All of the preselected global products that do not satisfy any of the other root causes are counted as Others.
The product details of the various root causes are listed together with the following key figures:
● Market share● Market Share Growth = Market Share – Previous Market Share (in percentage points)● Price● Price Growth = (Price – Previous Price ) / Previous Price (in percent)● Distribution = Weighted Distribution● Distribution growth = Weighted Distribution – Previous Weighted Distribution (in percentage points)
8.3.4 Key Figures in Global Market Share Analysis
Brand: Market Share
Brand market share is the market share of the selected brand compared to the total market, meaning the total sales value or sales quantity of the global category in the selected country and time frame.
Brand: Market Share Growth
Brand market share growth is the difference of the brand market share for the selected time frame and the brand market share for the comparison time frame.
Brand: Sales Growth (Value/Volume)
Brand sales growth is the growth of the sales value or sales quantity compared to the sales values or sales quantities of the comparison time frame.
Market: Sales Growth (Value/Volume)
Market sales growth is the growth of the sales value or sales quantity for the entire global category and country compared to the sales values or sales quantities of the comparison time frame.
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 307
Average Price
Average price is the division of the brand sales value by the brand sales quantity within the given time frame for the global category and country.
Average Price Growth
Average price growth is the difference of the average price and the average price of the comparison time frame divided by the average price of the comparison time frame in percent.
8.3.5 Calculated Key Figures for Global Market Share Analysis
Use
The following calculated key figures contribute to the two queries /DDF/CP03_Q0001 and /DDF/CP03_Q0002
Technical Name Name Calculation
/DDF/CP03_CK_AVGPRICE Average Price /DDF/CP03_RK_BRANDSALES
/
/DDF/CP03_RK_BRANDSALQUA
/DDF/CP03_CK_AVGPRICE_CP Average Price CP /DDF/CP03_RK_BRANDSALES_CP
/
/DDF/CP03_RK_BRANDSALQUA_CP
/DDF/CP03_CK_AVGPRICE_GROWTH Average Price Growth (/DDF/CP03_CK_AVGPRICE- /DDF/CP03_CK_AVGPRICE_CP )
%A
/DDF/CP03_CK_AVGPRICE_CP
/DDF/CP03_CK_BRANDSALES_GROWTH Brand Growth Value % (/DDF/CP03_RK_BRANDSALES -
/DDF/CP03_RK_BRANDSALES_CP )
%A -
/DDF/CP03_RK_BRANDSALES_CP
308 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
Technical Name Name Calculation
/DDF/CP03_CK_BRANDSALQUA_GROWT Brand Sales Quantity Growth % (/DDF/CP03_RK_BRANDSALQUA -
/DDF/CP03_RK_BRANDSALQUA_CP)
%A
/DDF/CP03_RK_BRANDSALQUA_CP
/DDF/CP03_CK_BRANDSALQUA_SHARE Brand Sales Quantity Share % /DDF/CP03_RK_BRANDSALQUA
%A
/DDF/CP03_RK_MKTSALQUA
/DDF/CP03_CK_BRANDSALES_SHARE Brand Sales Share % /DDF/CP03_RK_BRANDSALES
%A
/DDF/CP03_RK_MKTSALES
/DDF/CP03_CK_BRANDSLS_SHARE_CP Brand Sales Share CP % /DDF/CP03_RK_BRANDSALES_CP
%A
/DDF/CP03_RK_MKTSALES_CP
/DDF/CP03_CK_BRANDSLS_SHARE_DI Brand Sales Share Diff /DDF/CP03_CK_BRANDSALES_SHARE
-
/DDF/CP03_CK_BRANDSLS_SHARE_CP
/DDF/CP03_CK_BRANDQUA_SHARE_CP Brand Sales Volume Share CP /DDF/CP03_RK_BRANDSALQUA_CP
%A
/DDF/CP03_RK_MKTSALQUA_CP
/DDF/CP03_CK_BRANDQUA_SHARE_DI Brand Sales Volume Share Difference /DDF/CP03_CK_BRANDSALQUA_SHARE
-
/DDF/CP03_CK_BRANDQUA_SHARE_CP
/DDF/CP03_CK_MKTSALES_GROWTH Market Sales Growth % (/DDF/CP03_RK_MKTSALES - /DDF/CP03_RK_MKTSALES_CP )
%A
/DDF/CP03_RK_MKTSALES_CP
/DDF/CP03_CK_MKTSALQUA_GROWTH Market Sales Quantity Growth % (/DDF/CP03_RK_MKTSALQUA - /DDF/CP03_RK_MKTSALQUA_CP )
%A
/DDF/CP03_RK_MKTSALQUA_CP
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 309
More Information
The BI content documentation for the following queries:
8.3.6 Restricted Key Figures for Global Market Share Analysis
Use
The following restricted key figures contribute to the two queries /DDF/CP03_Q0001 and /DDF/CP03_Q0002:
Technical Name Name Restriction
/DDF/CP03_RK_BRANDSALES Brand Sales Key figure /DDF/GMSVAL restricted by calendar month /DDF/I_MFT and brand /DDF/PBRAND
/DDF/CP03_RK_BRANDSALES_CP Brand Sales CP Key figure /DDF/GMSVAL restricted by calendar month /DDF/I_CPMFT and brand /DDF/PBRAND
/DDF/CP03_RK_BRANDSALQUA Brand Sales Quantity Key figure /DDF/GMSQUA restricted by calendar month /DDF/I_MFT and brand /DDF/PBRAND
/DDF/CP03_RK_BRANDSALQUA_CP Brand Sales Quantity CP Key figure /DDF/GMSQUA restricted by calendar month /DDF/I_CPMFT and brand /DDF/PBRAND
/DDF/CP03_RK_MKTSALES Market Sales Key figure/DDF/GMSVAL restricted by calendar month /DDF/I_MFT
/DDF/CP03_RK_MKTSALES_CP Market Sales CP Key figure /DDF/GMSVAL restricted by calendar month /DDF/I_CPMFT
/DDF/CP03_RK_MKTSALQUA Market Sales Quantity Key figure /DDF/GMSQUA restricted by calendar month /DDF/I_MFT
/DDF/CP03_RK_MKTSALQUA_CP Market Sales Quantity CP Key figure /DDF/GMSQUA restricted by calendar month /DDF/I_CPMFT
310 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
More Information
The BI content documentation for the following queries:
8.3.7 Configuring and Administrating Global Market Share Analysis
As a data provider for Global Market Share Analysis, you have the following tasks:
● Define Market Groups [page 72]● Include Images in Global Market Share Analysis User Interfaces [page 73]● Define Path for Image Files [page 74] or Mass Definition of Image Paths [page 75]
8.3.7.1 Define Market Groups
Use
You use this transaction to define for each global brand the combinations of countries and global categories that you would like to report on.
These combinations are then stored as market groups in the InfoObject /DDF/MRMGRP. The assigned countries are stored in the InfoObject /DDF/MRMGRPM and the assigned global categories are stored in /DDF/MRMGRPC.
All defined market groups are available to be selected when you start the application Global Market Share Analysis. The transaction Define Market Groups is available in the SAP Easy Access Menu under Cross-Application Components Demand Signal Management Analytics and Reporting Global Market Share Analysis .
You can define several market groups for one global brand to reflect different responsibilities for global categories and countries. To determine the corresponding countries and global categories, you can either use a publishing group or explicitly assign individual countries and global categories to a market group. You can also combine both options.
You can change, display, or delete market groups. To add or remove a country or a global category, you just have to list all countries and global categories that you would like to include in the selection options.
NoteThe number of combinations of countries and global categories is limited due to restrictions in the lengths of a URL. You can include up to 8 global categories and 25 countries. If you reduce the number of global categories you can enhance the list of countries.
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 311
NoteIf you change a publishing group, for example, by replacing one country with another one, you must adjust the market group that corresponds to the publishing group.
More Information
Manage Publishing Groups [page 297]
8.3.7.2 Include Images in Global Market Share Analysis User Interfaces
Use
For the following object types you can include images in the user interfaces of Global Market Share Analysis:
● Country● Global Brand● Global Category
NoteOnly the MIME types image/pngand image/jpeg are supported, thus you can use only JPG and PNG files.
Process
1. You upload the corresponding image files into the MIME repository of your system.2. You define the path for each image file to the corresponding object key of the InfoObject using the
transactions Define Path for Image Files (/DDF/MIME) or Mass Definition of Image Files (/DDF/MIMEDEF).
NoteDisclaimer: Be aware of copyright. The images used are your responsibility.
More Information
Define Path for Image Files [page 74]
Mass Definition of Image Paths [page 75]
312 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
8.3.7.2.1 Define Path for Image Files
Use
You use the transaction Define Path for Image Files (/DDF/MIME) to assign image files stored in the MIME repository to the corresponding object keys to make the images visible in the application Global Market Share Analysis. The following object types are supported:
● Country● Global brand● Global category
For each object type you must specify the InfoObject that provides the key values that you would like to link to an image path. In the SAP standard the following InfoObjects are used:
Object Type InfoObject
Country 0COUNTRY
Global Brand /DDF/BRAGM_TXT
Global Category /DDF/PCAGM_TXT
Select one object type and define, for the relevant object keys, the path to the image file.
NoteDefine the paths before you start importing image files into the MIME repository. If the file does not exist yet, the system provides a warning but you can save the path definition anyway.
Alternatively, you can use transaction /DDF/MIMEDEF for a convenient mass definition of image paths.
More Information
Mass Definition of Image Paths [page 75]
8.3.7.2.2 Mass Definition of Image Paths
Use
You can use transaction /DDF/MIMEDEF to define the image paths for multiple object keys of a given object type at once.
SAP Demand Signal Management, version for SAP BW/4HANAGlobal Reporting C O N F I D E N T I A L 313
You have the option to scan a specified folder of the MIME repository for existing image files. The system searches for images with file names that are equal to the selected object keys. If there is a match of a file name (excluding the file extension) the corresponding assignment of the object key to the path is saved.
Alternatively, you first create the assignments of object keys to image paths. The system then creates the path as a concatenation of the specified path, the object key, and the file extension depending on the chosen image type. Afterwards, you can upload the image files into the MIME repository with the corresponding file name.
More Information
Define Path for Image Files [page 74]
314 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Global Reporting
9 Analytics and Reporting
The analytics and reporting UIs of SAP Demand Signal Management, version for SAP BW/4HANA allow different business users, such as key account managers, product managers, and demand and supply planners to display and analyze the data that is stored in SAP Demand Signal Management, version for SAP BW/4HANA.
Different types of data are available in SAP Demand Signal Management, version for SAP BW/4HANA, including company external data, such as retailer demand data and market research data (retail panel), as well as company internal data, such as master data and trade promotion data. From certain reports, it is also possible to navigate directly to SAP ERP to check if there are any open orders for a product or any outbound deliveries for that product. From the Promotion Analysis report, it is possible to navigate directly to SAP Trade Promotion Management to view and edit trade promotion data. It is also possible to integrateSAP Demand Signal Management, version for SAP BW/4HANA with a provider of social media data and analyze the impact of social media on sales.
The reports that are available can be used directly or can be modified to meet your requirements. For more information, see Extensibility of Analytics and Reporting [page 519].
Different software, such as SAP BusinessObjects Dashboards and SAP BusinessObjects Analysis for Microsoft Office, can be used on top of SAP Demand Signal Management, version for SAP BW/4HANA to perform additional analytics and reporting.
UI Frameworks Used to Build the Analytics and Reporting UI
The analytics and reporting UIs are built using the following UI technologies:
● UI development toolkit for HTML5 (SAPUI5)A user interface technology for building and adapting client applications. The SAPUI5 runtime is a client-side HTML5 rendering library with a rich set of UI controls for building both desktop and mobile applications. For more information, enter the keyword UI Frameworks based on HTML5, JavaScript and CSS in the documentation for SAP NetWeaver on SAP Help Portal at http://help.sap.com .
● Web Dynpro for ABAPThe SAP standard UI technology for developing Web applications in the ABAP environment. The following UI frameworks are used:○ Floorplan Manager (FPM) for Web Dynpro ABAP
A highly-configurable UI framework based on Web Dynpro ABAP. FPM overview page components are used in the canvas area of the analytics and reporting UI. They consist of a main assignment block and a set of assignment blocks that show related information.
○ Web Dynpro ABAP Page BuilderA framework for creating Web Dynpro applications, which can include home pages and side panels, and for defining and structuring CHIPs (Collaborative Human Interface Parts) in various layouts. On the analytics and reporting UI, CHIPs are used in the side panel to include the BusinessGraphics as well as in the Page Builder overview work centers.
○ Business Context Viewer (BCV)
SAP Demand Signal Management, version for SAP BW/4HANAAnalytics and Reporting C O N F I D E N T I A L 315
A framework that allows SAP Business Suite applications to integrate different types of additional information into the context of the applications. BCV is used to display the side panel area that includes BusinessGraphics that are relevant to the specific analytics and reporting UI.
○ Personal Object WorklistA framework that provides centralized, personalized access to a worklist with a general overview of the work environment of business users. It allows business users to define queries and view their results every time they navigate to the worklist or to the overview work center that contains the worklist.
For more information, enter the keyword UI Frameworks Based on Application Server ABAP in the documentation for SAP NetWeaver on SAP Help Portal at http://help.sap.com .
9.1 Global Market Share Analysis
Use
As a Global Marketing Manager you evaluate the market share of your global brands looking at various aspects. You are looking for answers to the following questions:
● What is the short-term or long-term trend of my brand in the different countries?● Which brand is gaining or losing market share in a country?● Why does a brand gain or lose market share?● What are the key products that drive the change?
Prerequisites
You have defined a market group. For more information, see Define Market Groups [page 72].
The data you want to report on has been published.
The system displays the market shares for the last reporting month that was published for all combinations of countries and global categories in your market group.
For more information on configuration, see Configuring and Administrating Global Market Share Analysis [page 72]
Global Market Share Overview
You typically start with an overview of market shares of your global brand in all countries and global categories that you are interested in. To analyze the short-term performance, you compare the past three months (P3M) with the same period of the previous year. For a long-term trend evaluation you compare the last year with the same period of the previous year (moving annual total (MAT)). Positive and negative market share trends are highlighted by the system in a way that you can easily identify the country-category combination that requires your attention.
Gainer/Loser Overview
In the next step you want to see for one specific country-category combination which brands gained or lost market share and how the overall market performed in comparison. You would like to analyze the history of the
316 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Analytics and Reporting
sales data of your brand as well as your competitor brands. For gainer brands you would like to understand what went well, whereas for loser brands you need to analyze what caused the negative trend.
Root Cause Analysis
In the last step you would like to know whether the positive or negative trend of your brand or a competing brand is caused by a significant change in distribution, prices, or the introduction of new products. If possible, you would like to drill down to the specific product level to see which products the market share changes can be attributed to.
More Information
Global Market Share Overview [page 301]
Gainer/Loser Overview [page 303]
Root Cause Analysis [page 304]
SAP Fiori-based documentation for
9.1.1 Global Market Share Overview
Use
Before entering the Global Market Share Overview, you select a market group. The subsequent screen gives you an overview of the market shares for all available combinations of countries and global categories in the selected market group. In addition, you see the market share changes compared to the same period of the previous year.
You have the following options to select market share values:
● Value: P3M (market share by sales value in the last three months up to the last published month)● Volume: P3M (market share by sales quantity in the last three months up to the last published month)● Value: MAT (market share by sales value for moving annual total, meaning the last 12 months up to the last
published month)● Volume: MAT (market share by sales quantity for moving annual total)
In the header you always see the current selection.
By clicking any of the values for a country or category, you can navigate to the Gainer/Loser Overview that provides you with further details for this value.
For direct access to the global market share analysis of a particular market group, you can save dedicated tiles for these market groups using the Save as Tile function.
Determination of the Last Published Month
Global Market Share Analysis is based on the latest data that has been published. The system determines the last published month for every combination of country and global category. The latest month that is published for all country-category combinations of the market group is used as the last period for P3M (past three months) and MAT (moving annual total).
SAP Demand Signal Management, version for SAP BW/4HANAAnalytics and Reporting C O N F I D E N T I A L 317
Key Figures
For each global category, you see the brand market share in the left column and the brand market share growth in the right column. These key figures are calculated as follows:
● Brand market share is the market share of the selected brand compared to the total market. The total market is defined as the total sales value or sales quantity of the global category in a specific country in the selected time frame.
● Brand market share growth shows the difference of the brand market share in the selected time frame and the brand market share in the comparison time frame.
Negative trends are marked in red, positive are marked in green to allow easy identification of the areas where special attention is required.
By clicking any of the values for a country or category, you can navigate to the Gainer/Loser Overview that provides you with further details for this value.
More Information
Gainer/Loser Overview [page 303]
Global Market Share Analysis [page 300]
9.1.2 Gainer/Loser Overview
Use
You navigate to this screen based on the global brand, global category, reporting time frame, comparison time frame and the key figure type (value or volume) you selected in the Global Market Share Overview. In the top section of the overview screen you see the following key figures:
● Brand: Market Share: Market share of the global brand● Brand: Market Share Growth: Market share growth of the global brand● Brand Sales Growth (Value/Volume): Sales growth of the global brand● Market Sales Growth (Value/Volume): Sales growth of the overall market
For more information on how these key figures are calculated, see Key Figures in Global Market Share Analysis [page 307]
History Charts
You can navigate horizontally using the arrow buttons to display charts with the complete history for the following key figures:
● Brand: Market Share History: Brand market share● Brand: Sales History (Value/Volume): Brand sales● Market: Sales History (Value/Volume): Market sales
The time frame covered by these charts is typically much longer than the selected P3M or MAT time frame. It contains all months for which published data exists, allowing you to visualize long term trends.
318 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Analytics and Reporting
By clicking the Sales History (Value/Volume) chart, you can choose between different representations of the history data and compare the data with one or multiple other competing brands.
By clicking the Market: Sales History (Value/Volume) chart, you can compare the market sales with the sales curve of one of the global brands.
Filter Brand Lists by Market Share Coverage
A global category may contain hundreds of brands, many of which are irrelevant for the assessment of your selected brand. Therefore, you can set a threshold for the percentage of the total market share that should be covered by the leading brands of the market.
In addition, you can restrict the number of brands to be displayed.
These are personalized settings that are saved for each user.
List of Gainer/Loser Brands
To see how your brand performed compared to other brands in the market, you can analyze the lists of gainer brands and loser brands at the bottom of the screen based on the following key figures:
● Brand: Market Share Growth shows the difference of the brand market share in the selected time frame and the brand market share in the comparison time frame.
● Brand: Market Share is the market share of the gainer or loser brand compared to the total market The total market is defined as the total sales value or sales quantity of the global category in a specific country in the selected time frame.
Both lists are sorted by Market Share Growth.
To gain further insights into the performance of individual brands, you can click the brand to navigate to the Root Cause Analysis screen.
More Information
Global Market Share Analysis [page 300]
Global Market Share Overview [page 301]
Root Cause Analysis [page 304]
9.1.3 Root Cause Analysis
Use
This screen provides information about the average price and average price growth of the brand and lists a number of influencing factors that could be relevant for the brand's change in market share. A number of positive and negative influences are evaluated.
You can use the filter function to define which products should be considered. For a detailed description of how the key figures for positive and negative influences are calculated, see Influencing Factors for Root Cause Analysis [page 305].
SAP Demand Signal Management, version for SAP BW/4HANAAnalytics and Reporting C O N F I D E N T I A L 319
You can navigate to this screen from the Gainer/Loser Overview. The values you see are based on the gainer/loser brand, country, global category, time frame, comparison time frame, and key figure type (value or volume) you selected.
At the top of the screen you see the following key figures:
● Brand: Market Share● Brand: Market Growth● Brand: Sales Growth (Value/Volume)● Average Price● Average Price Growth
For more information on how these key figures are calculated, see Key Figures in Global Market Share Analysis [page 307]
History Charts
You can navigate horizontally using the arrow buttons to display charts with the complete history for the following key figures:
● Brand market share● Brand sales value/volume● Market sales value/volume
The time frame covered by these charts is typically much longer than the selected P3M or MAT time frame. It contains all months for which published data exists, allowing you to visualize long term trends.
By clicking these charts you open a dialog window which allows you to perform a more detailed analysis of the values.
Influencing Factors for Market Share Change
At the bottom of the screen you see a list of positive and negative influencing factors. For products that contribute significantly (depending on threshold values) to a market share change, the following influences are evaluated by the algorithms described below:
● New Product (can appear only as positive influence)● Weighted Distribution● Pricing Strategy to Increase Sales (Value)● Pricing Strategy to Increase Sales (Volume)● Others
Products that have a significant contribution to the market share, but which cannot be assigned to one of the above mentioned influences are grouped with influences Others. When you navigate from the Gainer/Loser Overview to this screen, the system evaluates whether a product is assigned to one of the influences.
By clicking an influencing factor you can drill down to the product level and see to which products the influences can mainly be attributed.
For a detailed description of how the key figures for positive and negative influences are calculated, see Influencing Factors for Root Cause Analysis [page 305]
320 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Analytics and Reporting
More Information
Global Market Share Analysis [page 300]
Gainer/Loser Overview [page 303]
9.1.3.1 Influencing Factors for Root Cause Analysis
The influencing factors in Root Cause Analysis are defined and calculated as follows:
New Products
A product is considered as new if no sales occurred for this product for a specific period of time. You can define a threshold for this period by selecting a number of months. The system calculates this period as a range of the number of months defined by the threshold, ending at the last month of the current period, as defined in the Global Market Share Overview.
Accordingly, a global product is considered a new product if the following condition holds throughout the defined period:
Sum (Sales Value of Product) = 0
By default, the threshold for new products is set to 12 months. This means that a product is considered as a new product if it has a zero sales value for the MAT.
Weighted Distribution
Weighted distribution is considered a root cause in one of the two following cases (either positive or negative influence):
● If the weighted distribution minus the previous weighted distribution is bigger than the distribution threshold, this is considered a positive influence.Weighted Distribution – Previous Weighted Distribution > Distribution Threshold = Positive Influence
● If the previous weighted distribution minus the weighted distribution is bigger than the distribution threshold, this is considered a negative influence.Previous Weighted Distribution – Weighted Distribution > Distribution Threshold = Negative Influence
Pricing Strategy to Increase Sales (Value)
The price index is calculated according to the formula:
Price Index = Average Price (Product) / Average Price (Market Group)
SAP Demand Signal Management, version for SAP BW/4HANAAnalytics and Reporting C O N F I D E N T I A L 321
Pricing Strategy to Increase Sales (Value) is considered a root cause if the price index changes as follows:
Price Index – Previous Price Index > Price Change by Value Threshold (in percentage points)
The system counts the root cause as a positive influence, if the product has increased in market share at the same time. Otherwise, it counts the root cause as a negative influence. You can define thresholds for both values.
Prices are calculated as sales values divided by sales units. Sales volumes and their units of measure are not taken into account.
Pricing Strategy to Increase Sales (Volume)
The price index is calculated according to the formula:
Price Index = Average Price (Product) / Average Price (Market Group)
Pricing Strategy to Increase Sales (Volume) is considered a root cause if the price index changes as follows:
Previous Price Index – Price Index > Price Change by Volume Threshold (in percentage points)
The system counts the root cause as a positive influence, if the product has increased in market share at the same time. Otherwise, it counts the root cause as a negative influence. You can define thresholds for both values.
Prices are calculated as sales values divided by sales units. Sales volumes and their units of measure are not taken into account.
Others
All of the preselected global products that do not satisfy any of the other root causes are counted as Others.
The product details of the various root causes are listed together with the following key figures:
● Market share● Market Share Growth = Market Share – Previous Market Share (in percentage points)● Price● Price Growth = (Price – Previous Price ) / Previous Price (in percent)● Distribution = Weighted Distribution● Distribution growth = Weighted Distribution – Previous Weighted Distribution (in percentage points)
9.1.4 Key Figures in Global Market Share Analysis
Brand: Market Share
Brand market share is the market share of the selected brand compared to the total market, meaning the total sales value or sales quantity of the global category in the selected country and time frame.
322 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Analytics and Reporting
Brand: Market Share Growth
Brand market share growth is the difference of the brand market share for the selected time frame and the brand market share for the comparison time frame.
Brand: Sales Growth (Value/Volume)
Brand sales growth is the growth of the sales value or sales quantity compared to the sales values or sales quantities of the comparison time frame.
Market: Sales Growth (Value/Volume)
Market sales growth is the growth of the sales value or sales quantity for the entire global category and country compared to the sales values or sales quantities of the comparison time frame.
Average Price
Average price is the division of the brand sales value by the brand sales quantity within the given time frame for the global category and country.
Average Price Growth
Average price growth is the difference of the average price and the average price of the comparison time frame divided by the average price of the comparison time frame in percent.
9.1.5 Calculated Key Figures for Global Market Share Analysis
Use
The following calculated key figures contribute to the two queries /DDF/CP03_Q0001 and /DDF/CP03_Q0002
SAP Demand Signal Management, version for SAP BW/4HANAAnalytics and Reporting C O N F I D E N T I A L 323
Technical Name Name Calculation
/DDF/CP03_CK_AVGPRICE Average Price /DDF/CP03_RK_BRANDSALES
/
/DDF/CP03_RK_BRANDSALQUA
/DDF/CP03_CK_AVGPRICE_CP Average Price CP /DDF/CP03_RK_BRANDSALES_CP
/
/DDF/CP03_RK_BRANDSALQUA_CP
/DDF/CP03_CK_AVGPRICE_GROWTH Average Price Growth (/DDF/CP03_CK_AVGPRICE- /DDF/CP03_CK_AVGPRICE_CP )
%A
/DDF/CP03_CK_AVGPRICE_CP
/DDF/CP03_CK_BRANDSALES_GROWTH Brand Growth Value % (/DDF/CP03_RK_BRANDSALES -
/DDF/CP03_RK_BRANDSALES_CP )
%A -
/DDF/CP03_RK_BRANDSALES_CP
/DDF/CP03_CK_BRANDSALQUA_GROWT Brand Sales Quantity Growth % (/DDF/CP03_RK_BRANDSALQUA -
/DDF/CP03_RK_BRANDSALQUA_CP)
%A
/DDF/CP03_RK_BRANDSALQUA_CP
/DDF/CP03_CK_BRANDSALQUA_SHARE Brand Sales Quantity Share % /DDF/CP03_RK_BRANDSALQUA
%A
/DDF/CP03_RK_MKTSALQUA
/DDF/CP03_CK_BRANDSALES_SHARE Brand Sales Share % /DDF/CP03_RK_BRANDSALES
%A
/DDF/CP03_RK_MKTSALES
/DDF/CP03_CK_BRANDSLS_SHARE_CP Brand Sales Share CP % /DDF/CP03_RK_BRANDSALES_CP
%A
/DDF/CP03_RK_MKTSALES_CP
/DDF/CP03_CK_BRANDSLS_SHARE_DI Brand Sales Share Diff /DDF/CP03_CK_BRANDSALES_SHARE
-
/DDF/CP03_CK_BRANDSLS_SHARE_CP
324 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Analytics and Reporting
Technical Name Name Calculation
/DDF/CP03_CK_BRANDQUA_SHARE_CP Brand Sales Volume Share CP /DDF/CP03_RK_BRANDSALQUA_CP
%A
/DDF/CP03_RK_MKTSALQUA_CP
/DDF/CP03_CK_BRANDQUA_SHARE_DI Brand Sales Volume Share Difference /DDF/CP03_CK_BRANDSALQUA_SHARE
-
/DDF/CP03_CK_BRANDQUA_SHARE_CP
/DDF/CP03_CK_MKTSALES_GROWTH Market Sales Growth % (/DDF/CP03_RK_MKTSALES - /DDF/CP03_RK_MKTSALES_CP )
%A
/DDF/CP03_RK_MKTSALES_CP
/DDF/CP03_CK_MKTSALQUA_GROWTH Market Sales Quantity Growth % (/DDF/CP03_RK_MKTSALQUA - /DDF/CP03_RK_MKTSALQUA_CP )
%A
/DDF/CP03_RK_MKTSALQUA_CP
More Information
The BI content documentation for the following queries:
9.1.6 Restricted Key Figures for Global Market Share Analysis
Use
The following restricted key figures contribute to the two queries /DDF/CP03_Q0001 and /DDF/CP03_Q0002:
Technical Name Name Restriction
/DDF/CP03_RK_BRANDSALES Brand Sales Key figure /DDF/GMSVAL restricted by calendar month /DDF/I_MFT and brand /DDF/PBRAND
SAP Demand Signal Management, version for SAP BW/4HANAAnalytics and Reporting C O N F I D E N T I A L 325
Technical Name Name Restriction
/DDF/CP03_RK_BRANDSALES_CP Brand Sales CP Key figure /DDF/GMSVAL restricted by calendar month /DDF/I_CPMFT and brand /DDF/PBRAND
/DDF/CP03_RK_BRANDSALQUA Brand Sales Quantity Key figure /DDF/GMSQUA restricted by calendar month /DDF/I_MFT and brand /DDF/PBRAND
/DDF/CP03_RK_BRANDSALQUA_CP Brand Sales Quantity CP Key figure /DDF/GMSQUA restricted by calendar month /DDF/I_CPMFT and brand /DDF/PBRAND
/DDF/CP03_RK_MKTSALES Market Sales Key figure/DDF/GMSVAL restricted by calendar month /DDF/I_MFT
/DDF/CP03_RK_MKTSALES_CP Market Sales CP Key figure /DDF/GMSVAL restricted by calendar month /DDF/I_CPMFT
/DDF/CP03_RK_MKTSALQUA Market Sales Quantity Key figure /DDF/GMSQUA restricted by calendar month /DDF/I_MFT
/DDF/CP03_RK_MKTSALQUA_CP Market Sales Quantity CP Key figure /DDF/GMSQUA restricted by calendar month /DDF/I_CPMFT
More Information
The BI content documentation for the following queries:
9.1.7 Configuring and Administrating Global Market Share Analysis
As a data provider for Global Market Share Analysis, you have the following tasks:
● Define Market Groups [page 72]● Include Images in Global Market Share Analysis User Interfaces [page 73]● Define Path for Image Files [page 74] or Mass Definition of Image Paths [page 75]
326 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Analytics and Reporting
9.1.7.1 Define Market Groups
Use
You use this transaction to define for each global brand the combinations of countries and global categories that you would like to report on.
These combinations are then stored as market groups in the InfoObject /DDF/MRMGRP. The assigned countries are stored in the InfoObject /DDF/MRMGRPM and the assigned global categories are stored in /DDF/MRMGRPC.
All defined market groups are available to be selected when you start the application Global Market Share Analysis. The transaction Define Market Groups is available in the SAP Easy Access Menu under Cross-Application Components Demand Signal Management Analytics and Reporting Global Market Share Analysis .
You can define several market groups for one global brand to reflect different responsibilities for global categories and countries. To determine the corresponding countries and global categories, you can either use a publishing group or explicitly assign individual countries and global categories to a market group. You can also combine both options.
You can change, display, or delete market groups. To add or remove a country or a global category, you just have to list all countries and global categories that you would like to include in the selection options.
NoteThe number of combinations of countries and global categories is limited due to restrictions in the lengths of a URL. You can include up to 8 global categories and 25 countries. If you reduce the number of global categories you can enhance the list of countries.
NoteIf you change a publishing group, for example, by replacing one country with another one, you must adjust the market group that corresponds to the publishing group.
More Information
Manage Publishing Groups [page 297]
SAP Demand Signal Management, version for SAP BW/4HANAAnalytics and Reporting C O N F I D E N T I A L 327
9.1.7.2 Include Images in Global Market Share Analysis User Interfaces
Use
For the following object types you can include images in the user interfaces of Global Market Share Analysis:
● Country● Global Brand● Global Category
NoteOnly the MIME types image/pngand image/jpeg are supported, thus you can use only JPG and PNG files.
Process
1. You upload the corresponding image files into the MIME repository of your system.2. You define the path for each image file to the corresponding object key of the InfoObject using the
transactions Define Path for Image Files (/DDF/MIME) or Mass Definition of Image Files (/DDF/MIMEDEF).
NoteDisclaimer: Be aware of copyright. The images used are your responsibility.
More Information
Define Path for Image Files [page 74]
Mass Definition of Image Paths [page 75]
9.1.7.2.1 Define Path for Image Files
Use
You use the transaction Define Path for Image Files (/DDF/MIME) to assign image files stored in the MIME repository to the corresponding object keys to make the images visible in the application Global Market Share Analysis. The following object types are supported:
● Country● Global brand
328 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Analytics and Reporting
● Global category
For each object type you must specify the InfoObject that provides the key values that you would like to link to an image path. In the SAP standard the following InfoObjects are used:
Object Type InfoObject
Country 0COUNTRY
Global Brand /DDF/BRAGM_TXT
Global Category /DDF/PCAGM_TXT
Select one object type and define, for the relevant object keys, the path to the image file.
NoteDefine the paths before you start importing image files into the MIME repository. If the file does not exist yet, the system provides a warning but you can save the path definition anyway.
Alternatively, you can use transaction /DDF/MIMEDEF for a convenient mass definition of image paths.
More Information
Mass Definition of Image Paths [page 75]
9.1.7.2.2 Mass Definition of Image Paths
Use
You can use transaction /DDF/MIMEDEF to define the image paths for multiple object keys of a given object type at once.
You have the option to scan a specified folder of the MIME repository for existing image files. The system searches for images with file names that are equal to the selected object keys. If there is a match of a file name (excluding the file extension) the corresponding assignment of the object key to the path is saved.
Alternatively, you first create the assignments of object keys to image paths. The system then creates the path as a concatenation of the specified path, the object key, and the file extension depending on the chosen image type. Afterwards, you can upload the image files into the MIME repository with the corresponding file name.
More Information
Define Path for Image Files [page 74]
SAP Demand Signal Management, version for SAP BW/4HANAAnalytics and Reporting C O N F I D E N T I A L 329
9.2 Key Figures for SAP Demand Signal Management, version for SAP BW/4HANA
The following key figures are available in SAP Demand Signal Management, version for SAP BW/4HANA analytics and reporting:
Key Figure Definition Calculation Available in Report
Actual Sell-Out Sales Quantity
The actual sell-out sales quantity or value is sent by the retailer as part of the point-of-sale (POS) data.
— Promotion Analysis
Actual Sell-Out Sales Value
Average Price Indicates the average price for which a product is sold.
The average price is equal to the sum of the net sales values for the product at different locations divided by the sum of the sales quantities for the product at these locations.
Product Launch
Sales Analysis
Baseline Sell-In Sales Quantity
Retrieved from the Sales and Trade Promotions (Planned+Actual) in SU (0CP_SLSCA) and TPM Base Sales Quantity (0CP_SLSC9) InfoCubes.
— Promotion Analysis
Baseline Sell-In Sales Value
Calculated Baseline Sales Quantity
Calculated baseline sell-out sales quantity or value at trade promotion, store, region, city, or product level.
The calculated baseline sell-out sales quantity or value at trade promotion level is equal to the average weekly sell-out sales quantity or value for all products and stores that are part of the trade promotion for a reference period before a trade promotion takes place, for example, two weeks, multiplied by the number of weeks in which the trade promotion is running.
The calculated baseline sell-out sales quantity or value at store, region, or city level is equal to the calculated baseline sales quantity or value at trade promotion level multiplied by the store, region, or city weight. The store, region, or city weight is
Promotion Analysis
330 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Analytics and Reporting
Key Figure Definition Calculation Available in Report
Calculated Baseline Sales Value
equal to the actual sales quantity or value for the store, region, or city for the reference period divided by the actual sales quantity or value of all stores, regions, or cities for the reference period.
The calculated baseline sell-out sales quantity or value at product level is equal to the average sell-out sales quantity or value for the product that is part of the trade promotion for the reference period multiplied by the number of weeks in which the trade promotion is running.
Gross Sales Value Indicate the sales amount for a selected period of time including or excluding taxes.
The sales values are sent by the retailer as part of the point-of-sale (POS) data.
— Product Launch
Sales Analysis
Net Sales Value Product Launch
Sales Analysis
Lost Sales Quantity Indicates the sales quantity that a retailer loses a day as a result of an out-of-shelf situation.
The lost sales quantity is equal to the average sales quantity when there is an out-of-shelf situation and is equal to 0 when there is no out-of-shelf situation.
On-Shelf Availability
On-Shelf Availability Analysis
Out-of-Stock Situations
Out-of-Stock Analysis
Net Lost Sales Value Indicates the sales value that a retailer loses a day excluding taxes as a result of an out-of-shelf situation.
The lost sales value is equal to the average sales value when there is an out-of-shelf situation and is equal to 0 when there is no out-of-shelf situation.
On-Shelf Availability
On-Shelf Availability Analysis
Out-of-Stock Situations
Out-of-Stock Analysis
Number of Products The number of products that meet the selection criteria and for which stock data is calculated.
— Stock Analysis by Location
Number of Stores Listing Products
Indicates the number of stores that are selling the selected products.
— Product Launch
Sales Analysis
SAP Demand Signal Management, version for SAP BW/4HANAAnalytics and Reporting C O N F I D E N T I A L 331
Key Figure Definition Calculation Available in Report
Out-of-Shelf Duration For all products and locations that meet the selection criteria, the sum of the out-of-shelf duration for each product and location divided by the number of out-of-shelf signals for the product and location.
If two products are selling at a particular location, and the first product is out-of-shelf on day 1 and day 2, the second product is out-of-shelf on day 1, day 2, and day 3, the out-of-shelf duration for the location is equal to the sum of the out-of-shelf duration divided by the number of out-of-shelf signals, that is, (1+2+1+2+3)/5=2.
On-Shelf Availability
On-Shelf Availability Analysis
Out-of-Shelf Rate Indicates the percentage rate a product is out-of-shelf at a selected location in the selected period of time.
The out-of-shelf rate is equal to the number of out-of-shelf signals for the selected products within the selected period of time divided by the number of sales signals (that is, the result of the selling products times the selling days times the selling stores).
On-Shelf Availability
On-Shelf Availability Analysis
Out-of-Stock Rate Indicates the percentage rate a product is out-of-stock at a selected location in the selected period of time.
The out-of-stock rate is equal to the number of out-of-stock signals for the selected products within the selected period of time divided by the number of stock signals.
Out-of-Stock Situations
Out-of-Stock Analysis
Stock Analysis by Product
Out-of-Shelf Signals Indicates the total number of out-of-shelf signals in the selected period.
— On-Shelf Availability
On-Shelf Availability Analysis
Out-of-Stock Duration
For all products and locations that meet the selection criteria, the sum of the out-of-stock duration for each product and location divided by the number of out-of-stock signals for the product and location.
If two locations are selling a particular product, and the first location is out-of-stock on day 1 and day 2, the second location is out-of-stock on day 1, day 2, day 3, and day 4, the out-of-stock duration for the product is equal to the sum of the out-of-stock duration divided by the number of out-of-stock signals, that is, (1+2+1+2+3+4)/6=2.
Out-of-Stock Situations
Out-of-Stock Analysis
Stock Analysis
332 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Analytics and Reporting
Key Figure Definition Calculation Available in Report
Out-of-Stock Signals Indicates the total number of out-of-stock signals in the selected period.
— Out-of-Stock Situations
Out-of-Stock Analysis
Stock Analysis
Percentage of Stores Listing Products
Indicates the percentage of stores that meet the selection criteria and are selling the selected products.
— Product Launch
Sales Analysis
Planned Sell-In Sales Quantity
Retrieved from the Sales and Trade Promotions (Planned+Actual) in SU (0CP_SLSCA) and TPM Base Sales Quantity (0CP_SLSC9) InfoCubes.
— Promotion Analysis
Planned Sell-In Sales Value
Products Out-of-Stock
Indicates the number of source products that are out-of-stock at the selected location in the selected period.
— Stock Analysis by Location
Selling Days Indicates the number of days in the selected period for which sales data is available.
— On-Shelf Availability
Selling Products Indicates the number of source products that are available for sale in the selected store in the selected period.
— On-Shelf Availability Analysis by Location
Selling Stores Indicates the number of stores that are selling the selected product in the selected period.
— On-Shelf Availability Analysis by Product
Stock (Store) Indicates the latest available stock value for the selected product at the selected store in the selected period. It can also indicate the sum of the latest available stock values for all selected products at the selected store in the selected period.
— On-Shelf Availability
On-Shelf Availability Analysis
Out-of-Stock Situations
Out-of-Stock Analysis
Stock Analysis
Stores Out-of-Stock Indicates the number of stores where a product that meets the selection criteria is out-of-stock.
— Stock Analysis
SAP Demand Signal Management, version for SAP BW/4HANAAnalytics and Reporting C O N F I D E N T I A L 333
Key Figure Definition Calculation Available in Report
Uplift Indicates the difference between actual sales quantity or value and the calculated baseline sales quantity or value.
The uplift is equal to the actual sales quantity or value minus the calculated baseline sales quantity or value.
Promotion Analysis
Uplift (%) Indicates the percentage of the actual uplift compared to the calculated baseline sales quantity or value.
The uplift (%) is equal to the uplift divided by the calculated baseline sales quantity or value.
Promotion Analysis
334 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Analytics and Reporting
10 Integration with SAP Integrated Business Planning
Use
You can transfer POS data from SAP Demand Signal Demand Management (SAP DSiM) to SAP Integrated Business Planning (IBP). This data can be used in SAP Integrated Business Planning for demand to improve the quality of short-term forecasting.
The data model in SAP DSiM differs from the one used in IBP. To enable data integration, the data model in SAP Demand Signal Management, version for SAP BW/4HANA is enhanced by the object manufacturer DC which is used in IBP as the basis for short-term forecasting.
POS data of retailers uploaded to SAP Demand Signal Management, version for SAP BW/4HANA typically represents the following:
● Sales in retailer stores● Promotional sales in retailer stores● Withdrawals from retailer DCs● Stock in retailer stores● Stock in retailer DCs
POS data is provided on a daily or weekly basis.
Short-term forecasting in SAP Integrated Business Planning for demand is typically done on the level of the manufacturer DC that supplies one or several IBP customers. The term IBP customer denotes the retailers in the supply chain of the manufacturer. It is therefore not identical with the customer in SAP DSiM.
Demand signals are required in weekly buckets. Therefore, POS sales data or stock data must be aggregated and transformed in SAP Demand Signal Management, version for SAP BW/4HANA to comply with the required planning level in IBP. The following planning levels are used:
● Manufacturer DC● IBP customer● Product● Week
SAP Demand Signal Management, version for SAP BW/4HANAIntegration with SAP Integrated Business Planning C O N F I D E N T I A L 335
Integration between SAP Integrated Business Planning for demand and SAP Demand Signal Management, version for SAP BW/4HANA
More Information
Configuring the Integration with IBP [page 336]
10.1 Configuring the Integration with IBP
To configure the integration with SAP Integrated Business Planning (IBP), you have to perform the following steps:
1. Install the SAP BW Content for IBP.Run the report /DSR/BW_CONT_INST_AND_CHECK, as described in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA. Afterwards, check that the following SAP BW metadata is active:○ Process chains
○ /DDF/IBP○ /DDF/LOC_IBP_POS○ /DDF/LOC_IBP_STAGE○ /DDF/MD_SOURCE (enhanced by local process chain /DDF/LOC33)
○ SAP HANA CompositeProviders:○ /DDF/CP04○ /DDF/CP05○ /DDF/CP07
336 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Integration with SAP Integrated Business Planning
○ /DDF/CP082. In the Data Warehousing Workbench create a source system of the type ODP_BW. Enter a source system
type starting with S and a release identifier of your choice. Once you have created the source system, choose Replicate DataSources from the context menu. Confirm the popup window to display a list of all Operational Data Providers (ODPs) with the context BW. These ODPs are based on DSOs and InfoObjects. Deselect all ODPs except for DS91_F and DS92_F and confirm the list. The system then creates the two ODPs that you can import to an HCI DataStore. For more information, see the SAP HANA Cloud Integration Guide for SAP Integrated Business Planning on the SAP Help Portal at http://help.sap.com/ibp60 .
3. Create a source system for manual upload of company-internal data (materials). This may be either an ECC system or another BW system. If you upload materials from an ECC system and you have not connected an ECC system to SAP Demand Signal Management, version for SAP BW/4HANA yet, carry out the following steps:1. Install the relevant source-system-dependent SAP BW content. For more information, see the
Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h .
2. Set up the manual upload of manufacturer products to SAP Demand Signal Management, version for SAP BW/4HANA. For more information, see . In particular, create the data delivery agreement for manual upload, as described in this documentation. Using this agreement, you can import ECC materials as source products to SAP DSiM and make them known to data harmonization.
4. Adjust the setup of data harmonization for ECC materials so that the following conditions are met:○ The retailer’s source product and the manufacturer’s source product (material) are assigned to the
same harmonized product (GPID). You can achieve this, for example, by giving data harmonization by GTIN the highest score on setting up automatic harmonization in the transaction /DDF/FDH_SETUP. For more information, see Setting Up Automatic Harmonization [page 223].
○ The base unit of measure of the harmonized record, which corresponds to the preferred unit of measure in SAP BW, and the conversion factors between the two source records and the harmonized record are always maintained. This is the case if you have not defined the base unit of measure as a restricted attribute.
○ The base unit of measure delivered with the retailer’s source product file does not deviate from the sales unit of the retailer’s sales data file or from the base unit of measure of the retailer’s stock data file. If these units of measure are not delivered with the retailer’s files, the system always uses the consumer unit of measure.
5. Perform the steps for activating the direct update of master data.This is required because the new attributes IBP Customer and Manufacturer DC of the harmonized location have to be updated. For more information, see Activating the Direct Update of Master Data [page 54].
6. Enhance your mapping instructions of the mapping instruction sets SAP_BW_HARMONIZE_LOCAL_RECORD and SAP_EXP_HARM_OBJECT_LOCAL with the new fields for IBP Customer and Manufacturer DC or copy these mapping instructions from the SAP reference client.
7. Make the new location attributes for IBP integration (IBP Customer and Manufacturer DC) available as an attribute bundle in the attribute-specific settings in data harmonization and add the attributes of the bundle to the set of editable attributes. This enables you to manually change the assignment of certain source locations to an IBP customer or manufacturer DC in the application Mass Change of Harmonized Products.
8. If you have semantically partitioned your POS sales/stock data using SPOs, ensure that your SAP BW data model fits the assignment of IBP customers and manufacturer DCs that you prepared in the previous step:1. For each combination of IBP customer and manufacturer DC that you have assigned in the location
assignment file, check that the assigned source locations are either all contained in the same SPO or all contained in various different DSOs.
SAP Demand Signal Management, version for SAP BW/4HANAIntegration with SAP Integrated Business Planning C O N F I D E N T I A L 337
If this is not the case, either rearrange your semantic partitions in SAP DSiM or rearrange the corresponding IBP customers and assignments in the location assignment file, for example, by splitting them.
2. Once your SAP BW data model fits the assignment of IBP customers and manufacturer DCs, create your own CompositeProviders for checking and extracting POS sales and stock data to IBP.
9. Define the types of sales data and stock data that are relevant for IBP in Customizing. For more information, see the Customizing documentation under Demand Data Foundation Integration with other SAP Components Integrated Business Planning .
10. Make the IBP week definition known in SAP DSiM by defining the first week day of the IBP week in Customizing. For more information, see the Customizing documentation under Demand Data Foundation Integration with other SAP Components Integrated Business Planning General Settings .
10.2 Assigning Retailer Locations to Manufacturer DC and IBP Customer
Use
For the data integration with SAP Integrated Business Planning (IBP) you model the manufacturer DC and IBP customer as InfoObjects in SAP Demand Signal Management, version for SAP BW/4HANA and assign retailer locations (stores and DCs) to the combination of manufacturer DC and IBP customer in DSOs.
338 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Integration with SAP Integrated Business Planning
Assignment of Retailer Locations to Manufacturer DC and IBP Customer
For this assignment you use a flat file that can be uploaded to SAP Demand Signal Management, version for SAP BW/4HANA. You therefore need to execute the following steps:
1. Define for the file upload of location assignments a process definition consisting of two steps:1. Using the process chain /DDF/LOC_IBP_POS, upload the data into the DSO /DDF/DS23.2. Using the process chain /DDF/LOC_IBP_STAGE, upload the data into DSO /DDF/DS33. During this
transformation the system checks the InfoObject Internal Source Location (/DDF/LOCATION) to evaluate the internal source delivery location and to enrich assignments depending on the source data before calling data harmonization.
2. Define a separate context, data origin, and data provider.3. Create a data delivery agreement with the context, data origin, data provider, and the process definition
you set up earlier. Choose the following attributes for you data delivery agreement:○ Agreement type: Retailer Data○ Data format: Configurable CSV○ Data upload method: Automatic Upload with Folder Scanner
Assign a data set type to the data delivery agreement and enter /DDF/LOCATION_IBP_ATTR_POS as DataSource.
4. Assign a new file set to the data delivery agreement with the data set type you just assigned.5. Define the mapping for your data delivery agreement for the following extractor fields:
○ Context○ Source Location
SAP Demand Signal Management, version for SAP BW/4HANAIntegration with SAP Integrated Business Planning C O N F I D E N T I A L 339
○ Source Location Node Qualifier○ Location Type○ IBP Customer○ Manufacturer Distribution Center
Optionally, you can define mappings for the text fields of IBP Customer and Manufacturer Distribution Center.
6. Prepare a location assignment file for upload using the data delivery agreement you defined earlier. The file must provide the listed extractor fields. To do so, perform the following steps:1. Check which agreements for the automatic file upload of POS data come with retailer DC information
and assign the retailer DCs to IBP customers and manufacturer DCs. The system creates additional assignments for the delivered stores during the transformation.
2. For those POS agreements that come without retailer DC information check whether you can assign the single source locations directly to IBP customers and manufacturer DCs. You can do so by entering each source location in the file or by changing the harmonized locations resulting from this agreement later on in the UI Mass Change of Harmonized Locations. To make the search for those locations easier, assign a separate harmonization group to each of the agreements.
3. If you have POS agreements for which these approaches cannot be used or if you have sales or stock data that comes with units of measure which cannot be converted to the harmonized product’s base unit of measure and to the material’s base unit of measure, you can omit the entries for the source locations in the location assignment file.
NoteThe sales and stock data of these POS agreements is not transferred to IBP and can therefore not contribute to the planned demand in IBP.
7. Run an automatic upload of the location assignment file, using the agreement you have just created.8. Change the assignments of locations to manufacturer DC and IBP customer using the UI Mass Change of
Harmonized Locations, if needed.
Result
All locations (/DDF/GLID) have entries for the attributes Manufacturer DC (/DDF/MANUF_DC) and IBP Customer (/DDF/IBPCUST).
10.3 Releasing Data for IBP
Use
POS data for a certain period can be delivered by a retailer several times. For example, at the end of a week the retailer sometimes provides a correction file for sales data that has already partly been uploaded. Therefore, before releasing POS data for planning, you have to check it and decide whether the data is complete and ready to be used in SAP Integrated Business Planning (IBP).
340 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Integration with SAP Integrated Business Planning
SAP Demand Signal Management, version for SAP BW/4HANA provides the two apps Release Data for IBP and Mass Release to IBP to release one or several combinations of the following objects for the transfer from SAP Demand Signal Management, version for SAP BW/4HANA to SAP Integrated Business Planning:
● Manufacturer DC● IBP customer● Period (week)
More Information
SAP Fiori-based documentation for
SAP Fiori-based documentation for
10.3.1 Release Data for IBP
Use
This application allows you to analyze, for each available combination of manufacturer and IBP customer, if the data is complete and ready to be released for SAP Integrated Business Planning (IBP).
Worklist
The Worklist gives you an overview of the existing combinations of manufacturer DC and IBP customer, and of the periods that are not yet released.
You can sort the worklist by customer, manufacturer DC, and number of periods, and filter it by customer and manufacturer DC.
Periods
The Periods tab shows for the selected combination of manufacturer DC and IBP customer a list of all periods for which data exists for the selected key figure. As additional information, you see the relevant number of locations.
By default, the key figure Sales Quantity is analyzed. The following key figures are available:
● Sales quantity● Stock quantity● Promotional sales quantity● DC withdrawals● DC stock
SAP Demand Signal Management, version for SAP BW/4HANAIntegration with SAP Integrated Business Planning C O N F I D E N T I A L 341
For each period the release status is shown. By default the filter ist set to periods with the status Pending. The following statuses exist:
Status Meaning
Pending Period has not been released yet
Released Period has been released and there is no new delivery for this period
Outdated Period was released but a new delivery has been uploaded which affects this period
The number of periods taken into account depends on the number of weeks defined as relevant for IBP in Customizing. For more information, see the Customizing documentation under Demand Data FoundationIntegration with other SAP Components Integrated Business Planning General Settings .
You can select the periods and release them. You can release a period multiple times, for example, if a correction of sales data was provided and uploaded. You can navigate to the period history for each period, by clicking on the arrow at the end of each line.
To delete outdated data, use the Delete function in the Share menu at the bottom.
Week Details
To display the Week Details, click on the link for the relevant period. Depending on the selected key figure, you see a chart showing the number of locations per period. For daily deliveries you can see whether the data for the week is complete. Alternatively, you can display the data in a table format by clicking on Display as Table.
Statistics
On the Statistics tab you see a chart showing the complete history of the selected key figure for each available unit of measure. Using the drop-down menu, you can switch to a different key figure. An extreme decrease in the numbers could be an indicator for incomplete periods that cannot be released. Alternatively, you can display the data in table format.
Messages
On the Messages tab you see warning and error messages resulting from the checks that the system executes for the selected combination of manufacturer DC and IBP customer. Solve the described issue before releasing any period. For example, it is mandatory that each retailer source product and your company-internal material is mapped to the same harmonized product (for example using EAN/GTIN). The system also checks, for example, whether a base unit of measure is assigned to each source product and whether conversions are defined for the unit of measure.
More Information
SAP Fiori-based documentation for
342 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Integration with SAP Integrated Business Planning
10.3.2 Mass Release to IBP
Use
This app allows you to release several combinations of manufacturer DC, IBP customer, and periods at once.
The Worklist shows you a table of all combinations for periods with the statuses Pending or Outdated.
You can filter the worklist by IBP customer, manufacturer DC, and IBP period by using the Filter function at the top. Your can store your filter as a variant and define one variant as standard filter that is used whenever you open the app.
You can select individual combinations or all data in the list and release the data. Data that is released disappears from the worklist.
NoteThis application does not execute any checks and does not provide any details to support the decision whether the data should be released. It is intended to be used as a fast release tool when you are sure that all the relevant data is complete.
More Information
SAP Fiori-based documentation for
10.4 Transferring Data to IBP
The data transfer from SAP Demand Signal Management, version for SAP BW/4HANA to SAP Integrated Business Planning consists of two steps:
1. Aggregating POS sales and stock data into the DSOs Sales for IBP (/DDF/DS91) and Stock for IBP (/DDF/DS92) in SAP Demand Signal Management, version for SAP BW/4HANA
2. Sending the data from SAP Demand Signal Management, version for SAP BW/4HANA to SAP Integrated Business Planning using SAP HANA Cloud Integration (SAP HCI)
Aggregating Data for IBP
In SAP Demand Signal Management, version for SAP BW/4HANA schedule the process chain Load Transactional Data for IBP (/DDF/IBP) to run periodically, for example, daily. This process chain executes the following process chains:
● Load Sales Data for IBP (/DDF/SALES_IBP)
SAP Demand Signal Management, version for SAP BW/4HANAIntegration with SAP Integrated Business Planning C O N F I D E N T I A L 343
This process chain extracts sales data from the sales propagation DSO (/DDF/DS11) using a DataSource. Only released data is taken into account. During the transformation into the target DSO Sales for IBP (/DDF/DS91) the system considers only the types of sales data that are marked as relevant for IBP in Customizing. Daily sales data is aggregated to weeks using the definition of the first day of the week from Customizing. Weekly data is copied 1:1 to the corresponding week regardless of the start day of the week.
● Load Stock Data for IBP (/DDF/STOCK_IBP)This process chain extracts stock data from the stock snapshot propagation DSO (/DDF/DS11) using a DataSource. Only released data is taken into account. During the transformation into the target DSO Stock for IBP (/DDF/DS92) the system considers only stock types that are marked as relevant for IBP in Customizing.
Data of internal source locations is aggregated to its assigned combination of manufacturer DC and IBP customer. Sales and stock data is now prepared and ready to be sent to IBP.
Sending Data to IBP
In SAP HANA Cloud Integration (SAP HCI), schedule the task for transferring POS data from SAP Demand Signal Management, version for SAP BW/4HANA to SAP Integrated Business Planning. The periodicity of the run should be aligned with the process chain Load Transactional Data for IBP (/DDF/IBP), but with some lag behind this process chain to ensure that the chain finishes in time before the task in SAP HCI starts. For more information about scheduling in SAP HCI, see the SAP HANA Cloud Integration Guide for SAP Integrated Business Planning on the SAP Help Portal at http://help.sap.com/ibp60 .
Whenever new manufacturer DCs are uploaded to IBP or new IBP customers are planned, revise the location assignment file and run an automatic upload of the location assignment file.
344 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Integration with SAP Integrated Business Planning
11 ATMA Interface
There are several possible sources of data that you can prepare and load to DMF. With the DSiM solution, the data directly populates the ATMA Interface Objects. In case of flat-file loads or internal customer data loads, you must maintain the Sales History Acquisition DSO as a central object beforehand.
This section describes the following:
● Load Data from DSiM to DMF Staging● Load Data from Flat Files to DMF Staging
11.1 Configuring the ATMA-Interface
Execute the following steps after installing the ATMA-Interface to your BW system.
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 345
11.1.1 Installing SAP BW Content
Access the transaction using the following navigation path: Transaction RSA1 Modeling Infoobjects . On the Data Warehousing Workbench: Modeling screen, check if the Infoobjects of the standard content are active:
Info Object Description
0CALDAY Calendar Day
0CALMONTH Calendar Year/Month
0CALMONTH2 Calendar month
0CALQUART1 Quarter
0CALQUARTER Calendar Year/Quarter
0CALWEEK Calendar Year/Week
0CALYEAR Calendar Year
0CURRENCY Currency Key
0DATE Date
0DATEFROM Valid from
0DATETO Valid to
0DATE_ZONE Time zone
0FACTCAL_ID Factory Calendar ID
0FISCPER Fiscal year / period
0FISCPER3 Posting period
0FISCVARNT Fiscal year variant
0FISCYEAR Fiscal year
0HALFYEAR1 Half year
0LANGU Language key
0RECORDMODE BW Delta Process: Update Mode
0REQUID Request ID
0TCTTIMSTMP UTC Time Stamp
346 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
Info Object Description
0UNIT Unit of Measure
0WEEKDAY1 Weekday
0NUMDAY Number of Days
0NUMWDAY Number of Workdays
0TCAACTVT Activity in Analysis Authorizations
If the Info Objects mentioned above are not active yet, activate them from the Business Content. On the Data Warehousing Workbench: Modeling screen, choose BI Content on the left side.
● Use the Find Objects button for searching for the required Infoobjects.● Right-click on the Infoobject and choose Insert Objects for Collection. Repeat this for all required
Infoobjects.● Choose the Install button to install the missing BI Content.
11.1.2 Installing the ATMA-Interface Content
Access the transaction using the following navigation path: Transaction RSA1 Modeling Bi Content . Make sure to activate all related object for the ATMA-Interface. The following objects are mandatory:
aDSOs
Object Type Technical Name
ADSO /DDF/TMALOHNA
ADSO /DDF/TMALOHHT
ADSO /DDF/TMALOHHD
ADSO /DDF/TMALOHND
ADSO /DDF/TMALOHNT
ADSO /DDF/TMALOASS
ADSO /DDF/TMALOTXT
ADSO /DDF/TMALOATT
ADSO /DDF/TMAPRHHT
ADSO /DDF/TMAPRHHD
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 347
Object Type Technical Name
ADSO /DDF/TMAPRHND
ADSO /DDF/TMAPRHNT
ADSO /DDF/TMAPRTXT
ADSO /DDF/TMAPRASS
ADSO /DDF/TMAPRATT
ADSO /DDF/TMAPRUOM
ADSO /DDF/TMAPLHDR
ADSO /DDF/TMAPLMCP
ADSO /DDF/TMAPLMSP
ADSO /DDF/TMAPLPRD
ADSO /DDF/TMASHDTA
ADSO /DDF/TMASHCON
ADSO /DDF/TMASHTCT
ADSO /DDF/TMASHACL
ADSO /DDF/TMASHUDI
Data Sources
Object Type Technical Name
Data Source /DDF/TMA_LOCATION
Data Source /DDF/TMA_LOCATION_HIER_ASS
Data Source /DDF/TMA_LOCATION_HIER_NODE
Data Source /DDF/TMA_PRODUCT
Data Source /DDF/TMA_PRODUCT_HIER_ASS
Data Source /DDF/TMA_PRODUCT_HIER_NODE
Data Source /DDF/TMA_PRODUCT_UOM
Data Source /DDF/TMA_PRODUCT_LOC_COGS
348 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
Object Type Technical Name
Data Source /DDF/TMA_PRODUCT_LOC_SALES
Data Source /DDF/TMA_PRODUCT_LOC_PRICE
Data Source /DDF/TMA_SALES_HISTORY
Info Sources
Object Type Technical Name
Info Source /DDF/TMASHDTA
Info Source /DDF/TMASHCON
Info Source /DDF/TMASHTCT
Info Source /DDF/TMASHACL
Info Source /DDF/TMASHUDIF
Info Source /DDF/TMALOHHT
Info Source /DDF/TMALOHHD
Info Source /DDF/TMALOHND
Info Source /DDF/TMALOHNT
Info Source /DDF/TMALOTXT
Info Source /DDF/TMALOATT
Info Source /DDF/TMAPRTXT
Info Source /DDF/TMAPRASS
Info Source /DDF/TMAPRATT
Info Source /DDF/TMAPRUOM
Info Source /DDF/TMAPLHDR
Info Source /DDF/TMAPLMCP
Info Source /DDF/TMAPLMSP
Info Source /DDF/TMAPLPRD
Info Source /DDF/TMAPRHHT
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 349
Object Type Technical Name
Info Source /DDF/TMAPRHHD
Info Source /DDF/TMAPRHND
Info Source /DDF/TMAPRHNT
Info Source /DDF/TMALOASS
Info Source /DDF/TMASHACL_A
Info Source /DDF/TMASHDTA_A
Info Source /DDF/DS14_A
Info Source /DDF/TMA_SALES_HISTORY_A
Info Source /DDF/TMALOHHD_A
Info Source /DDF/TMALOASS_A
Info Source /DDF/TMALOHND_A
Info Source /DDF/TMALOATT_A
Info Source /DDF/TMAPLHDR_A
Info Source /DDF/TMAPRATT_A
Info Source /DDF/TMAPRHHD_A
Info Source /DDF/TMAPRASS_A
Info Source /DDF/TMAPRHND_A
Info Objects
Object Type Technical Name
Characteristic /DDF/TMAPRBR
Characteristic /DDF/TMAPRDG
Characteristic /DDF/DDAGR
Characteristic /DDF/TMAPLWP
Characteristic /DDF/TMADIFID
Characteristic /DDF/TMADIFAT
350 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
Object Type Technical Name
Characteristic /DDF/TMALDCH
Characteristic /DDF/TMALODC
Characteristic /DDF/TMAGTIN
Characteristic /DDF/TMAPLFN
Characteristic /DDF/TMAPLFW
Characteristic /DDF/TMAFUCD
Characteristic /DDF/TMAGNID
Characteristic /DDF/TMAGRAN
Characteristic /DDF/TMATXTM
Characteristic /DDF/TMAISOU
Characteristic /DDF/TMAISOL
Characteristic /DDF/TMAISOV
Characteristic /DDF/TMAISOW
Characteristic /DDF/TMAPRHL
Characteristic /DDF/TMALID
Characteristic /DDF/TMALOCD
Characteristic /DDF/TMATXTL
Characteristic /DDF/TMALOHI
Characteristic /DDF/TMALOHN
Characteristic /DDF/TMALOHT
Characteristic /DDF/TMALOID
Characteristic /DDF/TMALOOD
Characteristic /DDF/TMALOTY
Characteristic /DDF/TIMEREF
Characteristic /DDF/TMAORCH
Characteristic /DDF/TMALONP
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 351
Object Type Technical Name
Characteristic /DDF/TMAPRNP
Characteristic /DDF/TMAPRHI
Characteristic /DDF/TMAPRHN
Characteristic /DDF/TMAPHNT
Characteristic /DDF/TMAPRHT
Characteristic /DDF/TMAPRID
Characteristic /DDF/TMAPUOM
Characteristic /DDF/TMALOPO
Characteristic /DDF/TMALOSO
Characteristic /DDF/TMATACT
Characteristic /DDF/TMATATP
Characteristic /DDF/TMATCID
Characteristic /DDF/TMALUOM
Characteristic /DDF/TMAVUOM
Characteristic /DDF/TMAWUOM
Characteristic /DDF/TMATIFR
Characteristic /DDF/TMATITO
Characteristic /DDF/TMAVALF
Characteristic /DDF/TMAVALT
Characteristic /DDF/TMAACTV
Characteristic /DDF/TMATREF
Key Figure /DDF/TMA_DAF
Key Figure /DDF/TMA_DWF
Key Figure /DDF/TMA_FWD
Key Figure /DDF/TMA_PRO
Key Figure /DDF/TMA_TPR
352 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
Object Type Technical Name
Key Figure /DDF/TMAACV
Key Figure /DDF/TMACOGS
Key Figure /DDF/TMAUDNM
Key Figure /DDF/TMAACVT
Key Figure /DDF/TMAPPUG
Key Figure /DDF/TMAGSAM
Key Figure /DDF/TMAGSUN
Key Figure /DDF/TMAPHGT
Key Figure /DDF/TMAPLEN
Key Figure /DDF/TMAMSPR
Key Figure /DDF/TMAPPUN
Key Figure /DDF/TMANSAM
Key Figure /DDF/TMANSUN
Key Figure /DDF/TMATACD
Key Figure /DDF/TMAUNUM
Key Figure /DDF/TMAPGWT
Key Figure /DDF/TMAPVOL
Key Figure /DDF/TMARGPU
Key Figure /DDF/TMARNPU
Key Figure /DDF/TMAUNSA
Key Figure /DDF/TMAVATF
Key Figure /DDF/TMAPWDH
Application Component
Object Type Technical Name
Application Component /DDF/TMAPRBR
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 353
11.1.3 Authorization
To upload the Data from ATMA – Interface, special authorization is necessary. To run the report: /DDF/TPO_START_UPLOAD, the following authorization object needs to be assigned:
Object ID Field
DDF_ATMA ACTVT 16
11.1.4 Customizing
The following customizing needs to be done after you have installed the ATMA-Interface for your BW system.
11.1.4.1 Maintaining Load Types
A Load Type is a set of customizable steps that are executed by the interface. The Load Type has no other function then to group steps together. Therefore, it has only a technical name and a description.
Access the transaction using transaction code SM30:
On the Maintain Table Views: Initial screen, insert the following entry:
Field Name Entry Value
Title /DDF/TPO_LOTYP
In the View TPO – Load Type Overview screen, insert the following entries:
Load Type Description
A Load Type (for example. ALL)
Alpha Numeric
A Load Type Description (for example. Load All)
You can use for example the following entries:
Load Type Description
ALL Load All
LOCM Location Master
354 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
Load Type Description
LOHI Loc Hierarchy
PRHI Prod Hierarchy
PRLO Product/ Location
PRMA Product Master
SAHI Sales History
11.1.4.2 Maintaining Load Steps
The Load Steps define the transfer of an object from the corresponding DSiM DSOs into the RFC FM on the DMF server.
Access the transaction using transaction code SM30:
On the Maintain Table Views initial screen, insert the following entry:
Field Name Entry Value
Title /DDF/TPO_LOSTP
On the TPO – Load Steps Overview
Field Name Entry Value
Load Type Technical Key for the Load Type
Load Step The step numbers. The different steps with the same Load Type are executed in an ascending order.
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 355
Field Name Entry Value
Class/Interface The class implements the Interface /TMA/TPO_IF_RFC_CALLER in order to send the data to the RFC_FM defined in the same customizing record.
Currently shipped classes are:
screen, insert the followingTPO – RFC Location Hierarchy /TMA/TPO_RFC_LOCHIERARCHY
TPO – RFC Location Master /TMA/TPO_RFC_LOCMASTER
TPO – RFC Product Hierarchy /TMA/TPO_RFC_PRODHIERARCHY
TPO – RFC Product/Location /TMA/TPO_RFC_PRODLOCMASTER
TPO – RFC Product Master /TMA/TPO_RFC_PRODMASTER
TPO – RFC Sales History /TMA/TPO_RFC_SALESHISTORY
RFC-Mode The mode in which the RFC FM is called. All shipped classes allow only synchronous calls.
RFC-Destination The RFC connection name for the TPO server as configured in the transaction SM59.
Abs. Val. Is an optional parameter of the Sales History interface. This parameter is handed over to the FRC FM on the TPO Server.
Logical system Logical System name of target DMF system.
Sales Data Is an optional parameter of the Sales History interface. This parameter is handed over to the FRC FM on the TPO Server
Yes/No Is an optional parameter for all interfaces. This parameter is handed over to the FRC FM on the TPO Server
As examples, you can use the following entries with your RFC-Destination and Logical system (you can verify them in SAP DMF system under transaction SM30 in view /DMF/LOG_SYS):
See Appendix: Load Steps
11.1.5 Loading with DSiM
To load data from DSiM to DMF, you must use DSiM Data Providers and DSiM Objects as the data sources.
You can find further information on the SAP Demand Signal Management, version for SAP BW/4HANA solution on the SAP Help Portal at: https://help.sap.com/dsimbw4h.
356 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
11.1.5.1 Configuring DSiM Load
11.1.5.1.1 Global Parameters
Access the transaction using transaction code SM30:
On the Maintain Table Views initial screen, insert the following entry and choose Maintain.
Field Name Entry Value
Table/View /DDF/M_GLOBAL
In the Source for Product/Location Keys and Demand Group Creation view, create the following entries and choose Save.
Field Name Entry Value
Use harmonized data? Choose Y Yes if you want to use harmonized data. Choose N No to use source master data.
Product Attributes
Demand Group Insert the Characteristic from which the Demand Group is derived (for example. BRAND). From this Characteristic, all demand groups are derived. You can use a new created Infoobject
(for example. ZDEMANDGR), to fill the Characteristic, from which the demand group is derived, for each product during harmonization (see also OP1 chapter “3.2 Harmonization of Product Master Data”).
Location Attributes
Factory Calendar ID The characteristic which contains the factory calendar ID (for example. 0FACTCAL_ID).
Time Zone The characteristic which contains the time zone (for example. 0DATE_ZONE).
Sales Organization The characteristic which contains the sales organization (for example. /DDF/TPOLOSO).
Purchase Organization The characteristic which contains the purchase organization (for example. /DDF/TPOLOPO).
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 357
Field Name Entry Value
Distribution Channel The characteristic which contains the distribution channel (for example. /DDF/TPOLODC).
Distribution Chain The characteristic which contains the distribution chain (for example. /DDF/TPOLDCH).
Location Open Date The characteristic which contains the location open date (for example. /DDF/TPOLOOD).
Location Close Date The characteristic which contains the location close date (for example. /DDF/TPOLOCD).
Location Currency The characteristic which contains the location currency (for example. 0CURRENCY).
11.1.5.2 Sales History
11.1.5.2.1 Creation of Transformation
Access the transaction using transaction code RSA1.
On the Data Warehousing Workbench: Modeling screen, select InfoProvider and go to Sales History (/DDF/TMA_IF_SALES) using the following navigation path: Demand Data Foundation ATMA Interface ATMA Interface - Sales History .
Navigate to InfoSource Sales History Acquisition (/DDF/TMASHACL_A) in the data flow of Sales History Propagation DSO (/DDF/TMASHDTA).
Data flow downwards Sales History Propagation TRCS /DDF/TMASHDTA ->
ADSO /DDF/TMASHDTA Sales History Data Propagation
TRCS /DDF/TMASHACL_A -> TRCS /DDF/TMASHDTA
Sales History Acquisition
Right-click on InfoSource Sales History Acquisition (/DDF/TMASHACL_A) and select Create Transformation and insert the following entries:
Field Name Entry Value
ODSO DataStore Object
358 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
Field Name Entry Value
Name Name of your Market Research DSO (for example. /DDF/DS14)
Choose button Create Transformation.
Create the transformation to fill the InfoObjects of InfoSource Sales History Acquisition (/DDF/TMASHACL_A) from your Market Research DSO. If you want to use Market Research Retail Panel Propagation DSO (/DDF/DS14), you can use the following field mapping
Infoobject in Source DSO
Infoobject Description Logic
Infoobject in Sales History Data Extractor (/DDF/TMASHDTA_EXTR)
Infoobject Description
/DDF/LOAD_ID Process ID direct assignment /DDF/LOADID Load ID
/DDF/LOCATION Internal Source Location
direct assignment /DDF/TMALOID Location Identifier
= 1040 /DDF/TMALOTY Location Type Code
/DDF/PRODUCT Internal Source Product
direct assignment /DDF/TMAPRID Product ID
ROUTINE 0TCTTIMSTMP UTC Time Stamp
= W /DDF/TMAGRAN Granularity Type for Time Series Data
/DDF/MKCSQUA Market Research: Sales Qty (Units)
direct assignment /DDF/TMAUNSA Unit Sales
= PC 0UNIT Unit of Measure
/DDF/MKCSVAL Market Research: Sales (Value)
direct assignment /DDF/TMAGSAM Gross Sales Amount
/DDF/MKCSVAL Market Research: Sales (Value)
direct assignment /DDF/TMANSAM Net Sales Amount
/DDF/MKCSQUA Market Research: Sales Qty (Units)
= /1DD/S_MKCSVAL / /1DD/S_MKCSQUA
/DDF/TMAGSUN Gross Sales per Unit
/DDF/MKCSVAL Market Research: Sales (Value)
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 359
Infoobject in Source DSO
Infoobject Description Logic
Infoobject in Sales History Data Extractor (/DDF/TMASHDTA_EXTR)
Infoobject Description
/DDF/MKCSQUA Market Research: Sales Qty (Units)
= /1DD/S_MKCSVAL / /1DD/S_MKCSQUA
/DDF/TMANSUN Net Sales per Unit
/DDF/MKCSVAL Market Research: Sales (Value)
/DDF/MKCSQUA Market Research: Sales Qty (Units)
= /1DD/S_MKCSVAL / /1DD/S_MKCSQUA
/DDF/TMARGPU Regular Gross Price of a Product
/DDF/MKCSVAL Market Research: Sales (Value)
/DDF/MKCSQUA Market Research: Sales Qty (Units)
= /1DD/S_MKCSVAL / /1DD/S_MKCSQUA
/DDF/TMARNPU Regular Net Price of a Product
/DDF/MKCSVAL Market Research: Sales (Value)
0LOC_CURRCY Local currency direct assignment /SAPMI/0CURRENCY Currency Key
/DDF/MKCDWEI Market Research Weighted Distribution
= 1DD/S_MKCDWEI / 100
/DDF/TMAACV All Commodity Volume
/DDF/TMATCID Tactics ID
/DDF/MKCSQUA Market Research: Sales Qty (Units)
= /1DD/S_MKCSVAL / /1DD/S_MKCSQUA
/DDF/TMAPPUG Gross Promotion Price of a Product per unit
/DDF/MKCSVAL Market Research: Sales (Value)
/DDF/MKCSQUA Market Research: Sales Qty (Units)
= /1DD/S_MKCSVAL / /1DD/S_MKCSQUA
/DDF/TMAPPUN Net Promotion Price of a Product per unit
/DDF/MKCSVAL Market Research: Sales (Value)
/DDF/TIMEREF Market Research Time String (external)
direct assignment /SAPMI/TIMEREF Market Research Time Reference
360 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
Example for ROUTINE to fill Infoobject 0TCTTIMSTMP:
Code Syntax
METHOD compute_0TCTTIMSTMP. * IMPORTING* request type rsrequest* datapackid type rsdatapid* SOURCE_FIELDS-/1DD/TIMEREF__DATEFROM TYPE /BI0/OIDATEFROM* EXPORTING* RESULT type _ty_s_TG_1-TCTTIMSTMP DATA: MONITOR_REC TYPE rsmonitor.*$*$ begin of routine - insert your code only below this line *-* ... "insert your code here*-- fill table "MONITOR" with values of structure "MONITOR_REC"*- to make monitor entries ... "to cancel the update process* raise exception type CX_RSROUT_ABORT. ... "to skip a record"* raise exception type CX_RSROUT_SKIP_RECORD. ... "to clear target fields* raise exception type CX_RSROUT_SKIP_VAL. CONCATENATE SOURCE_FIELDS-/1dd/timeref__datefrom '000000' INTO RESULT.*$*$ end of routine - insert your code only before this line *-* ENDMETHOD. "compute_0TCTTIMSTMP
CautionIf you want to use another source DSO you can be guided by the mapping above.
11.1.5.2.2 Creation of Data Transfer Process
Access the transaction with transaction code RSA1:
● On the Data Warehousing Workbench: Modeling screen, select InfoProvider.● Navigate to InfoArea ATMA Interface – Sales History (/DDF/TPO_IF_SALES) using the following navigation
path: Demand Data Foundation ATMA Interface Sales History
● Navigate to InfoSource Sales History Data Extractor (/DDF/TMA_IF_SALES) in the data flow of Sales History Propagation DSOs (/DDF/TPOSHDTA. Data flow downwards: Sales History PropagationTRCS /DDF/TMASHDTA ADSO /DDF/TMASHDTA Sales History Data Propagation TRCS /DDF/TMASHACL_A TRCS /DDF/TMASHDTA Sales History Acquisition .
● Right-click on InfoSource Sales History Acquisition (/DDF/TMASHACL_A) and select Create Data Transfer Process and insert the following entries:
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 361
Field Name Entry Value
Data Transfer Proc. A DTP name (for example. /DDF/DS14 -> /DDF/TMASHCON)
DTP Type Standard (Can Be Scheduled)
Object Type ODSO DataStore Object
Name Name of your Market Research DSO (for example. /DDF/DS14)
● Choose Continue (Enter) button.
Select Extraction Mode as F Full and create the following filters:
Field Name Entry Value
Demand Group ! = null
Distribution Chain ! = null
Process ID Filter Routine
Example for Filter Routine to filter only the actual Process ID:
Code Syntax
form c_/1DD/S_LOAD_ID tables l_t_range structure rssdlrange using i_r_request type ref to IF_RSBK_REQUEST_ADMINTAB_VIEW i_fieldnm type RSFIELDNM changing p_subrc like sy-subrc.* Insert source code to current selection field*$*$ begin of routine - insert your code only below this line *-*call method /ddf/cl_bw_dtp_filter=>filter_by_known_process_id exporting ir_request = i_r_request i_fieldnm = i_fieldnm importing e_subrc = p_subrc changing ct_range = l_t_range[].*$*$ end of routine - insert your code only before this line *-*endform.
11.1.5.3 Tactic Consolidation
362 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
11.1.5.3.1 Creation of Transformation
● Access the transaction with transaction code RSA1:● On the Data Warehousing Workbench: Modeling screen, select InfoProvider.● Navigate to InfoArea ATMA Interface – Sales History (/DDF/TMA_IF_SALES) using the following navigation
path: Demand Data Foundation ATMA Interface ATMA Interface - Sales History● Navigate to InfoSource Sales History Acquisition (/DDF/TMASHACL_A) in the data flow of Tactic
Consolidation DSO (/DDF/TMASHCON). Data flow downwards: Tactic Consolidation TRCS /DDF/TMASHCON ADSO /DDF/TMASHCON Tactic Consolidation TRCS /DDF/TMASHACL_ATRCS /DDF/TMASHCON Sales History Acquisition
● Right-click on InfoSource Sales History Acquisition (/DDF/TMASHACL_A) and choose Create Transformation.
● Insert the following entries:
Field Name Entry Value
Object Type ODSO DataStore Object
Name Name of your Market Research DSO (for example. /DDF/DS14)
● Choose Create Transformation button.● Create the transformation to fill the InfoObjects DSO Tactic Consolidation (/DDF/TMASHCON) from your
Market Research DSO or other sources with information about tactics.
11.1.5.3.2 Creation of Data Transfer Process
● Access the transaction with transaction code RSA1:● On the Data Warehousing Workbench: Modeling screen, select InfoProvider.● Navigate to InfoArea ATMA Interface – Sales History (/DDF/TMA_IF_SALES) using the following navigation
path: Demand Data Foundation ATMA Interface ATMA Interface - Sales History● Navigate to InfoSource Sales History Acquisition (/DDF/TMASHACL_A) in the data flow of Tactic
Consolidation DSO (/DDF/TMASHCON). Data flow downwards: Tactic Consolidation TRCS /DDF/TMASHCON ADSO /DDF/TMASHCON Tactic Consolidation TRCS /DDF/TMASHACL_ATRCS /DDF/TMASHCON Sales History Acquisition
● Right-click on DSO Tactic Consolidation (/DDF/TMASHCON) and select Create Data Transfer Process and insert the following entries:
Field Name Entry Value
Object Type ODSO DataStore Object
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 363
Field Name Entry Value
Name Name of your Market Research DSO (for example. /DDF/DS14)
● Choose Continue button.● Select Extraction Mode as F Full and create the following filters:
Field Name Entry Value
Demand Group ! = null
Distribution Chain ! = null
Process ID Filter Routine
Example for Filter Routine to filter only the actual Process ID:
Code Syntax
form c_/1DD/S_LOAD_ID tables l_t_range structure rssdlrange using i_r_request type ref to IF_RSBK_REQUEST_ADMINTAB_VIEW i_fieldnm type RSFIELDNM changing p_subrc like sy-subrc.* Insert source code to current selection field*$*$ begin of routine - insert your code only below this line *-*call method /ddf/cl_bw_dtp_filter=>filter_by_known_process_id exporting ir_request = i_r_request i_fieldnm = i_fieldnm importing e_subrc = p_subrc changing ct_range = l_t_range[].*$*$ end of routine - insert your code only before this line *-*endform.
11.1.5.4 Maintain Tactic and Tactic Type information in Customizing
● Access the transaction with transaction code SM30:● Enter /DDF/C_KEYTACT in the field Table/View and choose Maintain.● Open SAP DMF system to look up defined tactic and tactic types.
Enter the mappings of the ACV key figures to the DMF tactic and tactic types, either for a specific for the delivery agreement or as default with an empty delivery agreement.
364 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
11.1.5.5 Process Chain
11.1.5.5.1 Creation of Process Chain
● Access the transaction using transaction code RSPC● On the Data Warehousing Workbench Modeling screen, navigate to display component ATMA Interface ->
DSiM Load● Right-click on the display component DSiM Load and select Create Process Chain.
Insert the following entries:
Field Name Entry Value
Process Chain A process chain name (for example, Z_DSiM_CONNECT)
Long description A process chain description (for example, Process chain to connect DSiM DSOs to Interface)
● On the next screen Insert Start Process, choose the Create button and insert the following entries:
Field Name Entry Value
Process Variants A variant name (for example Z_DSiM_CONNECT_START)
Long description A process chain description (for example Start variant to connect DSiM DSOs to Interface)
● Select Start Using Meta Chain or API from Scheduling Options.● Choose Save and then Back button.● On the Insert Start Process screen, the new created Start Variant is selected. Press Continue (Enter)
button.● Open the Load Process and Postprocessing area on the Process Chain Maintenance Modified Version screen
and drag and drop Data Transfer Process into the Process Chain.● Insert the data transfer process which has been created to fill the Tactic Consolidation DSO as described in
chapter Creation of Data Transfer Process [page 363].● Insert the data transfer process, which has been created to fill the Sales History DSO as described in
chapter Creation of Data Transfer Process [page 363].● Choose Save and activate the process chain.
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 365
11.1.5.5.2 TPO Meta Chain
● Access the transaction using transaction code RSPC● On the Data Warehousing Workbench: Modeling screen, navigate to Display Component ATMA Interface ->
DSiM Load● Open ATMA Interface – Master Data Process Chain (/DDF/TMA_IF_MAIN) by double-clicking.● Choose button Display/Change or press CTRL+F9 to switch to edit mode.● Open the General Services area on the Process Chain Maintenance Modified Version screen and drag and
drop Local Process Chain into the Master Process Chain.● Insert the process chain, which has been created previously as described in the preceding chapter (for
example,. Z_DSiM_CONNECT).● Connect the start variant to your new inserted process chain and connect your process chain to the next
object. Your process chain should be executed before all other steps.
11.1.5.6 SAP DSiM Process Flow Control (PFC)
If you want to execute the TPO Upload every time you load market research data, you can insert the process chain in your process definition for SAP DSiM data upload. Than you can execute the transfer of DSiM data so SAP TPO automatically within your upload.
● Access the transaction using transaction code SPRO
● Click on SAP Reference IMG button and access the transaction by following navigation path: SAP Customizing Implementation Guide Cross-Application Component Demand Data Foundation Data Upload Define Processes and Steps
On the Process Definition screen, modify your process to load market research data. Select your process definition and double click on Step Definition in the dialog structure.
● Choose button New Entries to insert your process chain as last step; make the following entries:
Field Name Entry Value
Step Number A step number higher than all others (e.g.300)
Description A description (for example, data transfer to SAP TPO)
Parallelization Group A Parallelization group, if you want to execute this step in parallel to another step.
Reference Key A reference key
Process Chain ID The process chain ID of Master Process Chain (for example. /DDF/TPO_IF_MAIN)
Upload Stage An upload stage
366 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
Field Name Entry Value
Activate Step Select the Check box, if you want to activate the step.
● Choose Save.
11.1.5.7 Start Load
The transfer program deletes all zero sales entries, all entries with negative unit sales and cuts sales, so that each product has the same sales end date. It then sends the sales data from SAP Demand Signal Management, version for SAP BW/4HANA to SAP DMF.
11.1.5.7.1 Schedule Process Chain
● For scheduling Process Chains, use transaction code RSPC● On the Data Warehousing Workbench: Modeling screen, navigate to the display component DSiM Load
– /DDF/TMA_DSIM and select the process chain ATMA Interface – Master Data Process Chain (/DDF/TMA_IF_MAIN).
● Double-click to open process chain ATMA Interface – Master Data process chain (/DDF/TMA_IF_MAIN).● Choose Schedule (F8 ).
NoteYou can either schedule the process chain immediately or set up a SAP DSiM process chain containing the main process chain. Than you can see the status of the upload in your SAP DSiM PFC and schedule the upload every time you get new data.
11.1.5.7.2 Execute Transfer Program
Use transaction SE38 to run transfer program.
On the initial screen of the ABAP Editor, insert /DDF/TPO_START_UPLOAD in the Program field and press F8 to execute.
On the Program /DDF/TMA_START_UPLOAD screen, insert the following entries:
Field Name Entry Value
LoadID The LoadID of your data delivery.
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 367
Field Name Entry Value
Loadtype A LoadType (for example, ALL)
Choose Execute ( F8 ).
To run this report, special authorizations are needed:
Object ID Field
DDF_ATMA ACTVT 16
11.1.6 Load Flat File
This chapter describes how to create the necessary flat files and load them to BW.
11.1.6.1 Flat File Load Configuration
To load the data from a flat file, you must execute some steps in a specified sequence. From Business Process you start the load with the sales data. This triggers the actual load ID and bundles the data which will be loaded to DMF.
The load ID will be stored and assigned to your DDAGR in the table: /DDF/TMA_LOADID and flagged as the current load.
This load ID works as a filter for all other load processes within the ATMA Interface to make sure there is no data mixing. Technically, it is the filter in every DTP within the interface.
If you need to reload another load ID or different Load IDs, you can easily maintain the load ID and the current load flag in the table /DDF/TMA_LOADID using transaction SM30.
11.1.6.1.1 Load ID Table
The Load ID /DDF/TMA_LOADID table has the following structure:
Field Reason Example Example 2
DDAGR Determines the Data Delivery Agreement.
Nielson IRI
368 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
Field Reason Example Example 2
Load ID Will be set by the system, but can be manually change in case of reloads.
Bundles the dataset.
100 101
Active Will be set by the system, but can be manually changed in case of reloads. It sets the DTP filters
x
The Table controls and regulates different datasets. The main objects are the Data Delivery Agreement (DDAGR), the load ID and the current load.
● The DDAGR identifies the data source, for example,: IRI, Nielson, or internal data, etc. There are no name restrictions for the DDAGR. It is used at several different places of data upload.
● The load ID is the separator between datasets. With the load ID, several datasets can be stored in BW. A load ID can be used only once, regardless of the DDAGR.
● The field Active is the identifier of the correct load ID. The marked load ID will be processed in the system.
For every load, the table must be filled, either automatically by the system, or manually.
11.1.6.1.2 DTP Filter Get Load ID
The method /DDF/CL_TRANSFORMATION=>GET_LOAD_ID is implemented in the following DTPs of the ATMA Interface.
The Routine makes sure, that only the selected Load ID is proceed inside the dataflow.
DTP
Location Hierarchy Assignments /DDF/TMALOHNA PC_FILE - Location Hierarchy Assignment
Location Hierarchy Descriptions / Texts /DDF/TMALOHHT PC_FILE – Location Hierarchy
Location Hierarchy Header / Attributes /DDF/TMALOHHD PC_FILE – Location Hierarchy Header
Location Hierarchy Node /DDF/TMALOHND PC_FILE – Location Hierarchy Node
Location Hierarchy Node Text /DDF/TMALOHNT PC_FILE – Location Hierarchy Node Text
Location Assignments /DDF/TMALOASS PC_FILE – Location Hierarchy Assignment
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 369
DTP
Location Descriptions / Texts /DDF/TMALOTXT PC_FILE – Location Descriptions
Location Header / Attributes /DDF/TMALOATT PC_FILE – Location Header
Product Hierarchy Descriptions / Texts /DDF/TMAPRHHT PC_FILE – Product Hierarchy Descriptions
Product Hierarchy Header /DDF/TMAPRHHD PC_FILE – Product Hierarchy Header
Product Hierarchy Node /DDF/TMAPRHND PC_FILE – Product Hierarchy Node
Product Hierarchy Node Text /DDF/TMAPRHNT PC_FILE – Product Hierarchy Node Text
Product Descriptions / Texts /DDF/TMAPRTXT PC_FILE – Product Descriptions
Product Hierarchy Assignments /DDF/TMAPRASS PC_FILE – Product Hierarchy Assignment
Product Master / Attributes /DDF/TMAPRATT PC_FILE – Product Master
Product UOM /DDF/TMAPRUOM PC_FILE – Product UOM
Tactics / Causals /DDF/TMASHTCT ATMA – Tactics
Product/Location Header / Attributes /DDF/TMAPLHDR ATMA – Product/Locations
11.1.6.1.3 Sales History Acquisition
The primary function of Sales History Acquisition (/DDF/TMASHACL) DSO is to stage the sales history data from the flat files and to start the business process. Beginning with having the data staged in the DSO, the Tactic and User diff creation is started. The DSO has the same structure as the Sales History flat file
11.1.6.1.4 Table /DDF/C_KEYTACT
The table /DDF/C_KEYTACT is the basis for transposing the key figure model, from the sales file, into an account model. The account model is used in four DSOs:
DSO
Sales History Propagation
Tactic Consolidation
370 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
DSO
Tactics / Offers
User Diff Data
Access the transaction using transaction code SM30Enter /DDF/C_KEYTACT in the field Table/View and click the Maintain button.
Enter the tactics and tactic types for your DDAGR. Related to the DDAGR the key figures will be converted into an account model.
Each tactic must be in its own row.
Field Content
DDAGR Data Delivery Agreement
KEYFIG the technical Name of the Key Figure in which the Tactic is stored from ATMA-Interface Acquisition Layer DSO
TACTIC the Key Figure will be transferred in Tactic / Tactic Type combination
TACTICTYPE
TACTIC CLASS ‘TCT’ or ‘UDIF’. The Tactic will be created either for the User Diffs DSO or the Tactic DSO
DENOMINATOR Constant ‘0’
NUMERATOR Constant ‘0’
MIN_ACV_THRES minimum = 1.
Only Values which are greater than this value get passed to the Tactic Consolidation DSO. DMF cannot handle tactics with 0 Unit Sales.
11.1.6.1.5 Sales History Data Flow
The Sales History Acquisition DSO is the source for the Tactic Consolidation DSO and the Sales History Propagation DSO. To load your own tactics, you must enhance the Infosources and the corresponding transformation:
● Enhance the Infosource Sales History Acquisition with your customer tactics.
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 371
Use the following field assignments in the transformation:
11.1.6.2 Prepare Flat File
To manage the different loads, it is recommended to manage the flat files with the following structure.
The actual flat files are placed in the main upload folder: DMF-Staging Files.
Store the already loaded flat files with the corresponding load ID in the other folders. You can prepare the Windows folder structure as follows:
● DMF-Staging Files● <System-ID>-LoadID_1● <System-ID>-LoadID_2● <System-ID>-LoadID_3● <System-ID>-LoadID_n
11.1.6.2.1 Location Hierarchy Assignments
The flat file described the assignment to the Location Hierarchy nodes. It has one row for each location. The flat file must have the following rows in a fixed order.
Field Example Explanation
Location Hierarchy ID R3-CRM-01 Needs to match the Hierarchy in DMF
Location Hierarchy Node Identifier 300790/CPF1C100
Location Identifier 300790
11.1.6.2.2 Location Header
The flat file describes the header data of each location. It has one row for each location.
The flat file must have the following rows in a fixed order.
Field Example Explanation
Location Identifier 300790
372 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
Field Example Explanation
Factory Calendar ID US The Factory Calendar will have impact of the seasonality effect. It needs to be matched to the factory calendar in DFM. The Customizing for the factory
calendar can be found under: SAP
Customizing Implementation Guide
SAP NetWeaver General Settings
Maintain Calendar Factory
Calendar
Time zone EST Must be matched to one of the time
zones in DFM under : SAP
Customizing Implementation Guide
SAP NetWeaver General Settings
Time Zones Maintain Time Zones
Sales Organization US_UNITED_STATES Must be matched to the Sales Organization for the correct Log Sys ID n DFM
under SAP Customizing
Implementation Guide Demand
Management Foundation
Organizational Data Maintain Sales
Organization
Purchase Organization CENTRAL_PURCHASE_ORG Must be matched to the Purchase Organization for the correct Log Sys ID n
DFM under: SAP Customizing
Implementation Guide Demand
Management Foundation
Organizational Data Maintain
Purchase Organization
Distribution Channel US_UNITED_STATES Must be matched to the Distribution Channel for the correct Log Sys ID n DFM under:
SAP Customizing Implementation
Guide Demand Management
Foundation Organizational Data
Maintain Distribution Channel
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 373
Field Example Explanation
Distribution Chain US_UNITED_STATES Must be matched to the Distribution Chain for the correct Log Sys ID n DFM
under: SAP Customizing
Implementation Guide Demand
Management Foundation
Organizational Data Maintain
Distribution Chain
Location Open Date 01/01/1900
Location Closing Date 31/12/2500
Currency Key GBP
Location Description Rema HQ / PO Box 1050 / NEW YORK NY 1000
Language Key E
11.1.6.2.3 Location Hierarchy Node
The flat file builds up the location hierarchy; you start with the root node and add several nodes to it. The lowest level is still a hierarchy node. The assignment between hierarchy node and the location is build up in the flat file: location assignments. It has several rows, depending how the hierarchy is build up.
The flat file must have the following rows in a fixed order.
Field Example Explanation
Location Hierarchy ID R3-CRM-01 R3-CRM-01 Must match the hierarchy in DMF
Hierarchy Description R3-CRM-01 R3-CRM-01
Location Hierarchy Node ID R3-CRM-01 300790/CPF1C100 Build up the hierarchy
Parent Node ID R3-CRM-01
Hierarchy Node Description R3-CRM-01 Rema HQ / PO Box 1050 / NEW YORK NY 1000
Language Key E E
374 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
11.1.6.2.4 Product Header
The flat file includes the description of the products. It has one row for each product.
The flat file must have the following rows in a fixed order.
Field Example Explanation
Product ID CBP_P_002
Unit of Measure EA
0LANGU E
Description Hair Repair Shampoo 1 X 10 500ml
11.1.6.2.5 Product Hierarchy Assignments
The flat file describes the assignment of the products to the hierarchy nodes. It has one row for each location.
The flat file must have the following rows in a fixed order.
Field Example Explanation
Product ID CBP_P_002
Product Hierarchy ID R3PRODHIER Must match the hierarchy in DMF
Product Hierarchy Node Identifier 00100
11.1.6.2.6 Product Hierarchy Node
The flat file builds up the product hierarchy. You start with the root node and add several nodes to it. The lowest level is still a hierarchy node. The assignment between hierarchy node and the product is build up in the flat file: product hierarchy assignments. It has several rows, depending how the hierarchy is built up.
The Flat File must have the following rows in a fixed order.
Field Example Explanation
Product Hierarchy ID R3PRODHIER R3PRODHIER Needs to match the Hierarchy in DMF
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 375
Product Hierarchy Description
R3PRODHIER R3PRODHIER
Product Hierarchy Node Identifier
ROOT 00100 Build up the Hierarchy
Level ID L0 L1 Build up the Hierarchy
Parent Node Identifier ROOT
Node Description ROOT TMA Products
Language key E E
11.1.6.2.7 Product UoM
The flat file determines the unit of measure conversion for each product. It has one row for each product.
The flat file must have the following rows in a fixed order.
Field
Product ID CBP_P_002
Product Unit of Measure EA
ISO code for Unit of Measure EA
Numerator for UOM Conversion 1 Constant
Denominator for UOM Conversion 1 Constant
Length of a Product 0 Constant
Height of a Product 0 Constant
Width of a Product 0 Constant
Unit of Measure for length Empty
ISO Unit of Measure for Length Empty
Product Volume 0 Constant
Unit of Measure for Volume Empty
ISO Unit of Measure for Volume EA
376 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
Field
Product Gross weight 0 Constant
Unit of Measure for Weight Empty
ISO Unit of Measure for Weight Empty
11.1.6.2.8 Product Location Price Data
The flat file determines the prices per product location. It has a price for all product locations.
The flat file must have the following rows in a fixed order.
Field
Product ID CBP_P_002
Location ID 300790
Valid from 19000101 yyyymmdd
Valid to 99991231 yyyymmdd
Regular Gross Price 7.99 Constant
Regular Net Price 7.99 Constant
Currency EUR Constant
Unit EA Constant
11.1.6.2.9 Product Location Manufacturer Sales Price
The flat file determines the sales prices per product location. It has a price for all product locations.
The flat file must have the following rows in a fixed order.
Field
Product ID CBP_P_002
Location ID 300790
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 377
Field
Valid from 19000101 yyyymmdd
Valid to 99991231 yyyymmdd
Manufacturer Sales Price 3.20
Currency EUR
Unit EA
11.1.6.2.10 Product Location Manufacturer COGS
The flat file determines the prices per product location. It has a price for all product locations.
The flat file must have the following rows in a fixed order.
Field
Product ID CBP_P_002
Location ID 300790
Valid from 19000101 yyyymmdd
Valid to 99991231 yyyymmdd
Manufacturer Cost of Good Sold 6.39
Currency EUR
Unit EA
11.1.6.2.11 Sales History Data
The Sales History flat file represents the time series sales data. For each product location, different key figures are maintained on a week or day level. This file has the largest number of rows.
378 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
The flat file must have the following rows in a fixed order:
Files Example Data Explanation
DDAGR TMA TMA Data Delivery Agreement, to distinguish between different Data Source (IRI, Intern, Nielsen etc.)
Location ID 300790 300790
Product ID CPT_F_002 CPT_F_002
UTC Time Stamp 20141229000000 20150105000000
Time Granularity W W W = week; D = day
Unit sales 135 122
Unit EA EA
Gross Sales Amount (Unit Sales * Sales per Unit)
291.45 263.70 Unit Sales * Sales per Unit
Net Sales Amount (Unit Sales * Sales per Unit)
291.45 263.70
Regular Gross Price of a Product
2.29 2.29 if the promoted price is lower than the regular price by 10% or more, DMF WILL INFER a TPR.
If this behavior is not desired, the user should define the BAdI in /DMF/TPO_OFFER_FILTER in class /DMF/CL_TPO_OFFER_FILTER.
Regular Net Price of a Product
2.29 2.29
Gross Sales per Unit 2.16 2.16
Net sales per unit 2.16 2.16
Currency USD USD
ACV – All Commodity Volume
96 96 All Commodity Volume / Store ACV
Gross promotion price 2.16 2.16 if the promoted price is lower than the regular price by 10% or more, DMF will infer a TPR.
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 379
Files Example Data Explanation
Net Promotion Price 2.16 2.16 If this behavior is not desired, you should define the BAdI in /DMF/TPO_OFFER_FILTER in class /DMF/CL_TPO_OFFER_FILTER.
Currency EA
“Store ACV -
All Commodity Volume
(0 – 100)”
100% 89%
TPR 100 0 Temporary List Price Reduction is 0 for ‘off’ and 100 for ‘on’
Tactic 1 … n
Print Add
9001 – 1”
0 70 Example for a custom defined tactic. You can add as many as you need in additional columns. Tactic needs to match a Tactic in DMF:
SAP Customizing
Implementation GuideCross-Application
Components Demand
Management Foundation
Promotions Maintain
Promotion Specific Tactics
Cannibalization 1 …n 1CANN 0 100 Cannibalization is handled as a Tactic and needs to be
maintained in SAP Customizing Implementation
Guide Cross-Application
Components Demand
Management Foundation
Promotions Maintain
Promotion Specific Tactics .
Cannibalization is similar to TPR, that is, 100 for ‘on’ and 0 for ‘off’
380 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
11.1.6.2.12 Promo ID
It is possible to add the Promo ID to the Sales History file and subsequently show it in the analytics results.
To install this, changes needed to be done in the following Objects:
Type Name
DSO Tactic Consolidation /DDF/TPOSHCON
DSO Tactics / Causals /DDF/TPOSHTCT
Method Expert Routine from /DDF/TPOSHDTA to /DDF/TPOSHTCT EXP_TO_TPOSHTCT
RFC-Handler /DDF/TPO_RFC_SALESHISTORY
11.1.6.3 Load Flat Files
11.1.6.3.1 Data Sources
Data Source Flat File
Location Header /DDF/TMA_LOCATION Location Header.xlsx
Location Hierarchy Assignments /DDF/TMA_LOCATION_HIER_ASS Location Hierarchy Assignments.xlsx
Location Hierarchy Nodes /DDF/TMA_LOCATION_HIER_NODE Location Hierarchy Node.xlsx
Product Header /DDF/TMA_PRODUCT Product Header.xlsx
Product Hierarchy Assignments /DDF/TMA_PRODUCT_HIER_ASS Product Hierarchy Assignments.xlsx
Product Hierarchy Node /DDF/TMA_PRODUCT_HIER_NODE Product Hierarchy Node.xlsx
Product UOM /DDF/TMA_PRODUCT_UOM Product UOM.xlsx
Product/Location Manufacturer COGS /DDF/TMA_PRODUCT_LOC_COGS Product Location Man COGS.xlsx
Product/Location Manufacturer Sales Prices
/DDF/TMA_PRODUCT_LOC_SALES Product Location Man Sales Price.xlsx
Product/Location Price Data /DDF/TMA_PRODUCT_LOC_PRICE Product Location Prices Data.xlsx
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 381
Data Source Flat File
Sales History Data /DDF/TMA_SALES_HISTORY Sales History Data.xlsx
11.1.6.3.2 Sales History
You must start by loading the sales history data to assign a new load ID to the current dataset.
To load the sales history data, the following steps must be followed:
● Load data to the Sales History Acquisition DSO (/DDF/TMASHACL).● Check if the request is green.
CautionThe content of the DSO should be the same as the Sales History Flat File.
● Delete Contents from TPO Consolidation DSO (/DDF/TMASHCON).● Execute DTP to Tactic Consolidation DSO.
DTP ATMA – Tactic Consolidation
Activate Tactic Consolidation (/DDF/TMASHCON)
● Check if the request is green and the filter shows the load ID from file.
CautionThe amount of data in the DSO must be higher than in the Sales History File. Check the Content of the DSO and if for all tactics in the Sales History File a Tactic in this DSO is created.
Basis of the creation of the tactics is the table Table /DDF/C_KEYTACT [page 370] /DDF/C_KEYTACT.
Make sure the infosource Tactic Consolidation (/DDF/TMASHCON) is enhanced with the tactic key figures.
● Execute DTP form Sales History Acquisition DSO into Sales History DSO. Check if the request is green and the filter shows the load ID from file.
Check the Content of the DSO. It must be identical to the content in the Sales History Acquisitions.
● Execute DTP from Sales History Propagation DSO to Tactics / Offers DSO. Check if the request is green and the filter shows the load ID from file.
NoteThe data volume must be identical to the Tactic Consolidation DSO. Check if the Tactic IDs of the Tactics / Offers DSO are also stored in the Sales History Propagation DSO.
● The DSO Content will be as follows:
382 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
Sales History DSO
Time Product Location Tactic ID
01.01.2018 CPT_F_002 300790 1
08.01.2018 CPT_F_002 300790 2
Tactics / Causals
Tactic ID Tactic Type Tactic
3 1
20 22
70 71
80 81
80 82
● Execute DTP from Sales History Propagation DSO to User Diff Data DSO. Check if the request is green and the Filter shows the Load ID from File.
CautionThe volume of data must be identical to the Tactic Consolidation DSO. Check if the tactic IDs of the User Diff DSO are also stored in the Sales History Propagation DSO.
11.1.6.3.3 Master Data
Load the data from the following data sources into the related DSOs. Choose this combination and execute the DTPs in the table below. As the DSOs are write optimized, no activation is needed. Check whether the request is green. Compare the content of the DSO with the content of the file. They must be identical.
NoteIt is recommended to build a Process chain for the DTPs.
Data Source DTP ADSO
Location Header /DDF/TMA_LOCATION PC_FILE – Location Header
Location Header /DDF/TMALOATT
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 383
Data Source DTP ADSO
Location Header /DDF/TMA_LOCATION PC_FILE – Location Header – texts
Location Header – texts
/DDF/TMALOTXT
Location Hierarchy Assignments
/DDF/TMA_LOCATION_HIER_ASS
PC_FILE – Location Hierarchy Assignments
Location Hierarchy Assignments
/DDF/TMALOHNA
Location Hierarchy Nodes
/DDF/TMA_LOCATION_HIER_NODE
PC_FILE – Location Hierarchy ID
Location Hierarchy ID /DDF/TMALOHHD
Location Hierarchy Nodes
/DDF/TMA_LOCATION_HIER_NODE
PC_FILE – Location Hierarchy ID – texts
PC_FILE – Location Hierarchy ID – texts
/DDF/TMALOHHT
Location Hierarchy Nodes
/DDF/TMA_LOCATION_HIER_NODE
PC_FILE – Location Hierarchy Nodes
Location Hierarchy Nodes
/DDF/TMALOHND
Location Hierarchy Nodes
/DDF/TMA_LOCATION_HIER_NODE
PC_FILE – Location Hierarchy Nodes – texts
PC_FILE – Location Hierarchy Nodes – texts
/DDF/TMALOHNT
Product Header /DDF/TMA_PRODUCT PC_FILE – Product Header
Product Header /DDF/TMAPRATT
PC_FILE – Product Header – texts
PC_FILE – Product Header – texts
/DDF/TMAPRTXT
Product Hierarchy Assignments
/DDF/TMA_PRODUCT_HIER_ASS
PC_FILE – Product Hierarchy Assignments
PC_FILE – Product Hierarchy Assignments
/DDF/TMAPRASS
Product Hierarchy Node
/DDF/TMA_PRODUCT_HIER_NODE
PC_FILE – Product Hierarchy ID
Product Hierarchy ID /DDF/TMAPRHHD
Product Hierarchy Node
/DDF/TMA_PRODUCT_HIER_NODE
PC_FILE – Product Hierarchy ID – texts
Product Hierarchy ID – texts
/DDF/TMAPRHHT
Product Hierarchy Node
/DDF/TMA_PRODUCT_HIER_NODE
PC_FILE – Product Hierarchy Nodes
Product Hierarchy Nodes
/DDF/TMAPRHND
Product Hierarchy Node
/DDF/TMA_PRODUCT_HIER_NODE
PC_FILE – Product Hierarchy Nodes – texts
Product Hierarchy Nodes – texts
/DDF/TMAPRHNT
Product UOM /DDF/TMA_PRODUCT_UOM
PC_FILE – Product UOM
PC_FILE – Product UOM
/DDF/TMAPRUOM
Product/Location Manufacturer COGS
/DDF/TMA_PRODUCT_LOC_COGS
PC_FILE – Product/Location Manufacturer Cost of Goods
Product/Location Manufacturer Cost of Goods
/DDF/TMAPLMCP
Product/Location Manufacturer Sales Prices
/DDF/TMA_PRODUCT_LOC_SALES
PC_FILE – Product/Location Manufacturer Sales
Product/Location Manufacturer Sales
/DDF/TMAPLMSP
384 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
Data Source DTP ADSO
Product/Location Price Data
/DDF/TMA_PRODUCT_LOC_PRICE
PC_FILE – Sales History Acquisition
Product/Location Price Data
/DDF/TMAPLPRD
Sales History Data /DDF/TMA_SALES_HISTORY
PC_FILE – Sales History Acquisition
Sales History Acquisition
/DDF/TMASHACL
11.1.6.3.4 Product Location
To create the Product Location data from the Sales History data, you must execute the following steps:
Load Product/Location Header
● Execute the Sales History Data DTP. Check if the request is green and the filter shows the load ID from file.
The Transformation will create the product locations from Sales History DSO.
11.1.7 Upload to DMF
11.1.7.1 Clean DMF Staging Data
In most cases, it is necessary to delete existing data from the DMF tables. This can be either staging and/or production tables and must be clarified with the data scientist beforehand.
Deletion of data from staging tables:
● Log on to DMF System.● Launch transaction SE38 and run program /DMF/DEL_STAGING_DATA and select the tables you want to
delete data from.
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 385
● Choose Schedule ( F8 ).
Data Deletion from Production Tables
● Log on to DMF System.● Launch transaction SE38 and run program /DMF/TS_DELETE.● Select the deletion type:
○ Forecast Data○ Model Data○ Time Series Data○ Orphan Data
● Maintain parameters of the report for forecast data deletion:
Logical system System where the data will be deleted
Forecast ID Optional
Product ID Choose the product you want to delete, leave empty if you want to delete all products
Location ID Choose the locations you want to delete, leave empty if you want to delete all products
Start Date Optional
End Date Optional
386 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
Maintain the Parameters of the report for Model data deletion:
Logical system System where the data will be deleted
Model ID Optional
Product ID Choose the product you want to delete, leave empty if you want to delete all products
Location ID Choose the locations you want to delete, leave empty if you want to delete all products
Maintain the parameters of time series data deletion:
Logical system System where the data will be deleted
Key Figure Parameter Optional
Product ID Choose the product you want to delete, leave empty if you want to delete all products
Location ID Choose the locations you want to delete, leave empty if you want to delete all products
Start Date Optional
End Date Optional
Select the data type for orphan deletion:
Forecast data
Model data
Time Series Data
If you need to delete the master data of the production tables, you must implement SAP Note 2574423 – Delete Data from DMF tables .
CautionWith the reports the actual productive model can be deleted. Also with deleting master data tables from productive system the model can be get inconsistent and broken. You must use this report with care.
11.1.7.2 Upload Data from BW to DMF Staging
● Launch transaction SE38 and execute program /DDF/TPO_START_UPLOAD.On the Program screen, insert the following entries and press F8 .
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 387
Field Name Entry Value
Load ID The Load ID of your data delivery.
Load Type A Load Type (for example,. ALL)
Choose Execute (F8).
CautionTo execute this program you may need special authorizations:
Object ID Field
DDF_ATMA ACTVT 16
11.1.7.3 DMF Stage to Production Load
● Log on to DMF System● Run transaction NWBC to launch the NetWeaver Business Client● On the lunch page of the NetWeaver Business Client, select Cockpit: If the above option is not available,
choose SAP_QAP_SAP_ISR_DDF_ROLES_MENU.
388 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
● Navigate to Services Monitor Imports Manuel Processing.
.● A pop-up window will open with the possible types of data loads. Mark your selection and choose Execute.
NoteYou can only flag one selection at a time.
The loads must be executed in the following order.
Process Step
Location
Location Hierarchies
Product Hierarchies
Product
Product Location
SAP Demand Signal Management, version for SAP BW/4HANAATMA Interface C O N F I D E N T I A L 389
Process Step
Syndicated POS Data
In the monitor, you can watch the number of data records reduce as the load is running. The system should auto-refresh every n seconds. You can also refresh the screen manually. The job is finished once the number of records for your data type counts down to zero records.
390 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
ATMA Interface
12 Configuration and Administration
Use
During the implementation phase you make all necessary settings in Customizing and configure the individual components for running SAP Demand Signal Management.
Dependent on your system landscape you make Customizing settings, for example, in the development system and all other settings in the production system.
You can access the individual settings at the following places:
● Customizing activitiesOpen Customizing and choose Cross-Application Components Demand Data Foundation or Cross-Application Components Demand Signal Management .
● Configuration transactionsOn the SAP Easy Access screen choose Cross-Application Components Demand Signal Management
● Web user interfaces for configuration
NoteAccess to Web user interfaces (Web UIs) is granted by authorization roles (PFCG roles) that contain the Web UI in the assigned menu.
To be able to access Web UIs, for example, from the user menu or in the SAP NetWeaver Business Client (NWBC), you must have a corresponding authorization role assigned to your system user.
For more information, see Roles for SAP Demand Signal Management [page 530].
When the system is already running and used in production mode, you need to change the configuration or define new settings from time to time. The following table explains in which cases you have to adapt the configuration in the production system.
When? Task See Also
You receive new files for retailer data (POS data) from an external data provider for which no data delivery agreement is defined yet.
Define new data delivery agreements for retailer data and all dependent objects like data origins, data providers, regions, and contexts
Defining Data Delivery Agreements [page 106]
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 391
When? Task See Also
You receive POS data from a new retailer. Define location calendars Defining Location Calendars [page 44]
Define periods with sales deviations for the pre-cleansing function of sales data enrichment
Defining Periods with Sales Deviations [page 274]
A new data delivery agreement for retailer data has been created.
Set up semantic partitioning Setting Up Semantic Partitioning [page 45]
Define files and file sets Defining Files and File Sets [page 111]
Define harmonization groups Defining Harmonization Groups [page 230]
Configure quality validation Configuring Quality Validation [page 203]
You receive new files for retail panel data (market research data) from an external data provider for which no data delivery agreement is defined yet.
Define new data delivery agreements for retail panel data and all dependent objects like data origins, data providers, regions, and contexts.
Defining Data Delivery Agreements [page 106]
Enhancing Data Delivery Agreements for Retail Panel Data [page 141]
A new data delivery agreement for retail panel data has been created.
Set up semantic partitioning Set Up Semantic Partitioning [page 45]
Map external fields and adapt the mapping
Mapping External Fields for Retail Panel Data [page 143]
Define extraction filters Defining Extraction Filters [page 145]
Revise hierarchy descriptions Revising Hierarchy Descriptions [page 150]
Define harmonization groups Defining Harmonization Groups [page 230]
392 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
When? Task See Also
Configure quality validation Configuring Quality Validation [page 203]
A new data delivery agreement for retail panel data with a new context for retail panel data has been created in the system.
Set up semantic partitioning Setting Up Semantic Partitioning [page 45]
Define time derivation Defining Time Derivation [page 147]
Revise hierarchy levels Revising Hierarchy Levels [page 152]
A new data delivery agreement for retail panel data with a new context for retail panel data has been created in the system that is relevant for global reporting.
Define product level descriptions for global reporting
Defining Product Level Description [page 153]
A new data delivery agreement and the dependent objects were created in a separate system, for example a test system, and you want to take over the settings into the production system.
Synchronize data delivery agreements and the dependent objects between a source and a target system
A new data origin has been created. Set up automatic harmonization Setting Up Automatic Harmonization [page 223]
Define appearance of object attributes on user interfaces
Defining the Appearance of Object Attributes [page 221]
Define priorities for copying attribute values
Defining Priorities for Copying Attribute Values into Harmonized Objects [page 224]
New object attributes are available, and you want them to be considered in data harmonization.
Define priorities for copying attribute values
Defining Priorities for Copying Attribute Values into Harmonized Objects [page 224]
Define appearance of object attributes on user interfaces
Defining the Appearance of Object Attributes [page 221]
Define restrictions for harmonized attribute values
Defining Restrictions for Harmonized Attribute Values [page 226]
You want to define new allowed attribute values, for example, new manufacturer names, for data harmonization.
Define allowed attribute values Defining Allowed Attribute Values [page 229]
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 393
When? Task See Also
You want to update the time master data for aggregated sales or stock snapshot data from retailers.
Upload time master data to the relevant InfoObject
One-Time Uploads Before Recurring Regular Data Uploads Are Started [page 164]
You have defined new stock types or types of sales data for POS data in Customizing, or you have defined new regions.
Upload the entities to the relevant InfoObject
You want to use the analytics and reporting Web Dynpro UIs.
Configure the Web Dynpro reports Configuring the Web Dynpro Reports [page 507]
You want to use the Sales Performance Analysis and Market Share Analysis analytics dashboards.
Configure product and location hierarchies for the analytics dashboards
Configuring Product and Location Hierarchies for the Analytics Dashboards [page 509]
12.1 Configuring and Administrating the Data Model
As an administrator for the data model you have the following tasks:
● Set up the data model in Customizing. For more information, see the Customizing documentation for Data Model under Cross-Application Components Demand Data Foundation Data Model .
● Configure the location calendar in the SAP Easy Access Menu● Set up semantic partitioning● Delete account assignments from InfoObjects● Activate the direct update of master data● Check Customizing
Task When? Where?
Set up data model in Customizing
Implementation Phase Customizing under Cross-Application Components
Demand Data Foundation Data Model
Define set of operating times Before you upload locations of a new retailer
On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Location Calendar Define Set of
Operating Times .
394 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Task When? Where?
Define location calendar Before you upload locations of a new retailer
On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Location Calendar Define Location
Calendar .
Define periods of location calendar
Before you upload locations of a new retailer
On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Location Calendar Define Periods of
Location Calendar .
Define default assignment of calendar
Before you upload locations of a new retailer
On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Location Calendar Define Default
Assignment of Calendar .
Assign locations to calendars On demand, to check the assignments, and to create or change assignments if necessary
On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Location Calendar Assign Locations to
Calendar .
Set up semantic partitioning When you create a new data delivery agreement or a new context
Customizing under Cross-Application Components
Demand Data Foundation Data Model .
For more information, see Setting Up Semantic Partitioning [page 45].
Delete account assignments On demand, in the rare case that an account represents an individual person and the records for this person must be deleted
On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Account Reference Delete Account
Assignments .
Activate direct update of master data
Implementaton phase On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Account Reference Update
InfoObjects from Change Log . For more information, see Activating the Direct Update of Master Data [page 54].
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 395
Task When? Where?
Check Customizing On demand On the SAP Easy Access screen choose Cross-
Application Components Demand Signal Management
Data Model Account Reference Check
Customizing for Data Model .
12.1.1 Defining Location Calendars
Use
Within the Demand Data Foundation, you can define location calendars and assign them to locations to indicate the operating times of these locations (for example, stores or distribution centers of a retail company).
For SAP Demand Signal Management, version for SAP BW/4HANA it is important to know the operating days and sometimes even operating hours of a retailer's store. This information is needed to obtain precise results during the data enrichment of aggregated sales data. Location calendars can vary for different companies, countries, and regions. Therefore it is possible to define various location calendars and assign them to the respective locations. Determining the correct calendar and evaluating the calendar information (for example, to find out if a certain day is a working day or a public holiday, or to check operating hours of a location for a given day or week), can impact performance. Therefore, the system generates periods for each location calendar and persists the relevant information as a time stream. This time stream can be evaluated quickly by the consuming applications.
Depending on the application and data using the location calendar, you can define the required periods, configuring a calculation rule. Optionally, you can also define a set of operating times per weekday and assign it to a location calendar.
NoteThe location calendar, the set of operating times, and the assignment of location calendars to locations are treated like master data. This data needs to be adjusted from time to time. It is not possible to transport it; you have to set it up from scratch in every system. Setting up location calendars is an administrative job that uses some Customizing data, for example, the calculation rule.
If you change the settings for location calendars, operating times, or assignments to locations the changes will only be effective for newly created data. If a reevaluation is required, you need to execute a reprocessing of the corresponding data to make the changes within a location calendar effective.
Prerequisites
For all activities around the location calendar you need the authorization DDF_ADMIN (Administration Demand Data Foundation).
396 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Activities
Setting Up Location Calendars
1. In the Customizing activity Maintain Calendar, define a holiday and factory calendar. For more information, see the Customizing for Calendars under SAP NetWeaver General Settings Maintain Calendar .
2. In the Customizing activity Define Calculation Rules for Periods of Location Calendar, define a period calculation rule. For more information, see the Customizing for Data Model under Cross-Application Components Demand Data Foundation Data Model Define Calculation Rules for Periods of Location Calendar .
3. Define a set of operating times (optional).4. Define a location calendar and assign:
○ A factory calendar○ A period calculation rule○ A set of operating times (optional)
Assigning Locations to Location Calendars
1. Assign your calendar in the view Define Default Assignment of Location Calendar.2. Load locations into the system, for example into InfoObject /DDF/LOCATION.3. In the program Assignments of Locations to Calendars, verify the generated assignments.
For more detailed information, see the individual program documentation.
12.1.2 Delete Account Assignments
Should data privacy require the deletion of accounts that represent private persons, you can use this report to delete account assignments from InfoObjects.
12.1.3 Setting Up Semantic Partitioning
Use
If you receive very large amounts of POS data or retail panel data over time, you may not be able to upload all this data to a single DataStore object (DSO). You can use semantically partitioned objects (SPOs) to distribute the data over several smaller physical DSOs. SAP Demand Signal Management, version for SAP BW/4HANA supports the semantic partitioning of your data model by data delivery agreement for transactional data and by context for master data. If the upload of one semantic partition fails, the uploads of other partitions are not affected.
If you receive several master data files from different retailers at the same time, you want to make these data available for reporting as fast as possible. In this case, you can also use SPOs to upload the data from the different contexts in parallel to the different physical DSOs of the SPO. You do this, for example, if the system cannot parallelize the data upload of a single data delivery, because the master data files are too small.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 397
Implementation Considerations
● Consider semantic partitioning of transactional data whenever you expect that the amount of data uploaded to a single standard DSO will exceed 500 million records over time. Do not create more semantic partitions than necessary because this could lead to a decrease in query performance.
● Consider semantic partitioning of master data when you frequently receive small portions of master data which need to be uploaded in parallel.
● We highly recommend to use SPOs for semantic partitioning. If you create multiple own copies for each standard DSO instead, you need to carry out additional activities to configure and operate your system.You must implement your own logic to direct the data of each data delivery to the right copy of the standard DSO. Depending on this logic, you adjust your implementation or data model when onboarding a new data provider. For master data, you must create your own implementation of the Business Add-In (BAdI) /DDF/MD_ACCESS. If you work with SPOs instead, you can use the default implementation of this BAdI.
Activities
Perform the following steps to be able to use semantic partitioning in SAP Demand Signal Management, version for SAP BW/4HANA:
1. Set up the data model by partition values for automatically including the new partition as part of the calculation view
1. Set up the data model by partition values for transactional data2. Set up the data model by partition values for bill of material data3. Set up the data model by partition values for master data4. Make sure that the correct partitions are accessed by the system
More Information
Maintaining Partition Objects [page 47]
Setting Up the Data Model by Partition Values for Transactional Data [page 47]
Setting Up the Data Model by Partition Values for Master Data [page 48]
Accessing the Correct Partitions [page 49]
For more information about using semantic partitioning, see SAP Help Portal at http://help.sap.com under SAP NetWeaver SAP NetWeaver Platform Application Help Function-Oriented View English
Business Warehouse Data Warehousing Modeling Enterprise Data Warehouse Layer Using Semantic Partitioning .
398 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
For more information about SAP HANA, see SAP Help Portal at http://help.sap.com under SAP In-Memory Computing SAP HANA SAP HANA Platform Development Information SAP HANA Developer Guide .
12.1.3.1 Maintaining Partition Objects
Within the Demand Data Foundation, you can maintain the settings in the Maintain Partition Objects view to automatically include the new partition as part of the calculation view.
This functionality uses a flexible framework to automatically handle the new partitions without creating a new set of procedures for Quality Validation, Enrichment and Reporting. To maintain this partitioning, proceed as follows:
1. Create a partition and include it in your DSO.2. Specify the DSO as the Main Object in this Customizing.3. Maintain the partitions as Partition Object under the main object.4. Choose the attributes which should be included in the calculation view.5. Select the check-box for the Main Object and Partition Object, which you want to include and choose the
Execute button .Check on the HANA side. The calculation view is generated with the partitions and DSO attributes included in the Customizing
For more information, see the Customizing documentation for Cross-Application Componentsunder Demand Data Foundation Data Model Maintain Partition Objects
12.1.3.2 Maintaining Partition Filter for BOM
Within the Demand Data Foundation you can define the source partition names in the propagation layer for the target Bill of Material split DataStore Object (DSO) . The standard DSOs for transactional data are used as templates.
To maintain this partitioning, proceed as follows:
1. Source partition refers to the sales/stock data with Bill of Material relevant DSO2. Target partition refers to Bill of Material split DSO which stores data at a lower level granularity.
For more information, see the Customizing documentation for Cross-Application Componentsunder Demand Data Foundation Data Model Maintain Partition Filter for Bill of Materials
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 399
12.1.3.3 Setting Up the Data Model by Partition Values for Transactional Data
Use
The criterion for defining the partitions of an semantically partitioned object (SPO) must be both flexible and scalable because new data delivery agreements are created at different points in time and the data deliveries may have different sizes. Every time you create a new data delivery agreement you decide to which partition value you want to assign it. To set up the data model, proceed as follows:
1. Create partition values in Customizing.2. Assign the partition values to data delivery agreements during mass creation and synchronization.3. Make the partition values available as characteristic values of InfoObject /DDF/DDA_PART in SAP
NetWeaver Business Warehouse.4. Use the InfoObject /DDF/DDA_PART as a criterion for defining partitions of SPOs based on the structure of
the following standard DSOs:○ /DDF/DS04 (Market Research Retail Panel Acquisition)○ /DDF/DS14 (Market Research Retail Panel Propagation)○ /DDF/DS01 (Aggregated Sales Acquisition)○ /DDF/DS11 (Aggregated Sales Propagation)○ /DDF/DS02 (Stock Snapshot Acquisition)○ /DDF/DS12 (Stock Snapshot Propagation)
5. You can define the settings in Customizing to automatically include the new partition as part of the calculation view. For more information, see the Customizing documentation for Cross-Application Components under Demand Data Foundation Data Model Maintain Partition Objects .
For more information, see the Customizing documentation for Cross-Application Components under Demand Data Foundation Data Model Define Partition Values for Transactional Data .
More Information
Setting Up Semantic Partitioning [page 45]
Setting Up the Data Model by Partition Values for Master Data [page 48]
Accessing the Correct Partitions [page 49]
400 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
12.1.3.4 Setting Up the Data Model by Partition Values for Master Data
Use
You assign the partition value for master data to a context to make sure that each master data record is available in only one partition of the semantically partitioned object (SPO).
Every time you create a new context, proceed as follows, before you upload master data for that context:
1. Create partition values in Customizing.2. Assign the partition values to the context. If you synchronize contexts from different systems, make sure
that the partition values are synchronized as well.3. Make the partition value available as characteristic value of InfoObject /DDF/CTX_PART in SAP NetWeaver
Business Warehouse by executing the corresponding data transfer process (DTP).4. Use the InfoObject /DDF/DDA_PART as a criterion for defining partitions of SPOs based on the structure of
the following standard DSOs:○ /DDF/DS41 (Product Acquisition)○ /DDF/DS51 (Product Propagation)○ /DDF/DS43 (Product Name/Value Pairs Acquisition)○ /DDF/DS53 (Product Name/Value Pairs Propagation)○ /DDF/DS21 (Location Acquisition)○ /DDF/DS31 (Location Propagation)○ /DDF/DS22 (Location Name/Value Pairs Acquisition)○ /DDF/DS32 (Location Name/Value Pairs Propagation)○ /DDF/DS61 (Attribute Reference)○ /DDF/DS05 (Time String Acquisition Market Research)○ /DDF/DS15 (Time String Propagation Market Research)○ /DDF/DS71 (Hierarchy Metadata)○ /DDF/DS72 (Hierarchy Level Metadata)○ /DDF/DS44 (Product Hierarchy Acquisition)○ /DDF/DS54 (Product Hierarchy Propagation)○ /DDF/DS24 (Location Hierarchy Market Research Acquisition)○ /DDF/DS34 (Location Hierarchy Market Research Propagation)
RecommendationCreate as many partitions as you need to cover as many contexts as possible in the early implementation phase in order to keep the changes to the SPOs and the necessary uploads to InfoObject /DDF/CTX_PART at a minimum.
For more information, see the Customizing documentation for Cross-Application Components under Demand Data Foundation Data Model Define Partition Values for Master Data .
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 401
More Information
Setting Up Semantic Partitioning [page 45]
Setting Up the Data Model by Partition Values for Transactional Data [page 47]
Accessing the Correct Partitions [page 49]
12.1.3.5 Accessing the Correct Partitions
Use
When you use semantically partitioned objects (SPOs) you have to adjust the standard data flows in a way that the data upload feeds the correct SPOs. Therefore, you copy the standard data transfer processes (DTPs). This affects all transformations from the Persistent Staging Area (PSA) to the acquisition DataStore objects (DSOs) and from there into the propagation DSOs.
When you define the partitions of an SPO, you can create DTPs for the upload to the different DSOs of the SPO, using a standard DTP as a template. By doing so, you reuse the standard transformations for the upload from the PSA to the data acquisition layer, and from the data acquisition layer to the data propagation layer. These transformations fill the partition value (/DDF/DDA_PART for transactional data or /DDF/CTX_PART for master data) as key field of the target DSO, according to the data delivery agreement or context that is known during the upload process. This allows the system to direct the data to the correct partition of the target SPO.
ExampleIf you do not assign partition values, the system uploads product master data to the product acquisition DSO DDF/DS41.
If the current data flow for master data contains a DSO with SPOs as data target, you can assign a partition value to a specific context. The system then uploads the master data to the corresponding SPO for this partition value, and you avoid having duplicate entries with different partition values for the same master data in the standard acquisition DSO.
If the master data from different retailers does not have the same structure, you can use multiple SPOs for one acquisition DSO.
To make sure that the system reads the data from the correct SPO in the acquisition layer, you assign the SPOs that store data in the acquisition layer to the standard acquisition DSO in Customizing for Cross-Application Components under Demand Data Foundation Data Model Assign Semantically Partitioned Objects to DataStore Objects .
NoteIf you want to implement your own logic how the system determines the correct SPO, you can implement BAdI: Semantic Partitioning of Master Data /DDF/MD_ACCESS accordingly. For more information, see Customizing for Cross-Application Components under Demand Data Foundation Business Add-Ins (BAdIs) BAdI: Semantic Partitioning of Master Data
402 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Additionally, in the following steps of the upload processes, the standard DSOs are accessed by SAP HANA procedures:
● Quality validation● Data enrichment● Extraction to global reporting● Data consolidation
For performance reasons, the system delegates this time-consuming part of the upload process to SAP HANA procedures. For each data delivery, the system has to make the correct semantic partitions that include the POS data or retail panel data known to SAP HANA.
For each of the upload steps, there are SAP HANA procedure templates that contain the physical DSO from which the data is read as a parameter. If you use semantic partitioning, proceed as follows to access the correct SPO:
● For each procedure template that is contained in the SAP HANA content package sap.ddf.dsimddf, create a procedure template instance for every physical DSO of your SPO. This SPO represents the layer from which the business logic of the procedure template has to read the data from.
● Implement BAdI: Stored Procedure (/DDF/CALL_PROCEDURE) to determine the physical DSO based on the data delivery agreement and the assigned partition value or a superset of partition values. Based on this DSO the name of the procedure template instance is determined.You can assign the standard DSO to the SPO that stores the data in Customizing for Cross-Application Components under Demand Data Foundation Data Model Assign Semantically Partitioned Objects to DataStore Objects . After that you can use the example implementation of the BAdI to determine the correct physical DSO.
For more details about the procedure templates used in SAP Demand Signal Management, version for SAP BW/4HANA and the implementation of the BAdI, see the Customizing documentation for Cross-Application Components under Demand Data Foundation Business Add-Ins (BAdIs) BAdI: Stored Procedures .
If semantic partitioning is not needed, the system calls standard procedure template instances, where the parameter of the procedure template is replaced by the standard DSO.
More Information
Setting Up Semantic Partitioning [page 45]
Setting Up the Data Model by Partition Values for Transactional Data [page 47]
Setting Up the Data Model by Partition Values for Master Data [page 48]
12.1.4 Direct Update of Master Data
Concept
You can use this feature to make the changes that were made to your master data objects in data harmonization directly visible and available for reporting.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 403
To make the changes to your products and locations available for reporting, you must transfer them to the corresponding InfoObjects.
Update of Master Data If Direct Update is Not Activated
If the direct update feature is not activated, this process runs as follows:
1. You save your changes in data harmonization.2. A scheduled daemon from the process flow control (PFC) picks up the changes. The system uses a process
definition from a data delivery agreement of the type Harmonization Data. The assigned process steps use process chains that extract the relevant changes and finally update the InfoObjects using DTPs.
Update of Master Data InfoObjects Without the Direct Update Feature
To see your changes in the InfoObjects, you must wait until the daemon has started and the process has completed successfully.
Update of Master Data If Direct Update Is Activated
If you activate the direct update of master data, the changes made in data harmonization are transferred directly to the InfoObjects without the help of the daemons from the process flow control (PFC). The update is executed by the corresponding APIs.
404 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Direct Update Using APIs
To avoid conflicts with data upload (for example, File-> Acquisition DSO -> Propagation DSO -> InfoObject), the system must not use DTPs but APIs for both processes. The APIs are encapsulated as a SAP BW process type, /DDF/ATTR (Update InfoObject), that replaces the DTPs that are used in the process without direct update, in the corresponding process chains.
Update of Master Data InfoObjects Using the Direct Update Feature
The following InfoObjects and attributes with texts are updated:
● Internal Source Product (/DDF/PRODUCT)● Product (/DDF/GPID)
○ Brand (/DDF/BRAG_TXT)○ Manufacturer (/DDF/MANG_TXT)○ Category (/DDF/PCAG_TXT)○ Subcategory (/DDF/PSCG_TXT)
● Global Product (/DDF/GMPID)○ Global Brand (/DDF/BRAGM_TXT)
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 405
○ Global Manufacturer (/DDF/MANGM_TXT)○ Global Category (/DDF/PCAGM_TXT)○ Global Subcategory (/DDF/PSCGM_TXT)
● Internal Source Location (/DDF/LOCATION)● Location (/DDF/GLID))● Global Location (/DDF/GMLID)
Prerequisites
You have activated the direct update of master data. For more information, see Activating the Direct Update of Master Data [page 54].
More Information
Update InfoObjects from Change Log [page 70]
Migrate Master Data Attributes [page 71]
Examples of Process Definitions for Uploading Point of Sale Data [page 64]
Examples of Process Definitions for Uploading Market Research Data [page 66]
12.1.4.1 Activating the Direct Update of Master Data
Use
The activation process comprises the following mandatory steps:
1. Ensure that you have defined a data delivery agreement of the type Harmonization Data and that this data delivery agreement has been activated. For more information, see Define Data Delivery Agreement for Harmonization Data [page 56].
2. Ensure that you have run report /DDF/FDH_TRANSFER_CONV_EXIT, as described in SAP note 2044743 . It is important that you enter the new conversion sequences DALOI_INPUT, DALOI, and GTINF_INPUT (DALOI_INPUT, DAL10, and GLNF1_INPUT) for the field mapping of the source product/source location in the mapping instruction sets for the object type FDH_PROD/FDH_LOC. For a detailed description, see SAP note 2036128 . The required adaptation of the field mappings is described in step 8 of this list.
3. Ensure that the following BAdI implementations are active:
○ Check in Customizing under Demand Data Foundation Data Harmonization Business Add-Ins (BAdIs) BAdI: Logic During Save in Data Harmonization that the implementation /DDF/BW_MD_UPD has the Active in IMG Activity checkbox selected.
○ Check in Customizing under Demand Data Foundation Data Harmonization Business Add-Ins (BAdIs) BAdI: Additional Attribute Checks and Input Help that the BAdI implementations /DDF/
406 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
BADI_FDH_BO_LOC_BW and /DDF/BADI_FDH_BO_PROD_BW have the Active in IMG Activity checkbox selected.
If you have other active implementations of these BAdIs, deactivate them in Customizing. Alternatively, you can copy the code of the above default implementations into your own implementations. Do not deactivate the code of the default implementations.
4. Assign mapping instructions to data origins. For more information, see Assign Mapping Instructions to Data Origins [page 57].
5. Create number range intervals. For more information, see Create Number Range Intervals [page 58].6. Define InfoObject-specific settings. For more information, see Define InfoObject-Specific Settings [page
59].
7. Define the general settings for data harmonization in Customizing under Cross-Application Components Demand Data Foundation Data Harmonization General Settings . Enter the interval for the direct
update ID that you have created before. If you do not have any specific requirements, you can use the values you defined for your data delivery agreement of the agreement type Harmonization Data.
8. Define settings for the update of master data. For more information, see Define General Settings for the Update of Master Data [page 61].
NoteIn case of a new installation of SAP Demand Signal Management, version for SAP BW/4HANA, you can skip the steps 11 to 18 and start directly with the execution of the report Update InfoObjects from Change Log.
9. Adapt field mappings for specific SAP BW attributes. For more information, see Adapt Field Mappings for Specific Attributes [page 61].
10. Execute transaction Check Customizing for Data Model (/DDF/BW_CUST_CHECK) to ensure that your Customizing settings are consistent.
11. Execute the transaction /DDF/MD_MIGRATE_ATTR to upload those attributes to data harmonization that are only available in SAP BW. For more information, see Migrate Master Data Attributes [page 71].
12. Replace steps in process definitions. For more information, see Replace Steps in Process Definitions [page 63].
13. Deactivate steps of process definitions. For more information, see Deactivate Steps of Process Definition [page 64].
14. Stop the following standard jobs:○ Create Harm. Data Delivery ID○ Release Harm. Data Delivery ID
15. If you have data delivery agreements for master data upload with the Disable Harmonization checkbox selected, deselect the checkbox for those data delivery agreements. The direct update of master data is based on data harmonization and does not work if harmonization is disabled.
16. Wait until all processes in the process flow control (PFC) which had been started before the process definitions were changed, are finished. This way you can ensure that all DTPs for updating SAP BW master data are finished.
17. Execute the report /DDF/FDH_SET_CHG_LOG2PROCESSED. This report sets the status of the change pointers of all completed data deliveries to Processed.
18. Release those data deliveries of type Harmonization Data that still have the status New by running the report /DDF/FDH_RELEASE_DELIVERIES
19. Schedule the report /DDF/BW_MD_UPD (Update InfoObjects from Change Log [page 70]) to switch on the direct update. Select the Save Message Log checkbox. As the update of the SAP BW InfoObjects for
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 407
products and locations was stopped when the corresponding process chains were removed from the process definitions, this report processes all change pointers that have been created since then and carries out the pending updates. Only those changes that have not been transferred to SAP BW yet are processed because the report /DDF/FDH_SET_CHG_LOG2PROCESSED has set the change log entries to Processed. Therefore, you should expect a longer run time for the initial run of this report.
20.After the report Update InfoObjects from Change Log has finished, check the application log (object /DDF/FDH, subobject /DDF/FDH_BW_MD_UPD).
Activities After the Direct Update has Been Activated
1. Remove the step that contains the process chain /DDF/MD_EXTRACT from the process definition you use in the data delivery agreement of the type Harmonization Data. Once the direct update is active, the upload processes for that process definition are no longer triggered automatically. They have to be scheduled instead.After removing this step, only one step with the process chain /DDF/MD_STAGE remains in the process definition. This process chain also updates the quantity conversion DSO, which cannot be updated directly.
2. Schedule the process chain /DDF/MD_EXTRACT for periodic execution. It is recommended to schedule this process chain not more than once every hour for the harmonization data deliveries. The process chain /DDF/MD_EXTRACT creates process flow control instances of your data delivery agreement of the type Harmonization Data and triggers the process chain /DDF/MD_STAGE as an automatic follow-up step.
3. Schedule the report /DDF/BW_MD_UPD for periodic execution. This report processes all the change pointers which the direct update could not process successfully, for example because of locking. Note that this report only sets the status of a change pointer to Processed if the extraction of the master data record to the corresponding propagation DSO has been carried out successfully by the process chain /DDF/MD_STAGE.
4. Schedule the report /DDF/MD_BW_CHGLOG_DELETE for periodic execution. This report cleans up SAP BW change pointers. The corresponding change pointers from data harmonization have been cleaned up before.
5. Consider switching on the direct deletion of master data. This is done in Customizing for the data model under Cross-Application Components Demand Data Foundation Data Model General Settings . Only switch on the direct deletion if you do not want to archive harmonized master data that has been marked for deletion and if you have not used one of the following InfoObjects in your own InfoProviders:○ /DDF/GPID○ /DDF/GLID○ /DDF/GMPID○ /DDF/GMLID
NoteThe runtime of report /DDF/BW_MD_UPD increases if you switch on direct deletion of master data.
More Information
Direct Update of Master Data [page 51]
Update InfoObjects from Change Log [page 70]
Examples of Process Definitions for Uploading Market Research Da [page 66]
408 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Examples of Process Definitions for Uploading Point of Sale Data [page 64]
12.1.4.1.1 Define Data Delivery Agreement for Harmonization Data
If you have carried out a new installation of SAP Demand Signal Management, version for SAP BW/4HANA, set up a data delivery agreement of the type Harmonization Data as follows:
1. Create a new process definition for data harmonization in Customizing for Demand Data Foundation under Data Upload Define Processes and Steps . The new process definition must have two steps
underneath, one referring to the process chain ID /DDF/MD_EXTRACT and the other one referring to process chain ID /DDF/MD_STAGE.
2. Create a new data origin for data harmonization in the area menu under Administrator Data UploadData Delivery Agreements Define Data Origins .
3. Create a new data provider for data harmonization in the area menu under Administrator Data UploadData Delivery Agreements Define Data Providers .
4. Create a new context for data harmonization in the area menu under Administrator Data UploadData Delivery Agreements Define Contexts .
5. Create a new data delivery agreement for data harmonization in the area menu under AdministratorData Upload Data Delivery Agreements Define Data Delivery Agreements with the following specifics:○ Agreement type: Harmonization Data○ Data format: Tables○ Data upload method: Manual Upload – Harmonization○ Data origin, data provider, context, and process definition, as defined in the previous steps
6. Activate the data delivery agreement.
12.1.4.1.2 Assign Mapping Instructions to Data Origins
Use transaction /DDF/FDH_SETUP to assign mapping instructions for the default origin (initial value) of the following mapping instruction sets:
Object Type Mapping Instruction Set
FDH_LOC SAP_EXP_HARM_OBJECT_GLOBAL
FDH_LOC SAP_EXP_HARM_OBJECT_LOCAL
FDH_LOC SAP_EXP_SOURCE_OBJECT
FDH_LOC SAP_BW_LOCAL_NVP_AFTER_FDH
FDH_PROD SAP_EXP_HARM_OBJECT_LOCAL
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 409
Object Type Mapping Instruction Set
FDH_PROD SAP_EXP_BRAND_LOCAL
FDH_PROD SAP_EXP_HARM_OBJECT_GLOBAL
FDH_PROD SAP_BW_LOCAL_NVP_AFTER_FDH
FDH_PROD SAP_EXP_MANUFCT_GLOBAL
FDH_PROD SAP_EXP_MANUFCT_LOCAL
FDH_PROD SAP_EXP_MATLGRP_LOCAL
FDH_PROD SAP_EXP_PRODCAT_GLOBAL
FDH_PROD SAP_EXP_PRODCAT_LOCAL
FDH_PROD SAP_EXP_PRODSCAT_GLOBAL
FDH_PROD SAP_EXP_PRODSCAT_LOCAL
FDH_PROD SAP_EXP_SOURCE_BRAND
FDH_PROD SAP_EXP_SOURCE_MANUFCT
FDH_PROD SAP_EXP_SOURCE_MATLGRP
FDH_PROD SAP_EXP_SOURCE_OBJECT
FDH_PROD SAP_EXP_SOURCE_PRODCAT
FDH_PROD SAP_EXP_SOURCE_PRODSCAT
If you have carried out a new installation of SAP Demand Signal Management, version for SAP BW/4HANA, assign mapping instructions for the default origin to the following mapping instruction sets:
Object Type Mapping Instruction Set
FDH_LOC SAP_BW_HARMONIZE_LOCAL_RECORD
FDH_LOC SAP_BW_IMP_SOURCE_OBJECT_GM
FDH_LOC SAP_BW_LOCAL_RECORD_AFTER_FDH
FDH_PROD SAP_BW_HARMONIZE_LOCAL_RECORD
FDH_PROD SAP_BW_IMP_SOURCE_OBJECT_GM
FDH_PROD SAP_BW_LOCAL_RECORD_AFTER_FDH
410 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
12.1.4.1.3 Create Number Range Intervals
Create number range intervals with the ID 01 for the following number range objects:
Number Range Object Where?
/DDF/ATTR Customizing under Cross-Application Components Demand Data Foundation Data
Model Define Number Range for Attribute IDs of Length 10
/DDF/ATT18 Customizing under Cross-Application Components Demand Data Foundation Data
Model Define Number Range for Attribute IDs of Length 18
/DDF/ATT9 Customizing under Cross-Application Components Demand Data Foundation Data
Model Define Number Range for Attribute IDs of Length 9
/DDF/FDHDI Customizing under Cross-Application Components Demand Data Foundation Data
Harmonization Technical Settings Define Number Ranges for Direct Update ID
12.1.4.1.4 Define InfoObject-Specific Settings
Ensure that the following entries are available in the Customizing activity InfoObject-Specific Settings for Direct Update. For more information, see the Customizing for Data Model under Cross-Application ComponentsDemand Data Foundation Data Model InfoObject-Specific Settings for Direct Update .
InfoProvider Object Type Mapping Instruction Set
Number Range Object
Number Range Text InfoObject
/DDF/BRANDG
FDH_PROD SAP_EXP_BRAND_LOCAL
/DDF/ATT18 01 /DDF/BRAG_TXT
/DDF/BRANDGM
FDH_PROD SAP_EXP_BRAND_GLOBAL
/DDF/ATT18 01 /DDF/BRAGM_TXT
/DDF/BRANDL
FDH_PROD SAP_EXP_SOURCE_BRAND
/DDF/ATT18 01 /DDF/BRAND_TXT
/DDF/GLID FDH_LOC SAP_EXP_HARM_OBJECT_LOCAL
- - -
/DDF/GMLID FDH_LOC SAP_EXP_HARM_OBJECT_GLOBAL
- - -
/DDF/GMPID FDH_PROD SAP_EXP_HARM_OBJECT_GLOBAL
- - -
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 411
InfoProvider Object Type Mapping Instruction Set
Number Range Object
Number Range Text InfoObject
/DDF/GPID FDH_PROD SAP_EXP_HARM_OBJECT_LOCAL
- - -
/DDF/LOCATION
FDH_LOC SAP_EXP_SOURCE_OBJECT
- - -
/DDF/MANUFACTG
FDH_PROD SAP_EXP_MANUFCT_LOCAL
/DDF/ATTR 01 /DDF/MANG_TXT
/DDF/MANUFACTL
FDH_PROD SAP_EXP_SOURCE_MANUFCT
/DDF/ATTR 01 /DDF/MANUF_TXT
/DDF/MANUFCTGM
FDH_PROD SAP_EXP_MANUFCT_GLOBAL
/DDF/ATTR 01 /DDF/MANGM_TXT
/DDF/MATL_GRPG
FDH_PROD SAP_EXP_MATLGRP_LOCAL
/DDF/ATT9 01 /DDF/MGRG_TXT
/DDF/MATL_GRPL
FDH_PROD SAP_EXP_SOURCE_MATLGRP
/DDF/ATT9 01 /DDF/MATGR_TXT
/DDF/PRODCATG
FDH_PROD SAP_EXP_PRODCAT_LOCAL
/DDF/ATT18 01 /DDF/PCAG_TXT
/DDF/PRODCATGM
FDH_PROD SAP_EXP_PRODCAT_GLOBAL
/DDF/ATT18 01 /DDF/PCAGM_TXT
/DDF/PRODCATL
FDH_PROD SAP_EXP_SOURCE_PRODCAT
/DDF/ATT18 01 /DDF/PCAT_TXT
/DDF/PRODSCATG
FDH_PROD SAP_EXP_PRODSCAT_LOCAL
/DDF/ATT18 01 /DDF/PSCG_TXT
/DDF/PRODSCATL
FDH_PROD SAP_EXP_SOURCE_PRODSCAT
/DDF/ATT18 01 /DDF/PSCAT_TXT
/DDF/PRODSCTGM
FDH_PROD SAP_EXP_PRODSCAT_GLOBAL
/DDF/ATT18 01 /DDF/PSCGM_TXT
/DDF/PRODUCT
FDH_PROD SAP_EXP_SOURCE_OBJECT
- - -
NoteThese entries for attribute InfoObjects are necessary if you use the standard content version of these InfoObjects that contains texts. Consequently, a direct update of these text tables is required.
412 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
If the texts are stored as keys of these InfoObjects, without updating text tables in addition, those entries are not required. As a minimum, the following settings are required:
InfoProvider
Object Type Mapping Instruction Set
Number Range Object
Number Range Text InfoObject
/DDF/GMLID
FDH_LOC SAP_EXP_HARM_OBJECT_GLOBAL
- - -
/DDF/GLID FDH_LOC SAP_EXP_HARM_OBJECT_LOCAL
- - -
/DDF/LOCATION
FDH_LOC SAP_EXP_SOURCE_OBJECT
- - -
/DDF/GMPID
FDH_PROD SAP_EXP_HARM_OBJECT_GLOBAL
- - -
/DDF/GPID FDH_PROD SAP_EXP_HARM_OBJECT_LOCAL
- - -
/DDF/PRODUCT
FDH_PROD SAP_EXP_SOURCE_OBJECT
- - -
12.1.4.1.5 Define General Settings for the Update of Master Data
Define the settings for the update of master data in Customizing under Cross-Application ComponentsDemand Data Foundation Data Model General Settings . The settings you make here control the mass processing of master data change pointers.
Field Recommendation
Maximum Package Size in Dialog
Set this to a number similar to the maximum number of records that result from a search in one of the UIs in data harmonization and that are changed together. A typical number would be 200.
Maximum Package Size in Background
Set this to a number similar to the package size of the DTPs for uploading master data from the acquisition layer to the propagation layer, that are used when the direct update is not activated.. This package size should be smaller than the default package size of other DTPs in SAP BW. A typical number would be 5,000.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 413
Field Recommendation
Maximum Block Size Set a block size that is smaller than the Maximum Package Size in Background times the number of parallel processes. Different blocks of change pointers are processed sequentially while the change pointers within the block are divided into packages that are processed in parallel.
Number of Parallel Processes Set this to a number of parallel processes similar to the number of parallel processes config-ured for the execution of DTPs in the SAP BW batch manager. This number depends on the overall number of background processes available in your system.
Waiting Time in Seconds Allow for a sufficient waiting time. This helps to minimize locking issues that may occur because of concurrent direct updates of the same master data record in SAP BW. Note that master data activation locks an InfoObject globally. A typical waiting time would be 20 seconds.
Direct Deletion Choose whether the system will directly delete SAP BW master data that is no longer used. Do not switch on the direct deletion of master data before activating the direct update feature.
12.1.4.1.6 Adapt Field Mappings for Specific Attributes
If you use the direct update, all attributes of /DDF/PRODUCT and /DDF/LOCATION must be passed to data harmonization. Otherwise, the system initializes the attributes that are not available in data harmonization when updating the InfoObjects.
Therefore, you have to adapt the existing field mappings for all attributes that are available in SAP Business Warehouse but not in data harmonization. For more information, see the Customizing documentation under
Cross-Application Components Demand Data Foundation Data Harmonization Technical SettingsDefine Object Types .
The following tables list the attributes that must be adapted.
Field Mapping for Object Type FDH_PROD
Mapping Instruction Set SAP_BW_HARMONIZE_LOCAL_RECORD
Define Mapping Instruction SAP_BW_HARMONIZE_LOCAL_RECORD
Counter
Mapping Type Import Field Field Name Conversion Sequence
Export Field
104 Standard Mapping /1DD/S_HNUMLVL HNUMLVL - /1DD/S_HNUMLVL
109 Standard Mapping /1DD/S_PHLVL PHLVL - /1DD/S_PHLVL
111 Standard Mapping /1DD/S_TXT_LONG DESCRIPTION_LONG - /1DD/S_TXT_LONG
414 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Counter
Mapping Type Import Field Field Name Conversion Sequence
Export Field
112 No Mapping /1DD/S_TXTLG1 - - -
113 No Mapping /1DD/S_TXTLG2 - - -
114 Standard Mapping - DESCRIPTION_LONG GETCHAR_1_TO_60 /1DD/S_TXTLG1
115 Standard Mapping - DESCRIPTION_LONG GETCHAR_61_TO_120
/1DD/S_TXTLG2
116 Standard Mapping GROSS_WT GROSS_WT - GROSS_WT
118 Standard Mapping TXTMD DESCRIPTION_MEDIUM
- TXTMD
119 Standard Mapping TXTSH DESCRIPTION_SHORT
- TXTSH
125 Standard Mapping /1DD/S_PRODLEVEL PRODLEVEL - /1DD/S_PRODLEVEL
Field Mapping for Object Type FDH_LOC
Mapping Instruction Set SAP_BW_HARMONIZE_LOCAL_RECORD
Define Mapping Instruction SAP_BW_HARMONIZE_LOCAL_RECORD
Counter
Mapping Type Import Field Field Name Conversion Sequence
Export Field
20 Standard Mapping /1DD/S_LHLVL LHLVL - /1DD/S_LHLVL
21 Standard Mapping /1DD/S_HNUMLVL HNUMLVL - /1DD/S_HNUMLVL
105 Standard Mapping /1DD/S_DLVLOC DLVLOC_SOURCE - /1DD/S_DLVLOC
111 Standard Mapping /1DD/S_LOC_GUID LOC_GUID - /1DD/S_LOC_GUID
112 Standard Mapping /1DD/S_MRCOUNTRY
MRCOUNTRY - /1DD/S_MRCOUNTRY
113 Standard Mapping /1DD/S_SALESAREA SALESAREA - /1DD/S_SALESAREA
114 Standard Mapping /1DD/S_UN_SAREA UN_SAREA - //1DD/S_UN_SAREA
115 Standard Mapping /1DD/S_TXT_LONG DESCRIPTION_LONG - /1DD/S_TXT_LONG
116 No Mapping /1DD/S_TXTLG1 - - -
117 No Mapping /1DD/S_TXTLG2 - - -
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 415
Counter
Mapping Type Import Field Field Name Conversion Sequence
Export Field
118 Standard Mapping - DESCRIPTION_LONG GETCHAR_1_TO_60 /1DD/S_TXTLG1
119 Standard Mapping - DESCRIPTION_LONG GETCHAR_61_TO_120
/1DD/S_TXTLG2
120 Standard Mapping TXTMD DESCRIPTION_MEDIUM
- TXTMD
121 Standard Mapping TXTSH DESCRIPTION_SHORT
- TXTSH
NoteThe counter can differ depending on your Customizing settings.
12.1.4.1.7 Replace Steps in Process Definitions
Use
If you use the standard process chains in SAP Demand Signal Management, version for SAP BW/4HANA, replace all steps in your process definitions that point to the following process chains with new process chains:
Old Process Chain New Process Chain
/DDF/PROD_STAGE /DDF/PROD_STAGE_2
/DDF/LOC_STAGE /DDF/LOC_STAGE_2
/DDF/GMPID_STAGE /DDF/GMPID_STAGE_2
/DDF/GMLID_STAGE /DDF/GMLID_STAGE_2
Alternatively, you can create new process definitions. However, if you decide to do so, you have to change the process definition in all your data delivery agreements.
More Information
Examples of Process Definitions for Uploading Point of Sale Data [page 64]
Examples of Process Definitions for Uploading Market Research Da [page 66]
416 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
12.1.4.1.8 Deactivate Steps of Process Definition
Deactivate all steps of the process definition you use in the data delivery agreements that point to the following process chains:
● /DDF/GLID_EXTRACT● /DDF/GPID_EXTRACT● /DDF/LOC_EXTRACT● /DDF/PROD_EXTRACT● /DDF/GMPID_EXTRACT● /DDF/GMLID_EXTRACT● /DDF/GMPROD_EXTRACT● /DDF/GMLOC_EXTRACT
These steps are no longer needed once the direct update is active.
Master data propagation DSOs cannot be updated directly, but they are still required by certain applications. Therefore, you have to add the following steps to this process definition in the data delivery agreements of the type Harmonization Data:
● /DDF/MD_EXTRACT● /DDF/MD_STAGE
NoteThe exact way of changing the process definition of your data delivery agreement of type Harmonization Data depends on whether or not you have used semantic partitioning for your master data. If you use semantically partitioned objects (SPOs) instead of the DSOs /DDF/DS31 and /DDF/DS51, you should create multiple copies of the process chain /DDF/MD_STAGE, depending on the number of semantic partitions, and multiple steps in the corresponding process definition.
To enable the system to choose the correct step at runtime, you must implement the BAdI /DDF/BADI_ADU_STEP.
Do not deactivate the data delivery agreement of type Harmonization Data.
12.1.4.1.9 Examples of Process Definitions for Uploading Point of Sale Data
The following examples show process definitions that define the required steps for uploading point of sale data. These process definitions are defined in Customizing. For more information, see Customizing for Data Upload under Cross-Application Components Demand Data Foundation Data Upload Define Processes and Steps .
The first table shows an example of a process definition when the direct update of master data is not active. The second table shows an example after the direct update has been activated.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 417
Process Definition When Direct Update Is Not Active
Step Description Parallelize Group
Process Chain ID Upload Stage
101 Location Acquisition MD1A /DDF/LOC_FILE Data Acquisition
102 Product Acquisition MD1A /DDF/PROD_FILE
103 Sales Acquisition MD1A /DDF/SALES_FILE
104 Stock Acquisition MD1A /DDF/STOCK_FILE
150 Quality Validation n/a /DDF/QV_PERFORM Quality Validation
161 Upload Addtional Locations
MD1X /DDF/LOC_ADD Master data creation during Data Acquisition. These steps are only required if new products or new locations, for which no master data has been created yet, were included in the delivery.
162 Upload Additional Products
MD1X /DDF/PROD_ADD
201 Location Staging MD2A /DDF/LOC_STAGE Data Propagation
202 Product Staging MD2A /DDF/PROD_STAGE
203 Sales Staging TD2 /DDF/SALES_STAGE
204 Stock Staging TD2 DDF/STOCK_STAGE
205 Sales Enrichment TD3 /DDF/SALES_ENRICHMT_STAGE
Data Enrichment
206 Stock Enrichment TD3 /DDF/STOCK_ENRICHMT_STAGE
Process Definition When Direct Update Is Active
Step Description Parallelize Group
Process Chain ID Upload Stage
101 Location Acquisition MD1A /DDF/LOC_FILE Data Acquisition
102 Product Acquisition MD1A /DDF/PROD_FILE
103 Sales Acquisition MD1A /DDF/SALES_FILE
104 Stock Acquisition MD1A /DDF/STOCK_FILE
150 Quality Validation n/a /DDF/QV_PERFORM Quality Validation
418 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Step Description Parallelize Group
Process Chain ID Upload Stage
161 Upload Addtional Locations
MD1X /DDF/LOC_ADD Master data creation during Data Acquisition. These steps are only required if new products or new locations, for which no master data has been created yet, were included in the delivery.
162 Upload Additional Products
MD1X /DDF/PROD_ADD
201 Location Staging MD2A /DDF/LOC_STAGE_2 Data Propagation
202 Product Staging MD2A /DDF/PROD_STAGE_2
203 Sales Staging TD2 /DDF/SALES_STAGE
204 Stock Staging TD2 DDF/STOCK_STAGE
205 Sales Enrichment TD3 /DDF/SALES_ENRICHMT_STAGE
Data Enrichment
206 Stock Enrichment TD3 /DDF/STOCK_ENRICHMT_STAGE
12.1.4.1.10 Examples of Process Definitions for Uploading Market Research Data for Global Reporting
The following examples show process definitions that define the required steps for uploading market research data for global reporting. These process definitions are defined in Customizing. For more information, see Customizing for Data Upload under Cross-Application Components Demand Data Foundation Data Upload Define Processes and Steps .
The first table shows an example of a process definition when the direct update of master data is not active. The second table shows an example after the direct update has been activated.
Process Definition When Direct Update Is Not Active
Step Description Parallelize Group
Process Chain ID Upload Stage
101 Hierarchy Metadata Extraction
/DDF/HIER_MR_EXTRACT Data Acquisition
102 Hierarchy Level Metadata Extraction
/DDF/HIER_LVL_MR_EXTRACT
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 419
Step Description Parallelize Group
Process Chain ID Upload Stage
103 Location Hierarchy Extraction
HD1 /DDF/LOC_HIER_MR_EXTRACT
104 Product Hierarchy Extraction
HD1 /DDF/PROD_HIER_MR_EXTRACT
111 Location Extraction MD12 /DDF/LOC_MR_EXTRACT
112 Product Extraction MD1 /DDF/PROD_MR_EXTRACT
113 Location Name Value Extraction
MD2 /DDF/LOC_NV_MR_EXTRACT
114 Product Name Value Extraction
MD2 /DDF/PROD_NV_MR_EXTRACT
120 Time String Extraction /DDF/MRTIME_FILE
130 Retail Panel Extraction /DDF/MRTPN_FILE
198 Location Staging MD3 /DDF/LOC_STAGE Data Propagation
199 Product Staging MD3 /DDF/PROD_STAGE
201 Location Hierarchy Staging
HD2 /DDF/LOC_HIER_STAGE
202 Product Hierarchy Staging
HD2 /DDF/PROD_HIER_STAGE
213 Location Name Value Staging
MD4 /DDF/LOC_NV_STAGE
214 Product Name Value Staging
MD4 /DDF/PROD_NV_STAGE
220 Time String Staging /DDF/MRTIME_STAGE
230 Retail Panel Staging /DDF/MRTPN_STAGE
350 Quality Validation Before Global Layer
/DDF/GMQV_PERFORM Quality Validation
401 Upload Locations for Global Reporting
MD7 /DDF/GMLID_STAGE Data Propagation
402 Upload Products for Global Reporting
MD7 /DDF/GMPID_STAGE
420 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Step Description Parallelize Group
Process Chain ID Upload Stage
430 Global Retail Panel Staging
/DDF/GMRTPN_STAGE
Process Definition When Direct Update Is Active
Step Description Parallelize Group
Process Chain ID Upload Stage
101 Hierarchy Metadata Extraction
/DDF/HIER_MR_EXTRACT Data Acquisition
102 Hierarchy Level Metadata Extraction
/DDF/HIER_LVL_MR_EXTRACT
103 Location Hierarchy Extraction
HD1 /DDF/LOC_HIER_MR_EXTRACT
104 Product Hierarchy Extraction
HD1 /DDF/PROD_HIER_MR_EXTRACT
111 Location Extraction MD12 /DDF/LOC_MR_EXTRACT
112 Product Extraction MD1 /DDF/PROD_MR_EXTRACT
113 Location Name Value Extraction
MD2 /DDF/LOC_NV_MR_EXTRACT
114 Product Name Value Extraction
MD2 /DDF/PROD_NV_MR_EXTRACT
120 Time String Extraction /DDF/MRTIME_FILE
130 Retail Panel Extraction /DDF/MRTPN_FILE
198 Location Staging MD3 /DDF/LOC_STAGE_2 Data Propagation
199 Product Staging MD3 /DDF/PROD_STAGE_2
201 Location Hierarchy Staging
HD2 /DDF/LOC_HIER_STAGE
202 Product Hierarchy Staging HD2 /DDF/PROD_HIER_STAGE
220 Time String Staging /DDF/MRTIME_STAGE
230 Retail Panel Staging /DDF/MRTPN_STAGE
350 Quality Validation Before Global Layer
/DDF/GMQV_PERFORM Quality Validation
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 421
Step Description Parallelize Group
Process Chain ID Upload Stage
401 Upload Locations for Global Reporting
MD7 DDF/GMLID_STAGE_2 Data Propagation
402 Upload Products for Global Reporting
MD7 /DDF/GMPID_STAGE_2
430 Global Retail Panel Staging
/DDF/GMRTPN_STAGE
12.1.4.1.11 Examples of Process Definitions for Uploading Master Data from Harmonization
The following examples show process definitions that define the required steps for uploading master data from data harmonization. These process definitions are defined in Customizing. For more information, see Customizing for Data Upload under Cross-Application Components Demand Data Foundation Data Upload Define Processes and Steps .
The first table shows an example of a process definition when the direct update of master data is not active. The second table shows an example after the direct update has been activated.
Process Definition When Direct Update Is Not Active
Step Description Parallelize Group Process Chain ID Upload Stage
101 Extract Harmonized Location
MD1 /DDF/GLID_EXTRACT
102 Extract Harmonized Product
MD1 /DDF/GPID_EXTRACT
103 Extract Harmonized Location to Global Reporting
MD1 /DDF/GMLID_EXTRACT
104 Extract Harmonized Product to Global Reporting
MD1 /DDF/GMPID_EXTRACT
201 Extract Source Location after Harmonization
MD2 /DDF/LOC_EXTRACT
202 Extract Source Product after Harmonization
MD2 /DDF/PROD_EXTRACT
203 Extract Global Location Assignment
MD2 /DDF/GMLOC_EXTRACT
422 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Step Description Parallelize Group Process Chain ID Upload Stage
204 Extract Global Product Assignment
MD2 /DDF/GMPROD_EXTRACT
Process Definition When Direct Update Is Active
Step Description Parallelize Group Process Chain ID Upload Stage
101 Stage Data After Harmonization
/DDF/MD_STAGE
12.1.4.2 Update InfoObjects from Change Log
Use
This program is used to switch on the direct update. It determines all change pointers for objects that were changed in data harmonization and that could not be transferred to corresponding InfoObjects, for example due to a locking problem.
Scheduling this program regularly ensures that all master data updates reach the corresponding InfoObjects and are available for reporting.
Prerequisites
You have made your settings for the update of master data in Customizing under Demand Data FoundationData Model General Settings . The parallelization settings from Customizing are used as defaults for this transaction. However, you can overwrite them for a specific program run, if required.
Features
Object Selection
The standard selection type is Selection by Unprocessed Change Pointers. You can also select change pointers for products or for locations only or you can narrow the selection by date or time.
Message Log
You can choose whether you want to generate a message log.
Display of Check Results
You can choose how the check results are displayed and whether the system displays them at all.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 423
Result
Once you have executed this program and the system has activated this new method for updating the affected InfoObjects, the previous update method is not available anymore.
12.1.4.3 Migrate Master Data Attributes
Use
You use transaction /DDF/MD_MIGRATE_ATTR to migrate product and location attributes from SAP Business Warehouse (SAP BW) to data harmonization before activating the direct update of master data. The program transfers attribute values for source products or source locations from the SAP BW InfoObjects /DDF/PRODUCT and /DDF/LOCATION to the corresponding objects in data harmonization. This is necessary because, for the direct update of master data, all attributes of /DDF/PRODUCT and /DDF/LOCATION must be passed to data harmonization. Otherwise, the system initializes the attributes that are not available in data harmonization when updating the InfoObjects.
Prerequisites
You must ensure that for the object type FDH_PROD the above product attributes are mapped as import and export fields with mapping type Standard Mapping to the corresponding field name of data harmonization in the mapping instruction set SAP_BW_HARMONIZE_LOCAL_RECORD. The same applies to the listed location attributes for the object type FDH_LOC in mapping instruction set SAP_BW_HARMONIZE_LOCAL_RECORD.
Features
For testing purposes, you can restrict the migration to a specific data origin or to specific object IDs of source products or source locations from data harmonization.
The following product attributes are migrated:
● /DDF/HNUMLVL Numerical Hierarchy Level● /DDF/PHLVL Source Hierarchy Level● /DDF/PRODLEVEL Source Product Level● /DDF/GROSS_WT Gross Weight● /DDF/DESCRIPTION_MEDIUM Medium Description of Object● /DDF/DESCRIPTION_SHORT Short Description of Object
The following location attributes are migrated:
● /DDF/HNUMLVL Numerical Hierarchy Level
424 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
● /DDF/LHLVL Source Hierarchy Level● /DDF/LOC_GUID Source Location Reference GUID● /DDF/SALESAREA Selling Area (Floor Space)● /DDF/UN_SAREA Selling Area (Floor Space) Unit● /DDF/DLVLOC Internal Source Delivery Location● /DDF/MRCOUNTRY Delivery Region● /DDF/DESCRIPTION_MEDIUM Medium Description of Object● /DDF/DESCRIPTION_SHORT Short Description of Object
All master data migration processes can be run in parallel mode. You can overwrite the default settings, if required.
12.1.5 Check the Customizing for Data Model
You use transaction /DDF/BW_CUST_CHECK to check whether the Customizing settings for the data model are correct, for example, the parameters for direct update or the field mappings for products or locations in data harmonization. When you start the transaction, several selection options are available:
● All Checks: All checks are performed.● General Settings: The system checks the Customizing of the general settings for the data model.● Product: The system performs only product-relevant checks.● Location: The system performs only location-relevant checks.
12.2 Configuring and Administrating the Data Upload
Use
The following table shows the tasks you have as an administrator for the data upload and explains, in which cases and where you make the individual settings. Dependent on your system landscape you make Customizing settings in the development system, and all other settings in the production system.
NoteFor more information, see Roles for SAP Demand Signal Management, version for SAP BW/4HANA. [page 530].
Task When? Where?
Set up data upload in Customizing Implementation Phase Customizing under Cross-Application
Components Demand Data
Foundation Data Upload.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 425
Task When? Where?
Define data origins
Define data providers
Define regions
Define contexts
You receive new files from an external data provider for which the data is not defined yet.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements .
Define new data delivery agreements You receive new files from an external data provider for which no data delivery agreement is defined yet.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements
Define Data Delivery Agreements or use transaction Define Data Delivery Agreements /DDF/DDAGR.
Enhance data delivery agreements for retail panel data
You receive new files for retail panel data (market research data) from an external data provider for which no data delivery agreement is defined yet.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements for
Market Research Data Enhance Data
Delivery Agreements or use transaction Enhance Data Delivery Agreements /DDF/DDAGR_MRD.
Define files and file sets A new data delivery agreement has been created.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements
Define Files and File Sets or use transaction Define Files and File Sets /DDF/FILESET.
Map external fields A new data delivery agreement for retail panel data has been created.
UI Manage Agreements
426 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Task When? Where?
Change mapping of external fields A new data delivery agreement for retail panel data has been created, and the mapping has been defined.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements for
Market Research Data Change
Mappings for External Fields or use transaction Change Mappings /DDF/SDX_MAP_MAINT
UI Manage Agreements
Define extraction filters A new data delivery agreement for retail panel data has been created.
UI Manage Agreements
Define time derivation A new context for retail panel data has been created in the system.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements for
Market Research Data Define Time
Derivation or use transaction Define Time Derivation /DDF/DEF_TD
Revise hierarchy descriptions A new data delivery agreement for retail panel data has been created in the system.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements for
Market Research Data Revise
Hierarchy Descriptions or use transaction Revise Hierarchy Descriptions /DDF/MR_HIER
Revise hierarchy levels A new context for retail panel data has been created in the system.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements for
Market Research Data Revise
Hierarchy Levels or use transaction Revise Hierarchy Levels /DDF/MR_PRIO
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 427
Task When? Where?
Define product level descriptions A new context for retail panel data has been created in the system that is used in global reporting.
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Upload Data Delivery Agreements for
Market Research Data Define Product
Level Descriptions or use transaction Define Product Level Descriptions /DDF/MR_LVL
Perform one-time uploads New stock types or types of sales data have been defined in Customizing, or new regions have been created.
See One-Time Uploads Before Recurring Regular Data Uploads Are Started [page 164]
Schedule background jobs for automatic uploads
Implementation phase See Scheduling and Monitoring Jobs for Automatic Data Uploads [page 86]
Monitor the upload processes and jobs Monitor DeliveriesUI
Monitor Jobs
UI
12.2.1 Enabling the Upload of Name/Value Pairs With Special Characters
Use
If you receive data that contains name/value pairs with special characters, you must execute the mapping report and the Business Add-In (BAdI) described below.
The mapping report Copy External Field Names (/DDF/ADU_ORIG_EXT_FNAME_COPY) uses the BAdI Mapping External Field Names (/DDF/BADI_MRD_NVP) which, if required, removes unsupported characters, truncates the character string to the maximum 30 characters, and ensures the string remains unique. The original external field name must be persisted beforehand.
If you have data delivery agreements for which external field names have already been uploaded and for which you want to implement the BAdI Mapping External Field Names and persist the original external field name, this report allows you to copy the values in the external field name column to the original external field name.
Procedure
1. Execute the mapping report
428 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
You use this report to automatically copy the values in the Map All Fields table (/DDF/C_FIELDMAP) from the External Field Name column (EXTFIELDNAME) to the Original External Field Name column (ORIG_EXTFNAME).
2. Create mappings using one of the following:○ Execute the report Map Fields for Market Research Data (/DDF/ADU_MAP_CUST_SDX).
For more information, see the report documentation.○ Navigate to the user interface Manage Agreements and choose Create Mappings.
For more information, see SAP Library in SAP Help Portal at https://help.sap.com/dsimbw4h .3. Go to transaction Data Delivery Agreements for Retail Panel Data (/DDF/SDX_MAP_MAINT) and check that
the original external field name is filled.4. Implement the BAdI
Before you upload external field names, particularly in the case of runtime mapping for data delivery agreements from countries whose languages contain special characters, such as Russia and China, it is important to note the following:To enable the upload of external field names beyond the initial load to the persistent staging area, unsupported characters and external field names that are longer than 30 characters must be modified to prevent dumps. To prevent dumps, you must implement the BAdI Mapping External Field Names. This BAdI allows you to change the external field names before they are saved to the mapping table. The modification of the external field names involves replacing or removing unsupported characters in the attribute name/value pair data type. The BAdI does the following:○ Modifies the external field names of data set types of subtype Attribute Name/Value Pair○ Disables the external field name checks by data delivery agreement before the name/value pair upload
5. Disable checks on external field names for specific data delivery agreements and extend this to exclude name/value pairs that you have already uploaded.You can implement this BAdI without affecting existing name/value pairs. The modified external field names are stored in the Map All Fields table in transaction Data Delivery Agreements for Retail Panel Data (/DDF/SDX_MAP_MAINT). The external field name of data set types of subtype Attribute Name/Value Pair is mapped to the ATTR_NAME extractor field during the name/value pair upload. The following checks are performed by default for the attribute name/value pair (external field name):○ It cannot be longer than 30 characters○ It cannot contain special characters (such as spaces)○ It cannot be duplicated for the same data delivery agreement or the same data set type○ When the check for the external field name fails, the upload process for the name/value pairs is
stopped.For more information, see the BAdI documentation in Customizing for Cross-Application Components under Demand Data Foundation Business Add-Ins BAdI: Mapping External Field Names .
12.2.2 Configuring the Data Upload for Retail Panel Data
Use
Whenever you receive retail panel data from providers of market research data, you create new data delivery agreements. You can either create single data delivery agreements or you can create multiple data delivery agreements at the same time.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 429
Activities
1. Create a folder structure with an inbound folder for the new data delivery agreements in accordance with the file path required.
NoteDefine folder names containing 10 or fewer characters, as the name of a data delivery agreement is limited to 10 characters.
2. Place data deliveries or empty metafiles in the inbound folders so that the new folder can be detected.3. You can create a single data delivery agreement for retail panel data (see Enhancing Data Delivery
Agreements for Retail Panel Data [page 141]).4. Define a file set that only contains the metafile and has no process step assigned (see Defining Files and
File Sets [page 111]).5. Create the mapping for external fields and adjust the automatically generated mapping manually, if
necessary (see Mapping External Fields for Retail Panel Data [page 143]).6. Define extraction filters (see Defining Extraction Filters [page 145]).7. Define the time derivation (see Defining Time Derivation [page 147]).8. Revise the hierarchy data (see Revising Hierarchy Descriptions [page 150] and Revising Hierarchy Levels
[page 152]).9. Define product level descriptions (see Defining Product Level Descriptions [page 153]).10. Activate the data delivery agreement.11. Make sure that standard background jobs are running (see Scheduling and Monitoring Jobs for Automatic
Data Uploads [page 86]).
12.2.2.1 Name/Value Pairs in Retail Panel Data
SAP Demand Signal Management, version for SAP BW/4HANA provides name/value pairs for any unknown attributes that are included in retail panel data deliveries. Any unknown attributes that have no target for storage must be mapped for future analytics and reporting.
To ensure that product and location attributes are mapped correctly, the Persistent Staging Area (PSA) contains the following key field names and values:
Field Name Field Value
ATTR_NAME /DDF/ATTRIBUTE_NAME
ATTR_VALUE /DDF/ATTRIBUTE_VALUE
12.2.2.2 Example: File Types
This table lists the file types you may find in the metafile.
430 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
File Type File Property
Metafile Not declared in the metafile
Format file FORMAT
Control file CONTROL
Dimension file DIMENSION
Hierarchy file HIERARCHY
Hierarchy level file LEVEL
Variable file VARIABLE
Key figure file MEASURE
Hierarchy structure file PARENTAGE
Attribute value file ATTRIBUTE_VALUE
Sales data file DATA
12.2.2.3 Example: Configurable CSV Format for Uploading Retail Panel Data
Procedure
1. Create a process definition and steps specifically for configurable CSV formatted files in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Define Processes and Steps .Use the following direct update process chains for retail panel data:
Step Number Description Process Chain ID Activate
10 Extract Market Research Hierarchy Metadata
/DDF/HIER_MR_EXTRACT Yes
20 Extract Market Research Hierarchy Level Metadata
/DDF/HIER_LVL_MR_EXTRACT
30 Extract Market Research Location Hierarchy
/DDF/LOC_HIER_MR_EXTRACT
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 431
Step Number Description Process Chain ID Activate
40 Extract Market Research Product Hierarchy
/DDF/PROD_HIER_MR_EXTRACT
101 Extract Location for Market Research Data
/DDF/LOC_MR_EXTRACT
102 Extract Product for Market Research Data
/DDF/PROD_MR_EXTRACT
103 Extract Location Name/Value for Market Research Data
/DDF/LOC_NV_MR_EXTRACT
104 Extract Product Name/Value for Market research Data
/DDF/PROD_NV_MR_EXTRACT
105 Extract Market Research Time String from File System
/DDF/MRTIME_FILE
110 Extract Market Research Retail Panel Exm. from File System
/DDF/MRTPN_FILE
201 Stage Locations with PFC /DDF/LOC_STAGE_2
202 Stage Products with PFC /DDF/PROD_STAGE_2
203 Stage Locations Name/Value Pairs with PFC
/DDF/LOC_NV_STAGE
204 Stage Products Name/Value Pairs with PFC
/DDF/PROD_NV_STAGE
207 Stage Market Research Time String with PFC
/DDF/MRTIME_STAGE
211 Stage Market Research Retail Panel Example with PFC
/DDF/MRTPN_STAGE
221 Stage Market Research Location Hierarchy with PFC
/DDF/LOC_HIER_STAGE
222 Stage Market Research Product Hierarchy with PFC
/DDF/PROD_HIER_STAGE
2. In the SAP Easy Access Menu under Administrator Data Upload define the following attributes, which you can then assign to your data delivery agreement:○ Data origins
432 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
○ Data providers○ Contexts
NoteWe recommend you create a new context. If you reuse an existing context, it may affect the harmonization results or final reporting.
3. Create a data delivery agreement specifically for configurable CSV formatted files in the SAP Easy Access Menu under Administrator Data Upload Define Data Delivery Agreements (transaction /DDF/DDAGR).
NoteUse the same syntax as the parent folder of the inbound/process/archive file folder structure when you define the name of the data delivery agreement.
4. Make the following settings in the data delivery agreement attributes:○ Select the agreement type Retail Panel Data.○ Select the data format Configurable CSV.○ Select the data upload method Automatic Upload with Folder Scanner.
NoteThe Template checkbox does not have any impact on a retail panel data delivery agreement type with the Configurable CSV data format.
5. Enter the process definition you created in step 1 and add the data origin, data provider, and context you defined.
NoteDo not select the checkbox Activate if the configuration for the data delivery agreement, file set, file format, and mapping is not yet done.
6. You must maintain the DataSource name for the data set types and steps that upload the data from the files to the acquisition layer. Add the data sets in the table below:
Data Set Definition
Data Set Type Description Step Number DataSource Name
H_HIER Hierarchy Header 10 /DDF/HIERARCHY_DX
H_LEVEL Hierarchy Level 20 /DDF/HIER_LEVEL_DX
L_ATTR_COL Location Attribute Column 101 /DDF/LOC_ATTR_COL_DX
L_ATTR_NV Location Attribute Name/Value Pair
103 /DDF/LOC_ATTR_NV_DX
L_HIER Location Hierarchy 30 /DDF/LOC_HIER_DX
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 433
Data Set Type Description Step Number DataSource Name
P_ATTR_COL Product Attribute Column 102 /DDF/PROD_ATTR_COL_DX
P_ATTR_NV Product Attribute Name/Value Pair
104 /DDF/PROD_ATTR_NV_DX
P_HIER Product Hierarchy 40 /DDF/PROD_HIER_DX
RP_DATA Retail Panel Transactional Data
110 /DDF/MRTPN_EXAMPLE
T_ATTR_COL Time Attribute Column 105 /DDF/TIME_ATTR_COL_DX
7. Go to the SAP User Menu and choose Administrator Data Upload Data Delivery Agreements for Market Research Data Enhance Data Delivery Agreements .Enhance the data delivery agreement with the following information and see the field help for more details:○ Region○ Normalized unit of measure○ Source value dimension product
Enter PROD.○ Source value dimension location
Enter GEOG.○ Source value dimension time
Enter TIME.○ Priority calculation type
Keep the default selection: Calculation Uses DEFAULT Value for Empty Hierarchy Level○ Time granularity
Depends on your data, for example, 0CWEEK for weekly deliveries or 0CMONTH for monthly, and so on.
○ 1st day of weekDepends on your data
○ 1st week of yearDepends on your data
NoteEnsure the data delivery agreement is inactive.
8. Drop the files in the Inbound folder.9. Define the file set by going to the UI for Manage Agreements using either the SAP Fiori Launchpad, the SAP
NetWeaver Business Client, or by navigating from the SAP User Menu.○ Select the data delivery agreement you created and choose Edit mode.○ In the File Set tab page, insert all the files that are available in your inbound folder and belong to the
data delivery.○ Make the following entries:
○ Select a row in the chosen file displays the file content in the preview○ Add a description○ Change the file name to a file name pattern to ensure future files are loaded
434 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
○ Assign data set types○ Choose file, thousand, and decimal separators○ Enter the number of header rows○ Define whether the file is optional in the data delivery○ Optional: check the Delete button function
10. In the File Format tab page, select the files one after the other and do the following:○ Check the original external field name and modify the system proposals for external field names○ Introduce conversion exits, if needed○ Choose which attribute should be included in a name/value pair data set
11. In the Mapping tab page, make the following settings:○ Select a non-name/value pair data set and assign attributes in one of the following ways:○ Attribute Value (Custom)○ External Field Value○ Fixed Value
Select a name/value pair data set and assign one of the three attributes described above to the following:○ Key fields○ Attribute fields
Result
Check your settings as described in Checks After Setting Up Configurable CSV Data Format [page 133]
12.2.2.4 Example: Checking the Set-Up for Configurable CSV Format (Retail Panel)
NoteOptional
The system does not let you activate data delivery agreements for which there are Customizing inconsistencies. Therefore, you can run the report Check Consistency of Configuration for Data Upload (/DDF/ADU_CUST_CONSIST_CHECK) to check for any inconsistencies and correct them so that you are able to activate the data delivery agreement.
1. Activate the data delivery agreement in the UI for Manage Agreements.2. Start the folder scanner and the process dispatcher.3. Make sure that both the folder scanner and process dispatcher jobs are running in the Monitor Jobs UI
under Status Overview.4. Monitor the process in the Monitor Deliveries UI to ensure that the upload process is successful.
5. In the test system in the SAP User Menu, under Data Upload Supervisor Monitor Deliveries , search for the data delivery agreement and monitor the last process until all its steps have been processed and the data delivery agreement process has the status Completed.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 435
6. Make a note of the process ID. Then check the uploaded data in the acquisition DataStore objects (DSOs). Once the process chains have been processed and have the status Completed, the data has already been uploaded to SAP Business Warehouse (SAP BW).
7. Go to Data Warehouseing Workbench: Modeling (transaction RSA1).8. Select the InfoProvider node and expand the node Demand Data Foundation to perform the checks on the
following DSOs:
DSOs for Checking
Technical Name Description
/DDF/DS21 Location Acquisition
/DDF/DS41 Product Acquisition
/DDF/DS05 Time String Acquisition
/DDF/DS04 Market Research Retail Panel Acquisition
To perform checks, select each DSO and right-click to select Display Data.
9. Enter the process ID you noted previously with preceding 0000's.
12.2.2.5 Self-Descriptive Exchange Format
The self-descriptive exchange (SDX) format is a data format that you can use to upload file sets that contain the following:
● Transaction data● Master data (attribute values or hierarchy data)● Hierarchy structures● Hierarchy metadata● Metadata on the key figures and the desired aggregation behavior of the key figures● Technical metadata on the file set itself
The metafile is a prerequisite for data deliveries in SDX format. The metafile provides the following information:
● A complete list of all the files belonging to the file set● A file name for all files● A file property field that specifies the type of data contained in the set
436 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
12.2.2.5.1 Hierarchy Upload
Use
Market research data can contain hierarchies in the dimensions product, market (location), and time; for example:
● Product data hierarchy: Category, subcategory, segment, brand, product● Market (location) data hierarchy: Market, country, region, zip code● Time data hierarchy: Year, half year, quarter, and month
You can analyze the data on all levels of the hierarchies, if the following preconditions are fulfilled:
● The hierarchy levels must be available as master data records identified with a qualifier (node or leaf).● The provided hierarchical structure must be converted into a hierarchy over the relevant InfoObject (for
example, /DDF/PRODUCT and /DDF/LOCATION).● The priorities are calculated to enable correct reporting of pre-aggregated totals
In the standard system the hierarchy upload is implemented for the product and location dimension using the following DataSources in SAP Business Warehouse:
● Hierarchy Metadata: Contains external IDs and long texts of external hierarchies maintained in the default language of the data delivery agreement
● Hierarchy Level Metadata: Contains the level ID, the level number, and a long text for the levels maintained in the default language for each hierarchy of the data delivery agreement
● Hierarchy Data: Contains the parent-child relationship that must be converted into a BW hierarchy
Process
1. Extract hierarchy and hierarchy level information from the delivered retail panel data.Create a data delivery agreement for market research data and the mapping of external fields.The source values of the three dimensions used in the source data are stored (for example, PROD for the dimension product). This is necessary to be able to distinguish between the different hierarchies. During data upload, the names of the hierarchies themselves could be identical, for example, the source data could contain a product hierarchy called H1 and a location hierarchy called H1. If there are no hierarchy levels for a dimension and the priority calculation type of the data delivery agreement is set to Calculation Uses DEFAULT Value for Empty Hierarchy Level the report creates default entries for all dimensions with empty levels
2. Revise hierarchy descriptions and hierarchy levels manually, if necessary. For example, you can adapt the descriptions of hierarchy levels to your needs.Changes to the descriptions or the hierarchy names apply after a reload of the hierarchies.Changes to the numeric hierarchy level require a complete reload of the hierarchy and the transaction data.
NoteTo update the hierarchy data, you have to update the mapping.
3. Upload the retail panel data for the first time (initial upload).
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 437
○ Hierarchy dataThe system identifies the hierarchy for each dimension using the source value of the dimension that is assigned to the data delivery agreement. The numeric hierarchy level is determined by the system for each element of the dimensions product, location (market) and time. The result is stored in the corresponding master data InfoObject.
○ Transaction dataThe system derives numeric hierarchy levels from the corresponding master data records belonging to the transaction data and multiplies the values to calculate the priority.
4. Subsequent upload of retail panel data.○ Hierarchy data
During metadata upload of hierarchies and hierarchy levels the system determines the changes of the current delivery compared to the existing settings for hierarchies.The settings are updated automatically, old entries are deactivated and new entries created (for example, because a new hierarchy is provided). Afterwards, the numeric hierarchy levels are recalculated for each element in the three dimensions.
NoteIt may happen that hierarchy data or hierarchy level data is missing in a delivery that previously contained this data. In this case, the system uses the existing settings to determine the numeric hierarchy levels and the priority calculation.
○ Transaction dataThe system derives numeric hierarchy levels from the corresponding master data records belonging to the transaction data and multiplies the values to calculate the priority
More Information
Priority Calculation [page 136]
Time Derivation [page 148]
12.2.2.5.2 Priority Calculation
Use
Market research data often contains key figures that are provided as so called pre-aggregated totals, for example, percentage values like Numeric Distribution or Weighted Distribution. For these non-cumulative totals on different hierarchy levels and aggregation levels you cannot use standard aggregation methods like sum. To ensure correct reporting, you must use an exception aggregation, for example, First Value with respect to a reference characteristic.
As market research data can contain hierarchies not only for products but as well for the dimensions location (market) and time, you cannot use one hierarchy level, for example, one level of the product hierarchy, as the reference characteristic. Instead, you have to combine all possible hierarchy levels in an additional characteristic that can be used as a reference characteristic for all delivered key figures in reporting. The
438 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
InfoObject /DDF/MRPRIO (Market Research Priority for Exception Aggregation) is used as the reference characteristic for this purpose. For the key figures the exception aggregation First Value is used.
Prerequisites
The data delivery agreement has the following properties:
● Each dimension element must have a numeric hierarchy levelIf no hierarchy is available for a dimension you can force the system to create a default entry for the calculation of priorities using the corresponding priority calculation type for your data delivery agreement. If no numeric hierarchy level is available for an element, priority calculation is not possible at all.
● The numeric hierarchy levels must be ascending from top to bottom.The key figures use exception aggregation First Value. This means that a dimension element that is higher in the hierarchy must have a lower number for the numeric hierarchy level (1 is the top level). The level numbering can have gaps from one level to the next. This may be necessary if multiple hierarchies are available for one dimension with different level numbers.
● A hierarchy level must be unique for a dimension independent of all hierarchies.If a dimension element belongs to more than one hierarchy, the hierarchy level must be the same in all hierarchies
You can use transaction Revise Hierarchy Data to check these settings and to define exceptions.
ExampleThere are two product hierarchies with the following hierarchy levels (in parentheses the semantic meaning of the level):
● H1: L1 (Category) - L2 (Manufacturer) - L3 (Brand) - L4 (Sub-Brand) - L5 (Product)
● H2: L6 (Category) - L7 (Manufacturer) - L8 (Segment) - L9 (Brand) - L10 (Sub-Brand) - L11 (Product)
The level Brand has a different meaning in hierarchy H1 and in hierarchy H2. In H1 it is directly below Manufacturer whereas in H2 it is below Segment and Segment is below Manufacturer.
The numeric hierarchy levels are defined as follows:
● H1: 1 (Category) - 2 (Manufacturer) - 4 (Brand) - 5 (Sub-Brand) - 6 (Product)● H2: 1 (Category) - 2 (Manufacturer) - 3 (Segment) - 4 (Brand) - 5 (Sub-Brand)
- 6 (Product)
Process
1. The system determines the numeric hierarchy level for each element of the three dimensions (attribute /DDF/HNUMLVL of InfoObjects /DDF/PRODUCT, /DDF/LOCATION, and /DDF/TIMEREF).The three numerical hierarchy levels are multiplied for each record and stored as a priority. This value can be used as a reference to report on pre-aggregated values.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 439
2. The system executes the priority calculation during the upload of transaction data by multiplying the numeric hierarchy levels of each dimension. This is done in a transformation for each data record before the result is stored as a numeric value in InfoObject /DDF/MRPRIO in DataStore Object (DSO) /DDF/DS14:○ Priority of each transaction data record (/DDF/MRPRIO) =○ Numerical hierarchy level product (/DDF/PRODUCT-HNUMLVL) *○ Numerical hierarchy level location (/DDF/LOCATION-HNUMLVL) *○ Numerical hierarchy level time (/DDF/TIMEREF-HNUMLVL)
Example
You receive the following master data:
● A market hierarchy with the market Total Market and the two submarkets Region North and Region South.
● A product hierarchy with the category Cake and two products Lemon Cupcake and Raspberry Muffin.● No time hierarchy is defined, so the time hierarchy level for all records is 1.
You receive the transaction data with the percentage value for numeric distribution as displayed in the following table:
Market Product Value Market Level Product Level Priority
Total Market Cake 61% 1 1 1
Total Market Lemon Cupcake 53% 1 2 2
Total Market Raspberry Muffin 52% 1 2 2
Region North Cake 69% 2 1 2
Region North Lemon Cupcake 61% 2 2 4
Region North Raspberry Muffin 72% 2 2 4
Region South Cake 56% 2 1 2
Region South Lemon Cupcake 51% 2 2 4
Region South Raspberry Muffin 59% 2 2 4
The data of the first three columns is delivered by a market research company.
The hierarchy levels for markets and products are determined by the system during master data upload.
The system calculates the priority during the upload of transaction data.
Key figures with a reference to the calculated priority and the exception aggregation First Value provide the correct pre-aggregated values for reporting.
In this example, you report on the numeric distribution of Lemon Cupcakes in the Total Market. The system does not summarize the records for Region North and Region South but takes the value with the
440 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
highest priority (lowest number). The records for Lemon Cupcake on regional level have priority 4. Compared to priority 2 for the Total Market, 2 is the higher priority. The numeric distribution of 53% is the correct value.
12.2.2.6 DataSources and Extract Structures for Retail Panel Data
Use
In SAP BW, you need to distinguish between different types of information.
When you extract retail panel data using the self-descriptive exchange format, there is a DataSource for each data set. This table shows which DataSources are linked with which extract structures. The function module used for these is /DDF/ADU_SAPDX_EXTRACT_GENERIC.
DataSource Description Extract Structure Required Data Format
/DDF/PROD_ATTR_COL_DX Product Attribute Column /DDF/S_MR_PROD_ATTR_COL_DX
SDX
Configurable CSV
/DDF/PROD_ATTR_NV_DX Product Attribute Name/Value Pair
/DDF/S_MR_PROD_ATTR_NV_DX
/DDF/PROD_HIER_DX Product Hierarchy /DDF/S_MR_PROD_HIER_DX
/DDF/LOC_ATTR_COL_DX Location Attribute Column /DDF/S_MR_LOC_ATTR_COL_DX
/DDF/LOC_ATTR_NV_DX Location Attribute Name/Value Pair
/DDF/S_MR_LOC_ATTR_NV_DX
/DDF/LOC_HIER_DX Location Hierarchy /DDF/S_MR_LOC_HIER_DX
/DDF/TIME_ATTR_COL_DX Time Attributes /DDF/S_MR_TIME_ATTR_COL_DX
/DDF/TD_DX_TEMPLATE SDX Transaction Data Template
/DDF/S_MR_TD_DX_TEMPLATE
/DDF/HIERARCHY_DX Hierarchies /DDF/S_MR_HIERARCHY_DX
/DDF/HIER_LEVEL_DX Hierarchy Levels /DDF/S_MR_HIER_LVL_DX
/DDF/MRTPN_EXAMPLE Retail Panel Example /DDF/S_MRTPN_EXAMPLE
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 441
More Information
DataSources and Extract Structures for Retailer (POS) Data [page 158]
12.2.2.6.1 Parallelizing Extraction Using Multiple InfoPackages
Use
To improve performance of the extraction of transactional data in self-descriptive exchange (SDX) format, you can parallelize the extraction.
To do this you distribute the data of a delivery that is to be extracted equally to multiple InfoPackages that can be processed in parallel.
You use the control structure of the following DataSources to define the index and the total number of InfoPackages:
● /DDF/TD_DX_TEMPLATE SDX Transaction Data Template● /DDF/MRTPN_EXAMPLE Retail Panel Example
If the control structure is not used, the system extracts the data sequentially.
Procedure
1. Create InfoPackages for the DataSource with the following selection parameters:○ PARAL_PROC_NR contains the total number of InfoPackages used○ PARAL_PROC_IDX contains the sequence number of each InfoPackage
2. Create a process chain that executes the InfoPackages in parallel and calls the subordinated process chain /DDF/MRTPN1 subsequently.
3. Assign the process chain to the process step that uploads SDX transactional data in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Define Processes and Steps .
More Information
BI content documentation:
●●
442 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
12.2.2.7 Enhancing Data Delivery Agreements for Retail Panel Data
Use
You enhance data delivery agreements with additional attributes that are necessary for uploading retail panel data from market research data providers.
Prerequisites
You have defined time granularities in Customizing under Cross-Application Components Demand Data Foundation Data Upload Settings for Retail Panel Data Define Time Granularity .
You have defined a data delivery agreement with the following properties:
● A process definition is assigned that refers to process chains that are created for DataSources for retail panel data.
● Agreement type is Retail Panel Data● Data format is Self-Descriptive Exchange Format● Data upload method is Automatic Upload with Folder Scanner● Data sets for incoming files in self-descriptive exchange (SDX) format are defined and associated with
process steps and the DataSources that are used for extraction.
Activities
1. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Data Upload Data Delivery Agreements for Market Research Data Enhance Data Delivery Agreements or use transaction Enhance Data Delivery Agreements /DDF/DDAGR_MRD.Select a data delivery agreement.
2. Specify the dimensions that can be used to upload hierarchies for the dimensions product, location, and time.For example, the hierarchies of dimension Product can be defined for product category, subcategory, segment, brand, and product
3. Specify the normalized unit of measure that is used to enrich market research data during data upload if no normalized unit of measure is provided.The normalized unit of measure typically has the dimension weight or volume for reporting in kilogram (KG) or liter (L).
4. Choose the priority calculation type.The priority calculation type defines whether a priority calculation is executed during data upload and how the system reacts if hierarchy levels are empty, for example, because they have not been provided with the retail panel data.
5. Choose the time granularity and dependent settings for time data.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 443
The time granularity defines the time periods that the data provided in the data deliveries can be subdivided into, for example, calendar weeks or periods of four weeks.
6. Define additional metafiles.If you want to load any additional files that are not contained in the metafile of the self-descriptive exchange (SDX) format, you can create a record for these files here. The records in this table are handled in exactly the same way as they would be if they were contained in the SDX metafile.
7. Define additional format files.If you want to load any additional files that are not explained in the SDX format file, you can create your own explanations by creating records in this table. The records in this table are handled in exactly the same way as they would be if they were contained in the SDX format file.
8. Define synonyms.Synonyms allow you to reduce the workload of updating multiple entries to updating a single entry. The records in this table are handled in exactly the same way as they would be if they were contained in the SDX synonym file.
NoteNot all attributes mandatory for data delivery agreements are required for defining a templates. On the UI, you create the dependent mappings and extraction filters and activate the data delivery agreements.
9. If you want to activate the data delivery agreement and do not want to use it as a template, you proceed as follows:1. (Optional) You use report Check Consistency of Configuration for Data Upload /DDF/CUST_CHECK to
check whether all settings for data delivery agreements are consistent.2. You map external fields using report Map Fields for Market Research Data DDF/SDX_MAP or on the
Manage Agreements UI.3. You activate the data delivery agreement.
Result
You have created a template for an active data delivery agreement for retail panel data from a market research company.
All active data delivery agreements are taken into account when the system scans the inbound folder for new files. All files with an active data delivery agreement are uploaded.
More Information
Self-Descriptive Exchange Format [page 134]
Defining Data Delivery Agreements [page 106]
Mapping External Fields for Retail Panel Data [page 143]
Defining Extraction Filters [page 145]
Revising Hierarchy Descriptions [page 150]
444 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Defining Files and File Sets [page 111]
Related Settings
The following settings can be defined dependent on data delivery agreements:
Configuring Quality Validation [page 203]
Defining Harmonization Groups [page 230]
12.2.2.8 Mapping External Fields for Retail Panel Data
Use
You map external fields to fields in the extract structure based on a data delivery agreement.
This mapping configuration allows you to take market research data from external sources and map the information to the fields in SAP Demand Signal Management, version for SAP BW/4HANA where you can then analyze and perform reporting on the data.
If you want to use a new field in analytics and reporting, you have to define the mapping.
The mapping also defines the hierarchy and hierarchy level information and the priority calculation.
Prerequisites
You have created inactive data delivery agreements for market research data.
You have assigned common fields to data set types in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Settings for Retail Panel Data Assign Common Fields to Data Set
Types .
Examples for common fields are Process ID, Language, or Currency.
Activities
1. Open the Management of Data Delivery Agreements Web user interface (Web UI).2. Select one or more data delivery agreements and choose Create Mapping. For existing mapping you can do
the following○ Update existing mapping○ Delete the mappings that are disabled○ Delete existing mapping○ Delete all existing mappings and creates new mappings
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 445
NoteYou can also use report Map Fields for Market Research Data /DDF/ADU_MAP_CUST_SDX to create or delete the mapping for a single data delivery agreement.
The system creates the mapping of the external fields to extractor fields used in SAP NetWeaver Business Warehouse.
3. Adjust the field mapping for one or multiple data delivery agreements, if necessary. You can choose any field to which the external field can be mapped from the input help.
NoteYou can also change the mappings in transaction Change Mappings of External Fields /DDF/SDX_MAP_MAINT:
q
1. On the SAP Easy Access screen choose SAP Menu Cross-Application Components Demand Signal Management Data Upload Data Delivery Agreements for Market Research DataChange Mappings of External Fields .
2. Select a data delivery agreement and choose Map All Fields. Adjust the individual settings, if necessary
3. Activate or disable the mapping for a specific data set type.
RecommendationYou can create the initial mapping in a test system and synchronize it with the production system at a later point in time.
Proceed as follows:
1. Deactivate the data delivery agreement.2. Place all the files into the inbound folder.3. Create the mapping.4. Adjust the mapping if necessary.
Result
The system has created the mapping and the hierarchies. You can change the mappings and revise the hierarchy data afterwards.
NoteIf there are no hierarchies for a dimension (product, market, or time) and the priority calculation type of the data delivery agreement is set to Calculation Uses DEFAULT for Empty Hierarchy Level, default entries for all the empty dimensions are created.
446 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
12.2.2.9 Defining Extraction Filters
Use
You can define and modify your own extraction filters for data delivery agreements to limit the amount of data that is loaded into SAP Demand Signal Management, version for SAP BW/4HANA to what is most relevant for your business needs. Extraction filters are applied to transaction data only. Master data is always uploaded in its entirety into the system to ensure consistent master data and hierarchies.
You can define an extraction filter at hierarchy level or attribute level for the following dimension types:
● Location (market)● Product● Time
NoteIf a data delivery in self-descriptive exchange (SDX) format contains a new dimension type and the transactional data is to be filtered by the new dimension type, you need to define an additional dimension. For more information on how to set up the extraction filtering of transactional data for an additional dimension type, see SAP Note 2086125 also.
Prerequisites
You have created mapping entries for the data set types for the following dimensions to ensure the extraction filter works at runtime:
● Product with dimension subtype Attribute Name/Value Pair● Location with dimension subtype Attribute Name/Value Pair● Time with dimension subtype Attribute Column-Based
The appropriate files and file sets are defined.
The data delivery agreement is not activated.
The files from the retail panel data delivery are in the inbound folder and ready to be uploaded.
NoteFor more information, see Roles for SAP Demand Signal Management, Roles for SAP Demand Signal Management, version for SAP BW/4HANA. [page 530]
Activities
1. Open the Web UI Management of Data Delivery Agreements.2. Select a data delivery agreement and open the Filters assignment block.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 447
3. In the Filters assignment block, choose the dimension for which you want to edit the filters.4. Add new rows to the filter definition and enter the data. If you assign a group number, the system evaluates
rows that belong to the same group number with AND.All groups or entries without group number are evaluated with OR.○ Define filters by attributes. You can use all the attributes that are defined in the format file.
Example
Example of Product Filter by Attributes for a Data Delivery Agreement
Activated Group Number Field Name Operator Field Value
Yes 1 Brand EQ Best Muffin
Yes 1 Manufacturer EQ Best Manufacturer
Yes 2 Brand EQ Sweet Dreams
Yes 2 Type NE Cookies
In this example, the rule is interpreted as follows:
All products are considered when (Brand is equal to “Best Muffin” AND Manufacturer is equal to “Best Manufacturer”) OR (Brand is equal to “Sweet Dreams” AND Type is not equal to “Cookies”).
○ Define filters by hierarchiesYou can define filters by hierarchies for all hierarchies that are defined for the dimensions Product and Location.Filters by hierarchies can consist of several rules. Each rule is evaluated as EQUAL TO and applied to entire hierarchies. Multiple rules are logically linked with OR.
ExampleYou enter a filter with hierarchy name H_1.
In this example, all products that belong to product hierarchy H_1 are a positive match.
If you define a filter by attributes and a filter by hierarchy for a dimension, the rules are combined.
ExampleFor a data delivery agreement that has filters Product by Attributes and Product Hierarchy Filters defined as above, the system evaluates all products when ((Brand is equal to “Best Muffin” AND Manufacturer is equal to “Best Manufacturer”) OR (Brand is equal to “Sweet Dreams” AND Type is not equal to “Cookies”)) OR Hierarchy is equal to H_1.
5. Activate the filters.6. Choose Simulate Filters to simulate the filters for each dimension.
For the selected dimension, the system extracts master data from the files according to the extraction filters you created. A dialog box is displayed and you can download the results log to check whether the filter results are as you would have expected.
448 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
NoteYou can only simulate filters for inactive data delivery agreements.
Example
As a data upload supervisor, you are responsible for the data upload process, data quality, and the correct selection of data. You want to prevent your database from getting filled up with aggregated data for the following reasons:
● Maintains a low total cost of ownership● Optimizes data upload and reporting
Your marketing department informs you that they want to analyze and report on your main markets and products only. You start to define your own extraction filters for your main markets and product categories by hierarchy and by attribute. You simulate the filters using the provided files to ensure that the results are as you would expect. Afterwards, you activate the data delivery agreement and proceed with the data upload.
12.2.2.10 Defining Time Derivation
You define how the system derives start date, end date, and if possible, all standard time attributes like 0CALYEAR or 0CALMONTH from the source time strings.
If the standard time derivations do not deliver the desired output (for example, if you have special formats or customer-specific needs), you can create a Business Add-In (BAdI) implementation for BAdI: Time Derivation. After activating the implementation, the filter value is available in the input help for the field Time Derivation Method. Select your filter value and save the record to assign your conversion routine for this time derivation.
NoteYou can add custom attributes like time, date and so on. For this you should perform the custom Business Add-In (BAdI) implementation.
Prerequisites
You have defined time granularities in Customizing under Cross-Application Components Demand Data Foundation Data Upload Settings for Retail Panel Data Define Time Granularity .
You have assigned the time granularity to a data delivery agreement.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 449
Activities
1. Run report Create Standard Time Derivations /DDF/TIMETYPE_EXAMPLE to create standard entries of combinations of time derivation types and time granularities that typically exist in data deliveries and that you can use for time derivation.On the SAP Easy Access screen, choose Cross-Application Components Demand Signal Management
Data Upload Data Delivery Agreements for Market Research Data Create Standard Time Derivationsor use transaction Create Standard Time Derivations /DDF/DEF_TD_EXP.
2. On the SAP Easy Access screen, choose Cross-Application Components Demand Signal ManagementData Upload Data Delivery Agreements for Market Research Data Define Time Derivation or use
transaction Define Time Derivation /DDF/DEF_TD.3. Enter the time granularity using the input help.4. If you want to define the time derivation for a specific context, enter a context using the input help.5. Create a regular expression that matches your external time strings.6. Specify the source time string that is provided with the market research data.7. If the standard derivation methods do not support the delivered time string, enter a filter value to identify
the time derivation method.8. Specify the offsets that determine the position of time information in the time string.
More Information
Time Derivation [page 148]
Example: Defining Time Derivation [page 149]
12.2.2.10.1 Time Derivation
Use
The system derives start date, end date, and if possible, all standard time attributes like 0CALYEAR or 0CALMONTH from the source time strings. The result is posted to InfoObject /DDF/TIMEREF.
This is done during data upload from DataStore object /DDF/DS05 to DataStore object /DDF/DS15 as follows:
1. The system determines the time granularity and the context assigned to the data delivery agreement.2. The system determines all possible time derivations for the given combination of context and time
granularity.If no entries are defined the system collects all time derivation types for the given time granularity, regardless of the context. If the time granularity of the data delivery agreement is set to MULTIPLE all time granularities are taken into account. Again, the system first checks if a context is assigned and if no matches are found, a context- independent search is run.
3. The system evaluates the source time string and checks the regular expression for all identified time derivations until one regular expression matches. If no time derivation is possible the upload process is stopped.
450 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
4. The system converts the source time string into a start date and end date and determines all standard time attributes, for example 0CALYEAR or 0CALMONTH, if possible. The conversion is done using implemented methods that evaluate the various offset parameters assigned to the time derivation. These parameters define where the time information can be found in the source time string.
More Information
Defining Time Derivation [page 147]
Example: Defining Time Derivation [page 149]
12.2.2.10.2 Example: Defining Time Derivation
You receive a data delivery with the time strings M201210, M201211, M201212, M201301, and so on.
The source time strings represent calendar months. They are built according to the following pattern of seven characters:
● First character M stands for month● The next four digits describe the calendar year● The last two digits describe the calendar month
Definition of a Regular Expression
A regular expression to validate the source time strings can be defined as follows: [M][2][0][0123456789]{2}[01][0123456789].
● The allowed set of values for each single digit is denoted within the square brackets. The first three characters are always M20, assuming that you receive only time strings beginning with M in the 20th century.
● The next two digits can contain any number between 0 and 9. Instead of entering the expression [0123456789] twice for each digit, you can use the abbreviated expression [0123456789]{2} indicating that the next two concatenated digits have the same value set.
● The sixth digit can have only the values 0 or 1.● The last digit can contain any number between 0 and 9.
Using this regular expression the system searches the time string for any sequence of seven characters following this expression. Any string that contains characters before or after this pattern, for example ABCM201302 or DEFM201303GH is also accepted by the system. To limit the expression so that it accepts exactly seven characters, you must define the start and end of the string using \< and \>, for example: \<[M][2][0][0123456789]{2}[01][0123456789]\>.
For more details about regular expressions, see documentation for the following ABAP keywords:
● Syntax of Regular Expressions
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 451
● Character String Patterns● Special Characters in Regular Expressions
Definition of Offsets
The four-digit year (in the example string 2012) starts at position 2 and the two-digit calendar month (in the example 09) starts at position 6. Therefore, the corresponding offsets of 1 and 5 are defined to identify the year and month within the source time string. All other offsets are set to 99 to indicate that no other time information is provided.
Method of Time Derivation
The time derivation is implemented as a standard routine. Therefore, no filter value is needed to identify a BAdI implementation. The system automatically determines the start date and end date.
Result of Time Derivation
As a result, the system derives the following values for the attributes of /DDF/TIMEREF for source time string M201209:
● 0DATEFROM = 20120901● 0DATETO = 20120930● 0CALMONTH = 201209● 0CALMONTH2 = 09● 0CALQUARTER = 20123● 0CALQUART1 = 3● 0CALYEAR = 2012
All other attributes, for example 0CALWEEK, are undefined and cannot be determined.
12.2.2.11 Revising Hierarchy Descriptions
Use
You can revise and change the source descriptions for the uploaded hierarchy data, if necessary.
452 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Prerequisites
You have mapped external fields for a data delivery agreement.
Activities
1. On the SAP Easy Access screen, choose Cross-Application Components Demand Signal ManagementData Upload Data Delivery Agreements for Market Research Data Revise Hierarchy Descriptions or
use transaction Revise Hierarchy Descriptions /DDF/MR_HIER.2. Select the data delivery agreement you want to use.3. (Optional) On subview Revise Hierarchy Descriptions, overwrite the name or the description of the
hierarchy that is displayed in the InfoObject of the corresponding dimension (product or location). Determine which hierarchies should be transferred to the InfoObject of the corresponding dimension.
4. On subview Revise Level Descriptions, overwrite the description of the hierarchy levels provided in the source data.
5. On subview Display Hierarchy History and Display Level History, check for old, inactive versions of a hierarchy or hierarchy level. The Data Delivery ID column is empty for active hierarchies. For inactive hierarchies, it contains the delivery ID that deactivated it, for example, because this hierarchy is no longer part of the delivered files.
Example
Transfer only some hierarchies and change descriptions
You receive three product hierarchies. However, only two of them should be transferred to /DDF/PRODUCT. You therefore disable the transfer for the third hierarchy. In addition, you change the descriptions of the hierarchy levels of the two relevant hierarchies.
More Information
Mapping External Fields for Retail Panel Data [page 143]
Hierarchy Upload [page 135]
Priority Calculation [page 136]
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 453
12.2.2.12 Revising Hierarchy Levels
Use
You can revise and change the numerical levels of the source data for the uploaded hierarchy data, if necessary.
Prerequisites
You have mapped external fields for market research data.
Activities
1. On the SAP Easy Access screen, choose Cross-Application Components Demand Signal ManagementData Upload Data Delivery Agreements for Market Research Data Revise Hierarchy Levels or use
transaction Revise Hierarchy Levels /DDF/MR_PRIO2. Select the context you want to use.3. On subviews Revise Product Hierarchy Level or Revise Location Hierarchy Level, change the source numeric
hierarchy level for a complete level of the product or location hierarchy.You can enter a user-defined value or add an offset to adapt the determination of numeric levels as a precondition for correct priority calculation.
4. On subview Define Level for Specific Product or Define Level for Specific Location, change the source numeric levels not for a complete hierarchy level but for single product or location by adding a user-defined value or an offset.
Example
Change Numeric Hierarchy Levels for a Complete Hierarchy
You receive two market hierarchies that have semantically the same top node. The first market hierarchy has lower levels separating the markets into regions and subregions. The other hierarchy separates the markets into different channels, for example drugstore, discounter, and supermarkets with different store sizes. To ensure correct reporting based on the pre-aggregated totals of those hierarchies, you can shift all levels of one hierarchy so that the top nodes of both hierarchies are not aggregated. Only the node with the higher priority (lower numeric hierarchy level) is taken into account.
Change Numeric Hierarchy Level for a Single Dimension Element
You receive retail panel data for 10 markets without any hierarchy. However, you know that the markets have a hierarchical dependency and you would like to report on the pre-aggregated totals accordingly. In this case, you can manually define a numeric level for each market.
454 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
More Information
Mapping External Fields for Retail Panel Data [page 143]
Revising Hierarchy Descriptions [page 150]
Hierarchy Upload [page 135]
Priority Calculation [page 136]
12.2.2.13 Defining Product Level Descriptions
Use
You can define descriptive names for product levels. You use product level descriptions in global reporting, especially in the Define Consolidation user interface.
Prerequisites
You have mapped external fields for a data delivery agreement.
Activities
1. (Optional) Run report Copy Product Level Descriptions /DDF/HIELVL2PLEVEL to copy hierarchy level descriptions that are defined for a specific data delivery agreement.
1. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Data Upload Data Delivery Agreements for Market Research Data Copy Product Level Descriptions or use transaction Copy Product Level Descriptions /DDF/MR_COPYLVL.
2. You specify the context and the language and execute the report.
2. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Data Upload Data Delivery Agreements for Market Research Data Define Product Level Descriptions or use transaction Define Product Level Descriptions /DDF/MR_LVL.
3. Select the context.4. On subview Define Product Hierarchy Level, define the descriptions.5. Start the process chain Stage Market Research Product Hierarchy Level Texts /DDF/PHLVL_STAGE using
the Data Warehousing Workbench (transaction RSPC) to make sure that the texts are transferred to InfoObject /DDF/PHLVL and are visible on the user interface.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 455
More Information
Mapping External Fields for Retail Panel Data [page 143]
Revising Hierarchy Descriptions [page 150]
Revising Hierarchy Levels [page 152]
Hierarchy Upload [page 135]
Priority Calculation [page 136]
BI content documentation: (/DDF/PHLVL_STAGE)
12.2.3 Configuring the Data Upload for Retailer Data
Use
NoteIf a new delivery ID has been pushed into a persistent staging area (PSA), the scanner detects this and creates an entry in the table so that you can choose a PSA-specific template from which you can then create new data delivery agreements.
1. Create a folder structure with an inbound folder for the new data delivery agreements in accordance with the file path required.
NoteDefine folder names containing 10 or fewer characters, as the name of a data delivery agreement is limited to 10 characters.
2. Place data deliveries in the inbound folders so that the new folder can be detected.3. Create a data delivery agreement in one of the following ways:
○ In the UI for Manage Agreements.
○ In the SAP User Menu under Administrator Data Upload Data Delivery Agreements Define Data Delivery Agreements (transaction /DDF/DDAGR).
4. Complete the following steps for the data delivery agreement:○ Define a file set.○ Define the file format.○ Create the mapping for external fields.
5. In the UI for Manage Agreements, choose New Master Data to create and assign the following new master data to the data delivery agreement:○ Data provider○ Data origin○ Context○ Region
456 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
More Information
Example: Configurable CSV Format for Uploading POS Data [page 155]
12.2.3.1 Example: Configurable CSV Format for Uploading POS Data
Procedure
1. Create a process definition specifically for configurable CSV formatted files in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Define Processes and Steps .Use the following direct update process chains for POS data:
Example: Process Definition CSF_POS
Step No. Description Process Chain ID Activate
101 Extract Locations from CSV File /DDF/LOC_POS Yes
102 Extract Products from CSV File /DDF/PROD_POS
103 Extract Location Attributes NV from CSV File
/DDF/LOC_NV_POS
104 Extract Product Attributes NV from CSV File
/DDF/PROD_NV_POS
105 Extract Aggregated Sales from CSV File
/DDF/SALES_POS
106 Extract Stock Snapshot from CSV File
/DDF/STOCK_POS
201 Stage Locations with PFC /DDF/LOC_STAGE_2
202 Product Staging (Direct Update) /DDF/PROD_STAGE_2
203 Stage Products with PFC /DDF/LOC_NV_STAGE
204 Stage Products Name Value Pairs with PFC
/DDF/PROD_NV_STAGE
205 Stage Aggregated Sales with PFC
/DDF/SALES_STAGE
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 457
Step No. Description Process Chain ID Activate
206 Stage Stock Snapshot with PFC /DDF/STOCK_STAGE
2. Create a data delivery agreement specifically for configurable CSV formatted files in the SAP Easy Access Menu under Administrator Data Upload Define Data Delivery Agreements (transaction /DDF/DDAGR).○ Select the agreement type Retailer Data.○ Select the data format Configurable CSV.○ Deselect the Activate Agreement checkbox.
3. Define the data sets for the data delivery agreement.
NoteIt is mandatory to assign DataSources for data delivery agreements that use the configurable CSV format.
Example: Data Set Definition for Agreement CSF_POS
Data Set Type Description Step No. DataSource Name
L_ATTR_COL Location Attribute Column 101 /DDF/LOC_ATTR_COL_POS
L_ATTR_NV Location Attribute Name/Value Pair
102 /DDF/LOC_ATTR_NV_POS
POS_SALES POS Sales Data 103 /DDF/SALES_POS
POS_STOCK POS Stock Data 104 /DDF/STOCK_POS
P_ATTR_COL Product Attribute Column 105 /DDF/PROD_ATTR_COL_POS
P_ATTR_NV Product Attribute Name/Value Pair
106 /DDF/PROD_ATTR_NV_POS
4. Create a folder structure on the application server for the new data delivery agreement with read/write access and place the files in the inbound folder.
5. Configure the file set by completing the following steps:○ Go to the UI Manage Agreements and select the new data delivery agreement CSF_POS.○ In Edit mode, choose on Insert.○ Define the following attributes for the file set and the individual files that belong to the file set:
NoteChange the file name pattern to ++++ before selecting the file row in the table of the file set tab page.
○ File descriptions○ File separators
458 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
○ Number of header rows○ Data set types for which the file is relevant
For example, you assign the data set types P_ATTR_COL and P_ATTR_NV to a product file.The file preview helps you to identify the file separator, escape characters, number of header lines, decimal separators, and thousand separators.For more information about these attributes, see the individual field help on the UI.
6. Configure the files format in the File Format tab page by selecting the files you want to configure.You can change the external field names, which act like a key and are used similarly to the attribute name (ATTR_NAME) in name/value pairs. Select the attributes that you want to have included in the name/value pair extraction.
7. Configure the mapping by selecting a data set type and then choosing one of the following mapping options:○ External field name: the system assigns the value that comes directly from the incoming source file○ Fixed attribute value: you can enter a fixed value for the system to use○ Custom attribute value: the system uses a calculated value, which comes from the implementation of
the Business Add-In BAdI: Mapping Enhancements (/DDF/BADI_ADU_EXTR_MAPPING)8. Activate the data delivery agreement.9. Start the folder scanner and the process dispatcher.
Result
Check your settings as described in Checking the Configurable CSV Formatted File Set-Up [page 157]
12.2.3.2 Example: Checking the Set-Up for Configurable CSV Format (POS)
NoteOptional
The system does not let you activate data delivery agreements for which there are Customizing inconsistencies. Therefore, you can run the report Check Consistency of Configuration for Data Upload (/DDF/ADU_CUST_CONSIST_CHECK) to check for any inconsistencies and correct them so that you are able to activate the data delivery agreement.
1. Activate the data delivery agreement in the UI for Manage Agreements.2. Start the folder scanner and the process scheduler.3. Make sure that both the folder scanner and process scheduler jobs are running in the Monitor Jobs under
Status Overview.4. Monitor the process in the Monitor Deliveries UI to ensure that the upload process is successful.5. In the test system in the Monitor Deliveries UI , under search for the data delivery agreement and monitor
the last process until all its steps have been processed and the data delivery agreement process has the status Completed.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 459
6. Make a note of the process ID. Then check the uploaded data in the acquisition DataStore objects (DSOs). Once the process chains have been processed and have the status Completed, the data has already been uploaded to SAP Business Warehouse (SAP BW).
7. Go to Data Warehouseing Workbench: Modeling (transaction RSA1).8. Select the InfoProvider node and expand the node Demand Data Foundation to perform the checks on the
following DSOs:
DSOs for Checking
Technical Name Technical Name
/DDF/DS21 Location Acquisition
/DDF/DS41 Product Acquisition
/DDF/DS05 Time String Acquisition
/DDF/DS01 Aggregated Sales Acquisition
/DDF/DS02 Stock Snapshot Acquisition
To perform checks, select each DSO and right-click to select Display Data.
9. Enter the process ID you noted previously with preceding 0000's.
12.2.3.3 DataSources and Extract Structures for Retailer (POS) Data
Use
In SAP BW, you need to distinguish between different types of information.
When you extract POS data using the configurable CSV format, there is a DataSource for each data set. This table shows which DataSources are linked with which extract structures. The function module used for these is also /DDF/ADU_SAPDX_EXTRACT_GENERIC.
DataSource Description Extract Structure Required Data Format
/DDF/PROD_ATTR_COL_POS Product Attribute Column POS
/DDF/S_POS_PROD_ATTR_COL
Configurable CSV
/DDF/PROD_ATTR_NV_POS Product Attribute Name/Value Pair POS
/DDF/S_POS_PROD_ATTR_NV
/DDF/LOC_ATTR_COL_POS Location Attribute Column POS
/DDF/S_POS_LOC_ATTR_COL
460 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
DataSource Description Extract Structure Required Data Format
/DDF/LOC_ATTR_NV_POS Location Attribute Name/Value Pair POS
/DDF/S_POS_LOC_ATTR_NV
/DDF/SALES_POS Sales POS /DDF/S_POS_SALES
/DDF/STOCK_POS Stock POS /DDF/S_POS_STOCK
More Information
DataSources and Extract Structures for Retail Panel Data [page 139]
12.2.3.4 Defining File Sets
Procedure
1. In the Manage Agreements UI, search for the data delivery agreement that you want to configure.This data delivery agreement should be inactive, have no mapping, and have configurable CSV format as the assigned data format.
2. Make the following settings using the input help for each of the files listed in the tab page for file sets:○ Adjust the file name pattern that the system proposes.○ Enter a description.○ Assign data set types.○ Enter separators.○ Enter the number of header rows that should be skipped before the system starts reading the file data.
3. Optional Enter the following:○ Escape character○ Optional○ Code page○ Replacement character
More Information
For more information, see the field help on the UI.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 461
12.2.3.5 Defining the File Format
1. In the UI for Manage Agreements, after configuring the files in the file set, go to the File Format tab page.All the original external fields from the file are listed in the table together with proposals for external fields. The proposals for external fields are created by the system automatically. The external fields are cleansed versions of the original external fields, which you can adjust yourself.
NoteThe external field is used in further steps such as mapping and data harmonization. The external field must, therefore, pass the following system checks that are executed upon saving:
○ Cannot have an empty value○ Cannot be duplicated in the same file○ The maximum field length is 30 characters○ The field name contains valid characters for harmonization purposes○ The field name uses upper case characters
Any of the following characters are considered to be valid: <ABCDEFGHIJKLMNOPQRSTUVWXYZ_0123456789#$%&*-/;<=>?@^{|}>
2. You cannot edit the Original External Field, which comes from the incoming source files for information purposes.However, you can edit the External Field, which is filled automatically but with a cleansed version of the original external field.
3. Select a conversion exit for localization of formatting.
When you select a specific file in a file set, the file preview shows the first 20 values of the selected file in table format.
12.2.3.6 Mapping Configurable CSV Formatted Files
1. In the UI for Manage Agreements, go to the Mapping tab page and select a data set type that is not used for name/value pairs.
2. Select one of the following mapping options for each row in the table:○ Attribute Value (Custom)○ External Field Value○ Fixed Value
For more information about these mapping options, see the individual field help.3. Select a name/value pair data set type.
NoteCheck the following when creating the mapping for name/value pair-related data set types:
○ All the attributes that have been marked as included for the name/value pair data set type are present in the attribute mapping for name/value pairs table
462 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Check the following when creating the mapping for any data set types:
○ Only one assignment type has been made for each field○ At least one key field has been mapped○ The BAdI: Mapping Enhancements has been implemented if you are assigning the Attribute Value
(Custom) option of mapping
4. Save your changes.
12.2.3.7 Handling Changes to Incoming Configurable CSV Formatted Files
The example scenarios below describe what you should do in the UI for Manage Agreements when changes are made to the configurable CSV formatted files.
Select the relevant data delivery agreement and click Edit.
Define a New File
New File Added to Data Delivery
File Set File Format Mapping Mapping Name/Value Pairs
1. Click Insert.2. Enter the file descrip
tion.3. Modify the file name
pattern, assign data set types and separators, and make all other entries necessary for the new file.
1. Select the file.2. The original external
field names are displayed in sequence.
3. The system proposes external field names that you can overwrite.
1. Select the data set type.2. Select the external field
names.
1. Select the data set type.2. Under Attribute Mapping
for Name/Value Pairs the system displays all the external field names in the same order as they are displayed in the File Format tab page.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 463
Delete an Existing File
File No Longer Included in Data Delivery
File Set File Format Mapping Mapping Name/Value Pairs
1. Click Delete.2. File is removed from list.3. Click Save.
The file is no longer available for selection.
When you select the data set type, the external field names of the deleted file are no longer available and the mapping is no longer available.
When you select the data set type, the attribute external field names of the deleted file are no longer available and the mapping is no longer available.
Change File Format
Changes to File Attributes (External Field Names)
File Format Mapping Mapping Name/Value Pairs
1. Select the file whose attributes you need to update.
2. Click Update External Fields.3. A pop-up displays the new external
field names that will be added to the file and the current external field names that will be disabled. Confirm or discard the proposed updates to external field names in the pop-up.
1. Select the data set type.2. Map the newly updated external
field names.
1. Select the data set type.2. Under Attribute Mapping for
Name/Value Pairs the system displays all the external field names in the same order as they are displayed in the File Format tab page.
Removal of File Attributes (External Field Names)
File Format Mapping Mapping Name/Value Pairs
1. Select the file whose attributes you need to update.
2. Click Update External Fields.3. A pop-up displays the new external
field names that will be added to the file and the current external field names that will be disabled. Confirm or discard the proposed updates to external field names in the pop-up.
1. Select the data set type.2. Mapped extractor fields that were
assigned before to these external fields are unassigned and the external field names are no longer available for selection.
1. Select the data set type.2. Under Attribute Mapping for
Name/Value Pairs the system displays all the external field names in the same order as they are displayed in the File Format tab page and the deleted external field names are removed.
464 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Disable External Field Names
File Format Mapping Mapping Name/Value Pairs
1. Select a field in the list and select Disable or choose Update External Fields to have the system automatically detect that a previously present field is no longer present in the new file.
2. Click Update Preview and the file preview no longer contains the disabled external field name.
NoteIf there are more columns in the actual file than in the file format, the system displays the header columns from the file as columns in the preview of the file.
1. Select the data set type.2. Existing assignments of extractor
fields to external fields are renamed
1. Select the data set type.2. Under Attribute Mapping for
Name/Value Pairs the system displays all the external field names in the same order as they are displayed in the File Format tab page and the disabled external field names are removed.
Exclude External Field Names From Name/Value Pairs
File Format Mapping Name/Value Pairs
1. Select the file whose attributes you need to update.2. Set the relevant field to Not Included.
1. Select the data set type.2. Under Attribute Mapping for Name/Value Pairs the sys
tem displays all the external field names in the same order as they are displayed in the File Format tab page and the excluded external field names are removed.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 465
Re-Enable External Field Names
File Format Mapping Mapping Name/Value Pairs
1. Select a field in the list and deselect Disable or choose Update External Fields to have the system automatically detect that a previously disabled field is now present in the new file.
2. Click Update Preview and the file preview contains the re-enabled external field name.
1. Select the data set type.2. The re-enabled external field
names are available again for selection.
1. Select the data set type.2. Under Attribute Mapping for
Name/Value Pairs the system displays all the external field names in the same order as they are displayed in the File Format tab page and the re-enabled external field names are available again.
Modify External Field Names
File Format Mapping Mapping Name/Value Pairs
1. Change the proposed external field name.
2. Click Update Preview and the file preview contains the modified external field name.
1. Select the data set type.2. The modified external field names
are available for selection.
1. Select the data set type.2. Under Attribute Mapping for
Name/Value Pairs the system displays all the external field names in the same order as they are displayed in the File Format tab page and the modified external field names are available.
12.2.4 One-Time Uploads Before Recurring Regular Data Uploads Are Started
Upload of Aggregated Sales or Stock Snapshot Data from Retailers
If there are not time master data available in the propagation layer, time characteristics cannot be assigned properly to transactional data. Therefore, point-of-sales data from retailers are not uploaded from the acquisition layer to the propagation layer.
Before you upload aggregated sales or stock snapshot data of retailers, you upload time master data to the /DDF/TIME InfoObject as follows:
1. You define an InfoPackage with generic time data (daily and weekly) for DataSource /DDF/TIME_ATTR.You specify the period of time for which generic time data with appropriate time characteristics as attributes are required in reporting by using fields VALIDFROM and VALIDTO. We recommend doing this at
466 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
the same time that you set up year-specific fiscal year variants, and that it covers a similar number of years into the future.
2. You execute the InfoPackage.3. You execute the /DDF/TIME process chain, to upload the time master data from the persistent staging
area to the /DDF/TIME InfoObject.
Upload of Aggregated Sales Data from Retailers
Before you upload aggregated sales data of retailers, you upload the types of sales data that are defined in Customizing under Cross-Application Components Demand Data Foundation Data Model Define Types of Sales Data to the /DDF/SALGRP InfoObject as follows:
1. You execute the Type of Sales Data Attributes InfoPackage2. You execute the Type of Sales Data Texts InfoPackage.3. You execute the /DDF/SALGRP_ATTR -> /DDF/SALGRP data transfer process.4. You execute the DTP /DDF/SALGRP_TEXT -> /DDF/SALGRP data transfer process.
Upload of Stock Snapshot Data from Retailers
Before you upload stock snapshot data from retailers, you upload the stock types that are defined in Customizing under Cross-Application Components Demand Data Foundation Data Model Define Stock Types to the /DDF/STOCKTYPE InfoObject as follows:
1. You execute the Stock Types InfoPackage.2. You execute the DTP /DDF/STOCKTYPE_TEXT -> /DDF/STOCKTYPE data transfer process.
Upload of Market Research Data
If you have defined regions for market research data, you must upload these to the /DDF/MRCOUNTRY InfoObject before you can upload market research data.
1. You create an InfoPackage for DataSource /DDF/MRCOUNTRY_TEXT2. You execute the data transfer process /DDF/MRCOUNTRY_TEXT -> /DDF/MRCOUNTRY
If you use global reporting of market research data, you assign countries as attributes of global locations. In order to be able to report retail panel data by these countries, you have to upload the country texts as follows:
1. If the data flow below InfoObject 0COUNTRY does not appear in the list of InfoSources in transaction RSA1, you first have to install the transfer structure of the corresponding data flow from the content. To do so, proceed as follows:1. You migrate DataSource 0COUNTRY_TEXT using transaction RSDS.
2. In transaction RSOR you choose in the 3.x Types Transfer Rules in the list of all object types.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 467
3. You select DataSource 0COUNTRY_TEXT and make sure that you selected Only Necessary Objects as grouping criterion.
4. You choose Install.2. In transaction RSA1, you create an InfoPackage for DataSource 0COUNTRY_TEXT. On the Processing tab,
you choose PSA and then in the InfoObject (Package by Package).3. You schedule the InfoPackage with update method Full Update.
Upload of Transactional Data (POS Data or Market Research Data)
If you want to use semantic partitioning of DataStore objects by defining partitions for semantically partitioned objects (SPO), you first have to upload the texts of the partition values to InfoObject /DDF/DDA_PART.
Execute the data transfer process /DDF/PARTVAL_TEXT/SRDCLNT001 to /DDF/DDA_PART.
12.2.5 Setting Up Automatic Upload with Folder Scanner
Use
You use the folder scanner to detect new file-based data automatically.
Activities
1. Allocate the appropriate disk space for the volume of incoming data you expect to receive.2. Configure the file system and the folders to enable the correct file movement as files move through the
inbound, process, and archive folders.3. Check that all the file directories have the appropriate access, modification, and execution privileges (for
example, read, copy, rename, and so on).4. Configure a data delivery agreement to set how the system should detect and upload new files.5. Define an active data delivery agreement with the data upload method Automatic Upload with Folder
Scanner and the data format Comma-separated values (CSV) or Self-descriptive exchange format (SDX).Only files for active data delivery agreements are detected.
6. Configure the folder scanner job and the process scheduler in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings Configure Jobs and ensure that the jobs have the status Started.
12.2.5.1 Scanning Static CSV Files
When the folder scanner scans the inbound folder, it identifies which data deliveries are complete by checking that all the mandatory Comma-Separated Values (CSV) files are present.
468 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
It detects which files are mandatory from the settings for file sets and files. Those files for which the Optional checkbox has not been selected in the file definition Customizing are considered by the folder scanner to be mandatory and must therefore be present for the delivery to be considered as complete.
Files that are associated with steps in the file definition Customizing are moved to the file processing folder. Files that are not associated with a step in the file definition Customizing are skipped. Their status is changed from New to Skipped. For example, if you use files with a file extension that marks them as being completely uploaded or finished (by using a file extension .fin, for example), the status of those files does not change to Ready.
12.2.5.2 Scanning SDX Files
Use
When the folder scanner scans the inbound folder, it identifies which data deliveries are complete by checking that all the mandatory files are present.
When data is delivered in the self-descriptive exchange (SDX) format, the folder scanner detects a data delivery as being complete when the metafile is present along with all the files described in the metafile.
A valid metafile must have a format and a dimension file.
Files in the SDX format are not associated to any steps in the file definition.
More Information
Self-Descriptive Exchange Format [page 134]
12.2.5.3 Example: Folder Scanner Failure
The system may not be able to detect and upload incoming data deliveries for various reasons. Here is a checklist of possible reasons for data detection failure:
Reason Action Required
You have insufficient disk space.
Running out of disk space can disrupt the load manager from completing operations such as data detection. You should free disk space by deleting files, for example. For more information about archiving, deletion, and housekeeping, see Housekeeping [page 91].
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 469
Reason Action Required
The file system and folders have not been configured correctly in the following Customizing settings:
● Logical File Path (transaction FILE)
● Customizing for Cross-Application Components under
Demand Data
Foundation Data
Upload Basic
Settings General
Settings .
For more information about how to set up the file system and folders correctly, see the following documentation:
● Customizing for Cross-Application Components under Demand Data Foundation
Data Upload Basic Settings General Settings● SAP Demand Signal Management Application Operations Guide on SAP Service Market
place
The jobs are not running. ● Check your Customizing settings for the daemon framework in Customizing for Cross-
Application Components under Demand Data Foundation Data Upload Basic
Settings Configure Jobs .● Check the Monitor Jobs to determine the jobs statuses.
The same data delivery comes twice.
The data is detected as duplicate data and an error message is logged in the Monitor Deliveries UI. The duplicate files are not uploaded.
The files do not match the file name pattern.
If you configured a pattern for your files (for example, DOC++++++.csv), files with an incorrect pattern are not processed. The file name pattern must be adjusted manually. Check your file name pattern in the settings for files and file sets.
The same file name pattern was used twice.
If the delivery instance table already contains a delivery with file name pattern 12345 and a second data delivery arrives for the same data delivery agreement with the file name pattern 12345, the load manager ignores the data delivery. Only one data delivery with the file name pattern that matches the file name pattern associated with the relevant data delivery agreement is allowed.
The files are detected and marked as being ready for upload. Some of the files are uploaded successfully. However, one of the remaining files to be uploaded has failed to be uploaded.
Not all mandatory files were delivered. No more files can be uploaded until all mandatory files are present. The scanner keeps checking for a complete data delivery.
Missing authorization. Contact your system administrator to gain full access for all the sub-folders that are relevant for the data delivery.
470 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
12.2.6 Setting Up Automatic Upload with PSA Scanner
Use
You use the Persistent Staging Area (PSA) scanner to detect new data in the Service Delivery PSA.
Activities
1. Define an active data delivery agreement with the data upload method Automatic Upload with PSA Scanner and the data format Tables.
2. Define a delivery DataSource for the data delivery agreement.If you do not define a delivery DataSource name, then the default delivery DataSource is used to detect any new data deliveries for this data delivery agreement.The default delivery DataSource for the parameter DELIVERY_INFO_DATASOURCE_NAME is defined in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings General Settings .
3. Configure the PSA scanner job in Customizing for Cross-Application Components under Demand Data Foundation Data Upload Basic Settings Configure Jobs and ensure that the jobs have the status Started.You can configure how frequently you want the PSA scanner to scan the Service Delivery PSA for new data.
12.2.7 Manual Data Upload
Use
You upload data manually in the following cases:
● You want to upload company-internal data from an ERP system like SAP ERP● The folder or Persistent Staging Area (PSA) scanner fails● You want to do a test upload for data from a new DataSource, such as a new retailer
SAP NetWeaver Business Warehouse (SAP NetWeaver BW) allows you to upload data by executing the appropriate InfoPackage manually.
Prerequisites
You have configured the following:
● An active data delivery agreementYou may wish to define a separate data delivery agreement for each individual data set: one agreement for product, one agreement for location, and one for sales data.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 471
You may also reuse an existing data delivery agreement.● The process scheduler job
Ensure that it has the status In Process so that subsequent steps can be triggered automatically by the process scheduler.
● The process definitionEnsure that it has a step with a process chain assigned for transferring data from the Acquisition DataStore object (DSO) to the Propagation DSO
● The step definitionYou must assign the step to an upload stage for manual uploads. Ensure that you select the correct upload stage setting, as this setting determines which steps should and should not be instantiated in the manual upload.
● If you upload data by manually triggering a process chain and you want the subsequent steps to be triggered automatically by the process flow control (PFC), you must assign the process type /DDF/STATU at the end of the process chain.
● If you want to use the same data delivery agreement for a manual upload as you have used for an automatic upload with the folder scanner, you have to define upload stages in Customizing.
Process
1. You log on to SAP NetWeaver BW Data Warehousing Workbench (transaction RSPC) and navigate to the appropriate InfoPackage.
2. You trigger the data upload by doing one of the following:○ You execute the InfoPackage to trigger the data upload to the relevant PSA.
If you upload a file, you must include the file name in the InfoPackage file name routine.If you upload company-internal data from SAP ERP, you use an SAP ERP DataSource.The PSA gets the relevant data delivery agreement from the file or SAP ERP.
○ You execute the Data Transfer Process (DTP) to do one of the following:○ Bring the data from a PSA request to the Acquisition DSO.○ Bring the data from any DSO to the Acquisition DSO.
○ You execute the process chain that brings the data to the Acquisition DSO.The process chain must have the process type /DDF/STATU at the end of it.
3. The system creates a delivery ID and a process ID.4. The system also creates an additional step with step number 000000 and initial status Manual.
This status is for the step only and is not propagated. While the step has the status Manual, you can find it in the Monitor Deliveries UI under the status category Running for the related process.If you did the manual upload without a process chain, then this step has neither a process chain nor a process chain log ID. You must set the status manually to Completed, as there is no PFC interaction.If you used a process chain in the manual upload, then this step is instantiated with a process chain and a process chain log ID. The process type /DDF/STATU sets the step status to Completed.
5. The system creates the subsequent steps.Using the delivery and process created in the previous step, the subsequent steps, which are assigned to upload stages other than Data Acquisition, are created with the status Ready. The PFC executes the subsequent steps once the additional step (with step number 000000 and initial status Manual) has the status Completed.
6. You check in the Monitor Deliveries UI, whether the upload was successful or not.
472 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Example
If you want to use the same data delivery agreement for a manual upload as you have used for an automatic upload with the folder scanner, you have to define upload stages in Customizing as displayed in the following table.
Step Attribute Value
Agreement Type Retailer Data
Upload Method Automatic Upload With Folder Scanner
Data Format Comma-Separated Values
The process definition has the following steps assigned:
● PRODUCT_FILE with upload stage Data Acquisition● LOCATION_FILE with upload stage Data Acquisition● PRODUCT_STAGE with upload stage Quality Validation● LOCATION_STAGE with upload stage Data Propagation
You decide you want to upload the product file manually and you can use the same data delivery agreement for this.
The system creates an additional step with the step number 000000 with status Manual for the manual upload and instances for the steps that are not relevant for data acquisition (PRODUCT_STAGE and LOCATION_STAGE).
The system does not create an instance for the step LOCATION_FILE because you want to upload only the PRODUCT_FILE in the manual upload.
Although no location data is uploaded, the step LOCATION_STAGE is nevertheless executed. This does not cause any conflicts because the Acquisition DataStore object does not contain any location data with the delivery ID that was created from the manual upload.
More Information
Data Reload [page 184]
12.2.8 Checking Completeness of Data Deliveries
Use
You can check on the Monitor Deliveries UI to see whether a process was created or not. You can also see whether the status of the process has changed to Ready. If the status is Ready, this means that all mandatory content (files or data sets) are present and are ready to upload.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 473
The criteria for a data delivery to be considered as complete depend on the data format and on the tool used for detecting the data:
● Self-descriptive exchange (SDX) format for retail panelA data delivery in SDX format is considered complete when the metafile and all the files described in the metafile are present in the inbound folder
● Comma-Separated Values (CSV) file format for point-of-sale (POS) dataA data delivery in CSV format is considered complete when all the mandatory files are present in the inbound folder as defined in Customizing.If you receive large volumes of files and if you always receive the files in the same sequence in the inbound folder, you can indicate that the data delivery is complete, for example, by defining a mandatory empty file (with file extension .fin) that is used as a marker that the data delivery is complete.The folder scanner detects when all the mandatory files are available in the inbound folder.Since you cannot guarantee the correct sequence when you manually copy or transfer files to the inbound folder, you can use an automatic mechanism to ensure that all files are transferred, before the mandatory empty file (.fin) gets transferred.
ExampleYou use an empty file with file extension .fin as marker for completeness as in the following table:
File Number File Name Pattern Optional
10 TopMarket++++++++.loc Yes
20 TopMarket++++++++.pro No
30 TopMarket++++++++.sal Yes
40 TopMarket++++++++.stk Yes
50 TopMarket++++++++.fin No
● The PSA scanner considers a data delivery to be complete once all the mandatory data sets are present in the PSA.
More Information
For more information about how you can influence the system detecting whether a data delivery is complete or not, see the documentation in Customizing for Cross-Application Components under Demand Data Foundation Business Add-Ins (BAdIs) BAdI: Data Delivery Enhancements .
474 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
12.2.8.1 Errors in Data Deliveries
Use
If a file causes a severe error, such as a core dump, the folder scanner aborts but starts again automatically.
If there is a severe error in the data delivery, the Persistent Staging Area scanner aborts but starts again automatically.
You can see what errors have occurred by checking the Monitor Deliveries UI, or in the case of jobs, the Monitor Jobs UI, where all errors for data uploads are displayed. The errors for the following are displayed:
● Processes● Steps● Process chains● Jobs● General system errors
More Information
Monitor Deliveries [page 180]
12.2.8.2 Data Consistency
Use
In the rare case that discrepancies occur between object statuses in different instance tables, you can execute a report (/DDF/ADU_CANCEL_INSTANCE) that sets the instance statuses to Canceled.
It is necessary to run this report because otherwise the data upload process is blocked.
The report uses the data delivery ID to select all the related instance statuses.
You can choose one of the following actions:
● Cancel deliveryYou can then reload the data delivery.
● Archive deliveryUse this option if there are discrepancies in the system, such as files having been deleted that prevent the Administrative Tasks job from marking the instances for archiving. In this case, you can run this report to archive the instances that could not be archived by the job.The prerequisite for running the report on this setting is that the delivery status is set to Canceled or Completed.
If you select both checkboxes in the report, the system first sets the instance statuses to Canceled and then marks the instances for archiving.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 475
More Information
For more information about this report, see the documentation of the report Set Instance Statuses to Canceled in the system.
Data Reload [page 184]
12.2.9 Archiving
Use
For archiving data upload instances in the area of data upload in SAP Demand Signal Management, version for SAP BW/4HANA, implement your SAP Business Warehouse (SAP BW) strategy for archiving. Data and objects should, therefore, be extracted to SAP BW in DataStore objects (DSOs) that have SAP BW time characteristics (0CALDAY, for example). You set up the archiving of your data in transaction Edit Data Archiving Process (RSDAP).
You use the Clean Archived Instances job to delete objects from the system. We recommend deleting objects after 1 to 2 years with the standard parameter being 2 years (730 days). There is no reason to delete the objects any earlier. Once the objects are deleted, the Monitor Deliveries and Data Delivery Monitor will no longer display the deleted objects and there will be inconsistencies if parts of the upload process or data harmonization process still try to access the objects that have been deleted. To access historical data that has already been cleaned up from tables in SAP Demand Signal Management, version for SAP BW/4HANA, implement BI reports.
More Information
Clean Archived Instances Job [page 93]
12.3 Configuring Quality Validation
Use
The system performs the quality validation step using sequences of functions that are defined in Customizing. Each function represents a key figure for quality validation.
You make key figure-specific settings and define per data delivery agreement which sequences are used at specified execution times.
476 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
NoteYou can use Business Add-In BAdI: Default Values for Set Up of Quality Validation /DDF/BADI_QV_DEFAULT to define default values that are only taken into account if no other settings have been specified.
Prerequisites
You have made the following settings for quality validation in Customizing for Cross-Application Components under Demand Data Foundation Quality Validation :
● Set Up Quality Validation● Define Number Range for Key Figure Detail Data
You have defined data delivery agreements. To define data delivery agreements, choose SAP Menu Cross-Application Components Demand Signal Management Data Upload Define Data Delivery Agreementson the SAP Easy Access screen.
Activities
1. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Quality Validation Configure Quality Validation ore use transaction Configure Quality Validation /DDF/QV_DEF.
2. Select the point in time at which the system executes sequences of functions (view Select Execution Time).In the standard system the following execution times are used:○ After Data Acquisition
This execution time is used for all standard sequences.○ Before Data Extraction to Global Reporting
This execution time is used for data delivery agreements that deal with market research data that is uploaded for global reporting.
3. Select a data delivery agreement, and assign a sequence of functions that is defined in Customizing (view Assign Sequence to Data Delivery Agreement).You can define different settings for each data delivery agreement that is available in the system. If you do not enter a data delivery agreement, this setting is used as default for all data delivery agreements for which not settings are defined.
4. For each function contained in the selected sequence make the following key figure-specific settings (view Define Key Figure-Specific Settings):1. Specify the key figure type.
○ An absolute key figure contains an absolute value, for example, the number of locations in a data delivery.
○ A relative key figure contains a relative value, for example, the percentage of new or unknown locations relative to the complete number of locations of this data delivery.
2. Define whether the key figure is comparative.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 477
A comparative key figure is used to evaluate the data quality of a data delivery comparing the data of the current data delivery to a specified number of historic data deliveries. For a comparative key figure you make the relevant settings for the comparison in Customizing.You can use Business Add-In BAdI: Number of Deliveries for Comparison in Quality Validation to define per data delivery agreement how many data deliveries are considered for key figures that compare data of multiple data deliveries.
3. Define the lower and upper threshold for the value the function calculated for the respective key figure.If the value for the key figure lies between the specified thresholds, the data quality is considered as being sufficient regarding this key figure.
4. Define the status of quality validation step that is set in the following cases:○ The thresholds for this key figure are violated.○ The key figure could not be calculated.
You can choose one of the following statuses:○ Error - Stop Processing○ Warning - Continue Processing○ Success - Continue Processing (No Message)
12.4 Configuring and Administrating Data Harmonization
Use
The following table shows the tasks you have as an administrator for data harmonization, and explains in which cases and where you make the individual settings. Depending on your system landscape, you make Customizing settings in the development system, and all other settings in the production system.
Task When? Where?
Set up data harmonization in Customizing
Implementation phase Customizing under Cross-Application
Components Demand Data Foundation
Data Harmonization
Define appearance of object attributes on user interfaces
Implementation phase Customizing activity Cross-Application
Components Demand Data Foundation
Data Harmonization Technical Settings
Make Attribute-Specific Settings
478 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Task When? Where?
Implementation phase
Whenever a new data origin is introduced in the system for which you do not want to use the default settings
On the SAP Easy Access screen choose
Cross-Application Components Demand
Signal Management Data Harmonization
Set Up Data Harmonization or use transaction Set Up Data Harmonization /DDF/FDH_SETUP.
Set up automatic harmonization
Implementation phase Customizing activity Cross-Application
Components Demand Data Foundation
Data Harmonization Technical Settings
Define Sequences for Data Harmonization
Implementation phase
Whenever a new data origin is introduced in the system for which you do not want to use the default settings
On the SAP Easy Access screen choose
Cross-Application Components Demand
Signal Management Data Harmonization
Set Up Data Harmonization or use transaction Set Up Data Harmonization /DDF/FDH_SETUP.
Define priorities for copying attribute values
Whenever you want to change the config-uration for object attributes (for example, because new object attributes were implemented)
Whenever a new data origin is introduced in the system for which you do not want to use the default settings
On the SAP Easy Access screen choose
Cross-Application Components Demand
Signal Management Data Harmonization
Set Up Data Harmonization or use transaction Set Up Data Harmonization /DDF/FDH_SETUP.
Define restrictions for harmonized attribute values
Implementation phase
Whenever you want to change which object attributes are considered (for example, because new object attributes were implemented)
UI Select Attributes
UI Define Allowed Attribute Values
Define harmonization groups and assign them to data delivery agreements
Implementation phase
Whenever a new data delivery agreement is introduced in the system for which you do not want to use the default settings.
On the SAP Easy Access screen choose
Cross-Application Components Demand
Signal Management Data Harmonization
Set Up Data Harmonization or use transaction Set Up Data Harmonization /DDF/FDH_SETUP.
Schedule background jobs Implementation phase Scheduling Background Jobs for Data Harmonization [page 235]
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 479
Task When? Where?
Activate Conversion exits for Product and Location
Implementation phase Activate Conversion Routines for MATERIAL, EANUPC, GLN and DATA_LOAD_ID [page 236]
Activate the action log Implementation phase Activate the Action Log [page 237]
Activate DOASO Implementation phase Activate Data Origin per Attribute for Source Objects [page 238]
Activate distinct ABBIDs Implementation phase Activate Distinct ABBIDs in Data Harmonization UIs [page 239]
Activate the action log and DOASO for user-created object types
Implementation phase Activate the Action Log and DOASO for User-Created Object Types [page 240]
Maintain General Harmonization Parameters
Implementation Phase Maintaining General Harmonization Parameter [page 241]
NoteAccess to Web user interfaces (Web UIs) is granted by authorization roles (PFCG roles) that contain the Web UI in the assigned menu.
To be able to access Web UIs, for example, from the user menu or in the SAP NetWeaver Business Client (NWBC), you must have a corresponding authorization role assigned to your system user
For more information, see Roles for SAP Demand Signal Management, version for SAP BW/4HANA [page 530].
12.4.1 Set Up Data Harmonization
Use
You use transaction Set Up Data Harmonization /DDF/FDH_SETUP to make the following configuration settings:
1. Define appearance of object attributes on user interfaces2. Set up automatic harmonization3. Define priorities for copying attribute values4. Define harmonization groups and assign them to data delivery agreements
On the SAP Easy Access screen choose Cross-Application Components Demand Signal ManagementData Harmonization Set Up Data Harmonization .
480 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
More Information
Defining the Appearance of Object Attributes [page 221]
Setting Up Automatic Harmonization [page 223]
Defining Priorities for Copying Attribute Values into Harmonized Objects [page 224]
Defining Harmonization Groups [page 230]
12.4.2 Defining the Appearance of Object Attributes
Use
For each object type, you define how object attributes are displayed on the following Web Dynpro user interfaces:
1. Mapping UI2. Detail UI for source object3. Detail UI for harmonized objects4. Search UI for source object5. Search and mass change UI for harmonized object6. POWL
You can define different settings for each data origin that is available in the system. If you do not enter a data origin for a specific setting, this setting is used as default for all data origins for which not settings are defined.
NoteYou can use Business Add-In BAdI: Default Values for Set Up of Data Harmonization /DDF/BADI_FDH_HARMON_DEFAULTS to define default values that are used when no values have been defined under Set Up Data Harmonization /DDF/FDH_SETUP.
Prerequisites
1. You have defined object types for which harmonization is turned on in Customizing for Cross-Application Components under Demand Data Foundation Data Harmonization Technical Settings Define Object Types (see Object Types Used in Data Harmonization [page 211]).
2. You have defined attribute bundles, groups, and sets that control which attributes are displayed on the Web Dynpro user interfaces with which properties in Customizing for Cross-Application Components under
Demand Data Foundation Data Harmonization Technical Settings Make Attribute-Specific Settings .
3. You have defined data origins. To define data origins, choose SAP Menu Cross-Application Components Demand Signal Management Data Upload Data Delivery Agreements Define Data Origins on the SAP Easy Access screen, or use transaction Define Data Origins /DDF/DORIGIN.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 481
Activities
1. On the SAP Easy Access screen, choose Cross-Application Components Demand Signal ManagementData Harmonization Set Up Data Harmonization , or use transaction Set Up Data
Harmonization /DDF/FDH_SETUP.2. On view Assign Attribute Sets to Attribute Values, assign attribute sets to attribute values.
For example, you can use different attribute sets for different product categories. For each object type, you select one attribute that is used to select the relevant attribute set and assign the attribute sets to specific values.If a data origin uses specific attribute values that are not covered with the standard values or if the same value means different things for different data origins, you can make this assignment dependent on the data origin.
ExampleThe attribute Product Category is defined as the selecting attribute. You are working with product categories Food and Beverages.
You define different attribute sets FOOD and BEVERAGE with the relevant attributes and assign these attribute set to the product categories.
Data origin 1 uses the attribute value Liquids for the product category Beverages. You assign the attribute set BEVERAGE to the attribute value Liquids.
3. On view Select Object Type, select the object type.4. On view Select Object Attributes, select an attribute.5. On view Define Alternative Descriptions of Attributes, define alternative descriptions for attributes.
For each data origin, you can define how attributes are named in this area. If you then choose a specific data origin in the user-specific parameters, all field labels are displayed using the description defined for this data origin (see User-Specific Parameters [page 217]).
12.4.3 Setting Up Automatic Harmonization
Use
You define for each object type, how automatic data harmonization is performed.
You can define different settings for each data origin that is available in the system. If you do not enter a data origin for a specific setting, this setting is used as default for all data origins for which not settings are defined.
NoteYou can use Business Add-In BAdI: Default Values for Set Up of Data Harmonization /DDF/BADI_FDH_HARMON_DEFAULTS to define default values.
482 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Prerequisites
You have defined object types for which harmonization is turned on in Customizing under Cross-Application Components Demand Data Foundation Data Harmonization Technical Settings Define Object Types(see Object Types Used in Data Harmonization [page 211]).
You have defined mapping instructions that control how attributes are imported, converted, and exported before and after harmonization in Customizing under Cross-Application Components Demand Data Foundation Data Harmonization Technical Settings Define Object Types .
You have defined the sequences for automatic harmonization in Customizing under Cross-Application Components Demand Data Foundation Data Harmonization Technical Settings Define Sequences for Data Harmonization .
You have defined data origins. To define data origins, choose on the SAP Easy Access screen SAP MenuCross-Application Components Demand Signal Management Data Upload Data Delivery AgreementsDefine Data Origins or use transaction Define Data Origins /DDF/DORIGIN.
Activities
1. On the SAP Easy Access screen, choose Cross-Application Components Demand Signal ManagementData Harmonization Set Up Data Harmonization , or use transaction Set Up Data
Harmonization /DDF/FDH_SETUP.2. On view Select Object Type, select the object type for which you want to change the settings.3. On view Define General Parameters, define the following:
○ Specify whether the system automatically harmonizes data provided by a selected data origin at all.○ Specify whether the system automatically creates new harmonized objects for all objects that cannot
be mapped to an existing harmonized object.○ Specify whether the system creates a work item for approval whenever data provided by the specified
data origin is mapped automatically.4. On view Set Up Automatic Harmonization, select the harmonization sequence that is used to determine
similar objects for objects provided by the specified data origin and define the minimal score that must at least be reached by a record so that it is included in automatic harmonization.○ The sequence that is defined for the target record type Source is only used to determine mapping
proposals on the Web user interface for manual mapping.○ The sequence that is defined for the target record type Harmonized is used to determine mapping
proposals on the Web user interface for manual mapping and for automatic mapping of source objects to harmonized objects.
5. On view Define Weighting for Harmonization Score, define how the individual functions of the selected harmonization sequence are weighted for the calculation of the score.
6. If you want to use different mapping instructions for attributes provided by specific data origins, you assign the mapping instruction to the data origin.1. On view Select Mapping Instruction Sets, select the set that contains the mapping instruction you want
to use.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 483
2. On view Assign Mapping Instructions to Data Origins, choose the data origins using the input help and assign a mapping instruction. If you do not enter a data origin for a specific setting, this setting is used as default for all data origins for which not settings are defined.For each data origin, you can use only one mapping instruction, but you can assign one mapping instruction to multiple data origins.
12.4.4 Defining Priorities for Copying Attribute Values into Harmonized Objects
Use
For each object type, you define the priorities with which attribute values are copied from source objects into the harmonized objects.
If you also define a derivation instruction for harmonized objects for a specific attribute, the attribute values are derived, before the priorities are taken into account.
You can define different settings for each data origin that is available in the system. If you do not enter a data origin for a specific setting, this setting is used as default for all data origins for which not settings are defined.
NoteYou can use Business Add-In BAdI: Default Values for Set Up of Data Harmonization /DDF/BADI_FDH_HARMON_DEFAULTS to define default values that are used when no settings have been made under Set Up Data Harmonization /DDF/FDH_SETUP.
Prerequisites
You have defined and activated object types that support source and harmonized record types for which harmonization is turned on and the Copy Attribute Values option is active in Customizing under Cross-Application Components Demand Data Foundation Data Harmonization Technical Settings Define Object Types (see Object Types Used in Data Harmonization [page 211]).
You have defined data origins. To define data origins, choose on the SAP Easy Access screen SAP MenuCross-Application Components Demand Signal Management Data Upload Data Delivery AgreementsDefine Data Origins , or use transaction Define Data Origins /DDF/DORIGIN.
Activities
1. On the SAP Easy Access screen, choose Cross-Application Components Demand Signal ManagementData Harmonization Set Up Data Harmonization , or use transaction Set Up Data
Harmonization /DDF/FDH_SETUP.
484 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
2. On view Select Object Type, select the object type for which you want to change the settings.3. On view Prioritize Data Origins, define the default priorities of data origins that are used if no specific
settings are used for the individual attributes.If attribute values are provided by multiple data origins, the value with the highest priority is copied to the harmonized object during automatic harmonization. If at a later point in time a value is provided by a data origin with an even higher priority, the attribute value of the harmonized object is overwritten.You can also exclude a data origin from attribute value harmonization in general. This means that attribute values provided by this data origin are never copied to a harmonized object. If, for example, the data quality of a specific data origin is insufficient, you can exclude the data origin or define a low priority.You can decide whether initial values of attributes are taken into account. You can prevent the copy of initial values per data origin by selecting the No Initials checkbox. In this case, initial values are ignored even if they have the highest priority. The system determines the next valid source record with the highest priority and copies its value.
4. On view Prioritize Data Origins per Attribute, narrow down the prioritization for specific attributes.You can define the data origin priorities for each attribute that is contained in the object structure.
5. On view Prioritize Data Origins per Attribute Value, narrow down the prioritization for specific attributes. The priorities defined at the data origins or attribute level can be overruled by priorities defined on attribute value level. You can enter only negative values in order to reduce the priority to map the attribute value to the level 2 harmonized record.
6. If you change the configuration in the production system, make sure that the new priorities are applied to the harmonized objects by using report Recalculate Priorities of Data Origins for Attribute Values /DDF/FDH_RECALC_ATTRIB_VALUES .On the SAP Easy Access screen, choose Cross-Application Components Demand Signal Management
Data Harmonization Recalculate Priorities per Attribute or use transaction Recalculate Priorities per Attribute /DDF/FDH_RECALC_ATTV.
7. On view Define Work Item Creation for Attribute Changes, define in which cases the system creates a work item for manual approval of attribute changes whenever the value of a specified attribute of a harmonized object is changed. Note: Work items of type Copy Attribute Values are created if the attribute values for saved harmonized objects are changed; changes to new harmonized objects are not taken into account.You have the following options:○ A work item for manual approval is always created.○ A work item is only created if the priority of the data origin that provides the attribute value is lower
(has an equal or higher number) than the specified threshold value.For example, if you specify the threshold 20, the system creates work items for approval for data origins with priority 20 or higher, whereas changes caused by data origins with a priority up to 19 are automatically performed.
12.4.5 Defining Restrictions for Harmonized Attribute Values
Use
For some attributes only specific attribute values are allowed to be used in harmonized objects, for example, manufacturer names are standardized.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 485
To make sure that only these allowed attribute values are used, you perform the following steps:
1. Select the attributes for which you want to restrict the values.You can also define hierarchies of attribute for which the values depend on each other, for example, manufacturers and brands.
2. Define the allowed values.3. Define a derivation instruction set for the derivation of harmonized attribute values that contains the
restricted attributes.
More Information
Selecting Attributes for Value Restrictions [page 227]
Defining Allowed Attribute Values [page 229]
Defining Derivation Instruction Sets for Harmonized Attribute Values [page 245]
12.4.5.1 Selecting Attributes for Value Restrictions
Use
For each object type, such as Product or Location, you select specific attributes for which you want to restrict the allowed attribute values.
You can define hierarchies for attributes that depend on each other, for example, you define a hierarchy with the attribute Manufacturer with the subordinate attribute Brand.
The standard system landscape contains different systems, for example, a development system in which the Customizing settings and the configuration is done, and a production system in which the company’s data is stored and processed.
Changing the configuration in the production system is in most cases prohibited because it can lead to severe data inconsistencies. However, if for example you introduce a new attribute for an object type, you may want to include this attribute in harmonization in a timely manner.
To make that possible, you define attribute hierarchies in the development system and load them into the production system using a remote function call (RFC). You can review the impact the changes would have on existing data before activating the new hierarchies.
NoteAttribute values for the following attributes cannot be harmonized:
● Attributes that are defined in Customizing as external key for the object type● Attributes that are used to identify the object record, for example, Data Provider, Data Origin, Context● Administrative data
486 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
NoteFor more information on the roles, see Roles for SAP Demand Signal Management, version for SAP BW/4HANA [page 530].
Activities
1. Open the Web UI Select Attributes for Value Restriction for the object type for which you want to change the settings in a system in which you are able to change the configuration.
2. In the Work Area, enter all attributes for which you want to restrict the allowed attribute values and define their hierarchies.You can use the list of available attributes that is displayed on the UI as a repository.
CautionAttributes that depend on each other hierarchically must have the same settings regarding priority and work item creation for the harmonization of attribute values (see Defining Priorities for Copying Attribute Values into Harmonized Objects [page 224]).
3. Check in the Comparison assignment block the impact that the configuration change can have on existing data.
NoteIn the standard system landscape there is no object data available in the source system that would need to be recalculated.
4. Open the Web UI Set Source System in the production system and select the development system from which the configuration is to be loaded as source system.Select the Select Attributes for Value Restriction checkbox and save.
5. Open the Web UI Select Attributes for Value Restriction for the object type for which you want to change the settings.○ In the assignment block Active Configuration, you see the selection of attributes that is active in the
current system.○ In the assignment block Configuration in Source System, you see the configuration that is retrieved
from the source system.This configuration replaces the current configuration once you activate it.
6. Check the impact on existing data that an activation of the new selection would have in the Comparison and Impact Description assignment blocks.Whenever the selection of attributes for value harmonization is changed, the related attribute values of harmonized objects are recalculated when you save. Additionally, changes can have the following impact on existing data:○ Allowed attribute values are deleted with their hierarchy with the following exceptions:
○ If you remove only the root node of a hierarchy, all allowed values on the lower levels are kept.○ If you remove a node in the middle of a hierarchy or the last entry, all allowed values on higher
levels are kept, entries on lower levels are deleted.○ If you insert an element in a hierarchy, entries on higher levels are kept, entries on lower levels are
deleted.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 487
NoteIf you move an attribute up or down in a hierarchy, the system behaves as if a node was removed and then inserted.
○ If an attribute is added to the selection, the attribute values of all harmonized objects will be cleared before they are recalculated. This includes attribute values that are changed manually. You can use Business Add-In BAdI: Default Values for Restricted Attributes to define the default values for all restricted attributes for which no allowed value can be determined.
7. Make any changes to the configuration in the source system and reload the configuration to the production system afterwards.
8. If you are sure that the impact on the current data does not lead to severe data inconsistencies, choose Prepare Activation.The system checks whether there are processes running in the system that access master data that need to be recalculated during activation of the new configuration. If no processes are running, the system locks all the affected object records. No harmonization processes can be carried out from this point in time until the activation is finished.
9. After the system has successfully locked all object records, choose Activate.The system starts a background job, in which the attribute values of the affected harmonized objects are harmonized anew based on the new attribute hierarchies.All changes are traced in the application log for subobject /DDF/FDH_ADMIN of object /DDF/FDH.
10. After the recalculation is finished, the locks are removed.
Result
You have selected the attributes for which the allowed attribute values are restricted. You can now define allowed attribute values for these attributes.
More Information
Defining Allowed Attribute Values [page 229]
Defining Restrictions for Harmonized Attribute Values [page 226]
Defining Derivation Instruction Sets for Harmonized Attribute Values [page 245]
Defining Derivation Instructions for Harmonized Attribute Values [page 248]
12.4.5.2 Defining Allowed Attribute Values
Use
For all attributes for which attribute values are restricted, you define allowed attribute values. For example, you define the list of all the manufacturer names that are to be used in harmonized objects.
488 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
If you do not define allowed values, the default values are used as defined in Business Add-In BAdI: Default Values for Restricted Attributes.
If a hierarchy is defined for an attribute, you can only define values dependent on superordinate attributes. For example, you define brand names dependent on manufacturers.
NoteIf an input help is defined for an attribute using the Business Add-In BAdI: Additional Attribute Checks and Input Help in Harmonization you can only define values that are provided by the BAdI implementation.
If a user wants to change attributes of harmonized objects for which the values are restricted, only the allowed values are available.
To make sure that the allowed values are taken into account during automatic harmonization, you have to define derivation instructions for the derivation of harmonized attribute values for all restricted attributes.
Prerequisites
You have selected attributes for value restriction.
NoteFor more information on the roles, see Roles for SAP Demand Signal Management, version for SAP BW/4HANA [page 530].
Activities
1. Open the Define Allowed Attributes UI for the object type for which you want to define allowed values.2. Select an attribute in the list of restricted attributes.3. Enter all allowed values for the selected attribute in the Allowed Attribute Values list.
NoteYou cannot delete allowed attribute values that are already in use in a harmonized object or in a derivation instruction set.
12.4.6 Defining Harmonization Groups
Use
You use the harmonization group to collect objects whose attributes are handled in data harmonization in the same way. For each harmonization group, you can create derivation instruction sets for harmonized objects that define how attribute values of harmonized objects are derived from source attribute values.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 489
NoteYou can use Business Add-In BAdI: Default Values for Set Up of Harmonization Groups /DDF/BADI_FDH_ADU_DEFAULT to define default values that take effect when nothing has been defined under Define Data Delivery Agreements /DDF/DDAGR.
Prerequisites
You have defined data delivery agreements. To define data delivery agreements, choose on the SAP Easy Access screen SAP Menu Cross-Application Components Demand Signal Management Data UploadDefine Data Delivery Agreements or use transaction Define Data Delivery Agreements /DDF/DDAGR.
Activities
1. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Data Harmonization Set Up Data Harmonization or use transaction Set Up Data Harmonization /DDF/FDH_SETUP.
2. On view Define Harmonization Groups, define all harmonization groups that are available in the system.3. On view Select Object Type, select the object type for which you want to define the settings for
harmonization groups.4. On view Assign Harmonization Groups to Delivery Agreements, assign harmonization groups to data
delivery agreements for the selected object type.You can use the same harmonization group for multiple data delivery agreements, but you can only assign one data delivery agreement to a harmonization group.When data is uploaded, the system automatically assigns the harmonization group to all source objects that are provided with the specified data delivery agreement.If you add a harmonization group without specifying the data delivery agreement, this harmonization group is used as default for all source objects for which no specific settings are defined.
12.4.7 Maintain Name Value Pair Filters
The Name Value Pair (NVP) filter allows you to manage large volume of name value pairs in the Derive Harmonized Product Attributesuser interface.
Name value pairs are created during data loads or by configuration in data harmonization and they are used as source attributes for derivation of attribute values for harmonized objects or source objects. NVP filter allows you to filter the name value pairs based on user authorization by attribute group.
490 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Prerequisites
You have defined data delivery agreements. To define data delivery agreements, choose on the SAP Easy Access screen SAP Menu Cross-Application Components Demand Signal Management Data UploadDefine Data Delivery Agreements or use transaction Define Data Delivery Agreements /DDF/DDAGR.
Activities
● On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Data Harmonization Set Up Data Harmonization or use transaction Set Up Data Harmonization /DDF/FDH_SETUP.
● On view Assign Harmonization Group for NVP Filter, assign harmonization groups for NVP filter for the selected object type.
● Create a new attribute group. On view Assign Attributes to Attribute Group, assign attributes to the group. Assign this attribute to the harmonization group.
Note
To assign attribute group to users you need authorization for the authorization object DH_NVPFLT.
Related Information
Name/Value Pairs [page 215]
12.4.8 Reports for Administrating Data Harmonization
Use
The following reports are available for data harmonization.
Report Technical Name Transaction Code Purpose
Check Customizing for Data Harmonization
/DDF/FDH_CUSTOMIZING_CHECK
/DDF/FDH_CHECK_CUST Check whether the Customizing settings for data harmonization are correct
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 491
Report Technical Name Transaction Code Purpose
Compare Customizing of Remote System with Local System
/DDF/FDH_COMPARE_IMG /DDF/FDH_COMPARE_IMG Compare the Customizing settings in the current client with the settings in a system that is connected using a remote function call (RFC)
Compare Customizing of Current Client with Client 000
/DDF/FDH_COMPARE_IMG_CLNT000
/DDF/FDH_COMP_CUST Compare the Customizing settings in the current client with the settings in client 000
Transfer Name/Value Pairs to Attributes
/DDF/FDH_COPY_NVP_TO_ATTR
/DDF/FDH_COPY_NVP Transfer the name/value pairs into static attributes of the structure used in data harmonization
Remove Processed Objects from Change Log
/DDF/FDH_DEL_CHGLOG /DDF/FDH_DEL_CHGLOG Remove processed entries from the change log. The change log contains entries for all objects for which the Customizing settings allow the creation of change pointers.
Remove Entries for Selected Objects from Change Log
/DDF/FDH_REORG_CHGLOG /DDF/FDH_REORG_CL Set selected entries in the change log to Processed
Remove Objects Marked for Deletion
/DDF/FDH_DEL_OBJ /DDF/FDH_DEL_OBJ Remove all objects from the database that are marked for deletion
Remove Obsolete Harmonized Records
/DDF/FDH_DEL_OBSOLETE_HARM_REC
/DDF/FDH_DEL_OBS_HRM Remove obsolete harmonized objects from the database that do not have a source object assigned and have not been changed during the specified amount of days in the past
Remove Objects by Data Delivery ID
/DDF/FDH_DEL_REC_BY_DELID
/DDF/FDH_DEL_BYDELID Delete all source objects that have been created during the upload for a specific data delivery ID. Deletion is performed based on records in the change log and can only be performed before change pointers have been removed.
492 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Report Technical Name Transaction Code Purpose
Remove Selected Objects /DDF/FDH_DEL_RECORDS /DDF/FDH_DEL_REC Delete selected objects of object types used in data harmonization
Remove Assignment of Source Objects
/DDF/FDH_UNASSIGN_RECORDS
/DDF/FDH_UNASSIGN Delete the assignment of source objects to harmonized objects
Recalculate Priorities of Data Origins for Attribute Values
/DDF/FDH_RECALC_ATTRIB_VALUES
/DDF/FDH_RECALC_ATTV Recalculate the priorities of attribute values after the priorities of data origins have been changed
Select Objects for Postprocessing
/DDF/FDH_SELECT_FOR_PPE
/DDF/FDH_SEL4PPE Select objects for harmonization postprocessing; during harmonization postprocessing, the selected objects are harmonized
Start Postprocessing /DDF/FDH_START_PPE /DDF/FDH_START_PPE Start the harmonization postprocessing that calculates harmonized objects
Start Resolution of Work Items
/DDF/FDH_START_WI_RESOLVE
/DDF/FDH_START_WIR Start the resolution of work items that were created during data harmonization
Cancel Data Harmonization Process Instances
/DDF/FDH_CANCEL_PROC_INS
/DDF/FDH_CANCEL_PROC Cancel process instances that have been created by data harmonization.
This is only relevant if direct update is not activated.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 493
Report Technical Name Transaction Code Purpose
Find Source Attribute Values Assigned to Multiple Harmonized Objects
/DDF/FDH_FIND_SRC_ATTV_MULT_HR
/DDF/FDH_FIND_SRCATV Get an overview of the assignment of source objects to multiple harmonized objects, based on the attribute values of a specified attribute and a selected object type. You can use this overview, for example, to evaluate whether it makes sense to perform the harmonization based on the specified attribute.
For source objects assigned to multiple harmonized objects, the report shows the object IDs of these source objects with the ID of the assigned harmonized object and the attribute value.
Create a New Data Delivery for Data Harmonization
/DDF/FDH_CREATE_NEW_DELIVERY
/DDF/FDH_NEW_DELIVER Create new data deliveries that are used to integrate the data changed by data harmonization into the standard upload process.
This is only relevant if direct update is not activated.
Release Data Deliveries for Harmonization
/DDF/FDH_RELEASE_DELIVERIES
/DDF/FDH_REL_DELIVER Release data deliveries that are used to integrate the data changed by data harmonization into the standard upload process.
This is only relevant if direct update is not activated.
Add Selected Objects to Change Log
/DDF/FDH_SELECT_FOR_CHGLOG
/DDF/FDH_SEL4CHGLOG Add selected objects to the change log for further processing
Revoke Manual Changes of Harmonized Object Attribute Values
/DDF/FDH_REVOKE_MANUAL_CHANGES
/DDF/FDH_REVOKE_MCHG Revoke manual changes of attribute values for selected harmonized objects
Revoke Manual Changes of Source Object Attribute Values
/DDF/FDH_REVOKE_MANUAL_CHG_SRC
/DDF/FDH_REVOKE_SRC Revoke manual changes of attribute values for selected source objects
494 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Report Technical Name Transaction Code Purpose
Delete Obsolete Action Logs /DDF/FDH_DEL_OBSOLETE_ACT_LOG
/DDF/FDH_REORG_ALOG Delete action logs for harmonization object types for which the action log function was disabled in the object type definition.
Create Records for DOASO for Existing Objects
/DDF/FDH_CREATE_SAO /DDF/FDH_CREATE_SAO Create records of the data origin of attributes for already existing source objects.
Delete Obsolete Records for DOASO
/DDF/FDH_DEL_OBSOLETE_SAO
/DDF/FDH_REORG_SAO Remove obsolete records of the data origin of attributes for source objects.
More Information
For more information see the documentation of the reports in the backend system.
Scheduling Background Jobs for Data Harmonization [page 235]
12.4.9 Scheduling Background Jobs for Data Harmonization
You schedule the following background jobs on a regular basis:
● Pass on changes using standard data upload process if direct update is not activated. If direct update is activated, changes to the standard SAP Business Warehouse info objects are passed on automatically.To pass on data that is manually changed in data harmonization using the standard data upload process the system uses data delivery IDs.Whenever data harmonization is started the latest data delivery of the data delivery agreement with agreement type Harmonization Data is locked. As soon as harmonization is finished, the lock on the data delivery is removed. Now the data delivery can be released and the status of the related process instance can be set to Ready and the Process Scheduler job triggers the execution of the next process steps.You schedule jobs that create and release data deliveries on a regular basis.For more information see Customizing under Cross-Application Components Demand Data Foundation
Data Harmonization Set Up Data Upload .● Resolve harmonization work items
The system creates work items whenever objects could not be mapped or changed during harmonization because one of the involved harmonized or source object records was locked. In most cases these work items can be resolved by the system without user interaction as soon as the locks are removed.You schedule a job that tries to resolve these work items on a regular basis.For more information see Customizing under Cross-Application Components Demand Data Foundation
Data Harmonization Set Up Resolution of Work Items .
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 495
In transaction Set Up Data Harmonization /DDF/FDH_SETUP you define for each object type, how many sessions can be active for this job in parallel and how many records can be processed together.
● Harmonization postprocessingYou schedule a job that starts harmonization.For more information see Customizing under Cross-Application Components Demand Data Foundation
Data Harmonization Set Up Postprocessing .In transaction Set Up Data Harmonization /DDF/FDH_SETUP you define for each object type, how many sessions can be active for this job in parallel and how many records can be processed together.
NoteEvaluate which settings for the jobs are ideal in your business context.
If direct update is not activated, if you want to send the results of data harmonization as fast as possible to SAP Business Warehouse (BW), you schedule the jobs in short intervals, for example, every 5 min. However, this increases the data load on BW significantly. If direct update is not activated, the system creates a separate process instance for each data delivery.
Minimizing the time that is needed to pass on harmonized data slows down system performance by increasing the data load on BW.
There are other reports available that can be scheduled on a regular basis, for example, to clean up the database. For more information, see Reports for Administrating Data Harmonization [page 232].
12.4.10 Activate Conversion Routines for MATERIAL, EANUPC, GLN and DATA_LOAD_ID
You use report /DDF/FDH_TRANSFER_CONV_EXIT to activate conversion routines that add leading zeros to the data stored in data harmonization for the following standard fields:
Object Type Attribute Name Domain Conversion Routine
Product MATERIAL /DDF/FDH_MATERIAL MATLF
Product EANUPC /DDF/FDH_GTIN GTINF
Location GLN /DDF/FDH_GLN GLNF1
All Object Types DATA_LOAD_ID /DDF/FDH_DATA_LOAD_ID DALOI
496 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
12.4.11 Activate the Action Log
Use
You can track changes of assignments from source to harmonized records as well as changes of the global reporting status.
Activities
You can activate the action log for object types which are relevant for global reporting or which can be harmonized. When you activate the action log, the system tracks the following actions:
● Assignment of source records to harmonized records● Unassignment of source records● Changes of the global reporting status
Tracked actions are displayed on the Action Log tab on the detail screens for the objects of an object type for which the action log is active.
NoteRelevant actions, such as the assignment performed before the action log was activated, are not reflected in the log. If you switch off the action log for a data harmonization object type, you can execute the report Delete Obsolete Action Logs (transaction /DDF/FDH_REORG_ALOG) to delete the obsolete action log entries for this object type.
To activate the action log for object types which are relevant for global reporting or which can be harmonized:
1. In Customizing for Cross-Application Components under Demand Data Foundation Data Harmonization Technical Settings Define Object Types , in Dialog Structure, select the node Define Object Types.
2. To activate the action log for products:1. In the Obj.Type column, select the line with the value FDH_PROD.2. In the Act. Log column, select the checkbox.
3. To activate the action log for locations:1. In the Obj.Type column, select the line with the value FDH_LOC.2. In the Act. Log column, select the checkbox.
4. Choose Save.
More Information
Activate the Action Log and DOASO for User-Created Object Types [page 240]
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 497
12.4.12 Activate Data Origin per Attribute for Source Objects
Use
You can store and view the data origin per attribute for source objects (DOASO) in data harmonization.
You can activate DOASO for each object type in Customizing for Cross-Application Components under Demand Data Foundation Data Harmonization Technical Settings Define Object Types . Select the
option DOASO Act for the relevant object type.
For example, for products and locations you perform the following steps:
1. In the Dialog Structure, select the node Define Object Types.2. To activate DOASO for products:
1. In the column Obj.Type, select the line with the value FDH_PROD.2. In the column DOASO Act, select the checkbox.
3. To activate DOASO for locations:1. In the column Obj.Type, select the line with the value FDH_LOC.2. In column the DOASO Act, select the checkbox.
4. Choose Save.
If you activate DOASO for a data harmonization object type, the following features are available:
● During the import of source records into data harmonization, the imported attribute values are saved in the database.
● On the Manage Products and Manage Locations details UIs, the Data Origin per Attribute tab provides information about each attribute (name/value pairs are not included) of the source object, its current value, the last imported value and its value source. The value source can be:○ None – No attribute value has been imported or derived○ Imported – Attribute value has been imported○ Enriched – Attribute value is taken from subobjects or has been added by object specific logic, for
example, the attribute Pending Update○ Derived – Attribute value has been derived with an Attribute Value Derivation Set for source objects○ Determined by System – Attribute value has been set by system, for example, for technical fields○ Status – Attribute value is determined by the rules of the status of the object
After you have activated DOASO for an object type, you should execute report Create Records for DOASO for Existing Objects (transaction /DDF/FDH_CREATE_SAO) to create DOASO records for the object type.
NoteYou should execute the report in all systems and clients in which the DOASO is activated and in which harmonization data exists.
If you have deactivated DOASO for a data harmonization object type, you can execute report Delete Obsolete Records for DOASO (transaction /DDF/FDH_REORG_SAO) to delete the obsolete source attribute origin records.
Using DOASO in User-Created Application Configurations
If you want to use DOASO in the UIs for Manage Objects that you have created for your own object types, you must set the parameter FDH_RECORD_TYPE in the application configurations you have created yourself.
498 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
More Information
Activate the Action Log and DOASO for User-Created Object Types [page 240]
12.4.13 Activate Distinct ABBIDs in Data Harmonization UIs
Use
You can use distinct Application Building Block IDs (ABBIDs) per object type in Data Harmonization UIs. This reduces the size of the metadata provider during runtime and improves performance. However, you must activate the functionality before it can be used.
When an application is called for a given object type, the metadata provider will be set up for all object types sharing the same ABBID. If no ABBID has been specified for this object type, the metadata provider will be set up for all object types maintained in Customizing and the default ABBID DDF_FDH is used. SAP delivers the ABBIDs DDF_FDH_P for object type FDH_PROD (products) and DDF_FDH_L for object type FDH_LOC (locations).
You can also use your own ABBIDs for these object types or for your own object types. These ABBIDs must be created in SPI in Customizing for Cross-Application Components under Processes and Tools for Enterprise Applications Settings for BO Framework and Navigation BO Framework Define Application Building Blocks .
Activities
To activate distinct Application Building Block IDs (ABBIDs) per object type in Data Harmonization UIs:
● Specify an ABBID in Customizing for Cross-Application Components under Demand Data FoundationTechnical Settings Define Object Types . You can specify the same ABBID for one or several object types.
12.4.14 Activate the Action Log and DOASO for User-Created Object Types
Use
You can enable the Action Log tab and the Data Origin per Attribute for Source Objects tab in the object UIs that you have created for your own object types. To do this, you must make sure that on the corresponding FPM Floorplan Configuration, the component FPM_TABBED_UIBB has been used to display the tabs that are to be used on the UI.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 499
Activities
To enable the Action Log tab and the Data Origin per Attribute for Source Objects tab in the object UIs that you have created for your own object types:
Step 1
1. Run transaction SE80.2. Choose package /DDF/UI_FDH_OBJ.
3. Choose WebDynpro FPM Applications /DDF/WDA_FDH_OBJ_OVP FPM Application Configurations .
4. Double-click an application configuration that has been created for your own object type.5. Open the FPM Floorplan Configuration that has been entered in the application configuration.6. Open Overview Page Schema. There should be one section where the component FPM_FORM_UIBB_GL2
has been added to display the object attributes.7. If in the second section the component FPM_TABBED_UIBB has not already been used, you should replace
the UIBBs added here with the component FPM_TABBED_UIBB that will be used to display the UIBBs currently available in this second section.
NoteA similar FPM_TABBED_UIBB should already be used for the Search and mass change: Harmonized Object UI or the Search: Source Object UI for your object type.
The simplest way to create a suitable FPM_TABBED_UIBB for the object UI is to copy one as described in step 2.
Step 2
1. Run transaction SE80.2. Choose either package /DDF/UI_FDH_MCH or the package in which your FPM_TABBED_UIBB has been
created.
3. Choose WebDynpro FPM Layout Component Configurations .4. Double-click the component configuration that is being used for your own object type.5. Choose Start Configurator.6. Choose Copy Configuration.7. In the popup, specify the new configuration ID and description.8. Open Tabbed UIBB Schema.9. Remove the Master UIBB.10. Remove the tab used for the attributes as it is not needed on the object UI.11. Check that the UIBBs displayed in the second section of the FPM Floorplan Configuration specified in step
1 have all been added to this new configuration.12. In General Settings, deselect the option Master/Detail.13. Choose GUIBB Settings and in the drop-down list, select Application Controller Settings.14. Ensure that for the Web Dynpro Component / Class, the class /DDF/CL_FDH_TABBED_APPCC has been
entered.15. Choose OK.16. Choose Save.
500 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Step 3
1. Run transaction SE80.2. Open the FPM Floorplan Configuration that was entered in the application configuration in step 1.3. Open Overview Page Schema.4. Replace the UIBBs added to the second section with the FPM_TABBED_UIBB created in step 2.5. Choose Save.
12.4.15 Maintaining General Harmonization Parameter
With this Customizing activity, you maintain the optional Harmonization parameters.
To maintain the settings, proceed as follows:
During initial data loads, if the volume of data load is very high, then you can set X in the following parameter to stop BW Direct Update on adhoc basis. Once the harmonization process is complete, you can manually run the following BW Direct Update Program to synchronize the records:
SAP_DSIM_HARM_DISABLE_BW_DUP
If the ratio of the source records to the harmonized record is high (more than 30,000:1), loading all relevant source records in the memory, to perform Attribute Value Derivation, can result in memory leakage and locking issues. In such cases, you can maintain optimal packet size in the following parameter:
SAP_DSIM_HARM_PACK_SIZE
For more information, see the Customizing documentation for Cross-Application Componentsunder Demand Data Foundation Data Harmonization Maintain General Harmonization Parameter
12.5 Configuring and Administrating Data Enrichment
As an administrator for data enrichment you have the following tasks:
● Set up data enrichment in Customizing. For more information, see the Customizing documentation for Data Enrichment under Cross-Application Components Demand Data Foundation Data Enrichment .
● Configure data enrichment in the SAP Easy Access Menu.
These configuration steps are required if you use the pre-cleansing function of sales data enrichment for POS data.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 501
Task When? Where?
Set up data enrichment in Customizing Implementation Phase Customizing under Cross-Application
Components Demand Data
Foundation Data Enrichment
Define period groups Before you upload locations of a new retailer
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Enrichment Periods of Sales
Deviations Define Period Groups .
Define periods with sales deviations Before you upload locations of a new retailer
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Enrichment Periods of Sales
Deviations Define Period with Sales
Deviations .
Define default assignment of periods with sales deviation
Before you upload locations of a new retailer
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Enrichment Periods of Sales
Deviations Define Default Assignment
of Periods of Sales Deviation .
Assign location to period groups On demand, to check the assignments for a data origin and to create or change assignments if necessary
On the SAP Easy Access screen choose
Cross-Application Components
Demand Signal Management Data
Enrichment Periods of Sales
Deviations Assign Locations to Period
Groups .
12.5.1 Defining Periods with Sales Deviations
Use
Special events throughout the year - like public holidays, festivals, or sports events - can influence consumer sales data. However, not only predictable, planned events like this have an impact on consumer behavior. Unforeseeable incidents like unexpected periods of extreme weather conditions can also lead to periods with sales deviations. Those periods with sales deviations vary for different companies, regions, and countries and therefore sometimes only affect the sales data of certain locations.
502 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
To mark days or periods with known deviations in your sales data, you can define periods with sales deviations and assign these periods to locations with or without reference to a data origin, country, or region. Using those periods helps the consuming applications to produce more accurate results in sales data analysis.
The periods with sales deviations are grouped in a period group. A period group can be assigned to data origins, countries, and regions. The system automatically assigns locations to period groups during data upload. Periods with sales deviations and the related assignments are treated like master data.
Example
During data enrichment of aggregated sales data from a retailer you would like to run an outlier analysis to mark unexpected high or low sales figures to exclude this data from certain calculations like the determination of average sales. However, based on previous business experience, you expect a higher sales volume for certain periods, for example a number of days before a public holiday. The data records from this period reflect the real sales volume and should not be excluded from analysis due to an outlier detection algorithm.
Prerequisites
For all activities related to periods with sales deviations you need authorization for the authorization object DDF_ADMIN (Administration Demand Data Foundation).
Activities
Setting up Periods with Sales Deviations
1. In the view Define Periods Groups, define a period group with one or several periods with sales deviations.2. In transaction Define Periods with Sales Deviations verify the generated days of the period group and
regenerate or add periods, if necessary.
Assigning Locations to Period Groups
1. In the view Define Default Assignments of Periods of Sales Deviations assign your period group.2. Load locations into the system, for example into InfoObject /DDF/LOCATION.3. In the program Assignments Locations to Period Groups, verify the generated assignments.
More Information
Periods with Sales Deviations [page 267]
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 503
12.6 Configuring and Administrating Global Market Share Analysis
As a data provider for Global Market Share Analysis, you have the following tasks:
● Define Market Groups [page 72]● Include Images in Global Market Share Analysis User Interfaces [page 73]● Define Path for Image Files [page 74] or Mass Definition of Image Paths [page 75]
12.6.1 Define Market Groups
Use
You use this transaction to define for each global brand the combinations of countries and global categories that you would like to report on.
These combinations are then stored as market groups in the InfoObject /DDF/MRMGRP. The assigned countries are stored in the InfoObject /DDF/MRMGRPM and the assigned global categories are stored in /DDF/MRMGRPC.
All defined market groups are available to be selected when you start the application Global Market Share Analysis. The transaction Define Market Groups is available in the SAP Easy Access Menu under Cross-Application Components Demand Signal Management Analytics and Reporting Global Market Share Analysis .
You can define several market groups for one global brand to reflect different responsibilities for global categories and countries. To determine the corresponding countries and global categories, you can either use a publishing group or explicitly assign individual countries and global categories to a market group. You can also combine both options.
You can change, display, or delete market groups. To add or remove a country or a global category, you just have to list all countries and global categories that you would like to include in the selection options.
NoteThe number of combinations of countries and global categories is limited due to restrictions in the lengths of a URL. You can include up to 8 global categories and 25 countries. If you reduce the number of global categories you can enhance the list of countries.
NoteIf you change a publishing group, for example, by replacing one country with another one, you must adjust the market group that corresponds to the publishing group.
504 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
More Information
Manage Publishing Groups [page 297]
12.6.2 Include Images in Global Market Share Analysis User Interfaces
Use
For the following object types you can include images in the user interfaces of Global Market Share Analysis:
● Country● Global Brand● Global Category
NoteOnly the MIME types image/pngand image/jpeg are supported, thus you can use only JPG and PNG files.
Process
1. You upload the corresponding image files into the MIME repository of your system.2. You define the path for each image file to the corresponding object key of the InfoObject using the
transactions Define Path for Image Files (/DDF/MIME) or Mass Definition of Image Files (/DDF/MIMEDEF).
NoteDisclaimer: Be aware of copyright. The images used are your responsibility.
More Information
Define Path for Image Files [page 74]
Mass Definition of Image Paths [page 75]
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 505
12.6.2.1 Define Path for Image Files
Use
You use the transaction Define Path for Image Files (/DDF/MIME) to assign image files stored in the MIME repository to the corresponding object keys to make the images visible in the application Global Market Share Analysis. The following object types are supported:
● Country● Global brand● Global category
For each object type you must specify the InfoObject that provides the key values that you would like to link to an image path. In the SAP standard the following InfoObjects are used:
Object Type InfoObject
Country 0COUNTRY
Global Brand /DDF/BRAGM_TXT
Global Category /DDF/PCAGM_TXT
Select one object type and define, for the relevant object keys, the path to the image file.
NoteDefine the paths before you start importing image files into the MIME repository. If the file does not exist yet, the system provides a warning but you can save the path definition anyway.
Alternatively, you can use transaction /DDF/MIMEDEF for a convenient mass definition of image paths.
More Information
Mass Definition of Image Paths [page 75]
12.6.2.2 Mass Definition of Image Paths
Use
You can use transaction /DDF/MIMEDEF to define the image paths for multiple object keys of a given object type at once.
506 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
You have the option to scan a specified folder of the MIME repository for existing image files. The system searches for images with file names that are equal to the selected object keys. If there is a match of a file name (excluding the file extension) the corresponding assignment of the object key to the path is saved.
Alternatively, you first create the assignments of object keys to image paths. The system then creates the path as a concatenation of the specified path, the object key, and the file extension depending on the chosen image type. Afterwards, you can upload the image files into the MIME repository with the corresponding file name.
More Information
Define Path for Image Files [page 74]
12.7 Configuring Analytics and Reporting
The following sections describe the settings you have to make to configure the Web Dynpro reports and the analytics dashboards created with the UI development toolkit for HTML5.
12.7.1 Configuring the Web Dynpro Reports
Use
You have to make the following configuration settings so that the Web Dynpro reports are displayed correctly.
Procedure
1. Set the link to the Internet Graphic Server (IGS) so that graphics in the reports are rendered properly. Proceed as follows:1. Go to the RFC Destinations (Display/Maintain) (SM59) transaction.
2. Choose TCP/IP connections IGS_RFC_DEST .3. Double-click on IGS_RFC_DEST and choose Connection Test.4. If the server is not responding, contact your system administrator and request a connection to the IGS.
2. Activate business configuration sets (BC sets) by doing the following:1. Go to the Activate BC Sets (SCPR20) transaction.2. Activate the following BC sets:
○ /DSR/BCV_VC_SIN_DS○ /DSR/BCV_VC_QUERY○ /DSR/BCV_VC_UPRVW
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 507
○ /DSR/BCV_VC_UPRVW○ /DSR/BCV_V_MEANT○ /DSR/BCV_V_CLF○ /DSR/BCV_VC_UQVIEW
3. Activate Web Services by doing the following:1. Go to the HTTP Service Hierarchy Maintenance (SICF) transaction.2. Choose Execute.
3. Choose default_host sap bc webdynpro ddf .4. Make sure that all UIs are activated. If a UI is grayed out, right-click on it, and choose Activate Service.
5. Choose default_host sap bc webdynpro dsr and make sure that all UIs are activated.6. Make sure that the following services are also activated:
○ default_host sap public bc
○ default_host sap public bc icons
○ default_host sap public bc icf
○ default_host sap public bc icons_rtl
○ default_host sap public bc pictograms
○ default_host sap public bc webdynpro
○ default_host sap public bc webdynpro adobeChallenge
○ default_host sap public bc webdynpro mimes
○ default_host sap public bc webdynpro ssr
○ default_host sap public bc webdynpro ViewDesigner
12.7.1.1 Configuring the Unit of Measure Conversion for the Promotion Analysis Report
1. To convert the SAP Trade Promotion Management (TPM) key figures that are used in the Promotion Analysis report of SAP Demand Signal Management, create a custom SAP NetWeaver Business Warehouse (SAP NetWeaver BW) conversion type based on the Product (0CRM_PROD) InfoObject. Proceed as follows:1. Go to the Modelling - DW Workbench (RSA1) transaction.2. Find the 0CRM_PROD InfoObject and open it in change mode.3. On the Business Explorer tab page, enter 0BASE_UOM as base unit of measure.4. Create a DataStore object by choosing Create next to the Units of Measure for Char. field.5. Enter a description and technical name (in the customer namespace) for the DataStore object.6. Save your changes.
The DataStore object will be active. To perform unit of measure conversions, the created DataStore object must be filled with data. To upload data into the DataStore object, a data transfer process and transformation must be created. For more information, enter the keywords Prerequisites for InfoObject-Specific Quantity Conversion in the documentation for SAP NetWeaver on SAP Help Portal at http://help.sap.com .
508 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
2. Create a quantity conversion type based on the Product (0CRM_PROD) InfoObject. It can be applied on TPM key figures in the BEx Query Designer to perform unit of measure conversions. Proceed as follows:1. Go to the UOM: BW (RSUOM) transaction.2. Enter a technical name and choose Create.3. Enter a long and short description.4. On the Conversion Factors tab page, enter 0CRM_PROD as a reference InfoObject.5. On the Unit of Meas. tab page, under Target Unit of Measure, select the Fixed Unit of Measure radio
button, and enter your target unit of measure.
NoteIn this case, a fixed unit of measure is used. However, you can specify the unit of measure in a query variable. For this, you must select the Target Quantity from Variable radio button instead.
6. Save your changes.3. Modify the queries for the Promotion Analysis report to apply the unit of measure conversion to a target
unit of measure. Proceed as follows:1. In the BEx Query Designer, open, for example, the Baseline and Planned Sales Quantity (/DSR/
CP09_PRM_Q0001) query.2. On the Rows/Columns tab page, in the Columns section, select the quantity key figures that you want
to convert to the target unit of measure, in this case, Baseline Sales Quantity (0BASE_QTY) and Planned Sales Quantity (0SCAN_QTY).
3. On the Conversions tab page, enter the conversion type that you created above.
12.7.2 Configuring the Analytics Dashboards Created with SAPUI5
The following sections describe the settings you have to make to configure the analytics dashboards created with the UI development toolkit for HTML5 (SAPUI5). You have to set up your product and market hierarchies to be able to use the Market Share Analysis and Sales Performance Analysis analytics dashboards. You can configure the look and feel of the Market Share Analysis analytics dashboard.
12.7.2.1 Configuring Product and Location Hierarchies for the Analytics Dashboards
Use
You have to set up your product and market hierarchies to be able to use the Market Share Analysis and Sales Performance Analysis analytics dashboards.
Hierarchy Set-Up for Market Share Analysis
You can set up product and market hierarchies either using the default BEx queries delivered by SAP that do not contain any product or location attributes or using custom BEx queries that do contain product or location attributes.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 509
You can set up the hierarchies with or without an SAP BW hierarchy. An SAP BW hierarchy contains data from a hierarchy that is created either manually or automatically by the system during a data upload. You can create an SAP BW hierarchy manually in the transaction Modeling - DW Workbench (RSA1). These hierarchies consist of Source Product (/DDF/PRODREF) nodes for product hierarchies or Source Location (/DDF/LOCREF) nodes for location hierarchies. You can also use text nodes in the hierarchies.
During a data upload, one or more SAP BW hierarchies can be created for products and locations. These hierarchies consist of Source Product (/DDF/PRODREF) nodes for product hierarchies or Source Location (/DDF/LOCREF) nodes for location hierarchies.
Therefore, product and market hierarchies can be set up in one of the following ways:
● Flat hierarchyA hierarchy based on default BEx queries without an SAP BW hierarchy.
● Attribute-based hierarchyA hierarchy based on the product or location attributes of custom BEx queries without an SAP BW hierarchy.
● SAP BW hierarchyA hierarchy based on default BEx queries with an SAP BW hierarchy. This is the default configuration for the Market Share Analysis analytics dashboard.
● Hybrid hierarchyA hierarchy based on product and location attributes of custom BEx queries with an SAP BW hierarchy.
NoteYou can set up the product and market hierarchies in different ways, for example, you can use SAP BW hierarchies with the BEx queries delivered by SAP for the product hierarchies and hybrid hierarchies with the BEx queries delivered by SAP for the market hierarchies.
By default, the following BEx queries are used for the hierarchies for the Market Share Analysis analytics dashboard:
● Market Share Sales by Product (/DSR/CP11_MKSHARE_V2_Q0008)● Market Share Sales by Location (/DSR/CP11_MKSHARE_V2_Q0009)
Hierarchy Set-Up for Sales Performance Analysis
The only possible product and location hierarchy configuration for the Sales Performance Analysis analytics dashboard is manually created SAP BW hierarchies with the BEx queries delivered by SAP.
By default, the following BEx queries are used for the hierarchies for the Sales Performance Analysis analytics dashboard:
● KPIs by Location Hierarchy Node (/DSR/CP02_SA2_Q0018)● KPIs by Product Hierarchy Node (/DSR/CP02_SA2_Q0019)
Product hierarchies are based on the Material (0MATERIAL) InfoObject, which contains manufacturer products.
Location hierarchies are based on the Source Location (/DDF/LOCREF) InfoObject, which contains retailer stores.
510 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
More Information
Defining Hierarchy Parameters [page 516]
12.7.2.1.1 Configuring Flat Hierarchies
Use
In this scenario, SAP NetWeaver Business Warehouse (SAP NetWeaver BW) hierarchies do not exist, therefore, the default BEx queries are used to create product and market hierarchies for the Market Share Analysis analytics dashboard. The hierarchies are flat because they have a root node and, because there are no product and location attributes, all products and locations are under the root node.
Prerequisites
A custom query for the Market Share Analysis analytics dashboard and for the DEF_GEOG or DEF_PROD query use has not been defined in the Define Custom Queries activity in Customizing for Cross-Application Components under Demand Signal Management Analytics and Reporting Product and Location Hierarchies .
Procedure
1. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Analytics and Reporting Define Hierarchy Parameters .
2. Create a new entry, for example, for the MKS analytics dashboard, for the GEOG dimension, and make sure that you leave the Hierarchy Name and the Query Name fields empty.
Result
The Market Share Analysis analytics dashboard will show product and market hierarchies based on the default BEx queries.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 511
12.7.2.1.2 Configuring Attribute-Based Hierarchies
Use
In this scenario, an SAP NetWeaver Business Warehouse (SAP NetWeaver BW) hierarchy does not exist. You can create custom BEx queries by creating copies of the default BEx queries and add product and location attributes to them. For example, you can add the Source Subcategory (/DDF/PRODSCATL) attribute to the product hierarchy.
You can also do the following:
● Define a custom query for a specific data delivery agreement● Define a custom query for all data delivery agreements that do not have an explicitly specified custom BEx
query
Procedure
Define a Custom Query for a Specific Data Delivery Agreement
1. Make sure that a custom query for the Market Share Analysis analytics dashboard and DEF_PROD query use has not been defined in the Define Custom Queries activity in Customizing for Cross-Application Components under Demand Signal Management Analytics and Reporting Product and Location Hierarchies .
2. In the BEx Query Designer, open the Market Share Sales by Product (/DSR/CP11_MKSHARE_V2_Q0008) and on the Rows/Columns tab page, in the Rows section, add the Source Subcategory (/DDF/PRODUCT__/DDF/PRODSCATL) characteristic.
3. Save the BEx query with a different technical name.
4. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Analytics and Reporting Define Hierarchy Parameters .
5. Create a new entry for the MKS analytics dashboard, for the PROD dimension, enter a data delivery agreement, enter the name of the query you created, and make sure that you leave the Hierarchy Name field empty.
Define a Custom Query for All Data Delivery Agreements
1. In the BEx Query Designer, open the Market Share Sales by Product (/DSR/CP11_MKSHARE_V2_Q0008) and on the Rows/Columns tab page, in the Rows section, add the Source Subcategory (/DDF/PRODUCT__/DDF/PRODSCATL) characteristic.
2. Save the BEx query with a different technical name.3. Create a new entry in the Define Custom Queries activity in Customizing for Cross-Application Components
under Demand Signal Management Analytics and Reporting Product and Location Hierarchies for the MKS analytics dashboard, for the DEF_PROD query use, and enter the name of the query you created.
4. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Analytics and Reporting Define Hierarchy Parameters .
5. Create a new entry for the MKS analytics dashboard, for the PROD dimension, and make sure that you leave the Hierarchy Name and the Query Name fields empty.
512 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
The custom query you created will be used for all data delivery agreements that do not have an explicitly specified custom BEx query in the Define Hierarchy Parameters transaction.
Result
The Market Share Analysis analytics dashboard will show a product hierarchy based on the attributes that are used in your custom BEx query.
12.7.2.1.3 Configuring SAP NetWeaver Business Warehouse Hierarchies
Use
In this scenario, product and market hierarchies are based on default BEx queries (without attributes) with SAP NetWeaver Business Warehouse (SAP NetWeaver BW) hierarchies. This is the default configuration for the Market Share Analysis analytics dashboard.
Prerequisites
A custom query for the Market Share Analysis analytics dashboard and for the DEF_GEOG or DEF_PROD query use has not been defined in the Define Custom Queries activity in Customizing for Cross-Application Components under Demand Signal Management Analytics and Reporting Product and Location Hierarchies .
Procedure
1. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Analytics and Reporting Define Hierarchy Parameters .
2. Create a new entry, for example, for the MKS analytics dashboard, for the GEOG dimension, enter a hierarchy name and leave the Query Name field empty.
Result
The Market Share Analysis analytics dashboard will show product and market hierarchies based on the SAP NetWeaver BW hierarchy you specified.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 513
12.7.2.1.4 Configuring Hybrid Hierarchies
Use
In this scenario, hierarchies are based on product and location attributes of custom BEx queries with SAP NetWeaver Business Warehouse (SAP NetWeaver BW) hierarchies. You can create custom BEx queries by creating copies of the default BEx queries and adding product and location attributes to them.
You can also do the following:
● Define a custom query for a specific data delivery agreement● Define a custom query for all data delivery agreements that do not have an explicitly specified custom BEx
query
Procedure
Define a Custom Query for a Specific Data Delivery Agreement
1. Make sure that a custom query for the Market Share Analysis analytics dashboard for the DEF_GEOG or DEF_PROD query use has not been defined in the Define Custom Queries activity in Customizing for Cross-Application Components under Demand Signal Management Analytics and Reporting Product and Location Hierarchies .
2. In the BEx Query Designer, open the Market Share Sales by Product (/DSR/CP11_MKSHARE_V2_Q0008) or the Market Share Sales by Location (/DSR/CP11_MKSHARE_V2_Q0009) query and on the Rows/Columns tab page, in the Rows section, add product or location attributes.
3. Save the BEx query with a different technical name.
4. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Analytics and Reporting Define Hierarchy Parameters .
5. Create a new entry for the MKS analytics dashboard, for the PROD or GEOG dimension, enter a data delivery agreement, enter the name of the SAP NetWeaver BW hierarchy you want to use, and enter the name of the query you created.
Define a Custom Query for All Data Delivery Agreements
1. In the BEx Query Designer, open the Market Share Sales by Product (/DSR/CP11_MKSHARE_V2_Q0008) or the Market Share Sales by Location (/DSR/CP11_MKSHARE_V2_Q0009) query and on the Rows/Columns tab page, in the Rows section, add product or location attributes.
2. Save the BEx query with a different technical name.3. Create a new entry in the Define Custom Queries activity in Customizing for Cross-Application Components
under Demand Signal Management Analytics and Reporting Product and Location Hierarchies for the MKS analytics dashboard, for the DEF_PROD or DEF_GEOG query use, and enter the name of the query you created.
4. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Analytics and Reporting Define Hierarchy Parameters .
5. Create a new entry for the MKS analytics dashboard, for the PROD or GEOG dimension, enter the name of the SAP NetWeaver BW hierarchy you want to use, and leave the Query Name field empty.
514 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
The custom query you created will be used for all data delivery agreements that do not have an explicitly specified custom BEx query in the Define Hierarchy Parameters transaction.
Result
The Market Share Analysis analytics dashboard will show product and market hierarchies based on the attributes defined in the custom query and the SAP NetWeaver BW hierarchy you specified.
Example
If you added the Manufacturer and Category attributes to a copy of the Market Share Sales by Product (/DSR/CP11_MKSHARE_V2_Q0008) query, the product hierarchy will be displayed on the Product view of the Market Share Analysis analytics dashboard as shown on the following figure:
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 515
12.7.2.1.5 Defining Hierarchy Parameters
Use
To enable the end user to have a selection of product and location hierarchy data to choose from in the Market Share Analysis and Sales Performance Analysis analytics dashboards, you can either use the default settings or create your own settings. The end user has the following data to choose from on the Data Selection screen:
● Product hierarchy description● Market hierarchy description (known as location hierarchy description in the backend configuration)● Product category● Country (known as a delivery region in the backend configuration)
Optional: You can decide on a SAP BW hierarchy and custom BEx query that will be used to display the product and location data.
Prerequisites
● You have made the required settings in the following activities in Customizing for Cross-Application Components under Demand Signal Management Analytics and Reporting Product and Location Hierarchies :○ Define Analytics Dashboards○ Define Dimensions
You can use the default settings for the product and location hierarchies that are available in the following activities in Customizing for Cross-Application Components under Demand Signal ManagementAnalytics and Reporting Product and Location Hierarchies :○ Define Analytics Dashboards○ Define Dimensions○ Define Query Use○ Display Default Queries
You can set up your own custom hierarchies by making the required settings in the Define Custom Queries activity in Customizing for Cross-Application Components under Demand Signal ManagementAnalytics and Reporting Product and Location Hierarchies .
● You have defined data delivery agreements on the SAP Easy Access screen under SAP Menu Cross-Application Components Demand Signal Management Data Upload Data Delivery AgreementsDefine Data Delivery Agreements .
516 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Activities
1. On the SAP Easy Access screen, choose SAP Menu Cross-Application Components Demand Signal Management Analytics and Reporting Define Hierarchy Parameters or transaction code /DSR/V_HIER_PAR.
2. Enter MKS for the Market Share Analysis or SA for the Sales Performance Analysis analytics dashboards.3. Enter a dimension, either PROD for a product hierarchy or GEOG for a market hierarchy.
If you make an entry for the PROD dimension, you must enter the name of a Source Product (/DDF/PRODREF) hierarchy. If you make an entry for the GEOG dimension, you must enter the name of a Source Location (/DDF/LOCREF) hierarchy.
4. Enter a profile name, which is a unique ID for the entry.5. Enter a profile description.
The description will be visible on the Data Selection screen of the analytics dashboards, either in the Product Hierarchy or the Market Hierarchy dropdown list depending on the dimension and dashboard.
6. Select the Visible checkbox if you want the hierarchy entry to be available on the Data Selection screen of the analytics dashboard.
7. For the Market Share Analysis analytics dashboard, enter a data delivery agreement, which corresponds to the category in Market Share Analysis.This is not used for Sales Performance Analysis.
8. For the Market Share Analysis analytics dashboard, enter a delivery region, which corresponds to country in Market Share Analysis. This is not used for Sales Performance Analysis.
NoteYou can enter an asterisk (*) for the data delivery agreement and for the delivery region. Then, the SAP BW hierarchy or BEx query that you define is used for all delivery regions and data delivery agreements and also depends on the settings you make for the Hierarchy Name and the Query Name fields.
9. The following options are possible for the Hierarchy Name and the Query Name fields:○ Do not enter a hierarchy name and a query name
Use this option for a flat hierarchy (based on default BEx queries without an SAP BW hierarchy) and an attribute-based hierarchy (based on the product or location attributes of custom BEx queries without an SAP BW hierarchy).
○ Enter a hierarchy name and do not enter a query nameUse this option for an SAP BW hierarchy (based on default BEx queries with an SAP BW hierarchy).
○ Do not enter a hierarchy name and enter a query nameUse this option for an attribute-based hierarchy (based on the product or location attributes of custom BEx queries without an SAP BW hierarchy).
○ Enter a hierarchy name and a query nameUse this option for a hybrid hierarchy (based on product and location attributes of custom BEx queries with an SAP BW hierarchy).
NoteIf you do not enter a query name and you have defined a custom query in the Define Custom Queries Customizing activity, the system uses your custom query. If you have not defined a custom query, the system uses the default query defined in the Display Default Queries Customizing activity.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 517
NoteWhen data is uploaded with a data set type for market research data (retail panel data), an entry is automatically created for the data delivery agreement in the Define Hierarchy Parameters view (with the PROD dimension for a product hierarchy or the GEOG dimension for a market hierarchy).
The Generated checkbox is automatically selected to show that the system generated the entry following the data upload. If you create the entry manually, you cannot select the Generated checkbox.
When data is uploaded, SAP BW hierarchies may be created too. These hierarchies are immediately visible on the Data Selection screen of the analytics dashboards, either in the Product Hierarchy or the Market Hierarchy dropdown list depending on the dimension.
More Information
Configuring Product and Location Hierarchies for the Analytics Dashboards [page 509]
12.7.2.2 Configuring the Market Share Analysis Analytics Dashboard
Use
You can change the look and feel of the Market Share Analysis analytics dashboard. You do this by copying properties from the default parameters.properties file for the analytics dashboard to the custom_parameters.properties file and changing their default values.
Procedure
1. Go to the Object Navigator (SE80) transaction.2. Enter BSP Application, then enter /DSR/DSIM_UI5_MKS, and then choose Display.
3. Choose /DSR/DSIM_UI5_MKS Page Fragments WebContent market_share commonproperties parameters.properties and copy the properties you want to change to /DSR/
518 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
DSIM_UI5_MKS Page Fragments WebContent market_share common propertiescustom_parameters.properties . You can copy one or more of the following properties:
Property Value
mks.node.threshold.negative Contains the threshold for showing the deviation in red. By default, the value is -1. This means that if the deviation is -1% or more, it will be shown in red.
mks.node.threshold.positive Contains the threshold for showing the deviation in green. By default, the value is 1. This means that if the deviation is 1% or more, it will be shown in green.
mks.topn Contains the maximum number of top-selling markets or products to be displayed in the Relative Market Share for Top-Selling Markets and Relative Market Share for Top-Selling Products charts. By default, the value is 5.
mks.graph.nodesbypage Contains the maximum number of child nodes in a product or market hierarchy that can be expanded or collapsed at once. By default, the value is 3.
Result
The values that you define in the custom_parameters.properties file will be used instead of the default values in the parameters.properties file.
12.8 Extensibility of Analytics and Reporting
You can extend both the Web Dynpro reports and the analytics dashboards created with the UI development toolkit for HTML5 (SAPUI5). The following sections describe how you can do that.
12.8.1 Extensibility of the Web Dynpro Reports
The user interface (UI) frameworks that are used to build the analytics and reporting UI allow you to extend the Web Dynpro reports that are delivered with SAP Demand Signal Management. For example, you can extend the UI by doing the following:
● Create new or change the existing BEx queries, for example, by adding new key figures or attributes● Add a new chart user interface building block (chart UIBB)
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 519
12.8.1.1 Adding a New Key Figure to a Web Dynpro Report
Use
If you want to add a new key figure to the On-Shelf Availability Analysis by Location report, for example, average lost sales by product out-of-shelf, which is calculated by dividing the cumulative net lost sales by the number of products out-of-shelf, you have the following two options:
● You can create a new BEx query that includes the new key figure and create a new component configuration. You have to assign the new query and the new component configuration to the corresponding user interface building block (UIBB) component.
● You can change a BEx query delivered by SAP and add the new key figure. Since the BEx query is already assigned to a component configuration, no further steps are required.
NoteWhen you use this method, if a new version of the query is delivered, the modified query cannot be updated. You have to manually merge the modified query with the new version that is delivered.
The steps you have to perform for the first option are described below.
Procedure
1. Go to transaction Object Navigator (SE80) and select the /DSR/UI_CONS package.
2. Under Object Name, choose Web Dynpro FPM Applications /DSR/WDA_CONS_OOSF_ANALYSIS (On-Shelf Availability Analysis by Location) FPM Application Configurations /DSR/WDA_CONS_OOSF_ANALYSIS__CP (On-Shelf Availability Analysis Configuration) and choose Display Configuration.
3. On the Application Configuration page, in the Assign Web Dynpro Component assignment block, choose the configuration name.
4. On the Component Configuration page, in the Overview Page Schema assignment block, select the search UIBB component and choose Configure UIBB.
5. To display the feeder class and the BEx queries that are used by the search UIBB component, choose Feeder Class Parameters.
6. Go to the BEx Query Designer and search for the queries you found in the previous step.7. After you identify the BEx query that you have to change, save a local copy with a different technical name.8. On the Rows/Columns tab page, in the Columns section, define the new key figure and save your changes.
9. On the Component Configuration page, choose Additional Functions Deep-Copy .10. You cannot change the configurations that are delivered. You have to save a local copy with a different
name. On the Floorplan Manager: Application Hierarchy Browser page, change the target configuration IDs and choose Start Deep-Copy.
11. Go to transaction Object Navigator (SE80) and navigate to the FPM application configuration you created in the previous step and choose Start Configurator.
12. On the Application Configuration page, in the Assign Web Dynpro Component assignment block, choose the configuration name.
520 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
13. On the Component Configuration page, in the Overview Page Schema assignment block, select the search UIBB component and choose Configure UIBB.
14. Choose Feeder Class Parameters.15. Replace the old query with the one you created in the previous steps and save your changes.
NoteNote that you have added the new key figure to the search UIBB component only. If you want to add the new key figure to the hierarchical list UIBB and list UIBB components for the location and product views, you must configure each UIBB component to point to the new configuration. If you want to add the new key figure to the location and product views, you must change more than one BEx query since there are different queries for the hierarchy and the list views.
12.8.1.2 Adding a Chart UIBB to a Web Dynpro Report
Use
If you want to add a generic chart user interface building block (chart UIBB) to the On-Shelf Availability Analysis by Location report, for example, you have to create a new component configuration, add the chart UIBB, and assign a generic feeder class to the chart UIBB component. If you want to add a more complex chart to the report, you have to define the feeder class for that first.
Procedure
1. Go to transaction Object Navigator (SE80) and select the /DSR/UI_CONS package.
2. Under Object Name, choose Web Dynpro FPM Applications /DSR/WDA_CONS_OOSF_ANALYSIS (On-Shelf Availability Analysis by Location) FPM Application Configurations /DSR/WDA_CONS_OOSF_ANALYSIS__CP (On-Shelf Availability Analysis Configuration) and choose Display Configuration.
3. On the Application Configuration page, in the Assign Web Dynpro Component assignment block, choose the configuration name.
4. On the Component Configuration page, choose Additional Functions Deep-Copy .5. You cannot change the configurations that are delivered. You have to save a local copy with a different
name. On the Floorplan Manager: Application Hierarchy Browser page, change the target configuration IDs and choose Start Deep-Copy.
6. Go to transaction Object Navigator (SE80) and navigate to the FPM application configuration you created in the previous step and choose Display Configuration.
7. On the Application Configuration page, in the Assign Web Dynpro Component assignment block, choose the configuration name.
8. On the Component Configuration page, choose Edit.
9. In the Overview Page Schema assignment block, choose UIBB Analytics Chart Component .10. In the Preview assignment block, select the chart UIBB component and choose the wrench button in the
top right corner.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 521
11. On the Component Configuration page, choose Edit and then choose Feeder Class.12. Enter the /DSR/CL_BS_ANLY_CHART_F feeder class and save your changes.
You can now use the new configuration instead of the delivered one.
12.8.1.3 Removing Columns from the Input Help in the Search Criteria
Use
You can remove a column from the input help in the search criteria for the Web Dynpro reports, for example, the Promotion Analysis report.
Procedure
1. Log on to the system where the report is running.2. Go to the Characteristic maintenance (RSD1) transaction.3. Enter an InfoObject, for example, 0CRM_MKTELM for the Promotion Analysis report, and choose Maintain.4. Choose the Attributes tab.
The Order for F4 Help column determines the order of the columns for the input help.5. Change the value of any attribute that you do not want to be displayed in the input help to zero (0).6. Save and activate your changes.
12.8.1.4 Displaying Country Names for Market Research Countries
Use
In the transaction Enhance Data Delivery Agreements (/DDF/DDAGR_MRD), you assign delivery regions to data delivery agreements that are relevant for uploading market research data. A region represents a geographical country, region, or a bundle of several countries for which a specific data delivery is valid. The following are examples of delivery regions:
● GERMANY represents Germany● US_EAST represents the east coast of the United States of America● BENELUX represents Belgium, Netherlands, and Luxemburg● NORDIC represents Denmark, Sweden, Norway, and Finland
522 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Integration
By maintaining meaningful descriptions for these delivery regions, you ensure the end user has a list of meaningful country names in the selection criteria when choosing the Data Selection to filter the results displayed on the analytics dashboards.
12.8.2 Extensibility of the Analytics Dashboards Created with SAPUI5
You can extend the Sales Performance Analysis and Market Share Analysis analytics dashboards by adding new key figures to their views, for example, the Map, Location, or Product view.
If you want to add new key figures to one of the views, you have to do the following:
1. Save a copy of the corresponding BEx query and add the new key figures to it.You can display the default BEx query to be used for each analytics dashboard in Customizing for Cross-Application Components under Demand Signal Management Analytics and Reporting Product and Location Hierarchies Display Default Queries .
2. Assign the new BEx query to the analytics dashboard in the Define Custom Queries activity in Customizing for Cross-Application Components under Demand Signal Management Analytics and ReportingProduct and Location Hierarchies .
3. Create custom properties for the format of the key figure in the Business Server Pages application (BSP application).
If you want to go back to the initial configuration of the analytics dashboards, you have to remove the assignment of the new BEx query to the analytics dashboard in Customizing and remove the custom properties for the key figure in the BSP application.
12.8.2.1 Adding a New Key Figure to the Map View of the Sales Performance Analysis Analytics Dashboard
1. In the BEx Query Designer, open the KPIs by Geography (/DSR/CP02_SA2_Q0020) query and save a local copy with a different technical name.
2. On the Rows/Columns tab page, in the Columns section, right-click on Key Figures and, in the context menu, choose New Formula.
3. Click on the new formula and choose Edit.4. Enter a description, for example, Double Net Sales and in the Detail View field, enter the formula, for
example, Net Sales * 2.5. Move the key figure to the bottom of the list of key figures and save your changes.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 523
6. Assign the new query to the analytics dashboard by creating the following new entry in the Define Custom Queries activity in Customizing for Cross-Application Components under Demand Signal ManagementAnalytics and Reporting Product and Location Hierarchies :
Field Value
Dashboard Enter SA.
Query Use Enter SA_0001 (KPIs by Geography).
Query Name Enter the technical name of the query you created.
Query InfoProvider Enter the technical name of the InfoProvider for the query you created.
7. Save your entries.8. Go to the Object Navigator (SE80) transaction.9. Enter BSP Application, then enter /DSR/DSIM_UI5_SA, and then choose Display.
10. Choose /DSR/DSIM_UI5_SA Page Fragments WebContent sales_analysis common propertiescustom_parameters.properties .
11. Add the properties for the format of the new key figure as follows:
Property Value
sa.overview_v2.map.kf<N>.type
(<N> is the position of the key figure in the Columns section of the BEx query.)
The following values are possible:
○ currencyFor key figures that contain currency.
○ unit_of_measureFor key figures that contain quantities with units of measure.
○ percentageFor key figures that contain numbers displayed as percentages.
○ no_unitFor key figures that contain numbers without units.
sa.overview_v2.map.kf.default Enter the following value kf<N>, where <N> is the number of the key figure that will be displayed in the map view by default. For example, if the value is kf4, the key figure in the fourth position in the Columns section of the BEx query will be displayed by default.
For example, the custom_parameters.properties file can be as follows:
sa.overview_v2.map.kf11.type = currencysa.overview_v2.map.kf.default = kf4
12. Save your entries.
524 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
12.8.2.2 Adding a New Key Figure to the Location or Product View of the Sales Performance Analysis Analytics Dashboard
1. In the BEx Query Designer, open the KPIs by Location Hierarchy Node (/DSR/CP02_SA2_Q0018) or the KPIs by Product Hierarchy Node (/DSR/CP02_SA2_Q0019) query and save a local copy with a different technical name.
2. On the Rows/Columns tab page, in the Columns section, right-click on Key Figures and, in the context menu, choose New Formula.
3. Click on the new formula and choose Edit.4. Enter a description, for example, Double Net Sales and in the Detail View field, enter the formula, for
example, Net Sales * 2.5. Move the key figure to the bottom of the list of key figures and save your changes.6. Assign the new query to the analytics dashboard by creating the following new entry in the Define Custom
Queries activity in Customizing for Cross-Application Components under Demand Signal ManagementAnalytics and Reporting Product and Location Hierarchies :
Field Value
Dashboard Enter SA.
Query Use Enter SA_0006 (KPIs by Location Hierarchy Node) or SA_0007 (KPIs by Product Hierarchy Node).
Query Name Enter the technical name of the query you created.
Query InfoProvider Enter the technical name of the InfoProvider for the query you created.
7. Save your entries.8. Go to the Object Navigator (SE80) transaction.9. Enter BSP Application, then enter /DSR/DSIM_UI5_SA, and then choose Display.
10. Choose /DSR/DSIM_UI5_SA Page Fragments WebContent sales_analysis common propertiescustom_parameters.properties .
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 525
11. Add the formatting properties for the new key figure as follows:
Property Value
sa.customer_v3.kf<N>.valueProperty
(<N> is the position of the key figure in the Columns section of the BEx query.)
The following values are possible:
○ valueFor key figures that contain a value without formatting.
○ formatted_valueFor key figures that contain a value as formatted in BEx query designer.
sa.customer_v3.kf<N>.type The following values are possible:
○ currencyFor key figures that contain currency.
○ unit_of_measureFor key figures that contain quantities with units of measure.
○ percentageFor key figures that contain numbers displayed as percentages.
sa.customer_v3.kf<N>.positiveTrend This property is only used when the sa.customer_v3.kf<N>.type property key of the key figure is set to percentage.
The following values are possible:
○ higher_numbersFor key figures whose values are positive or increasing. For example, if net sales are increasing, a positive trend is shown.
○ lower_numbersFor key figures whose values are negative and increasing. For example, if net lost sales are increasing, a negative trend is shown.
12. Save your entries.
12.8.2.3 Adding a New Key Figure to the Market Share Analysis Analytics Dashboard
Use
You can extend the Market Share Analysis analytics dashboard by adding new key figures to the Market and Product views.
526 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Procedure
1. In the BEx Query Designer, open the Market Share Sales by Location (/DSR/CP11_MKSHARE_V2_Q0009) or the Market Share Sales by Product (/DSR/CP11_MKSHARE_V2_Q0008) query and save a local copy with a different technical name.
2. On the Rows/Columns tab page, in the Columns section, right-click on Key Figures and, in the context menu, choose New Formula.
3. Click on the new formula and choose Edit.4. Enter a description, for example, Double Sales Value and in the Detail View field, enter the formula, for
example, Sales Value * 2.5. Move the key figure to the bottom of the list of key figures and save your changes.
NoteDo not change the order of the original key figures.
6. Assign the new query to the analytics dashboard by creating the following new entry in the Define Custom Queries activity in Customizing for Cross-Application Components under Demand Signal ManagementAnalytics and Reporting Product and Location Hierarchies :
Field Value
Dashboard Enter MKS.
Query Use Enter DEF_GEOG (Default Location Query) or DEF_PROD (Default Product Query).
Query Name Enter the technical name of the query you created.
Query InfoProvider Enter the technical name of the InfoProvider for the query you created.
7. Save your entries.8. Go to the Object Navigator (SE80) transaction.9. Enter BSP Application, then enter /DSR/DSIM_UI5_MKS, and then choose Display.
10. Choose /DSR/DSIM_UI5_MKS Page Fragments WebContent market_share commonproperties parameters.properties and copy the following properties to /DSR/DSIM_UI5_MKS Page Fragments WebContent market_share common properties custom_parameters.properties .
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 527
Property Value
mks.kpi Contains the list of the key figures that can be displayed in the Product and Market views, for example, relative,relativeDeviation,absolute,absoluteDeviation,SalesValue,SalesValueDeviation,SalesQnt,SalesQntDeviation.
Add the new key figure at the end of the list by separating it from the other key figures with a comma (,). The name must start with a letter and, if it contains more words, they must be concatenated.
For example, the entry can be mks.kpi=relative,relativeDeviation,absolute,absoluteDeviation,SalesValue,SalesValueDeviation,SalesQnt,SalesQntDeviation,DoubleSalesValue.
NoteIf you remove a key figure from this list, the values for this key figure in the properties below are ignored.
mks.kpi.<Name of the key figure>.attributeName
Contains the name of the attribute that is used as a placeholder for the new key figure.
By default, the BEx query contains four key figures. You cannot add more than five new key figures since a total of nine key figures is supported. If you add a fifth key figure, the attribute name must be Ext1v, if you enter a sixth key figure, the attribute name must be Ext2v, and so on. These are the only possible values for the attribute names.
For example, the entry can be mks.kpi.DoubleSalesValue.attributeName=Ext1v.
mks.kpi.<Name of the key figure>.label Contains the name of the key figure that will appear on the
UI. The label can be any string or a key from the /DSR/
DSIM_UI5_MKS Page Fragments WebContent
market_share common properties
parameters.properties file.
For example, the entry can be mks.kpi.DoubleSalesValue.label=Double Sales Value.
528 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Configuration and Administration
Property Value
mks.kpi.<Name of the key figure>.format
Contains the format of the key figure.
Only one of the following entries is possible:
○ mks.kpi.DoubleSalesValue.format=dsimPercentageTwoDecimals
○ mks.kpi.DoubleSalesValue.format=dsimPercentage
○ mks.kpi.DoubleSalesValue.format=dsimNumberOneDecimal
○ mks.kpi.DoubleSalesValue.format=dsimNumberNoDecimal
For example, the custom_parameters.properties file can be as follows:
mks.kpi=relative,relativeDeviation,absolute,absoluteDeviation,SalesValue,SalesValueDeviation,SalesQnt,SalesQntDeviation,DoubleSalesValuemks.kpi.DoubleSalesValue.attributeName=Ext1vmks.kpi.DoubleSalesValue.label=Double Sales Valuemks.kpi.DoubleSalesValue.format=dsimNumberOneDecimal
NoteIf you added more than one key figure to the BEx query, you have to add the property entries above for each key figure.
You must only add the property entries once, regardless of whether you added new key figures to the Market Share Sales by Location (/DSR/CP11_MKSHARE_V2_Q0009) or the Market Share Sales by Product (/DSR/CP11_MKSHARE_V2_Q0008) query. However, if you want the result for the new key figures to be displayed on both the Market and Product views, you have to add the new key figures to both queries. If you add the new key figures to the Market Share Sales by Location (/DSR/CP11_MKSHARE_V2_Q0009) query only and not to the Market Share Sales by Product (/DSR/CP11_MKSHARE_V2_Q0008) query, for example, the new key figures will be displayed on the Product view but their values will be zero.
11. Save your entries.
SAP Demand Signal Management, version for SAP BW/4HANAConfiguration and Administration C O N F I D E N T I A L 529
13 Roles for SAP Demand Signal Management, version for SAP BW/4HANA
You can define authorizations and access to the user interface (UI) by means of authorization roles (PFCG roles).
The following types of roles are used in SAP Demand Signal Management, version for SAP BW/4HANA:
1. Business role (BCR)2. Authorization role3. Technical role (TCR)
Business Roles
You use business roles to provide access to user interfaces using the SAP Fiori Launchpad.
A business role contains references to business catalogs and business catalog groups. Once you assign the business role to a user, the user interfaces included are available as tiles in the business catalog group on the entry page of the SAP Fiori launchpad. Through the business catalog referenced, users have access to further relevant role-specific user interfaces.
The business catalog group can contain the following types of user interfaces:
1. Apps based on SAP Fiori2. Web user interfaces (UI) based on the UI development toolkit for HTML5 (SAP UI5)3. Web UIs based on Web Dynpro
The corresponding authorizations are defined using authorization roles.
You can find a complete list of the business roles and the related authorization roles under Business Roles for SAP Demand Signal Management, version for SAP BW/4HANA [page 531].
Authorization Roles
You use authorization roles to define the authorization for each system user or group of system users. An authorization role contains authorizations for the back-end server. It can also contain references to the corresponding OData services that are required to use applications.
These roles can also be used to provide access to user interfaces based on SAP Graphical User Interface (SAP GUI) or Web Dynpro:
1. User menu in SAP GUI2. SAP NetWeaver Business Client (NWBC)
530 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Roles for SAP Demand Signal Management, version for SAP BW/4HANA
3. SAP NetWeaver PortalYou need to load the role from the back-end system to the SAP NetWeaver Portal. For more information, see SAP Note 1685257
In addition to the authorization roles that are related to the business roles, the following authorization roles are available:
1. AdministratorsThe tasks of an administrator are to review and adapt the Customizing settings, configure the system, and run reports.○ SAP_DSIM_CONFIGURATION_2 Administrator for Demand Data Foundation
For more information, see Administrator for Demand Data Foundation [page 533].○ SAP_DSIM_CONFIGURATION Administrator for Demand Signal Management
2. Support user rolesThese roles provide access to all user interfaces in display mode.○ SAP_DSIM_DISPLAY_3 Support User Role for Demand Data Foundation○ SAP_DSIM_DISPLAY Support User Role for Demand Data Foundation
Technical Roles
Technical roles contain references to technical catalogs, and they allow users to access the apps contained in these catalogs.
The following technical roles are available:
● SAP_DSIM_TCR_T DSiM Transactional Apps● SAP_DSIM_TCR_T DSiM Transactional Apps
13.1 Business Roles for SAP Demand Signal Management, version for SAP BW/4HANA
The following table provides an overview over the business roles for SAP Demand Signal Management, version for SAP BW/4HANA and the dependent tasks, authorization roles, business catalogs, and business catalog groups.
SAP Demand Signal Management, version for SAP BW/4HANARoles for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 531
Description
and Tasks
Business Role Authorization Role Business Catalog Business Catalog Group
Configuration Expert SAP_DSIM3_BCR_CONFIG_EXPERT_T
SAP_DSIM_HARM_STATUS_APP
SAP_DSIM_DELVMONITOR_APP
SAP_DSIM_EVENTMONITOR_APP
SAP_DSIM3_BC_CONFIGURATION
SAP_DSIM3_BCG_CONFIGURATION_T
1. and monitor data deliveries2. Create data delivery agreements3. data delivery agreeemnts4. publishing groups5. consolidation6. the data that is relevant for global analyses and consolidate it7. data and convert data to a common target time granularity8. Weighting Factors
Business Analyst SAP_DSIM3_BCR_BUS_ANALYST_T
SAP_DSIM_MANAGE_BOM_APP
SAP_DSIM_HARM_STATUS_APP
SAP_DSIM_GMDATARELEASE_APP
SAP_DSIM_GMDATAPUBLISH_APP
SAP_DSIM3_BC_RELEASE_PUBLISH
SAP_DSIM3_BC_PROD_HARM
SAP_DSIM3_BC_RELEASE_PUBLISH_T
SAP_DSIM3_BCG_PRODHARM_T
1. new external data sources2. errors during the processing of data before it is available for reporting3. external data to internal company data4. errors that occur while mapping or uploading of external data5. cleansing, and synthesizing data6. insights and opportunities to improve processes7. inventory and measure the outcome
Harmonization User (relevant for Level 2 harmonization)
SAP_DSIM_HARMONUSER3_APP
SAP_DSIM_HARM_STATUS_APP
SAP_DSIM3_BC_PROD_HARM_LVL2
SAP_DSIM3_BCG_PRODHARM_LVL2_T
1. and resolve work items2. allowed attribute values and derivation instruction sets for attribute value harmonization3. and edit mapping of uploaded source products and locations to harmonized products and locations4. and edit harmonized products and locations5. harmonized products and locations for global reporting
532 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Roles for SAP Demand Signal Management, version for SAP BW/4HANA
Description
and Tasks
Business Role Authorization Role Business Catalog Business Catalog Group
System Administrator SAP_DSIM3_BCR_SYSTEM_ADMIN_T
SAP_DSIM_HARM_STATUS_APP
SAP_DSIM_DELVMONITOR_APP
SAP_DSIM_EVENTMONITOR_APP
SAP_DSIM3_BC_SYSTEMMONITORING
SAP_DSIM3_BCG_SYSADMIN_T
1. a seamless running of background jobs2. errors that occur during the execution of background jobs3. up the File System4. Installation Backup5. the Configuration processes
IBP Integration User SAP_DSIM_BCR_IBPINTEGRATION
SAP_DSIM_IBPINTEGRATION_APP
SAP_DSIM_BC_IBPINTEGRATION
SAP_DSIM_BCG_IBPINTEGRATION
1. the quality and status of POS data that is relevant for IBP integration2. individual release statuses for individual customer/manufacturer DC combinations3. obsolete release statuses, release dates, and time periods4. mass releases of POS data to IBP
13.2 Administrator for Demand Data Foundation
SAP_DSIMDDF_CONFIGURATION_2
Use
The tasks of an administrator for Demand Data Foundation are to review and adapt the Customizing settings, configure the system, and run reports.
Activities
Typical activities for this role are:
1. Review and adapt the Customizing for Cross-Application Components under Demand Data Foundation2. Configure the data flow in SAP Business Warehouse (SAP BW)3. (Optional) Configure the query schema for SAP Data Services queries4. Define location calendars
SAP Demand Signal Management, version for SAP BW/4HANARoles for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 533
5. Delete account assignments6. Configure the data upload7. Define data delivery agreements for retailer and retail panel data, as well as their dependent objects8. Define configurable CSV formatted files for upload in the UI for Manage Agreements9. Map external fields for retailer and retail panel data in the UI for Manage Agreements10. Manage hierarchy levels and descriptions11. Monitor data upload jobs using the Monitor Jobs12. Synchronize data delivery agreements from the test system to the production system on Web UI
Synchronization of Data Delivery Agreements13. Configure quality validation14. Set up data harmonization15. Set up attribute value harmonization on Web UIs Set Source System for Value Restrictions and Select
Attributes for Value Restriction16. Run reports for data harmonization17. Define periods of sales deviations for data enrichment18. Provide images for Global Market Share Analysis19. Delete unused profiles
13.3 Data Upload Supervisor
SAP_DSIM_PROCESS_MONITOR_2
Use
The tasks of the data upload supervisor are the following:
1. Control the background jobs that are used in the automatic data upload, it can be done in the Monitor Jobs, or in transaction Job Selection (SE37).
2. Monitor the data uploadThe supervisor can do this in the Monitor Deliveries .
3. Correct any errors in the upload processThe supervisor can monitor the data upload for any errors in the Monitor Deliveries UI and take action to correct those errors in SAP Business Warehouse, for example, in transaction Data Warehousing Workbench: Modeling (RSPC) to repair process chains.
4. Change the statuses of objects in the data upload to control the upload processThe supervisor can change object statuses in the Monitor Deliveries.
Activities
Typical activities for this role are monitoring and making manual changes to the data upload in SAP Demand Signal Management, version for SAP BW/4HANA. The Data Upload Supervisor ensures that the file system and file transfer is set up correctly and data can be uploaded successfully.
534 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Roles for SAP Demand Signal Management, version for SAP BW/4HANA
The data upload supervisor can perform the following activities:
1. Start and stop jobs2. Monitor application logging and system messages for jobs, processes, process steps, and process chains3. Cancel processes and process steps4. Reload data deliveries5. Skip steps6. Put steps on hold and restart them7. Execute steps manually and set them to completed
SAP Demand Signal Management, version for SAP BW/4HANARoles for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 535
14 Apps for SAP Demand Signal Management, version for SAP BW/4HANA
As of SAP Demand Signal Management, version for SAP BW/4HANA, all the apps for this solution are available on the SAP Fiori Launchpad. You can access the apps from this launchpad and also through individual transactions. The apps are now categorized under specific catalogs in the launchpad. These catalogs are based on the business roles assigned to each app.
The following transactional and analytical apps that are designed based on SAP UI5 are available for SAP Demand Signal Management, version for SAP BW/4HANA:
● Manage Agreements● Manage Context● Manage Data Origin● Manage Data Provider● Manage Derivation Instruction Sets (Products/Locations)● Manage Objects (Products/Locations)● Manage Region● Manage Time Derivation● Match External Hierarchy● Monitor Deliveries● Plan Data Deliveries● Restrict Attribute Value● Harmonization Status● Manage Bill of Materials● Monitor Jobs
Related Information
Manage Agreements [page 537]Manage Contexts [page 539]Manage Data Origin [page 542]Manage Data Provider [page 544]Manage Derivation Instruction Sets (Products/Locations) [page 546]Manage Objects (Products/Locations) [page 549]Manage Region [page 551]Manage Time Derivation [page 554]Match External Hierarchy [page 556]Monitor Deliveries [page 558]
536 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
Plan Data Deliveries [page 561]Restrict Attribute Value [page 563]Harmonization Status [page 565]Manage Bill of Materials [page 568]Monitor Jobs [page 570]
14.1 Manage Agreements
Use
With the master data app Manage Agreements you can create and maintain agreements corresponding to deliveries in SAP Demand Signal Management, version for SAP BW/4HANA.
Key Features
● Create and maintain agreements along with data sets, file sets, file formats, filters, mapping and quality validation parameters.
● View a list of existing agreements.● Filter the list of agreements via robust filter bar facilitated with value helps● Case insensitive, live search for all value helps
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: UIDSIM4H 100
Related Apps
Monitor Deliveries [page 558]
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 537
Manage Contexts [page 539]
Manage Region [page 551]
Manage Data Provider [page 544]
Manage Data Origin [page 542]
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
14.1.1 App Implementation: Manage Agreements
Technical Data
The following tables list technical objects specific to the Manage Agreements app:
Back- End Components
OData Service Authorization Role (PFCG Role)
/DDF/DDAGR SAP_DSIM_DDAGR_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Backend Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: UI5 Application
Component Technical Name
UI5 application /DDF/DDAGR
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
For more information, also see the chapter Activate SICF Services for SAP Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_CONF_EXPERT_T
Business Catalog SAP_DSIMBW4H_BC_CONFIGURATION
538 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
Component Technical Name
Business Catalog Group SAP_DSIMBW4H_BCG_CONFIGURATION_T
Technical Role SAP_DSIMBW4H_TCR_T
Technical Catalog SAP_DSIMBW4H_TC_T
LPD_CUST Role DSIM
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles on the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Prerequisites for Configuration
Before implementing the app, you must ensure the following:
TO BE ADDED
More Information
For general information about how to implement SAP Fiori apps, see the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
14.1.2 App Extensibility: Manage Agreements
The Manage Agreements app is not suitable for extension.
14.2 Manage Contexts
With the master data app Manage Contexts you can create and maintain contexts in SAP Demand Signal Management, version for SAP BW/4HANA.
Key Features
● Create new contexts● View a list of existing contexts● Maintain existing contexts● Delete the obsolete contexts
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 539
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: UIDSIM4H 100
Related Apps
Manage Agreements [page 537]
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
14.2.1 App Implementation: Manage Contexts
Technical Data
The following tables list technical objects specific to the Manage Contexts app:
Back-End Components
OData Service Authorization Role (PFCG Role)
/DDF/DDAGR SAP_DSIM_DDAGR_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Back-End Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori
Front-End Server: UI5 Application
540 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
Component Technical Name
UI5 application /DDF/CONTEXT
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
Front-End Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_CONF_EXPERT_T
Business Catalog SAP_DSIMBW4H_BC_CONFIGURATION
Business Catalog Group SAP_DSIMBW4H_BCG_CONFIGURATION_T
Technical Role SAP_DSIMBW4H_TCR_T
Technical Catalog SAP_DSIMBW4H_TC_T
LPD_CUST Role DSIM
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles in the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori
Prerequisites for Configuration
Before implementing the app, you must ensure the following:
TO BE ADDED
More Information
For general information about how to implement SAP Fiori apps, see the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
14.2.2 App Extensibility: Manage Contexts
Manage Contexts app is not suitable for extension.
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 541
14.3 Manage Data Origin
Use
With the master data app Manage Data Origin you can create and maintain origins and their corresponding harmonization parameters in SAP Demand Signal Management, version for SAP BW/4HANA.
Key Features
● Create and maintain data origins along with harmonization parameters● View a list of existing agreements● Filter the list of origins via robust filter bar facilitated with value helps● Case insensitive, live search for all value helps
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: UIDSIM4H 100
Related Apps
Manage Agreements [page 537]
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
542 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
14.3.1 App Implementation: Manage Data Origin
Technical Data
The following tables list technical objects specific to the Manage Data Origin app:
Back- End Components
OData Service Authorization Role (PFCG Role)
/DDF/DATA_ORIGIN SAP_DSIM_DATAORIGIN_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Backend Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: UI5 Application
Component Technical Name
UI5 application /DDF/DATAORIGIN
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
For more information, also see the chapter Activate SICF Services for SAP Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_CONF_EXPERT_T
Business Catalog SAP_DSIMBW4H_BC_CONFIGURATION
Business Catalog Group SAP_DSIMBW4H_BCG_CONFIGURATION_T
Technical Role SAP_DSIMBW4H_TCR_T
Technical Catalog SAP_DSIMBW4H_TC_T
LPD_CUST Role DSIM
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles on the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Prerequisites for Configuration
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 543
Before implementing the app, you must ensure the following:
TO BE ADDED
More Information
For general information about how to implement SAP Fiori apps, see the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
14.3.2 App Extensibility: Manage Data Origin
Manage Data Origin is not suitable for extension.
14.4 Manage Data Provider
Use
With the master data app Manage Data Provider you can create and maintain data providers in SAP Demand Signal Management, version for SAP BW/4HANA.
Key Features
● Create new data providers.● View a list of existing data providers.● Maintain (edit/delete) existing data providers.
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: UIDSIM4H 100
544 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
Related Apps
Manage Agreements [page 537]
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
14.4.1 App Implementation: Manage Data Provider
Technical Data
The following tables list technical objects specific to the Manage Data Provider app:
Back- End Components
OData Service Authorization Role (PFCG Role)
/DDF/DDAGR SAP_DSIM_DDAGR_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Backend Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: UI5 Application
Component Technical Name
UI5 application /DDF/DATAPROVIDERS
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
For more information, also see the chapter Activate SICF Services for SAP Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_CONF_EXPERT_T
Business Catalog SAP_DSIMBW4H_BC_CONFIGURATION
Business Catalog Group SAP_DSIMBW4H_BCG_CONFIGURATION_T
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 545
Component Technical Name
Technical Role SAP_DSIMBW4H_TCR_T
Technical Catalog SAP_DSIMBW4H_TC_T
DSIM
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles on the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Prerequisites for Configuration
Before implementing the app, you must ensure the following:
TO BE ADDED
More Information
For general information about how to implement SAP Fiori apps, see the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
14.4.2 App Extensibility: Manage Data Provider
Manage Data Provider is not suitable for extension.
14.5 Manage Derivation Instruction Sets (Products/Locations)
Use
With the transactional app Manage Derivation Instruction Sets (Products/Locations) you can create and maintain instruction sets and headers in SAP Demand Signal Management, version for SAP BW/4HANA.
546 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
Key Features
● Create and maintain instruction set headers● View a list of existing instruction set headers● Filter the list of headers via robust filter bar facilitated with value helps● Case insensitive, live search for all value helps● Maintain instruction sets in each header in a hierarchical manner
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: UIDSIM4H 100
Related Apps
TO BE ADDED
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
14.5.1 App Implementation: Manage Derivation Instruction Sets
Technical Data
The following tables list technical objects specific to the Manage Derivation Instruction Sets app:
Back- End Components
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 547
OData Service Authorization Role (PFCG Role)
/DDF/AVD SAP_DSIM_AVD_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Backend Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: UI5 Application
Component Technical Name
UI5 application /DDF/AVD
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
For more information, also see the chapter Activate SICF Services for SAP Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_CONF_EXPERT_T
Business Catalog SAP_DSIMBW4H_BC_CONFIGURATION
Business Catalog Group SAP_DSIMBW4H_BCG_CONFIGURATION_T
Technical Role SAP_DSIMBW4H_TCR_T
Technical Catalog SAP_DSIMBW4H_TC_T
LPD_CUST Role DSIM
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles on the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Prerequisites for Configuration
Before implementing the app, you must ensure the following:
TO BE ADDED
More Information
For general information about how to implement SAP Fiori apps, see the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
548 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
14.5.2 App Extensibility: Manage Derivation Instruction Sets
Manage Derivation Instruction Sets is not suitable for extension.
14.6 Manage Objects (Products/Locations)
Use
With the transactional app Manage Harmonized Objects (Products and Locations) you can maintain source and harmonized objects in SAP Demand Signal Management, version for SAP BW/4HANA.
Key Features
● Maintain Source and Harmonized objects (both products and locations)● View a list of objects● Filter the list of objects via robust filter bar facilitated with value helps● Case insensitive, live search for all value helps● Maintain mappings of source objects with harmonized objects
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: UIDSIM4H 100
Related Apps
TO BE ADDED
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 549
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
14.6.1 App Implementation: Manage Objects
Technical Data
The following tables list technical objects specific to the Manage Objects app:
Back- End Components
OData Service Authorization Role (PFCG Role)
/DDF/OBJECT SAP_DSIM_OBJECTS_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Backend Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: UI5 Application
Component Technical Name
UI5 application /DDF/OBJECTS
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
For more information, also see the chapter Activate SICF Services for SAP Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_CONF_EXPERT_T
Business Catalog SAP_DSIMBW4H_BC_CONFIGURATION
Business Catalog Group SAP_DSIMBW4H_BCG_CONFIGURATION_T
Technical Role SAP_DSIMBW4H_TCR_T
Technical Catalog SAP_DSIMBW4H_TC_T
LPD_CUST Role DSIM
550 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
Component Technical Name
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles on the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Prerequisites for Configuration
Before implementing the app, you must ensure the following:
TO BE ADDED
More Information
For general information about how to implement SAP Fiori apps, see the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
14.6.2 App Extensibility: Manage Objects
Manage Objects is not suitable for extension.
14.7 Manage Region
Use
With the master data app Manage Region you can create and maintain regions in SAP Demand Signal Management, version for SAP BW/4HANA.
Key Features
● Create new regions● View a list of existing regions● Edit and delete existing regions
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 551
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: UIDSIM4H 100
Related Apps
Manage Agreements [page 537]
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
14.7.1 App Implementation: Manage Region
Technical Data
The following tables list technical objects specific to the Manage Region app:
Back- End Components
OData Service Authorization Role (PFCG Role)
/DDF/DDAGR SAP_DSIM_DDAGR_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Backend Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: UI5 Application
Component Technical Name
UI5 application /DDF/REGIONS
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
552 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
For more information, also see the chapter Activate SICF Services for SAP Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_CONF_EXPERT_T
Business Catalog SAP_DSIMBW4H_BC_CONFIGURATION
Business Catalog Group SAP_DSIMBW4H_BCG_CONFIGURATION_T
Technical Role SAP_DSIMBW4H_TCR_T
Technical Catalog SAP_DSIMBW4H_TC_T
LPD_CUST Role DSIM
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles on the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Prerequisites for Configuration
Before implementing the app, you must ensure the following:
TO BE ADDED
More Information
For general information about how to implement SAP Fiori apps, see the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
14.7.2 App Extensibility: Manage Region
Manage Region is not suitable for extension.
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 553
14.8 Manage Time Derivation
Use
With the master data app Manage Time Derivation you can create and maintain time derivations in SAP Demand Signal Management, version for SAP BW/4HANA.
Key Features
● Create new time derivation rules● View a list of existing time derivations● Edit and delete existing time derivations
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: UIDSIM4H 100
Related Apps
Manage Agreements [page 537]
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
554 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
14.8.1 App Implementation: Manage Time Derivation
Technical Data
The following tables list technical objects specific to the Manage Time Derivation app:
Back- End Components
OData Service Authorization Role (PFCG Role)
/DDF/DDAGR SAP_DSIM_DDAGR_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Backend Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: UI5 Application
Component Technical Name
UI5 application /DDF/TD
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
For more information, also see the chapter Activate SICF Services for SAP Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_CONF_EXPERT_T
Business Catalog SAP_DSIMBW4H_BC_CONFIGURATION
Business Catalog Group SAP_DSIMBW4H_BCG_CONFIGURATION_T
Technical Role SAP_DSIMBW4H_TCR_T
Technical Catalog SAP_DSIMBW4H_TC_T
LPD_CUST Role DSIM
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles on the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Prerequisites for Configuration
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 555
Before implementing the app, you must ensure the following:
TO BE ADDED
More Information
For general information about how to implement SAP Fiori apps, see the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
14.8.2 App Extensibility: Manage Time Derivation
Manage Time Derivation is not suitable for extension.
14.9 Match External Hierarchy
Use
With the transactional app Match External Hierarchy, you classify records to suggested to other record values in SAP Demand Signal Management, version for SAP BW/4HANA.
Key Features
● Classify attribute values as per the suggestions provided by machine learning engine● View all suggestions with confidence value● Input values other than the suggested one● Unsend the records sent to machine learning
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA
556 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
○ Required software components: UIDSIM4H 100
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
14.9.1 App Implementation: Match External Hierarchy
Technical Data
The following tables list technical objects specific to the Match External Hierarchy app:
Back- End Components
OData Service Authorization Role (PFCG Role)
/DDF/EXT_HIER SAP_DSIM_EXT_HIER_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Backend Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: UI5 Application
Component Technical Name
UI5 application /DDF/ MATCH_EXTHIER
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
For more information, also see the chapter Activate SICF Services for SAP Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_CONF_EXPERT_T
Business Catalog SAP_DSIMBW4H_BC_CONFIGURATION
Business Catalog Group SAP_DSIMBW4H_BCG_CONFIGURATION_T
Technical Role SAP_DSIMBW4H_TCR_T
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 557
Component Technical Name
Technical Catalog SAP_DSIMBW4H_TC_T
LPD_CUST Role DSIM
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles on the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Prerequisites for Configuration
Before implementing the app, you must ensure the following:
TO BE ADDED
More Information
For general information about how to implement SAP Fiori apps, see the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
14.9.2 App Extensibility: Match External Hierarchy
Match External Hierarchy is not suitable for extension.
14.10 Monitor Deliveries
Use
With the master data app Monitor Deliveries, you can create and maintain deliveries and their processes in SAP Demand Signal Management, version for SAP BW/4HANA.
Key Features
● Get an overview over the statuses of received and planned data deliveries● Assign received data deliveries that were not assigned automatically to planned data deliveries● View all processed corresponding to a delivery
558 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
● View quality validation report for a process
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: UIDSIM4H 100
Related Apps
TO BE ADDED
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
14.10.1 App Implementation: Monitor Deliveries
Technical Data
The following tables list technical objects specific to the Monitor Deliveries app:
Back- End Components
OData Service Authorization Role (PFCG Role)
/DDF/DDEL_MONITOR SAP_DSIMDDF_DELVMONITOR_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Backend Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: UI5 Application
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 559
Component Technical Name
UI5 application /DDF/DEL_MON
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
For more information, also see the chapter Activate SICF Services for SAP Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_SYS_ADMIN_T
Business Catalog SAP_DSIMBW4H_BC_SYSTEMMONITORING
Business Catalog Group SAP_DSIMBW4H_BCG_SYSADMIN_T
Technical Role SAP_DSIMBW4H_TCR_T
Technical Catalog SAP_DSIMBW4H_TC_T
LPD_CUST Role DSIM
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles on the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Prerequisites for Configuration
Before implementing the app, you must ensure the following:
TO BE ADDED
More Information
For general information about how to implement SAP Fiori apps, see the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
14.10.2 App Extensibility: Monitor Deliveries
Monitor Deliveries is not suitable for extension.
560 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
14.11 Plan Data Deliveries
Use
With the transactional app Plan Data Deliveries you can plan on which dates data deliveries are expected SAP Demand Signal Management, version for SAP BW/4HANA.
Key Features
● Get an overview over the planned deliveries● Plan new data deliveries● Drop or delete planned data deliveries that are no longer valid
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: UIDSIM4H 100
Related Apps
Monitor Deliveries [page 558]
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 561
14.11.1 App Implementation: Plan Data Deliveries
Technical Data
The following tables list technical objects specific to the Plan Data Deliveries app:
Back- End Components
OData Service Authorization Role (PFCG Role)
/DDF/DDEL_PLAN SAP_DSIMDDF_PLANDELIVERIES_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Backend Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: UI5 Application
Component Technical Name
UI5 application /DDF/DEL_PLAN
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
For more information, also see the chapter Activate SICF Services for SAP Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_SYS_ADMIN_T
Business Catalog SAP_DSIMBW4H_BC_SYSTEMMONITORING
Business Catalog Group SAP_DSIMBW4H_BCG_SYSADMIN_T
Technical Role SAP_DSIMBW4H_TCR_T
Technical Catalog SAP_DSIMBW4H_TC_T
LPD_CUST Role DSIM
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles on the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Prerequisites for Configuration
562 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
Before implementing the app, you must ensure the following:
TO BE ADDED
More Information
For general information about how to implement SAP Fiori apps, see the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
14.11.2 App Extensibility: Plan Data Deliveries
Plan Data Deliveries is not suitable for extension.
14.12 Restrict Attribute Value
Use
With the master data app Restrict Attribute Value you can create and maintain agreements corresponding to deliveries in SAP Demand Signal Management, version for SAP BW/4HANA.
Key Features
● Create attribute value restrictions in hierarchical manner● Maintain existing restrictions
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: UIDSIM4H 100
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 563
Related Apps
Manage Derivation Instruction Sets (Products/Locations) [page 546]
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
14.12.1 App Implementation: Restrict Attribute Value
Technical Data
The following tables list technical objects specific to the Restrict Attribute Value app:
Back- End Components
OData Service Authorization Role (PFCG Role)
/DDF/AVM SAP_DSIM_AVM_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Backend Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: UI5 Application
Component Technical Name
UI5 application /DDF/AVM
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
For more information, also see the chapter Activate SICF Services for SAP Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_CONF_EXPERT_T
Business Catalog SAP_DSIMBW4H_BC_CONFIGURATION
Business Catalog Group SAP_DSIMBW4H_BCG_CONFIGURATION_T
564 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
Component Technical Name
Technical Role SAP_DSIMBW4H_TCR_T
Technical Catalog SAP_DSIMBW4H_TC_T
LPD_CUST Role DSIM
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles on the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Prerequisites for Configuration
Before implementing the app, you must ensure the following:
TO BE ADDED
More Information
For general information about how to implement SAP Fiori apps, see the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
14.12.2 App Extensibility: Restrict Attribute Value
Restrict Attribute Value is not suitable for extension.
14.13 Harmonization Status
Use
With the transactional app Harmonization Status, you in your role of business analyst or a brand manager can view the status of the work items for various object types that are harmonized.
Harmonization processing is a complex process involving multiple steps, which are very performance intensive. It would benefit you to know what the system is currently processing and when will the changes be available to generate the required reports.
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 565
Key Features
● View pending work items for different object types● View within each object type, the number of relevant process types● View for each process, further break-up of the pending work items grouped by Data Origins
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: UIDSIM4H 100
Related Apps
Monitor Jobs [page 570]
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
14.13.1 App Implementation: Harmonization Status
Technical Data
The following tables list technical objects specific to the Harmonization Status app:
Back- End Components
OData Service Authorization Role (PFCG Role)
/DDF/HARM_STATUS SAP_DSIM_HARM_STATUS_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Backend Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori.
566 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
Frontend Server: UI5 Application
Component Technical Name
UI5 application /DDF/HARM_STATUS
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
For more information, also see the chapter Activate SICF Services for SAP Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_SYS_ADMIN_T
Business Catalog SAP_DSIMBW4H_BC_SYSTEMMONITORING
Business Catalog Group SAP_DSIMBW4H_BCG_SYSADMIN_T
Technical Role SAP_DSIMBW4H_TCR_T
Technical Catalog SAP_DSIMBW4H_TC_T
LPD_CUST Role DSIM
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles on the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Prerequisites for Configuration
Before implementing the app, you must ensure the following:
TO BE ADDED
More Information
For general information about how to implement SAP Fiori apps, see the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
14.13.2 App Extensibility: Harmonization Status
You can extend the Harmonization Status application according to your business needs for different aspects. For this purpose, the following extensibility options are available:
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 567
Back End/ABAP
Design Time: Gateway Entity Design Time: Extension IncludeRun Time: Superclass and Method to Be Redefined
Process /DDF/INCL_PROCESS Not applicable
HarmObjectType /DDF/INCL_OBJECT_TYPE Not applicable
QueuedItem /DDF/INCL_QUEUED_ITEM Not applicable
For more information about Fiori apps, please refer to the Fiori documentation at the on the SAP Help Portal at https://help.sap.com/dsimbw4h.
SAP Fiori for Business Suite More SAP Fiori Products for SAP Business Suite SAP Fiori for SAP Demand Signal Management, version for SAP BW/4HANA Harmonization Status
14.14 Manage Bill of Materials
Use
With the transactional app Manage Bill of Materials, you as a Business Analyst can create or change the Bill of Materials (BOM). Through this application, you can analyze the effectiveness of retail promotions on the sale of products sold as a part of BOM.
Key Features
● Create new BOMThis application allows you to create new bill of materials. You can upload data into BW InfoProviders from SAP ECC or as flat files. The three important parameters while uploading the BOM are the Product Reference ID, Context, and the Transaction Unit of Measure. You can select an item and either choose to copy or delete this BOM.
● Add or delete or edit the validities in a BOMYou can set validities for each BOM and set specific configurations for each of them using the Split Validity feature. These configurations are then taken into consideration during the specified validity schedule.
● Add or delete or edit the components within the validities in a BOM● Create a new BOM by copying information from an existing BOM● View BOM explosion details
You can explode the BOM to split the sales and stock data into lower details.● View where used list of a BOM
This helps you to determine the dependencies of the selected BOM before you decide to edit the validities or delete the BOM.
568 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: UIDSIM4H 100
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
14.14.1 App Implementation: Manage Bill of Materials
Technical Data
The following tables list technical objects specific to the Manage Bill of Materials app:
Back- End Components
OData Service Authorization Role (PFCG Role)
/DDF/BOM SAP_DSIM_MANAGE_BOM_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Backend Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: UI5 Application
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
For more information, also see the chapter Activate SICF Services for SAP Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_BUS_ANALYST_T
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 569
Component Technical Name
Business Catalog SAP_DSIMBW4H_BC_HARMONIZATION
Business Catalog Group SAP_DSIMBW4H_BCG_HARMONIZATION_T
Technical Role SAP_DSIMBW4H_TCR_T
Technical Catalog SAP_DSIMBW4H_TC_T
LPD_CUST Role DSIM
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles on the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
For more information about Fiori apps, please refer to the Fiori documentation at the on the SAP Help Portal at https://help.sap.com/dsimbw4h.
SAP Fiori for Business Suite More SAP Fiori Products for SAP Business Suite SAP Fiori for SAP Demand Signal Management, version for SAP BW/4HANA Manage Bill of Materials
14.14.2 App Extensibility: Manage Bill of Materials
Manage Bill of Materials is not suitable for extension.
14.15 Monitor Jobs
Use
With the transactional app Monitor Jobs,
you as a system administrator can start or stop a job that occurs in the backend system for SAP Demand Signal Management, version for SAP BW/4HANA.
Many processes occur in the backend of the system. You can use this application to track the progress of these processes. It is important to ensure that the Control Job is in Start status before starting any other job.
570 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
Key Features
● Start or stop a backend job● View details of backend jobs including the last five application log entries
Technical Requirements
The following software products must be available in your system landscape:
● Back-end system (business data)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: DSIM4H 100
● UI add-on (front-end components)○ Required product release: SAP Demand Signal Management, version for SAP BW/4HANA○ Required software components: UIDSIM4H 100
Related Apps
Harmonization Status [page 565]
Component for Customer Messages
CA-DS4-UI Demand Signal Management on BW4HANA - Frontend UI
14.15.1 App Implementation: Monitor Jobs
Technical Data
The following tables list technical objects specific to the Monitor Jobs app:
Back- End Components
OData Service Authorization Role (PFCG Role)
/DDF/JOB_MONITOR SAP_DSIM_JOB_MONITOR_APP
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Backend Server in the SAP Fiori Overview on the SAP Help Portal at http://help.sap.com/fiori.
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 571
For more information about the activation of the UI5 application (ICF service), see the chapter Configuring the OData Services in the Installation Guide for SAP Demand Signal Management, version for SAP BW/4HANA on the SAP Help Portal at https://help.sap.com/dsimbw4h.
For more information, also see the chapter Activate SICF Services for SAP Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
Frontend Server: SAP Fiori Launchpad Components
Component Technical Name
Business Role SAP_DSIMBW4H_BCR_SYS_ADMIN_T
Business Catalog SAP_DSIMBW4H_BC_SYSTEMMONITORING
Business Catalog Group SAP_DSIMBW4H_BCG_SYSADMIN_T
Technical Role SAP_DSIMBW4H_TCR_T
Technical Catalog SAP_DSIMBW4H_TC_T
LPD_CUST Role DSIM
LPD_CUST Instance TRANSACTIONAL
For more information about the steps to be performed, see Setup of Catalogs, Groups, and Roles on the Fiori Launchpad in the configuration information for SAP Fiori for SAP Business Suite on the SAP Help Portal at http://help.sap.com/fiori.
More Information
For more information about Fiori apps, please refer to the Fiori documentation at the on the SAP Help Portal at https://help.sap.com/dsimbw4h.
SAP Fiori for Business Suite More SAP Fiori Products for SAP Business Suite SAP Fiori for SAP Demand Signal Management, version for SAP BW/4HANA Monitor Jobs
14.15.2 App Extensibility: Monitor Jobs
You can extend the app according to your business needs for different aspects. For this purpose, the following extensibility options are available:
Back End/ABAP
Design Time: Gateway Entity Design Time: Extension IncludeRun Time: Superclass and Method to Be Redefined
Job /DDF/INCL_JOB_MONITOR Not applicable
ApplicationLog /DDF/INCL_APPL_LOG Not applicable
572 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
Apps for SAP Demand Signal Management, version for SAP BW/4HANA
Design Time: Gateway Entity Design Time: Extension IncludeRun Time: Superclass and Method to Be Redefined
ApplicationLogDetail /DDF/INCL_APPL_LOG_DETAIL Not applicable
Extension Points
For adding additional UI elements, for example, the following extension points are available:
View Extension Point Use
JobDetails.view.xml extJobDetailsFooter Allows you to add additional button in the Job Details Footer Bar
JobList.view.xml extJobListToolbar Allows you to add buttons in Job List Toolbar
JobList.view.xml extJobListTableColumn Allows you to add additional columns in Job List Table
JobList.view.xml extJobListTableItem Allows you to add additional fields in Job List Table
For more information about Fiori apps, please refer to the Fiori documentation at the on the SAP Help Portal at https://help.sap.com/dsimbw4h.
SAP Fiori for Business Suite More SAP Fiori Products for SAP Business Suite SAP Fiori for SAP Demand Signal Management, version for SAP BW/4HANA Monitor Jobs
SAP Demand Signal Management, version for SAP BW/4HANAApps for SAP Demand Signal Management, version for SAP BW/4HANA C O N F I D E N T I A L 573
15 DSIM Fiori Authorization Roles OData
For more information about authorization roles and assigned OData services, see Roles, Users and Authorizations on Back-End Server in the SAP Fiori Overview on SAP Help Portal at http://help.sap.com/fiori
.
574 C O N F I D E N T I A LSAP Demand Signal Management, version for SAP BW/4HANA
DSIM Fiori Authorization Roles OData
Important Disclaimers and Legal Information
HyperlinksSome links are classified by an icon and/or a mouseover text. These links provide additional information.About the icons:
● Links with the icon : You are entering a Web site that is not hosted by SAP. By using such links, you agree (unless expressly stated otherwise in your agreements with SAP) to this:
● The content of the linked-to site is not SAP documentation. You may not infer any product claims against SAP based on this information.● SAP does not agree or disagree with the content on the linked-to site, nor does SAP warrant the availability and correctness. SAP shall not be liable for any
damages caused by the use of such content unless damages have been caused by SAP's gross negligence or willful misconduct.
● Links with the icon : You are leaving the documentation for that particular SAP product or service and are entering a SAP-hosted Web site. By using such links, you agree that (unless expressly stated otherwise in your agreements with SAP) you may not infer any product claims against SAP based on this information.
Beta and Other Experimental FeaturesExperimental features are not part of the officially delivered scope that SAP guarantees for future releases. This means that experimental features may be changed by SAP at any time for any reason without notice. Experimental features are not for productive use. You may not demonstrate, test, examine, evaluate or otherwise use the experimental features in a live operating environment or with data that has not been sufficiently backed up.The purpose of experimental features is to get feedback early on, allowing customers and partners to influence the future product accordingly. By providing your feedback (e.g. in the SAP Community), you accept that intellectual property rights of the contributions or derivative works shall remain the exclusive property of SAP.
Example CodeAny software coding and/or code snippets are examples. They are not for productive use. The example code is only intended to better explain and visualize the syntax and phrasing rules. SAP does not warrant the correctness and completeness of the example code. SAP shall not be liable for errors or damages caused by the use of example code unless damages have been caused by SAP's gross negligence or willful misconduct.
Gender-Related LanguageWe try not to use gender-specific word forms and formulations. As appropriate for context and readability, SAP may use masculine word forms to refer to all genders.
SAP Demand Signal Management, version for SAP BW/4HANAImportant Disclaimers and Legal Information C O N F I D E N T I A L 575
www.sap.com/contactsap
© 2018 SAP SE or an SAP affiliate company. All rights reserved.
No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP SE or an SAP affiliate company. The information contained herein may be changed without prior notice.
Some software products marketed by SAP SE and its distributors contain proprietary software components of other software vendors. National product specifications may vary.
These materials are provided by SAP SE or an SAP affiliate company for informational purposes only, without representation or warranty of any kind, and SAP or its affiliated companies shall not be liable for errors or omissions with respect to the materials. The only warranties for SAP or SAP affiliate company products and services are those that are set forth in the express warranty statements accompanying such products and services, if any. Nothing herein should be construed as constituting an additional warranty.
SAP and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP SE (or an SAP affiliate company) in Germany and other countries. All other product and service names mentioned are the trademarks of their respective companies.
Please see https://www.sap.com/about/legal/trademark.html for additional trademark information and notices.
THE BEST RUN