nota sap_141242

8
SAP Note Header Data Symptom We have measured the runtime for Release 4.0B in an in-house system and the results are described below. If you have a similar data constellation in your system but your figures are much worse and the poor runtime cannot be traced back to database problems (the database statistics of the relevant tables are current, there are no unnecessary full-table scans, there are no irregularities in the data exchange between the database server and the application server and so on), you should perform a closer performance analysis in your system. Create a customer message under the component IS-R-BD- ART. Other Terms MM41, MM42, ARTMAS, ALE, data transfer, distribution Reason and Prerequisites The runtime measurements are based on Release 4.0B with Support Package 11 and were carried out for dialog maintenance and data transfer using IDoc processing. The values of the runtime measurements apply only to Release 4.0B. However, the recommendations for improving the performance can also be used for Releases 4.5A, 4.5B and subsequent releases. 1. Test data structure The measurements were made on the basis of the following test data: l Six distribution centers (one distribution center is a global reference distribution center) l 96 stores (one store is a global reference store) l One storage location for each site l One warehouse number with a storage type l Eight distribution chains (one reference store is specific to the distribution chain) l There is a three-level material group hierarchy above the basic material group l Two variant-creating characteristics and one information characteristic are assigned to the basic material group l The characteristics have ten characteristic values each 2. Test scenarios The processing times for certain function modules are determined by runtime measurements. The processing times were investigated for a single article, a generic article with one variant, a generic article with 10 variants and a generic article with 20 variants. Each time, the scenario of a new creation (transaction MM41) was tested. Nevertheless, the times of the maintenance scenario (transaction MM42) should be on a similar scale. The term 'operation' is used frequently in the following section. The number of data segments to be created is not a size that can be used to measure or compare processing times. The processing times are therefore set in relation to the number of operations. What does an operation mean in the article master? You can maintain several keys (areas of validity, organization levels) for each database table in an article either in dialog mode or by using IDoc processing. The database tables are divided into repeating tables (in dialog mode and using table control, you can maintain several data records for each article, for example, MAKT, MARM, MEAN and so on) and non-repeating tables (every key represents a separate area of validity, for example MARA, MAW1, MARC, MARD and so on). To avoid impairing performance, the system does not process the table keys one after the other and for each table separately (field selection, reference handling, required field checks, consistency checks). Instead, the data is bundled into 'operations' that process one key for each non-repeating table. If there are, for example, several MARC, MBEW and MVKE keys, it groups together one MARC key, one MBEW 141242 - 4.0B: Article master runtime measurement Version 23 Validity: 05.05.2006 - active Language English Released On 05.05.2006 09:59:11 Release Status Released for Customer Component IS-R-BD-ART Articles BC-MID-ALE Integration Technology ALE Priority Recommendations / Additional Info Category Performance Other Components

Upload: jose-oviedo

Post on 04-Jan-2016

27 views

Category:

Documents


0 download

DESCRIPTION

Nota de SAP

TRANSCRIPT

Page 1: Nota SAP_141242

SAP Note

Header Data

Symptom

We have measured the runtime for Release 4.0B in an in-house system and the results are described below. If you have a similar data constellation in your system but your figures are much worse and the poor runtime cannot be traced back to database problems (the database statistics of the relevant tables are current, there are no unnecessary full-table scans, there are no irregularities in the data exchange between the database server and the application server and so on), you should perform a closer performance analysis in your system. Create a customer message under the component IS-R-BD-ART.

Other Terms

MM41, MM42, ARTMAS, ALE, data transfer, distribution

Reason and Prerequisites

The runtime measurements are based on Release 4.0B with Support Package 11 and were carried out for dialog maintenance and data transfer using IDoc processing. The values of the runtime measurements apply only to Release 4.0B. However, the recommendations for improving the performance can also be used for Releases 4.5A, 4.5B and subsequent releases.

1. Test data structure

              The measurements were made on the basis of the following test data:

l Six distribution centers (one distribution center is a global reference distribution center)

l 96 stores (one store is a global reference store)

l One storage location for each site

l One warehouse number with a storage type

l Eight distribution chains (one reference store is specific to the distribution chain)

l There is a three-level material group hierarchy above the basic material group

l Two variant-creating characteristics and one information characteristic are assigned to the basic material group

l The characteristics have ten characteristic values each

2. Test scenarios

              The processing times for certain function modules are determined by runtime measurements. The processing times were investigated for a single article, a generic article with one variant, a generic article with 10 variants and a generic article with 20 variants. Each time, the scenario of a new creation (transaction MM41) was tested. Nevertheless, the times of the maintenance scenario (transaction MM42) should be on a similar scale.

              The term 'operation' is used frequently in the following section. The number of data segments to be created is not a size that can be used to measure or compare processing times. The processing times are therefore set in relation to the number of operations. What does an operation mean in the article master? You can maintain several keys (areas of validity, organization levels) for each database table in an article either in dialog mode or by using IDoc processing. The database tables are divided into repeating tables (in dialog mode and using table control, you can maintain several data records for each article, for example, MAKT, MARM, MEAN and so on) and non-repeating tables (every key represents a separate area of validity, for example MARA, MAW1, MARC, MARD  and so on). To avoid impairing performance, the system does not process the table keys one after the other and for each table separately (field selection, reference handling, required field checks, consistency checks). Instead, the data is bundled into 'operations' that process one key for each non-repeating table. If there are, for example, several MARC, MBEW and MVKE keys, it groups together one MARC key, one MBEW

    141242 - 4.0B: Article master runtime measurement  

Version   23     Validity: 05.05.2006 - active   Language   English

Released On 05.05.2006 09:59:11

Release Status Released for Customer

Component IS-R-BD-ART Articles

BC-MID-ALE Integration Technology ALE

Priority Recommendations / Additional Info

Category Performance

Other Components

Page 2: Nota SAP_141242

key and one MVKE key from each operation and process them as a single operation.

              Example: The IDoc contains data for one MARA, three MARC, two MARD, three MBEW and two MVKE keys. The following operations are the result:

a) The MARA key is combined with the first MARC key and the first matching MARD, MBEW and MVKE keys.

b) The MARA key is combined with the second MARC key and the next matching MARD, MBEW and MVKE keys.

c) The MARA key is combined with the third MARC key and the last matching MBEW key.

              If there is a generic article with x variants, the system generates one operation for each article (so a total of x+1 operations).

3. Processing times during dialog maintenance

              In dialog maintenance, the function module MATERIAL_UPDATE_ALL_RETAIL is critical in determining performance. Function module MATERIAL_ARRAY_UPDATE_RETAIL_2 does not appear because the update runs in asynchronous mode. The measurements therefore apply only to the first function module. The measured values for the processing times show the waiting time between choosing 'Save' and the display of the message that tells you the article was created.

a) Article data structure

                       We investigated dialog processing with the following article data structure:

¡ Basic data

¡ Listing for 100 sites/eight distribution chains

¡ Logistics data for the listed sites and the two global reference sites

¡ Sales data/POS data for the listed distribution chains

                       The following data volumes were recorded for each article that was tested:

                                     Data structure for single article (EA)       Basic data                  Logistics/Sales/POS data   Table    #Data records            Table     #Data records ---------------------------------------------------------------   MARA        1                    MARC        102    MAW1        1                    MPOP        102    MAKT        2                    MARD        102    MARM        3                    MBEW        102    MEAN        3                    MLGN         1    MLAN        1                    MLGT         1                                       MVKE        8                                       WLK2        8

                           Data structure for generic article with one variant (SA_1VAR)       Basic data                  Logistics/Sales/POS data   Table    #Data records            Table     #Data records ---------------------------------------------------------------   MARA        2                    MARC        105    MAW1        2                    MPOP        105    MAKT        4                    MARD        105    MARM        6                    MBEW        105    MEAN        3                    MLGN         2    MLAN        2                    MLGT         2                                       MVKE        16                                       WLK2        16

                       Comments: MEAN data was created only for the variants. MARC, MPOP, MARD and MBEW data for generic articles can only be maintained for the reference sites.

                          Data structure for generic article with 10 variants (SA_10VAR)       Basicdata                  Logistics/Sales/POS data   Table    #Data records            Table     #Data records ---------------------------------------------------------------   MARA        11                    MARC        1023    MAW1        11                    MPOP        1023    MAKT        22                    MARD        1023    MARM        33                    MBEW        1023    MEAN        30                    MLGN         11    MLAN        11                    MLGT         11                                       MVKE        88                                       WLK2        88

                          Data structure for generic article with 20 variants (SA_20VAR)       Basicdata                  Logistics/Sales/POS data   Table    #Data records            Table     #Data records ---------------------------------------------------------------   MARA        21                    MARC        2043    MAW1        21                    MPOP        2043    MAKT        42                    MARD        2043    MARM        63                    MBEW        2043    MEAN        60                    MLGN         21

Page 3: Nota SAP_141242

   MLAN        21                    MLGT         21                                       MVKE        168                                       WLK2        168

b) Measurement results

                           Processing times [sec] for MATERIAL_UPDATE_ALL_RETAIL Article        Basic data only      Basic data and logistics data               Time #Operations     Time    #Operations                     (= #Article)                (= #MVKE) ----------------------------------------------------------------   EA          0,25       1              2,0        8 (1*8) SA_1VAR       0,9        2              6,0      16 (2*8) SA_10VAR      4,0      11            21,0       88 (11*8) SA_20VAR      7,0      21            40,0      168 (21*8)

c) Evaluation

                       From the measurements, we can derive the following rules of thumb:

¡ If only basic data is created, you can expect a processing  time of 0.33 seconds for each article in function module MATERIAL_UPDATE_ALL_RETAIL (this means that a generic article with 20 variants requires about seven seconds).

¡ If basic data and logistics data is created and the reference sites are only processed in the logistics layers (that is, the validity areas of the non-reference sites do not change), you  can expect a processing time of 0.25 seconds for each operation in function module MATERIAL_UPDATE_ALL_RETAIL (this means that about 22 seconds are required to process a generic article with 10 variants and the 88 operations that result from this).

4. Processing times during the distribution of data

              During distribution, the function module MATERIAL_UPDATE_ALL_RETAIL is also critical in determining performance. However, since the data is updated synchronously, the update modules also contribute towards the total runtime. Data distribution is an extreme case because all article data exists 1:1 in the IDoc segments. The number of operations that are processed in MATERIAL_UPDATE_ALL_RETAIL therefore increases in direct proportion to the number of MARC/MVKE segments. In our example, the number of operations is formed from the maximum number of existing MARC or MVKE segments. If, for example, a large number of MLGN or MLGT segments is processed and this number exceeds the number of MARC or MVKE segments, the number of MLGN/MLGT segments determines the maximum number of operations. The processing time for the IDoc is determined by the function module IDOC_INPUT_ARTMAS. The runtime measurement for IDOC_INPUT_ARTMAS includes the COMMIT WORK command. This means that the processing time includes the update modules that do not run until the COMMIT WORK command is executed. The times that are required in the ALE layer for IDoc processing are not included.

a) Article data structure

                       We investigated distribution with the following article data structure:

¡ Basic data

¡ Logistics data for 102 sites

¡ Sales data/POS data for eight distribution chains

                       The same data volume is used as for the determination of processing times in dialog maintenance.

b) Measurement results

                               Processing times [sec] for IDOC_INPUT_ARTMAS Article        Basic data only      Basic data and logistics data               Time #Operations     Time    #Operations                     (= #Article)                (= #MARC) ----------------------------------------------------------------   EA           2,0      1             54,0      102 (102) SA_1VAR        4,0      2            55,0      105 (102+3) SA_10VAR      12,0      11            500,0    1023 (10*102+3) SA_20VAR      24,0      21            970,0    2043 (20*102+3)

                        Processing times [sec] for MATERIAL_UPDATE_ALL_RETAIL Article              Basic data and logistics data                           Time  #Operations                                     (= #MARC) ----------------------------------------------------------   EA                    42,0         102 SA_1VAR                  42,0        105 SA_10VAR                360,0        1023 SA_20VAR                690,0        2043

Page 4: Nota SAP_141242

c) Evaluation

                       From the measurements, we can derive the following rules of thumb:

¡ If you import only basic data with IDoc processing, you can expect a processing time of one second for each article in function module IDOC_INPUT_ARTMAS (this means that a generic article with 20 variants requires about 21 seconds).

¡ If you import basic data and logistics data using IDoc processing and the site key list is not accessed, you can expect a processing  time of 0.5 seconds for each transaction in function module  IDOC_INPUT_ARTMAS (this means that about 500 seconds are required to process an IDoc with 1000 MARC segments during IDoc processing).

¡ About 70% of the total runtime (IDOC_INPUT_ARTMAS) is used by the function module MATERIAL_UPDATE_ALL_RETAIL.

5. Processing times during data transfer

              Unlike during the distribution of data, you can specifying site-specific data for the reference sites in the IDoc during the initial data transfer. To create site-specific data for all other sites, you can use the site key list. The system does not generate an operation in MATERIAL_UPDATE_ALL_RETAIL for the sites specified in the key list. Instead, it creates the data by copying the data of the assigned reference site in function module MATERIAL_ARRAY_UPDATE_RETAIL_2. The processing time for the IDoc is determined in the same way as with the data distribution scenario.

a) Article data structure

                       We investigated data transfer with the following article data structure:

¡ Basic data

¡ Logistics data for 102 sites (logistics data is specified only for the three reference sites - The other sites were transferred using the site key list).

¡ Sales data/POS data for the eight distribution chains.

                       The same data volume is used as for the determination of processing times in dialog maintenance.

b) Measurement results

                            Processing times [sec] for IDOC_INPUT_ARTMAS Article              Basic data and logistics data                           Time  #Operations                                     (= #MVKE) ----------------------------------------------------------   EA                    19,0        8 (1*8) SA_1VAR                  25,0      16 (2*8) SA_10VAR                150,0      88 (11*8) SA_20VAR                275,0      168 (21*8)

                        Processing times [sec] for MATERIAL_UPDATE_ALL_RETAIL Article              Basic data and logistics data                           Time  #Operations                                     (= #MVKE) ----------------------------------------------------------   EA                    5,0        8 (1*8) SA_1VAR                  11,0      16 (2*8) SA_10VAR                35,0      88 (11*8) SA_20VAR                55,0      168 (21*8)

c) Evaluation

                       From the measurements, we can derive the following rules of thumb:

¡ If you import basic data or logistics data using an IDoc and if you use a site key list, you can achieve a processing time of 1.5 to 2 seconds (as is the case in our example). Using the site key list reduces the number of operations, which also reduces the total runtime (this means that about 150 seconds are required for an IDoc that leads to 88 operations in which 1000 MARC segments are created).

¡ Only 25% of the total runtime (IDOC_INPUT_ARTMAS) is consumed by the function module MATERIAL_UPDATE_ALL_RETAIL and total runtime is also lower.

6. Comparison of processing times for MATERIAL_UPDATE_ALL_RETAIL (dialog mode/data transfer)

               Processing times [sec] for MATERIAL_UPDATE_ALL_RETAIL Article          Dialog mode       Data transfer               Time #Operations     Time    #Operations                       (= #MVKE)                (= #MVKE) ----------------------------------------------------------   EA           2,0      8              5,0      8

Page 5: Nota SAP_141242

SA_1VAR        6,0      16            11,0      16 SA_10VAR      21,0      88            35,0      88 SA_20VAR      40,0    168             55,0     168

              In data transfer, the runtime of MATERIAL_UPDATE_ALL_RETAIL is considerably higher than in dialog mode (this is because additional source code is processed during data transfer).

7. Comparison of processing times for IDOC_INPUT_ARTMAS (without/with site key list)

                   Processing times [sec] for IDOC_INPUT_ARTMAS Article        Without key list          With key list               Time #Operations     Time    #Operations                       (= #MARC)                (= #MVKE) ----------------------------------------------------------   EA           54,0      102              19,0      8 SA_1VAR        55,0      105            25,0      16 SA_10VAR      500,0      1023            150,0      88 SA_20VAR      970,0    2043             275,0     168

              If you reduce the number of operations by a factor of 12 (see SA_10VAR or SA_20VAR), the processing time is reduced by a factor of 3.

8. Investigation of other optimization methods

              Certain functions are not required during the initial data transfer. We examined by how much you can reduce the runtime during the first data transfer when you deactivate certain functions. These functions are:

l Calling source determination to determine the default vendor data

l Writing the application log

l Writing the change documents/change pointers

a) The effect of source determination on the processing time

                             Processing times [sec] for IDOC_INPUT_ARTMAS Article     With source determ.      Without source determ.                 w/o          w.          w/o          w.               Key list    Key list   Key list     Key list ------------------------------------------------------------   EA           54,0      19,0              47,0      17,0 SA_1VAR        55,0      25,0            51,0      22,0 SA_10VAR      500,0      150,0            455,0      130,0 SA_20VAR      970,0    275,0             873,0     240,0

                       Rule of thumb: By deactivating source determination, you can reduce the runtime by about 10% (regardless of whether you use the key list).

b) The effect of the application log on the processing time

                             Processing times [sec] for IDOC_INPUT_ARTMAS Article      With application log        Without application log                 w/o          w.          w/o          w.               Key list    Key list   Key list     Key list ------------------------------------------------------------   EA           54,0      19,0              49,0      18,0 SA_1VAR        55,0      25,0            50,0      24,0 SA_10VAR      500,0      150,0            460,0      140,0 SA_20VAR      970,0        275,0      ***,*      ***,*

                       Using the key list reduces the number of messages that are written in the application log. This means that the performance improvement is not so great when the key list is used. Due to the long runtime that would result, we have not performed a separate measurement for SA_20VAR without the application log.

                       Rule of thumb: Removing the application log improves performance by about 10% if no key list is used and by about 5% if the key list is used.

c) The effect of change documents/change pointers on the processing time

                             Processing times [sec] for IDOC_INPUT_ARTMAS Article     With change docs           Without change docs               and change pointers       and change pointers                 w/o          w.          w/o          w.               Key list    Key list   Key list     Key list ------------------------------------------------------------   EA           54,0      19,0              50,0      15,0

Page 6: Nota SAP_141242

SA_1VAR        55,0      25,0            50,0      21,0 SA_10VAR      500,0      150,0            446,0      108,0 SA_20VAR      970,0        275,0      ***,*      ***,*

                       Due to the long runtime that would result, we have not performed a separate measurement for SA_20VAR without change documents/change pointers.

                       Rule of thumb: Removing the change documents/change pointers improves performance by about 10% if no key list is used and by about 20% if the key list is used.

d) The effect of all three optimization methods on the processing time

                             Processing times [sec] for IDOC_INPUT_ARTMAS Article      W/o optimizations          With optimizations                 w/o          w.          w/o          w.               Key list    Key list   Key list     Key list ------------------------------------------------------------   EA           54,0      19,0              44,0      12,0 SA_1VAR        55,0      25,0            45,0      17,0 SA_10VAR      500,0      150,0            400,0      87,0 SA_20VAR      970,0        275,0      ***,*      ***,*

                       If no key list is used, the total of the three saving combined do not add up and the total saving is lower (60-70% of the amount obtained by adding the savings individually).

                       Rule of thumb: By deactivating source determination, the application log, and  change documents/change pointers, you can reduce the runtime by  about 20% (when no key list is used) or by about 35% (when the key list is used). However, the 35% reduction is lower in absolute figures because the total runtime is already lower when you use a key list.

9. Result

              The processing time depends exclusively on the number of operations. This is particularly the case during data transfer using IDoc processing. This means that  when you use IDoc processing to transfer data for a large number of sites, distribution chains or variants, the processing time increases in almost direct proportion to the number of resulting operations.

              The best way of optimizing the runtime during the data transfer is if you use the site key list to create the site-specific data for the non-reference sites. Why do the processing times differ so greatly? If IDoc logistics data is supplied explicitly for each site, the system must use the following logic for each data segment:

a) Evaluate field selection to determine which fields are not ready for input (according to Customizing), that is, which fields do not require data from the IDoc.

b) Perform reference handling, that is, copy data from the reference site to the dependent site.

c) Merge the IDoc data of the dependent site with the data resulting from reference handling.

d) Check whether all required entry fields are filled.

e) Perform foreign key checks, that is, check whether the field values are valid.

f) Perform special checks to ensure that mutually dependent fields have consistent values.

g) Check whether there are differences between the dependent site and the reference site.

              If the logistics data for the sites in the key list is generated automatically on the basis of the reference site data, it is sufficient to simply copy the data (which in turn is much quicker). This method is also used during dialog processing if the listing is executed within article maintenance and if the data in the logistics views is maintained only for the reference sites. Even when data is created, no lock entries are generated for the data segments created by copying. This improves performance and reduces the load on the  lock table (when several jobs are processed simultaneously, the lock  table can overflow if too many locks are set).

              By deactivating certain functions, you can improve the runtime of the data transfer.

              You can see from test article SA_10VAR that by using all optimization options, you can reduce the runtime from 500 seconds to 87 seconds, which is a reduction of 80% (-> the system is 5.7 times faster); likewise, the number of operations is reduced from 1023 to 88 (a reduction factor of 11.6, which corresponds to a reduction of 91%).

              

Solution

What can I do to achieve good system performance during the data transfer (which is important to do because of the amount of data processed)? You should always aim to keep the number of operations as low as possible. If you want to create site-specific data for a large number of sites (for example, more than 10),

Page 7: Nota SAP_141242

you must use the site key list. If the logistics data is transferred to all sites in the IDoc individually, this on the one hand results in unacceptable processing times, and on the other hand creates the risk of extreme deviations (table MABW) when the logistics data of the reference site and that of the dependent sites have different values. These deviations then lead to problems during production operation:

l Due to the deviations, changes made to the logistics data of the reference site are no longer automatically transferred to the dependent sites, that is, you must maintain these values manually for each site that is maintained differently.

l If you want to change the logistics data of an article that has a high number of deviations at operation level, it takes much longer to access the logistics data and to saved the changed data than when there are no deviations. This is because the system must read the data for all differently maintained areas of validity (organizational levels) to make sure that data is consistent. Consequently, the more deviations there are, the longer the reading takes. When you save, the system checks all imported data to make sure that it is consistent and that there are no longer any deviations (it is possible that the values at dependent site level have been adjusted to match a change that was made to a value at reference site level). If this is the case, the system corrects these deviations.

           This behavior is the same both during dialog maintenance and when changes are made using IDoc processing.

l If you subsequently try to list an article for new sites and no logistics data yet exists for these sites, the system creates logistics data during the listing. To do this, it uses a function module that it also used during IDoc processing. Consequently, the processing times for the listing are also very poor since the system must read and check an unnecessarily large amount of  data.

            If you use the key list segment, you must set up the IDoc as follows:

l For the reference sites, specify the required logistics data for the IDoc segments E1BPE1MARCRT, E1BPE1MARCRTX, E1BPE1MPOPRT, E1BPE1MPOPRTX, E1BPE1MARDRT, E1BPE1MARDRTX, E1BPE1MBEWRT, E1BPE1MBEWRTX and so on.

l The dependent sites should be entered in the key list segment E1BPE1WRKKEY. In the case of generic articles, logistics data can only be maintained for reference sites. The dependent sites should therefore be entered for all variants. The logistics data is then created for all the variants and sites specified in the IDoc segment E1BPE1WRKKEY.

In general, it is unlikely that you will be able to maintain the logistic data in the same way for all sites. For this reason, certain fields (for example, reorder point (MARC-MINBE), maximum stock level (MARC-MABST) and so on) are excluded from deviation handling and from the copying logic in standard systems. This means that no deviations occur when you maintain these fields differently. On the other hand, changes to these fields at reference site level are no longer transferred to the dependent sites. Only changes to fields that are not excluded from the copying logic are transferred. These fields, which cannot lead to a deviation, are supplied only during the case of a new creation when they are supplied with a default value from the reference site. There are also fields in the logistics data whose values can be changed by other applications (for example, changes to vendor default data from the vendor master data maintenance, changes to forecast parameters from the forecast or changes in the forecast profile). These fields are also excluded from the copying logic and deviation handling. Depending on your business requirements, there may be other fields that you need to maintain differently. These fields must be excluded from deviation handling and from the copying logic. This ensures that deviations do not exist thus enabling copying for the other fields. Note 142897 describes how to do this. When the relevant fields no longer lead to deviations, you require a simple method to supply them with values without having to create specific IDoc segments (this would cause the described performance problems to occur due to the large number of operations involved). Note 142898 describes this method. The described procedure enables you to transfer data efficiently and prevents unintentional deviations, which cause performance problems or extra maintenance effort during subsequent processes. It is relatively time-consuming to remove existing deviations at a later stage. You should therefore aim to keep the number of deviations for each article (that is, the number of entries in table MABW) as low as possible beforehand. In addition, you should also take the following points into account:

l If an article is managed in many sites, you should decide whether inventory management on an article basis is really required for all sites, or whether value-based inventory management at warehouse group level or a higher warehouse group hierarchy level is sufficient. If inventory management is to take place at the level of the value-only article, you do not have to create logistics data for the article/site combinations (which is then followed by warehouse-group-specific inventory management). This decision has a great affect on the runtime of the data transfer.

l If you create IDocs using a non-SAP system, you should configure IDoc inbound processing in ALE Customizing so that the IDocs generated in the R/3 system are collected and then processed in packages during a later step instead of being processed immediately. During immediate processing, an RFC (Remote Function Call) is made for each IDoc, which leads to a considerable overhead.

l Collecting the IDocs allows you to speed up IDoc processing using parallel processing. This involves the processing of the IDocs in packages by parallel work processes. The total runtime of the data transfer is then reduced according to the degree of parallel processing. The required settings for this and the degree of parallel processing that is used should be determined by an experienced SAP consultant. If the degree of parallel processing is too high, it may have the opposite of the desired effect by causing a system overload.

           

Page 8: Nota SAP_141242

            Additional optimization methods concerning source determination, change documents and the application log are described in Notes 139706, 141243, and 141244. As of Release 4.6A there are solutions for this that do not involve a modification. To improve the runtime during article creation, it can also be useful to implement Note 148499. Performance problems are also likely if you use material group hierarchies that contain a high number of characteristics or a few characteristics that are assigned a high number of characteristic values. With such constellations, the processing time in the class system increases disproportionately. If you need to integrate additional fields into article maintenance to store data for customer-specific attributes, you  should not create them using the characteristics of the classification system because this causes the performance problems described above to occur. Instead, you should add the extra attributes to table MARA. To do this, make a custom enhancement to dialog maintenance as described in the Implementation Guide under 'Configuring the Article Master'. The extra effort required to implement this is more than made up for by the resulting runtime improvement. You should always implement customer-specific fields as an enhancement of table MARA if you know that the additional attributes are relevant for most articles.

Validity

References

This document refers to:

SAP Notes

This document is referenced by:

SAP Notes (9)

Software Component From Rel. To Rel. And Subsequent

SAP_APPL 40B 40B  

45A 45B  

46A 46B  

46C 46C  

470 470  

500 500  

600 600  

1563398   Lock problems due to too many deviations

1122286   IDOC_INPUT_ARTMAS: Poor performance with large IDocs

213923   ALE: Performance optimization of article master

148499   ALE/BAPI: Performance article master (data transfer)

142898   Customer-specific default data during creation

142897   Customer-specific adjustment of difference handling

141244   Deactivating the application log during data transfer

141243   ALE:Deactivatng change doc. creatn for data transf.

139706   ALE:Deactiv. srce determ. f data transf.(perform.)

141243   ALE:Deactivatng change doc. creatn for data transf.

141244   Deactivating the application log during data transfer

148499   ALE/BAPI: Performance article master (data transfer)

142898   Customer-specific default data during creation

142897   Customer-specific adjustment of difference handling

139706   ALE:Deactiv. srce determ. f data transf.(perform.)

213923   ALE: Performance optimization of article master

1563398   Lock problems due to too many deviations

1122286   IDOC_INPUT_ARTMAS: Poor performance with large IDocs