SNP Optimization Training Slides

February 10, 2017 | Author: Ravindranath Reddy | Category: N/A
Share Embed Donate


Short Description

Download SNP Optimization Training Slides...

Description

Decision variables are unknown before the decision problem is solved. By solving the optimization problem, a value is assigned to them. Types of variables in APO: ·continuous variables – these may take any floating-point values (real numbers) in a range, e.g., the starting time for a certain job ·discrete variables – they can only take on integer values in a range, e.g., the numbers of vehicles on a particular route ·binary (0/1-) variables – they represent binary decisions (e.g. setup decisions). Finding a feasible solution may be difficult. The objective of SAP APO is to obtain the best feasible (= optimal) solution of a decision problem or at least a good feasible solution for a given run time.

In many practical situation there is more than only one business goal. In case of conflicting goals it is very difficult to weigh between these objectives. In APO the users can assign different weights to each goal, and they can determine interactively the value of the weights by using different scenarios. Linear objective function: each variable is multiplied by a constant, and all terms are added together. For example, if we are trying to determine the optimum product mix of two products, A and B, which we denote as X1 and X2 respectively. If the profit margin derived from selling one unit of A is $100 and that from B is $400, then the overall profit would be F(X) = 100 X1 + 400 X2 which is a straight line on a graph of X1 and X2.

Variable-type constraints reflect the basic properties of the variables, e.g., non-negativity or integrality of production volume. These constraints usually define the domains for the decision variables (search space). Functional constraints describe the structural relationship of activities and resources. In allocating resources to activities, the demand must not exceed the availability. Common constraints on resources include:  Market demand - The market demand ultimately determines the throughput of the firm. The marketplace dictates the product lines and production plan. Loss of sales usually is caused by the plant‟s inability to meet customer requirements at the required time, price and quality. Examples of these constraints are customer orders; sometimes there are different priorities between the customer orders. In SAP APO Optimizer, we can either enforce the prioritization of customer orders strictly (i.e., a higher prioritized customer order may not be delayed in favor of a lower prioritized customer order), or we can model these constraints as soft constraints (i.e., penalize lateness according to priorities).  Material availability – This is a fundamental requirement for production. If a vendor delivers the raw material late or the material is defective, the production process is disrupted.  Resource capacity – Machines, tools and labor must be synchronized to ensure a smooth and timely flow of the production process. Both capacity and materials may be soft constraints in the sense that extra capacity and materials can be purchased at an additional cost for mid- term planning. However, for shortterm detailed scheduling, these would be hard constraints. Which resource is a bottleneck may also change over time as supply or demand changes.  Infrastructure logistics – These include temporal constraints and sequencing constraints in operations and routing. For example: after the heat treatment, a part may have to wait for a period of time before the next production step can be executed.

H – number of hockey sticks per day (x1) C – number of chess sets per day (x2) The region enclosed by the graphs of each constraint represent the feasible solution space (yellow). The objective function (dotted lines) is plotted at different levels. It can be calculated as x2 =-5/3  x1 + Z/4, where Z represents a fixed level for the objective value. Here, the objective function values are 32 and 64, respectively. In case of LPs the optimal solution always occurs at a vertex of the feasible region. Here, it is reached for x1(H) = 24, x2(C) = 4, and a maximum profit of $64.

Points in the interior of the feasible space cannot be optimal, because the level lines of the objective function may be moved to higher objective values. Note that for some special cases the optimal solution may also occur in two vertices of the feasible region. Then, all convex combinations between these two vertices are optimal solutions of the decision problem. In this case, the level line of the objective function is parallel to the ‚critical„ restriction. In our example, this would be the case for the following objective function: max x1 + 3 x2 => x2 =-1/3 x1 + Z/3 Then, we get several optimal solutions, e.g.  x1 = 24, x2 = 4  x1 = 12, x2 = 8  x1 = 6, x2 = 10  The objective function value is always Z = 36.

BACK-UP: SAP TUTOR: EXAMPLE_THEORY.SIM

Constraints are transformed into =-restrictions by adding slack variables: x3,x4,x5.

Idea of the simplex method: Change of a basis variable (here: x3, x4,x5) with a non-basis variable in the way that - the objective value increases - the values of the basic variables remain non-negative  Candidates for change of basic: non-basic variables with negative coefficient in objective function row  Optimal solution found if all coefficients of the objective function are nonnegative  Starting Corner: Non-Basis-Variables: x1, x2 = 0

1st Iteration: 1. Determination of Pivot Column: the variable with the lowest negative coefficient in the objective function row is chosen, because the objective value increases most by taking this variable in the basic (here: x2 with –4 is Pivot Column). In the sequel, the pivot column is denoted by q (here: q = 2) 2. Determination of Pivot Row: In general, the increase of a new basic variable leads to a reduction of another basic variable, because otherwise the constraints would be violated. Therefore, the increase of the new basic variable is limited by the condition that the values of the other basic variables remain non-negative. The new basic variable can only be increased until the value of one of the other variables is equal to zero. This variable will be taken of the basic. For determining this bottleneck for all coefficients a iq > 0 of the pivot column q the following quotients are calculated: i= bi/ aiq for all rows i with aiq > 0 where bi describes the value of the coefficient in the row i of the solution column. Then, the variable in the row p has to be taken of the basic for which the quotient i takes on the lowest, non-negative value (here: 3 with a value of 10). => Pivot Element: a32 (row p=3, column q=2) 3. Pivot step: creation of the unit vector with a 1 in the pivot row, i.e. a *pq = 1. 4. Optimality criteria: optimal solution is found, if all coefficients in the row of the objective function are non-negative. Otherwise go back to step 1 (choice of pivot column) -> not fulfilled => 2nd iteration necessary

2nd Iteration 1. Determination of Pivot Column: q = 1 2. Determination of Pivot Row: p =2 => Pivot Element: a21 (row p=2, column q=1) 3. Pivot step: creation of the unit vector with a 1 in the pivot row, i.e. a*pq = 1. 4. Optimality criteria: not fulfilled => 3rd iteration necessary

3rd Iteration 1. Determination of Pivot Column: q = 5 2. Determination of Pivot Row: p =1 => Pivot Element: a15 (row p=1, column q=5) 3. Pivot step: creation of the unit vector with a 1 in the pivot row, i.e. a*pq = 1. 4. Optimality criteria: fulfilled => optimal solution found

shadow prices -> coefficients of slack variables in objective function  u3 = 2/6 => decreasing the first restriction (resource A) by one unit would lead to a reduction of the profit by $0.333 -> 4H + 6C < 119 => Profit = 63.667 ( with optimal solution: H = 23.5 and C =4.167)  u4 = 1/3 => decreasing the second restriction (resource B) by one unit would lead to a reduction of the profit by $0.333 -> 2H + 6C < 71 => Profit = 63.667 ( with optimal solution: H = 24.5 and C =3.667)  u5 = 0 => third restriction (resource C) is not active reduced costs -> coefficients of decision variables in objective function  since x1 and x2 are both larger than zero, the corresponding reduced costs are both equal to zero  If reduced costs have a positive value, then they indicate how much the corresponding coefficient in the objective function for this decision variable has to be increased in order to get this variable into the basis.

Simplex: algorithm which moves from one vertex of the polyhedron to a new vertex by advancing along one edge at a time Interior Point Method (IPM): algorithm which moves through the interior of the polyhedron. Comparison: 1. The optimal solution to an LP problem will always lie on a vertex, i.e. on an extreme point of the boundary of the feasible region 2. An algorithm which moves through the interior of a region must pay attention to the fact that it does not leave the feasible region. Approaching the boundary of the feasible region is penalized. This penalty is dynamically decreased in order to find a solution on the boundary. 3. Interior point methods contain complicated mathematics and use advanced mathematical concepts. A large variety of IPMs has been developed. In linear programming, IPMs are well suited especially for large, sparse problems. Here, considerable computing-time gains can be achieved.

B&B is a specific enumeration algorithm. Different from complete enumeration, only a limited number of sub-problems has to be solved. The B&B algorithm works like a divide and conquer strategy. First, the linear relaxation of the integer problem is solved. If the solution is integer, then this is also the optimal solution for the problem. Otherwise, two new sub-problems are created by branching on a fractional variable. The process is repeated until no further sub-problem can be created or all solutions are integers or infeasible. The determination of good bounds is important, so that the branching may be stopped early.

Step 1: linear relaxation (e.g., with simplex method)  No feasible solution => MILP has no feasible solution => END  Optimal integer solution => END  Optimal, non-integer solution => step 2

objective function value of LP relaxation is upper bound for objective function value for MILP, lower bound is `-oo` Step 2: Branching  Branching of a non-integer variable X, with g < X < g+1 and g as integer. The set of all solutions

is divided into two sub-sets:  

(a) solutions with X  g (b) solutions with X  g+1

 the other constraints remain unchanged. Step 3: Bounding  The two sub-problems of step 2 are solved.

The objective function values of both problems are upper bounds for the objective value of both sub-sets. If there are still non-integer variables which have to be integers, then the branching must be continued. But: it is not necessary to analyze a sub-problem if  



the bound of the sub-set is lower than the objective value of a feasible, integer solution the sub-problem is not feasible

If all variables are integers, and if there is more than one sub-problem with a feasible integer solution, then the optimal solution can be found by comparing the values of the objective function.

Step 4: Optimality test  The algorithm terminates if all branches are implicitly or explicitly analyzed. Then, either an

optimal solution has been found or the problem is infeasible.

Abbreviation: SP = sub-problem; numbers according to sequence of solving the sub-problems

SP5 ends, because integer solution is found. SP6 ends, because objective function value (Z=14) is lower than the function value of SP5 (Z=15) which fulfills all constraints incl. discretisation and therefore is an lower bound for the overall optimal solutions. Sub-Problems which have a lower objective function value as this bound does not need to be investigated further.

End of sub-problem 5: integer solution; new bound: Z = 15 End of sub-problem 6: objective function value of solution with Z = 14 falls below bound Z = 15 Note: If there are (in at least one sub-set) variables that are not integer, then a criteria is needed to determine which sub-problem should be branched. Within the Dakin-Algorithm the `newest bound rule„ is usually applied. This means that the last generated sub-set which has not been completely analyzed is branched. If the sub-sets are generated simultaneously, then the sub-set with the higher upper bound is chosen.

Cutting-Plane methods operates by starting in the same way as B&B but then moving towards a solution by restricting the feasible region defined by the constraints of the relaxed problem. Pure cutting-plane algorithm tends to be slow to reach a solution => In APO: combination of B&B and Cutting Plane Approach: Branch & Cut (B&C) algorithm.

Branch and Cut (B&C) proceeds in steps which are similar to B&B, but at each node of the tree rather than branching on a single variable, the algorithm appends cuts to the problem which cut off parts of the LP feasible region without cutting off any valid integer solution. Standard Cuts:  Gomory  Flow  Cover  Disjunctive  ......

Constraints Transportation transportation resource  inbound/outbound handling resource

 Production production resource  storage

resource

 Procurement inbound handling resource

Product

Constraints

minimal/fixed safety stock

lot sizes

shelf life

Decision Variables production, transportation, procurement, storage quantities deliveries (fulfilled demands) capacity expansion

Production/transportation/storage capacities: Limiting the maximum availability of production/transportation/storage resources for each location. Handling capacity: Limiting the flow in and out of a location. The optimizer may consider different handling resources for inbound and outbound. Capacity calendar for resources: The capacity may change during the planning horizon Break calendar: Breaks can be defined for each resource, e.g. weekends (not: downtimes) Due dates/ deadlines/ maximum delay: Due dates are considered as a soft constraint, i.e. lateness is only penalized in the objective function, whereas deadlines are hard constraints, i.e. no delay is allowed. Deadlines may be modeled by limiting the maximal delay with 0. Safety stock: Falling below safety stock is considered as a soft constraint, i.e. penalized in the objective function. Additional:  Integer lot size: production or transportation lot size is not continuous sizeable.  Minimum lot size: some resources may only be used for large lot sizes.

If setup times are small and a lot of products are on a resource per bucket, then a „global“ capacity reduction in the resource master data should be used instead of using fixed resource consumption in the PPMs of each product.

If the average lot size (derived from demand data) is considerable larger than the minimum lot size, then it is not necessary to model minimum lot sizes. For instance, with an average demand of 10 pieces, it makes actually no sense to use a minimum lot size of 1 piece for the SNP Optimizer.

Optimization from final stage in Supply Chain (customer level) to the first stage in the Supply Chain (plant level).

The optimizer has to run on a Windows Server machine. As of SCM 2007, Linux is also supported. On a 32bit OS, the maximum main memory which can be used is 3 GB. (1 GB is reserved for operating system). On a 64bit OS, the only restriction is the physical memory size (swapping will greatly reduce the performance). The processor should always be as fast as possible. With the processor performance you can directly influence the run time performance of the SNP optimizer. You have an almost linear relation, i.e. if you have a run time of 3 hours when you use an 2 GHz processor, then you will get a run time of only 2 hours, if you use an 3 GHz processor of the same type (for different CPU types, the . Important note: 112403 (addressing 3 GB memory under Windows NT/2000/2003) In case of problems with main memory (overflow), there are some specific parameters in transaction copt10 available to solve these problems. Get in touch with development (via OSS message) when you experience these kind of problems, so that development can recommend which parameter should be set. However, there are upper bounds for the number of variables and constraints for the optimizer. Benchmarks from customer cases will follow later.

How to get the sheet? http://service.sap.com/scm -> mySAP SCM technology -> Performance and Configuration Recommendations -> System Requirements for the SAP APO 3.x and SAP SCM 4.x Optimizer

For a FIXED run time with increasing model detail (i.e. complexity of the model), the gap between optimal solution and computed solution by the optimizer increases.  business will probably not accept these solutions, since they are too far away from the optimal solution (model too detailed)  if the model is too rough, then the business will also not accept the solution, since the model doesn„t correspond to their business requirements. In the „optimal model detail“ region, the model is not too rough (i.e. it respects the important business requirements) and not too detailed (i.e. it is still solvable in an acceptable run time). Here, acceptable solutions (in terms of business requirements and run time) are found. However, it might also be necessary to use e.g. decomposition techniques to get the right tradeoff between solution quality and run time and to reach business acceptability of the computed solutions.

The customer uses SNP and PP/DS. Nevertheless, they have used a very detailed time bucket profile in SNP resulting in 82 planning periods and a huge amount of (discrete) variables and constraints. It was not possible at all to solve this model in SNP. But since the customer uses PP/DS for detailed scheduling, it actually didn„t make sense to use such a detailed time bucket profile and it was possible to convince him to use an aggregated time bucket profile with only 39 planning periods. Additionally, the number of transportation lanes could have been reduced and the number of discrete variables as well. It was not absolutely necessary to have minimum lot sizes and fixed resource consumption, but, on the other hand, transportation lot sizes have been introduced. In total, the number of discrete variables have been reduced. Nevertheless, it is not possible to solve a problem with such a lot of (discrete) constraints and variables without any decomposition technique. After testing a lot of problem instances with time and product decomposition (with different parameter settings), the customer uses now product decomposition. The total run time is 10 hours (including data collection and writing back of the results), pure optimization time is set to 7 hours, since such an amount of time is available. However, good results are already found after approx. 3 hours.

General remark  cost values shouldn„t differ too much  range from 1 to 10^6 - 10^8 is o.k.  larger differences may lead to numerical problems Central maintenance of costs -> Master Data -> Supply Network Planning Master Data -> Maintain Costs (Table) -> Maintain Costs (Directory) Purpose of cost profile: global weighting of different cost factors checking global impact of one cost factor on solution, e.g. what is the impact if capacity increases are very expensive cost values for all corresponding decision variables are changed to increase/decrease specific values for, e.g., producing/transporting/storing a specific product, the individual PPM/transportation/ storage costs in the master data have to be changed

Costs for increasing production capacities are defined in the resource master data (quantities/rates-definition in capacity variants). Costs for storage expansion are defined in the resource master (in the same manner as costs for production expansion). The costs functions can centrally be maintained in transaction /sapapo/snpcosf.

Single-level costs are variable (quantity-dependent) costs which are incurred when using this production process model in one bucket. For each execution of the PPM the single-level costs occur. If, e.g., an PPM is executed 10 times in a bucket, and the single-level costs are equal to 5, then the PPM costs in this bucket are 50. Multi-level costs are not relevant for the SNP optimizer. They are only used for the scheduling heuristics in PP/DS. Cost value in Input log: //ET_PROMO: PCOST – variable cost of PPM

The costs are calculated according to the cost function in the example as follows:  5 + 4*x for 0 < x < 10  15 + 3*(x-10) for 10 < x < 20  20 + 2*(x-20) for 20 < x < 30  25 + 1*(x-30) for 30 < x < 999.999.999,999 where „x„ represents the number of PPM executions per bucket. Note, that the upper bound of one segment is equal to the lower bound of the subsequent segment. If the number of PPM executions is equal to these bounds, then the cheaper part of the segment is chosen. If in our example x = 10, then the costs will be 15. In general, the costs are computed as follows:  fix costs + variable costs * (x- „from-value in cost function„) If you want the optimizer to consider the cost function, you do not only have to maintain it in the PPM, but you also have to activate the corresponding field in the optimization profile. If the cost function for the PPM is used, the optimizer ignores the PPM costs. Cost functions for transportation and procurement work exactly with the same logic. Again, you have to use the discrete solver if the optimizer should consider the cost functions. Important note: „continuous„ cost functions reduce the complexity for the optimizer!

Cost Functions in Inputlog:  // ET_TRANCOS (Transport Cost Functions)  // ET_PRODCOS (Production Cost Functions)  // ET_PROCCOS (Procurement Cost Function)  with  ORIGN: smallest value  FIXCO: fixed cost  VARCO: variable cost   

In example above: // ET_PRODCOS // PROID

   

< PPM name> < PPM name> < PPM name> < PPM name>



cost = FIXCO + VARCO * (X – ORIGN)

ORIGN VARCO 0,0000000000000E+00 1,0000000000000E+02 2,0000000000000E+02 3,0000000000000E+02

FIXCO 1,0000000000000E+03 1,5000000000000E+03 1,9000000000000E+03 2,1000000000000E+03

5,0000000000000E+00 4,0000000000000E+00 2,0000000000000E+00 1,0000000000000E+00

Transportation costs are calculated as follows:  Trans. costs describe the variable costs for transporting a product via a transportation lane per unit of measure.  Mns of Trsp Costs mean the distance-dependent costs of a Transport Method (per km). The cost function can only be maintained in transaction /sapapo/snpcosf (or: SC Engineer -> Goto -> Cost functions which leads to this transaction). Transportation costs in Input log:  table ET_ARC:  TCTYP: variable transportation cost for the fleet -> calculated as: Mns of Trsp Costs * Trsp distance  table ET_ARCMAT:  TCOST: variable cost for the material at this lane = Trans. costs  table ET_TRANCOS: cost function for transportation

Transportation Unit [TRAUNIT] is the unit maintained at the transportation resource and Base Unit of Measure [BMAT] is the unit of measure in the product master. Both have to be maintained consistently, i.e. in the product master the conversion to the transportation unit has to be maintained. Here, both units are kg.

resulting transportation costs: 180/5000[BMAT/TRAUNIT]*2[APO-$/km]*672,576[km] + 180[BMAT]*100 [APO-$/TRAUNIT] = 18048,425 [APO-$] The first cost term describes the distance-dependent costs associated with the usage of a transportation resource, the second term describes the quantity dependent costs of transporting the material. If you don„t have a transportation resource, then the first cost term will simply be multiplied with the transported quantity, i.e. the default value is then 1 transportation unit [TRAUNIT]. In case of a transportation resource this implies that the Mns of Trsp Costs associate with the complete utilization of the resource, whereas in case of no transportation resource the Mns of Trsp Costs associate with the base unit of measure of the product. The first cost term is bucket-dependent, the second one bucket-independent. This means, that the first cost term is taken into account in each bucket. If, e.g., a transport starts in one bucket and ends in the next buckets, then these costs will occur twice, whereas the quantity-dependent cost occur only once. If you use a cost function, then you have to specify variable and fixed costs. Note, that both cost terms are multiplied with the transportation distance. Example: If you have maintained fixed costs of 100 APO-$ and variable costs of 10 APO-$, and a transportation distance of 950 km, then the fixed costs are equal to 95000 APO-$ and the variable costs are equal to 9500 APO-$. You can check this in the Input log of the optimizer, in table ET_TRANCOS. For calculating the cost of a transport, the variable costs are multiplied with the transported quantity.

Procurement costs in Input log: in table ET_LOCMAT: FCOST: linear procurement cost FPERF: flag: procurement permitted = 'X'

Restricted use of shelf life in SNP. Please check notes 574321 (SNP in general) and 579556 (SNP OPT). The optimizer only considers shelf life when the flag ‚Plng w/ shelf life„ is switched on in the product master and a shelf life is maintained (Attributes Tab). Additionally, the flag ‚Do Not Take Shelf Life into Account‟ must not be activated in the optimizer profile. Shelf life violating costs in Input log: in table ET_LOCMAT: WASFL: flag: shelf life penalty is active -> WASFL = 'X' WASTE: shelf life: penalty for wasted quantity (same values as for procurement costs FCOST) STODU: shelf life: storage duration

Restricted use of shelf life in SNP. Please check notes 574321 (SNP in general) and 579556 (SNP OPT). Settings:  Do Not Take Shelf Life into Account: the optimizer does not take product shelf lives into account during optimization, even if the corresponding master data are maintained.  Continue Using Expired Product: the optimizer takes into account the shelf life of products that was specified in the product master. For expired products (those that have passed their shelf life expiry date), the optimizer plans for the continued use of this product, but calculates penalty costs for this continued use.  Dispose of Expired Product: the optimizer takes into account the shelf life of products that was specified in the product master. For expired products (those that have passed their shelf life expiry date), the optimizer plans for the disposal of this product, but calculates penalty costs for this disposal.  Use Penalty Csts that are not Prod-Dep.: This indicator can be activated in connection with either the Continue Using Expired Product indicator or the Dispose of Expired Product indicator. By default, the optimizer considers the location product procurement costs that were defined on the Procurement tab page in the product master. If you want to enter and define different costs that are not dependent on the product, you are able to do this in the Penalty Costs: Not Product Dependent field. In this instance, you must also activate the Use Penalty Csts that are not Prod-Dep. indicator.

Only the ending inventory for the bucket accrues an inventory cost. If you use cost per bucket, it is harder to adjust the storage cost to balance with the safety stock penalty cost which is calculated as a cost per day. In addition, using a storage cost per bucket does not account for planning in different buckets – e.g., weekly or monthly planning buckets. For LPs, it is not necessary to maintain storage costs to get ‚JIT behavior„ of the optimal solution. It is ensured via an internal post processing in the optimizer that planned orders are generated as late as possible to satisfy the demand. This post processing is not active for discrete problems, i.e. storage costs have to be maintained to get this JIT result. Cost Calculation:  Average Stock On Hand: the optimizer calculates the storage costs by multiplying the following values together:   



Stock on hand at the end of the bucket (period) Storage costs that have been defined for the location product in the product master (in the Procurement tab) Number of days in the bucket

Stock on Hand at End of Period: the optimizer calculates the storage costs by multiplying the stock on hand at the end of the bucket (period) with the storage costs that were defined for the location product. Storage costs in Input log: in table ET_LOCMAT HCOST: storage costs BUCFL: storage cost: multiply by bucket length = 'X' -> if setting Average Stock On Hand is chosen

Prerequisite in location product master:  maintained safety stock (target safety stock level)  safety stock planning method Take Period Length into Account: the optimizer also takes the period length into account when calculating the penalty costs for falling below the safety stock, which means that it multiplies the product from an absolute value (or relative value) of the stock fallen below with both the penalty costs and the period length in days. In case of weekly buckets this would mean that the resulting penalty costs would be multiplied with 7 if this checkbox is activated. If this setting is not active, then the optimizer does not consider the number of days in a bucket (in which inventory is below safety stock).

Safety stock penalty is calculated as:  Take a Relative Deviation from the Safety Stock into Account: the optimizer calculates penalty costs when the safety stock level is fallen below. To do this, it multiplies the percentage of the amount fallen below with the penalty costs that were defined in the product master (per day).  calculation: safety stock penalty cost * [(target safety stock – inventory)/target safety stock] * (if required) period length in days (number of days that inventory is below safety stock)  Take an Absolute Deviation from Safety Stock into Account: the optimizer calculates penalty costs when the safety stock level is fallen below. It multiplies the absolute value of the amount fallen below with the penalty costs that were defined in the product master (per day).  calculation: safety stock penalty cost * amount of stock below the desired safety stock * (if required) period length in days (number of days that inventory is below safety stock)  Take Period Length into Account: the optimizer also takes the period length into account when calculating the penalty costs for falling below the safety stock, which means that it multiplies the product from an absolute value (or relative value) of the stock fallen below with both the penalty costs and the period length in days. In case of the example above with weekly buckets this would mean that the resulting penalty costs would be multiplied with 7 if this checkbox is activated. If this setting is not active, the optimizer takes into account the number of buckets in which inventory is below safety stock. Safety stock penalty costs in Input log: in table ET_LOCMAT: SSPEN: penalty for not covering safety stock

The penalty cost for not satisfying the demand is interpreted as cost per unit of demand not satisfied. The penalty cost for delay is interpreted as cost per unit of demand per day of delay, e.g., if you are working in monthly buckets, then the cost is multiplied by the number of calendar days. Late Delivery costs, maximum lateness and non delivery costs in Input log: // ET_DEMCLTIM (Demand Class with Penalty for Delay/Not Delivering)  LAPEN: penalty for lateness  MAXLA: maximum lateness  NDPEN: penalty for not delivering Demand data in Input log in table ET_DEMAND:  DEMCL = demand classes:  1 – highest priority (customer demand)  2 – priority 2 (corr. forecasts)  3 – lowest priority (forecasts)

Resource utilization cost is on the capacity variant 2 of the resource. Capacity variant 2 has to be specified as the active variant of the resource. This works for mixed and bucket resources. For mixed resources, you have access to the capacity variant only if you explicitly enter the bucketed capacities. Resource utilization cost is interpreted as cost per unit of resource above the capacity variant 1. If you are only using capacity within capacity variant 1, then no costs are considered. In APO 3.x you have to use capacity variant 1 and 2. Costs for capacity expansion in Input log: // ET_RESFAMC (Resource Family Calendar)  INCRF: upper bound for increasing production resource  INPEN: penalty per unitary volume for increasing capacity

You have to maintain non-delivery or delay costs, otherwise there will be no production at all, since the ´do-nothing-at-all´ solution will lead to minimal costs (=0). With storage costs you can control the ´JIT´ behavior of the system.

Storage and shipping calendar are only used if the corresponding resources are not maintained. If there is a storage resource, or a handling resources, respectively, then the calendar from this resource will be used by the optimizer. The same is valid for production resources. The stock category group from the SNP Tab is not used by the SNP optimizer, and the deployment settings are only used by the deployment optimizer.

The shelf life parameters are on the product master. Shelf life parameter is ‚product specific„ not product-location specific (e.g., the same shelf life at the plant or at the DC). The optimizer only considers shelf life when the flag ‚Plng w/ shelf life„ is switched on in the product master and a shelf life value is maintained (Attributes Tab). Additionally the corresponding setting in the optimizer profile has to be activated. Only the shelf life field is used by optimization, any of the other fields (maturity, min/max shelf life) are not used. In SNP, only optimization recognizes shelf life. Other SNP solvers and the interactive planning tables do not recognize shelf life. It may be misleading to see stock in the interactive planning table that is no longer useable. Please check notes 574321 (SNP in general) and 579556 (SNP OPT) for more detailed explanations. The shelf life costs occur in the bucket where the product has to be discarded due to exceeded shelf life. Shelf life parameters in Input log: in table ET_LOCMAT: WASFL: flag: shelf life penalty is active -> WASFL = 'X' WASTE: shelf life: penalty for wasted quantity STODU: shelf life: storage duration

Production storage capacity is a product specific maximum stock level at this location (per bucket). The maximum lot size from the product location master is not considered during the optimizer run, but it is taken into account for the creation of orders (in liveCache). Example: maximum lot size in location-product master data = 500  optimizer creates an order of e.g. 2000 (also displayed in interactive planning table)  4 orders of 500 are shown in the detailed view / product view  Customizing change necessary; ta: SPRO -> APO Implementation Guide -> Advanced Planner

and Optimizer -> Supply Chain Planning -> Supply Network Planning (SNP) -> Global Settings for SNP Optimizer: activate checkbox „create orders“, if more than one order should be created, if necessary by location-product master data settings (for details see consultancy note 503222) To consider rounding value as lot size, the flag ‚discretisation„ in the PPM has to be activated. Additionally, the corresponding settings in the optimizer profile for discretization (min lot size, rounding value) and production storage capacity (general constraints/capacity restrictions: maximum product-specific quantity stored) have to be selected. Lot Size parameters in Input log:  in table ET_PROMO:  LOTSZ: rounding value or fixed lot size from location product  PMINQ: minimum production lot size (from PPM or location product)  Note: 'PMAXQ' in table ET_PROMO: maximum production lot size (from PPM)  in table ET_LOCMAT:  MAXST: maximum stock  MAXFL: flag: Constraint active -> MAXFL = 'X' (Optimization Profile)  in table ET_LOCPROD:  SAFTY: safety stock (Note: 'FPROD' in table ET_LOCPROD describes confirmed production)

Within the Production Horizon (in days) / Stock Transfer Horizon (in days) the optimizer does not plan production / transfers. That is, the optimizer does not create SNP planned orders / distribution receipts within this horizon, but postpones production / distribution to the first day beyond the specified production / stock transfer horizon. Forecast horizon (in days) defines a horizon in calendar days when the forecast is not considered as part of total demand. Actually, the optimizer does not respect the forecast horizon. If you want it, apply note 412551. Note 443012 has also to be applied, if you are lower than SP17. Horizons in Input log: table ET_PRODBND for production horizon, table ET_TRANBND for stock transfer horizon. It may happen that there is an entry in the input log even if no production / stock transfer horizon is maintained: if you have e.g. weekly buckets, and you run the optimizer on the second day of the week, then no creation of production/distribution orders in this week is possible, i.e. the horizons are active. The same is true for table ET_PROCBND (see below). Procurement parameters in Input log:  in table ET_LOCMAT: FPERF: procurement type FCOST: linear procurement cost (variable) SSPEN: penalty for not covering safety stock HCOST: storage costs  

in table ET_PROCCOS: procurement cost function in table ET_PROCBND: planned delivery time => procurement bounds are set to 0.

Maintenance in Product Location Master Data -> Procurement Tab, field: Procurement Type E = In-house Production x = maintained - = not maintained Consulting note: 510559.

Maintenance in Product Location master data -> Procurement Tab, field: Procurement Type F = External Procurement x = maintained - = not maintained

Maintenance in Product Location master data -> Procurement Tab, field: Procurement Type X = In-house Production or External Procurement P = Procurement Planning: External x = maintained - = not maintained * = maintenance not relevant for interpretation

GR/GI parameters in Input log in table ET_LOCMAT:  RECTI: goods receipt processing time  ISSTI: goods issue processing time  CONIN: consumption of input handling capacity (goods receipt) (if inbound handling resource is maintained in location master data: Resources Tab)  CONOU: consumption of output handling capacity (goods issue) (if outbound handling resource is maintained in location master data: Resources Tab)  CACON: storage capacity consumption (if storage resource is maintained in location master data: Resources Tab) Example: If the handling resource can handle 1000 kg per day, and you define the handling capacity consumption as 10 kg per piece, the maximum rate is 100 pieces per day. If you want to consider the GR time for in-house production, then go to APO customizing: supply chain planning -> SNP -> basic settings -> maintain global SNP settings: HEU: Planned Order GR = processing time.

Handling and storage resources in Input log:  ET_LOC (Handling resources at a location)      

LOCID: Location ID, RESIN: Handling resource ID for inbound INPEN: Penalty for increasing the handling capacity MAXIN: Flag: Hard + soft constr. active -> MAXIN = 'X' RESOU: Handling resource ID for outbound OUPEN: Penalty for increasing the handling capacity MAXOU: Flag: Hard + soft constr. active -> MAXOU = 'X'



ET_LOCC: Handling resource: normal capacity



ET_LOCUC: Handling resource: additional capacity

 



MAXHA: handling capacity INCHA: upper bound for increasing the handling capacity

ET_SUBLOC (Storage resources of a sublocation)  

SUBID: Sublocation ID INPEN: Penalty for increasing the capacity  MAXFL: Flag: Hard + soft constr. active -> MAXFL = 'X'



ET_SUBLOCC: Storage resource: normal capacity 



MAXSL: storage capacity

ET_SUBLOCUC: Storage resource: additional capacity 

INCSL: upper bound for increasing the storage capacity

Over utilization, downtime settings, minimum utilization not recognized by SNP. There is no cost associated with using the normal capacity. The cost assigned to the resource is the cost per unit of capacity used above the normal capacity (variant 2). Details about Cross-Period Lot Sizing will follow later. Production resource in Input log:  ET_RESOURCE (Elementary production resources)  



RESID, RFAID: Resource ID and resource family ID UNIVO: Unitary volume (=1.0, redundant entry)

ET_RESFAM (Production resources of a resource family)  

RFAID: Resource family ID INPEN: Penalty per unitary volume for increasing capacity  MAXFL: Flag: Hard + soft constr. active -> MAXFL = 'X' (finite capacity)  DISCR: Flag for using discrete increase of resource-family



ET_RESC: Production resource: normal capacity (variant 1 or standard capacity)



ET_RESFAMC: Production resource: additional capacity (variant 2)

  

MAXRE: production capacity INCRF: upper bound for increasing the production capacity INPEN: penalty for increasing production capacity

From an economical viewpoint it is sometimes sensible to require a minimum resource utilization even if the demand is not sufficient. We use capacity variants to model the utilization, and the SNP optimizer then takes them into account. If the utilization is below the minimum then the optimizer considers costs, i.e. the minimum capacity is a soft constraint. The minimum, the normal and the maximum resource consumption are realized as capacity variants of a resource. In the capacity view of SNP planning book three capacity variants are shown. The minimum resource capacity will be shown in the new key figure SUPVAR3. The minimum resource capacity is computed by macros according to the macros of the first two resource variants (SUPVAR1 and SUPVAR2). The choice which product(s) are selected to load the resource up to the desired (minimum) capacity depends on the overall cost situation. The decision is made in the way that the total costs are minimized. Calculation of costs 

Non-utilization (minimum capacity) 



Resource utilization 



(minimum capacity – used capacity) * costs for falling below minimum capacity = (8 – 6) * 100 = 200 MIN(used capacity, normal capacity) * costs for used capacity = MIN(12,16) * 1 = 12

Increased utilization (maximum capacity) (used capacity – normal capacity) * costs for increased capacity = (22 – 16) * 10 = 60 + Resource utilization costs = 16  => total costs = 76  

SAP TUTOR: OPT_MIN_RES.SIM

For further information refer to the PDS guide in the Manufacturing section of the service marketplace.

The PPM duration is important for the finish date of an activity. Together with the bucket offset, the PPM duration determines the material availability. The optimizer takes the fixed resource consumption only in account when the corresponding setting in the optimization profile is activated (discrete constraints: fixed consumptions: fixed material and resource consumption).

The minimum lot size is always the minimum lot size per BUCKET. This is also true for minimum transportation lot sizes. The lot size is always the lot size per BUCKET. This is also true for transportation lot sizes. The maximum lot size is always the maximum lot size per WORKING DAY. This is also true for maximum transportation lot sizes. Note that both, minimum and maximum lot size in the PPM, refer to the unit of measure of the location product, not to the unit of measure of the output component in the PPM. Note 503222 describes how order splitting can be activated. In APO 3.0, you have to use an copt10 parameter, in APO 3.1 and 4.0, you can customize this. Note that SNP heuristic and CTM only use the min./max. lot size from the location product as min./max lot sizes. The min./max. lot size from the PPM only determines the validity of an order. Example: if there is only one PPM (or one means of transportation) with a min. PPM (transportation) lot size of 5 and a max. PPM (transportation) lot size of 10 to produce (transport) a specific product, and we have an order/demand of 2 or 20, respectively, then this order/demand cannot be satisfied, since no valid procurement source exist to produce/procure this product.

Note: Rounding value, fixed lot size and Lot-for-Lot are only considered with selected discretisation in PPM and corresponding setting in the optimization profile. Minimum lot size is only considered if the corresponding setting in the optimization profile (discrete constraints) is selected. Maximum lot size in location product master is only considered for the determination of the planned production order, i.e. the produced quantity is split in planned production orders in the size of the maximum lot size in the location product master (or less). In AP0 3.0 it is only done, if all PPM of the selection have no "fixed resource consumption". If you don't want this splitting in any case - it can be switched off (see note "503222 Info on Optimizer Production Order Splitting").  Recommendation for maximum PPM lot size: If you have no specific "maximal lot size" for your PPM, don't enter a value like 9999999999 in this field. This could lead to problems in the floating point precision during optimization, i.e. it could provoke bad results and a bad performance of the SNP optimizer. In this case we recommend a value that is around 10% - 20% higher than the maximum that could be produced in one day. If a rounding value is used, then the maximum lot size should be a multiple of the rounding value. Using the rounding value requires that Lot-for-Lot in the location product master is activated. Output quantity means the production (planned) as displayed in the interactive planning table. The planned production in the optimizer log file describes the number of PPM executions and it can be different. Consulting Notes: 503294 (Info on Optimizer Production Lot Size), and 448986 (Info on Optimizer Lot Sizes, collection of related notes).

Note: If fixed lot size is maintained in location product master, then it overwrites the minimum lot size from PPM, i.e. PMINQ.

In both cases, the minimum and maximum lot sizes are not relevant, e.g. PMINQ = 0, PMAQ = 1000. Examples illustrates interpretation of rounding value and material consumption in PPM. Note that in the first case the output rate (OUTIN) displayed in the input log is still equal to 7, but internally, the optimizer calculates with an output rate of 11.

Definition of Lot Size Profile for Transportation Lanes: Supply Network Planning-> Environment -> Current Settings ->Profiles-> Define Supply Network Planning Lot Size Profiles (Transportation Lanes) Example for discrete transportation method (= truck): demand at a DC: Product A: 8 tons; Product B: 8 tons capacity of a truck: 10 tons solution without discretisation: transportation of 16 tons solution with discretisation: planned transport of either 10 tons with 1 truck and 6 tons backlog for one of the products (for which product a backlog is created, depends on the late delivery penalty and non-delivery costs). Here transportation costs are incurred for one a full truck or 8 tons for product A and 8 tons for product B (with 2 trucks). Here, transportation costs are incurred for 2 full trucks. The actual result depends on the cost and master data (late delivery penalty, non-delivery costs, storage costs, warehouse capacities, and so on).

Consulting note: 511782

To activate discrete transportation fleets, the corresponding setting in the optimizer profile has to be selected (-> transports). Additionally, the checkbox DiscrTrMet has to be selected at the corresponding transportation lane. To activate minimum lot sizes, the corresponding setting in the optimizer profile has to be selected (-> Transp lots). To activate the maximum lot size, the corresponding checkbox in the optimizer profile has to be selected. To get integer lot sizes, the following setting in transaction /sapapo/copt10 has to be entered:  Section: SNP  Name: DISCRETETRANSPORTLOTHORIZON  Switch: INTEGER  Integer Column: Capacity Consumption (TCONA) describes how much capacity is used by transporting the material at this lane. The available capacity can be found in table ET_FLEET, column TUNIA. The penalty costs for increasing the capacity are displayed in column INPEN. MAXFL describes whether the capacity is finite (-> X) or infinite. Additional entries in Input log:  ET_FLEETC: Fleet resource: normal capacity 



MAXFL: capacity measured in trucks

ET_FLEETUC: Fleet resource: additional capacity 

INCFL: upper bound for increasing the fleet capacity

Bucket-Offset (BO) controls the material availability in a bucket. Values between 0 and 1 can be entered. Counterpart in optimization profile: ´rounding limit´ for production (value: 0-100%). But bucket offset from PPM is leading, i.e. if you have entered a value there, the rounding limit from the optimization profile is not considered. Additional example: if this bucket offset factor is set to 0.5 and the optimizer is ran for monthly buckets: If the duration of the operation is less than 15 days (half the month), the material output from the operation will be available for the subsequent operation in the same month. If the duration is more than half a month, the output will only be available in the next monthly bucket. For the first scenario, the first and following operation will be scheduled in the same month. But note that since the optimizer works in buckets, both operations will have a start date set to the first of the month - not scheduled in sequence. Notes:  Breaks (e.g. weekends) extend the length of a PPM duration  If PPM exceeds the bucket length, then the number of days in the bucket in which the PPM is finished, is relevant for the calculation. Consultancy Note: 434197

Definition of Quota Arrangements in Master Data (transaction: sapapo/scc_tq1). Optimizer Profile: 



Multiple quotas: either consider first or consider last  Defines how the Optimizer in Supply Network Planning (SNP) proceeds, if multiple quota arrangements exist for a particular period: You define that the optimizer considers the first or last quota arrangement in a period. The optimizer then ignores the remaining quota arrangements within this period. Not maintained: treat as zero or ignore  Defines how the optimizer in Supply Network Planning (SNP) proceeds if quota arrangements have not been defined for all relevant sources of supply: Treat as Zero: The optimizer does not plan any stock transfers from the sources of supply for which no quota arrangements have been defined. Ignore: The optimizer plans stock transfers from all relevant sources of supply. Quota arrangements are considered for those sources of supply where they have been defined.

Product-Group Assignment in Product Master data (Tab: Properties 2) Quota arrangement constrains the total receipt of all products within the group at the given location only.

All three methods arrive at an optimal solution. The main difference in the application of these methods may be the runtime. There is no general rule known for selecting the best method for a given problem (except testing each one of them). A good measure for the application is benchmarking on a test scenario because the optimal choice of the method depends mainly on the structure of the supply chain and less on the given input data. Therefore, in a productive environment, daily benchmarking is not necessary.

Window size defines the number of periods which are fixed after each single run. In the example above, it is set equal to 3. Standard settings / Default values:  window size is defined by the user (should be as large as possible)  overlap = window size/2 (non integer numbers are rounded down)  aggregated = remaining periods are aggregated to one bucket  Example: 52 buckets, window size = 26 => delta = 13, aggregated = 1 => problem with 40 buckets is considered in the first optimization run

With transaction /sapapo/copt10 you can change these standard settings (Section: SNP_TIMEDECO)  parameter „MinOverlap„ defines the overlap/delta (in the example above it is equal to 3)  parameter „DiscreteOverlap„ defines the discrete overlap (which is equal to 2 in the example). It has to be defined smaller than or equal to MinOverlap. If not, it will be set equal to Min Overlap.  parameter „AggregationWindowSize„ defines how many of the remaining buckets are aggregated into one bucket (in Example = 4)  „SWITCH“ for all parameters: Integer Don„t change default values without talking to development/expert optimization consulting.

DEMO

planning window defined by the material flow: consider all resources, intermediate materials and activities which could be used to produce a given product reduction of complexity by considering only a subset of the materials as well as only the subset of other entities that have to be considered for these materials (e.g. resources, transportation lanes) after each run for a sub-problem, capacities are reduced according to the individual solution, and these solutions are aggregated iteratively to a consistent solution for the whole problem window size represents the %-value of the model size which are considered in one sub-problem (single LP run) gliding window approach => overlaps between sub-problems (only a part of the solution of a specific sub-problem stays fixed)

Column 1 and 2 are defined by the user, taken from the prioritization profile Column 3 results from the Bill of Materials in the PPM (or PDS). From this BOM we get the atomic sub-problems. 

Sub-problem 1 is resulting e.g. from the fact that P2 and P3 are input components to produce the output component P1.  The other sub-problems result from similar relationships.

Step 1: Column 4, i.e. the priority for the product decomposition, is then derived from the priority of the sub-problems.

Step 1: Column 4, i.e. the priority for the product decomposition, is then derived from the priority of the sub-problems. Step 2: For sub-problems with DIFFERENT priority, there is definitely NO overlap (step 2).

Step 1: Column 4, i.e. the priority for the product decomposition, is then derived from the priority of the sub-problems. Step 2: For sub-problems with DIFFERENT priority, there is definitely NO overlap (step 2). Step 3: For sub-problems with the SAME priority, the standard sorting is used. E.g. if a partitioning parameter (window size) of 99% is used, all sub-problems are solved together, if a window size of 0% is used, they are all treated separately.

The iteration limit (number of improvements) should only be used for test purpose: specify an iteration limit of 1. Solve the discrete model without a runtime limit with full search. That will give you an estimate, how long the optimizer will take to find the first feasible solution. Note: changes in the model and/or changes in the demand pattern and quantities may implicate a big difference for the time to solve the model! Disable the iteration limit (set it to 0) and use a reasonable runtime limit instead (take the time you gathered during the tests with the iteration limit =1 into account and multiply it by e.g. 2) and solve the model.

Example: Take the Minimum PPM Lot Size into Account If you have chosen the discrete optimization method, you can specify in this field that you want the optimizer to take into account the minimum lot size (that was defined in the production process model (PPM)) when running the PPM. You can define the horizon for which you want this discrete constraint to be considered. You can either define a specific horizon (in days or weeks, for example), or specify whether the discrete constraint should be considered across all daily buckets, or across all daily and weekly buckets, and so on, based on the buckets defined in the planning buckets profile. The discretization horizon starts from today's date, even if the planning horizon starts in the future or the past. Dependencies  You must activate Discrete Optimization in the optimizer profile header data.  You must define the minimum lot size in the PPM.  If a larger minimum lot size has been defined in the location product master, the optimizer takes into account the value from the location product master, even if this indicator is activated.

Detailed discretization for different restrictions for daily, weekly or monthly buckets (analogously for quarterly or yearly buckets) If SNP - time buckets consist of weeks and ´days´ is chosen as detailed discretization, then in the optimization run this choice is ignored and there is no discretization at all. Analogously for monthly SNP bucket sizes: if ´days´ or ´weeks´ are activated in detailed discretization, then there are no discrete variables. Discretization in Input log:  DISCM Discretization algorithm: K, P or V  DITER Discr.: Maximum number of iterations  DLAUF Discr.: Maximum runtime in seconds  DTRUN Discr.: Rounding-limit transportat. variables  DPRUN Discr.: Rounding-limit production variables         

DISRF DISRZ DISTR DISPR TRLOS PRLOS COSTR COSPD COSPC

End bucket End bucket End bucket End bucket End bucket End bucket End bucket End bucket End bucket

for increasing prod.res. discretely for using fix prod.res. consumption for using discrete fleet on lanes for using a discrete PPM for using transport lots of material for using production lots for using transport cost function for using production cost function to use procurement cost function

Examples:  10 D => first 8 buckets are discrete, since days 8-10 fall in the first weekly buckets; afterwards no discretization  3 Z => first 21 days are discrete; afterwards no discretization  7 Z => first 49 days are discrete; afterwards no discretization  3 I => first 3 months are discrete; afterwards no discretization

With strict prioritization you can assign the priority of safety stock to one of the three demand classes. This is only possible for LPs.

Calculation of profit simply means to multiply the fulfilled demands of all products with the corresponding non-delivery costs and subtracting the costs from this value. Example: 1 Product with non-delivery costs of 10,000 APO-$ and a demand of 100. The costs for producing, transporting and storing this product is 50,000 APO-$ (this is the result which you will get if calculation of profit is not activated). Then, the profit is equal to 1,000,000-50,000 = 950,000 APO-$.

Average Stock On Hand the optimizer calculates the storage costs during planning, by multiplying the following values together:  Stock on hand at the end of the bucket (period)  Storage costs that have been defined for the location product in the product master (in the Procurement tab page)  Number of days in the bucket Stock on Hand at End of Period the optimizer calculates the storage costs during planning, by multiplying the stock on hand at the end of the bucket (period) with the storage costs that were defined for the location product on the Procurement tab page in the product master.

SAP TUTOR: OPT_INCREMENTAL_DEPENDENT_DEMAND.SIM

Result:  planned distribution receipt/demand at DC-03/Plant-03  planned production at Plant-03

Dependent Demand cannot be fulfilled due to production/stock transfer horizon for input components (8 weeks).

If „pseudo-hard“ is activated, then the optimizer does consider shortages on the non-selected input product. However, the resulting negative stock on hand in a bucket is set to zero. There is no further consideration of this backlog in subsequent periods.

By creating a purchase requisition, the optimizer doesn„t recognize that there is a source available in the APO model. Additionally, (potential) resource constraints will not be considered, if a purchase requisition is created.

Consideration of Priorities  You can define priorities from 1-4 for three different priority classes of the demand and the safety

stock. 1 is the highest,4 the lowest priority. You can also assign the same priorities for two or more priority classes.  The standard setting is that all priority classes and the safety stock have the same priority. Consideration of Priority of Location Products  If you set this indicator, the system considers not only the priority for the priority classes of the

demand, but also the priority of the location products. You define this priority in the master data of the location product on the SNP 2 tab page  To simplify this combination of both priority types, you must also subdivide the product priorities into three product classes with A, B and C. In addition, you can set the Consider Priority of Location Products indicator so that the system considers the priority of products. The system considers the product priority together with the demand priority. You can define which priority is more important and should be considered by the system first, by setting the Prio.of Loc.Prod. more important than Prio. of Prio.Classes indicator. Consideration of Procurement Priority of PPMs/PDS  If you set this indicator, the system considers the procurement priorities of production process models

(PPMs) or production data structures (PDS). You define this priority in the master data for PPMs or PDS. Consideration of Procurement Prio.of Transportation Lanes  If you set this indicator, the system considers the procurement priorities of transportation lanes. You

define this priority in the transportation lane master data in the Product-Specific Transport section. Consideration of Costs of Products without Input Products  To calculate the product value, the system automatically accepts the value 1 as the value of raw

products (products without input products). If you set this indicator, the system bases the calculation on the actual storage costs instead. You define the storage costs in the master data of the location product on the Procurement tab page.

Optimizer should not transport or produce without demand  No transport should be activated due to cheaper target location cost  No production should be activated due to cheaper output location-product cost  No transport should start to save storage cost on the truck  No production should start to save storage cost by production in progress

Consistency Check Mode  No Checks: No checks are made. We do not recommend that you choose this option. Switching off the check function does not improve performance and can even lead to termination of the optimizer, if there are data errors. Maximum Number of Messages per Consistency Check  restricts number of messages that are to be issued per consistency check.  Once this maximum number is reached, the system displays an additional message specifying how many more messages from this consistency check are suppressed (not shown).  Example: You have restricted the messages per consistency check to 25. Your model contains 30 production process models for which you have not maintained costs. You receive 25 messages: "Production model PA at location LA: are costs ok?" (number 149) plus another message "5 messages with number 149 have been suppressed". Write All Log Data  If you activate this indicator, the log data for an optimization run is stored. You can view this data in the Optimizer Log Data transaction (Supply Network Planning -> Reporting -> Optimizer Log Data).  If you do not activate this indicator, you will notice an improvement in performance. However, it is then only possible to view the SNP optimizer messages directly after the SNP optimization run. Upon leaving the transaction, these messages are lost permanently.  

Note: In Customizing for Supply Network Planning in the Maintain Global Settings for the SNP Optimizer IMG activity, you must enter a positive value in the Number of log entries field, to make it possible for log entries to be stored.  If you deactivate the indicator, the SNP optimizer is not able to take into account any SNP optimization bound profiles. Additionally, you cannot release the corrected demand forecast from Demand Planning to Supply Network Planning

During optimization, various constraints are taken into consideration, e.g., the capacity constraints as defined in resources. However, sometimes more flexible definition of constraints is required. For example, the quantity of parts that can be supplied by a vendor may vary from period to period. In such cases, we need to use time series key figures to define restrictions. In APO 4.0, there are 4 additional time series key figures for production, procurement, storage, and transportation upper bound. These key figures (9APRODUB, 9APROCUB, 9ASTORAGEUB, and 9ATRANUB) can be used to define time bucket dependent constraints for production, procurement, transportation, and storage. You can specify the time-based key figures in the standard SNP planning book 9ATSOPT with data view OPT_TSB (standard SNP planning area: 9ASNP04) or in a planning book that you created yourself. The main grid of the data view is the same as that in the SNP94(1) view of SNP standard planning book 9ASNP94. The second grid of this data view include key figures 9APRODUB, 9APROCUB, 9ASTORAGEUB, and 9ATRANUB. Zero-Indicator indicates whether an entry in the upper bound KF should be interpreted as “0” (enter then an e.g. “1” in the corresponding column) or whether it is empty (ignored). Only relevant upper bounds are displayed during interactive planning. For example, if location product is selected, only procurement and storage upper bounds are displayed; if PPM is selected, only production upper bound is displayed; if transportation lane is selected, only transportation upper bound is displayed. The SNP optimizer then takes these constraints into consideration when calculating the optimal solutions. The “Ignore Time-Based Constraints” Indicator in the Optimizer Profile should be set if you want to use an SNP optimization bound profile for the SNP optimization run, for example. If you use the time-based constraints alongside the bound from the optimization bound profile, the two types of constraints might cancel each other out. If this occurs, the SNP optimizer would not be able to find a feasible solution.

Requirement: For each location-product combination a time-dependent maximum and a minimum stock level can be used. If these levels are violated penalty costs should be taken into account within the SNP optimization. For the time-based minimum stock level (which is actually a safety stock requirement) the existing key figure SAFTY can be used. Additionally, you can define a key figure in order to model the costs of violating the minimum stock level. Moreover, you can specify the stock upper bound and any penalty costs for violating this upper bound as time-based key figures. This can be done in the standard SNP planning book 9ATSOPT (standard SNP planning area: 9ASNP04) or in a planning book that you created yourself. In the Optimizer Profile (Extended Settings Tab) it is possible to distinguish whether the time-based maximum stock level is a soft or a (pseudo-)hard constraint. This indicator is used to specify how the SNP optimizer takes into account a time-based stock upper bound that you may have set in interactive Supply Network Planning. You have the following options:  If you set the indicator, the optimizer regards the stock upper bound as a soft constraint, which means it can be violated with incurring penalty costs. You define the penalty costs in a time-based key figure in interactive Supply Network Planning.  If you do not set the indicator, the optimizer regards the stock upper bound as a pseudo-hard constraint, which means that it can be violated, but will incur infinitely high penalty costs.

Mass maintenance of key figures: transaction /SAPAPO/TSKEYFMAIN Optimizer Profile -> Extended Settings: Checkbox „Ignore Time Based Constraints“ must NOT be activated to consider receipt bounds.

The optimization bound profile is an optional profile that you can use for subsequent optimization runs. Using the resulting values for all decision variables (production quantity, transportation quantity, procurement quantity, storage quantity or delivery quantity) from an initial optimization run as your basis, you can set maximum and minimum limits for each bucket to recalculate particular decision variables in subsequent optimization runs.

Period - Choose the type of period (day, week or month) for which the optimization should be valid. Number - Enter the number of periods for which the optimization should be valid. Upper Limit - Set this flag, if the maximum percentage of the decision variable should be taken into account. Deviation (+%) – Enter a percentage to recalculate maximum deviation of all decision variables. Lower Limit – Set this flag, if the minimum percentage of the decision variable should be taken into account.

Deviation (-%) – Enter a percentage to recalculate the minimum deviation of all decision variables. Basic value – Choose a basic value (average, maximum or minimum) if the resulting value for the decision variable in a particular bucket from the initial optimization run was zero. The basic value is derived from the previous optimization run. Depending on your settings, the basic value is derived from either the bucket with the largest or smallest quantity of the decision variable (MAX/MIN), or from the average of all buckets that contain non-zero values for the relevant decision variable (AVG).

Prerequisite: Executing the SNP optimizer in the background: /sapapo/snpop Quotations can be checked with transaction: /sapapo/scc_tq1 validity date of quotas = end time of planned transports

Inbound quotas are used for SNP Heuristics, outbound quotas are used for Deployment Heuristic.

execution of the optimizer in the background  usage of a data view  definition of planning start and end date start dates before the start date of the data view are shifted to the start date of the data view end dates in the field "Planning end date" after the end date of the data view are shifted to the end date of the data view. If the „planning start date" is not at the start of a bucket, it is shifted to the start of the bucket in which it falls, meaning that the SNP optimizer takes the complete bucket into account. The same is true for the „planning end date": it is always shifted to the end of the bucket in which it falls. The "Stock on hand" quantity is taken from the initial column, or if the planning start date is greater than the start date of the data view from the column before. This is the only value that is considered. If you need to consider orders that fall outside of this "data view" horizon, you must choose or create a data view that has a horizon to suit your optimization needs. Negative quantities or backlogs are not transferred to the SNP optimizer. The reason for this is that the origin of these quantities is not known. For example, it is not known whether this quantity is from dependent demand, a forecast, or a sales order. The age of the backlog is also not known and whether it would be allowed to fulfill the forecast or the sales order.

If you are running the optimizer, it is always useful to read the optimizer log files. Log files:  Supply Network Planning -> Reporting -> Optimizer Log  /sapapo/snpoplog  Consulting notes: 454194, 509732 -> general information about Log Files Input Log For interpretation of the input log, check the documentation for a description of the sections and fields in this input log. This input log is very helpful in debugging whether the problem is in the master data setup or in the interpretation of the master data. Description of the log is included in the APO Documentation: E.g. APO 3.0: (Help -> SAP Library) under the Supply Network Planning-> Supply Network Planning Process -> Supply Network Planning Methods -> Optimization Planning -> Optimizer Log Data. E.g. SCM 4.1: (Help -> SAP Library) under the Supply Network Planning -> Supply Network Planning Run -> Optimization Based Planning -> Application Logs for Optimizer -> Optimization Input Log. Output Log (results in text file) If you think the optimizer did not give you the expected results, it is also a good idea to review the results log. Sometimes, the optimization results are correct and it is the translation to liveCache that is incorrect. That usually requires an OSS note to correct. But this would at least save you the effort of trying to figure out what costs or parameters to change.

Shadow prices are opportunity costs indicating how much the objective function value would change when the corresponding resource capacity could be modified by one unit. Reduced costs can be interpreted as follows: if a variable has a solution of zero, its reduced cost information indicates how much this variable is overpriced (in a minimization problem) compared to other variables. Both information cannot be interpreted when discretisation or decomposition methods are used.

1. Step (cost analysis)  Increase of penalty costs that are too low 2. Step (upper limit analysis and capacity analysis)  relaxation of upper limits (bounds) and then relaxation of capacity constraints 3. Step (lead time analysis - horizons)  Relaxation of production, stock transfer horizon as well as planned delivery time 4. Step (product availability analysis)  For all location products without a source, a dummy source will be created 5. Step (lead time analysis - past)  Planning Horizon is ignored, i.e. orders can be created in the past. These are then transferred to the first bucket of the planning horizon.

1. Step (upper limit analysis and capacity analysis)  relaxation of upper limits (bounds) and then relaxation of capacity constraints 2. Step (lead time analysis - horizons)  Relaxation of production, stock transfer horizon as well as planned delivery time 3. Step (cost analysis)  Increase of penalty costs that are too low 4. Step (product availability analysis)  For all location products without a source, a dummy source will be created 5. Step (lead time analysis - past)  Planning Horizon is ignored, i.e. orders can be created in the past. These are then transferred to the first bucket of the planning horizon.

1. Step (upper limit analysis and capacity analysis)  relaxation of upper limits (bounds) and then relaxation of capacity constraints 2. Step (lead time analysis - horizons)  Relaxation of production, stock transfer horizon as well as planned delivery time 3. Step (cost analysis)  Increase of penalty costs that are too low 4. Step (product availability analysis)  For all location products without a source, a dummy source will be created 5. Step (lead time analysis - past)  Planning Horizon is ignored, i.e. orders can be created in the past. These are then transferred to the first bucket of the planning horizon.

Step 11 has to be finalized 4 to 6 weeks ahead of the planned GoingLive. Steps 7-11 typically take about 4 weeks of model tuning to find the best balance between model accuracy and solution quality. If you encounter long data collection and result writing times, be sure to check if all available performance notes concerning data collection and writing are implemented. For this purpose, check consulting note 485018 where all relevant notes are collected.

Rule-base means usage of a specific heuristic (Root heuristic). This heuristic is available in APO 3.1 standard.

Optimizer Server:  IBM Netfinity 8500 Server  4 CPU processors (700 MHz each)  4.0 GB main memory APO: SP 19

Rule of thumb for data collection time: 0.3 seconds per location product (as long as the number of PPMs is not considerably larger than the number of location products). Optimizer Server:  Windows NT Enterprise  2 CPU processors (Pentium 4, 1.2 GHz)  3.0 GB main memory

APO: SP16; selected OSS notes for optimizer

Optimizer Server:  2 CPU processors (700 MHz)  2.0 GB main memory APO: SP16; selected OSS notes for optimizer

Optimizer Server:  HP NetServer LC 2000  Windows NT Enterprise  2 CPU processors  4.0 GB main memory APO: SP19

In APO 3.0 (SP25 and higher), APO 3.1 (SP14 and higher), and APO 4.0, the system steps of optimization are introduced with messages in the message log. (in APO 3.x you can also implement note note 594415 to receive the messages). Step 1 is introduced with the message “Step 1: Data read and model creation started at time on date”. All subsequent messages in the message log are now related to step 1. Step 1 consists of the data reading and building the model. The model is stored in several tables. These tables can be modified in a BAdI before step 2 is started. A description of the BAdI can be found in the note “542145 Info on BAdI for modifying the Optimizer Input and Output”. After the BAdI call, the input log file is written. All tables in which the model is build up are described in the note “509732 Infos on the SNP optimizer input and result log file”. The input log file is the basis for the calculation of a solution and is stored as a text file in the database. Step 2 is introduced with the message “Step 2: Model consistency check and solution calculation at time on date” and all subsequent messages are then related to this step. First, a model consistency check is made in addition to the creation of a linear program or a mixed integer program. In addition to the tables from the input log, the optimizer profile and other control settings are also taken into account for the solution calculation. The runtime specified in the optimizer profile only relates to the duration of the solution calculation of step 2. It refers to CPU time rather than stopwatch time. Step 3 is introduced with the message “Step 3: Order creation started at time on date.” Here, the results from step 2 are used as a basis for order creation. All subsequent messages in the message log are now related to step 3. The result can be modified in BAdI method ACCESS_RESULT_LOG before the orders are created. The result log is written after the BAdI call and stored in the database as a text file. The result log is the basis for order creation. For this reason, the BAdI is called before the result log is written. One or more corresponding orders are created in the liveCache database for each procurement quantity, transportation quantity, or production quantity per bucket and location product in the result log. Note 448986 “Info on Optimizer Lot Sizes” describes how many orders are created. For information about improving the performance of step 1 and 3, see note “485018 Info on the Performance of the Optimizer” and for step 2, see note 454433 “SNP Optimizer Profile - Discretization End Bucket.”

ABAP programs are run in a special runtime environment. This is the SAP Basis or its successor, the SAP Web Application Server (Web AS). Steps 1 and 3 are written in ABAP and run on the Web AS. Programs written in C++ run directly on the operating system. The optimization engine is written in C++ and runs directly on the operating system and not on the Web AS. This is also true for the liveCache database, it is also not part of the Web AS and also runs as an application directly on the operating system. In step 1, the “old” not-fixed orders of the selected location products are deleted in the liveCache database. This is necessary in order for the available capacity at the resources to be determined correctly. If the old orders were not deleted, they would consume capacities on their respective resources. The reading of demands in step 1 is done in up to 6 parallel processes, but only if optimization is started in the background. Here, you can set the number of tasks in which you want it to be executed. The optimization engine receives the data from step 1 on the Web AS. It performs a model consistency check and calculates a solution. The solution is received on the Web AS, where step 3 uses the result for order creation in the liveCache database. The input and result logs are written in an extra task. This task is either performed in parallel to the optimization or afterwards. This decision is made by the Web AS and based on the system load. The different programming languages of the steps also result in different components in OSS.Steps 1 and 3 belong to component SCM-APO-SNP-OPT and step 2 belongs to component SCM-APO-OPT-SNP. All the messages of step 2 belong to message classes /sapapo/snpopt or /sapapo/opt. If a message does not belong to one of these message classes, it is not a message of step 2. If there are any network or RFC function call problems, a message appears in a message box or in the job log with the message “Optimizer: …” (message class /sapapo/snp 505).

For more details check note 587407.

Easy = Capacity Constraints Difficult = Discrete Constraints – e.g. (minimum) lot sizes and stock levels

SAP TUTOR: SCM41_OPT_FAILSafe.SIM

search for consultancy notes in CSN with component ‚APO-SNP-OPT„ and ,APOOPT-SNP„

Consideration of setup times to create feasible production plans w.r.t. capacity utilization

LP solution with capacity reduction via loss factor (in resource master data) can already be sufficient:  similar setup times for each product  setup times comparable low in comparison to bucket capacity => global loss factor sufficient for mid-term/aggregated SNP planning

Figures represent PP/DS view, i.e. the situation after releasing the SNP planned orders to PP/DS. Actually, the SNP view consists of peaks at the beginning of each period. good results (figure 1) 

if setup costs are small in comparison to storage costs, i.e. lots are small, since setups can occur more frequently (see bucket 1)  if setup consumption is significantly smaller than the entire available capacity in a bucket (see buckets 2 and 3)

bad results (figure 2) 

if setup costs are high in comparison to storage costs, i.e. lots are large (this case does not have to be a serious problem. If setup costs are really high, then this should actually happen).  significant problem: setup consumption is large compared to bucket capacity. Actually, setups in bucket 2 and 3 are not necessary, since setup has already be planned in bucket 1.

•restriction: only at most one setup per bucket possible! •Lot size restrictions: • minimum lot sizes: campaign optimizer respects minimum campaign quantities • campaign quantity can be an integer multiple of a fixed lot size (discrete campaign quantity) Note: cost functions on campaign quantities can be used as additional constraint, too.

No sequence-dependent startups  setup consumption does not depend on the sequence of the products  no setup matrix as in PP/DS  only one fixed resource consumption per setup, independently on the sequence of products!

For each mode of an PP/DS PPM an SNP PPM is generated -> necessary to receive setup status from PP/DS order

Cross-Period Lot Size Planning: the optimizer takes into account setup statuses resulting from already planned PP/DS orders (and SNP fixed planned orders). Here, an SNP PPM can be executed in a bucket (period) without considering a setup requirement if, within this bucket, there exists a PP/DS order that has been created with the associated PP/DS PPM, or if the corresponding setup requirements can be adopted from the previous period.  The optimizer does not take this indicator into account if you have not entered a value in the cross-period lot size field (the Discrete Constraints tab page).  The indicator is only valid for resources that have been marked with the cross-period lot size indicator.  The corresponding PP/DS PPM must have been assigned to the SNP PPM in the PPM master data. Lot Size Planning: Not Cross-Period: the optimizer takes into account setup statuses resulting from already planned PP/DS orders when not using cross-period lot size planning. Here, a PPM can be executed in a bucket without considering a setup requirement, if an operation from the same PPM already exists within this bucket, independently of the required resource. The optimizer is able to schedule additional setup operations to the already existing ones from PP/DS within one bucket.  The optimizer only takes this indicator into account if you have entered a value in the Fixed Material and Resource Consumption field (from the Discrete Constraints tab page).  The SNP PPM must be assigned to the corresponding PP/DS PPM in PPM master data.

View more...

Comments

Copyright ©2017 KUPDF Inc.
SUPPORT KUPDF