cancel
Showing results for 
Search instead for 
Did you mean: 

Calculation level for IO Forecast error CV

0 Kudos

Hi Experts,

We have the following problem when calculating forecast error CV for IO.

  • The forecast is at monthly level. So we can get very high errors if we compare it at the weekly level against the Actual sales.
  • Locations are interchangeable. We can plan to sell a product from a particular Customer facing location, but this location can easily change to another, with no big impact for the bussiness. So if we calculate the forecast error CV at the location level, we can expect very high errors.
  • The products are interchangeable as well. We can plan for product A, but easily we can sell a product B instead. We can group both products in a family. The are still some reasons to mantain boths products separately.

So I'm thinking to calculate the forecast error CV at the Monthly-Product Family -Customer Group (Without location), and to copy that value at the standar level which is Weekly - Product - Location - Customer group. Because I can plan a product A to be sold from a location 1, but at the end can be sold a product B from the location 2, and that's perfectly fine for the business, but the forecast error will be huge.

What do you think experts? makes sense for you? What potential drawback can arise?

Thanks,

Héctor

Accepted Solutions (1)

Accepted Solutions (1)

Irmi_Kuntze
Advisor
Advisor
0 Kudos

Hi Héctor,

in regards to aggregation on product and location, if you find proper groupings (e.g. the good old planning products from ECC where between the single SKU within on of those grouped products you may consider in optimizer to substitute the products), that seems to be a valid approach. They should have similar semi-finished products.

In regards to the timing, I am not 100% convinced yet that calculating on monthly bucket is the best solution, depends on your business ("depends" is a nice placeholder for "I have no clue"...). You somehow need to break down the forecast to weekly level for inventory in any case, and you need enough buffer in case the sales comes in early, meaning your error cv might be a bit low if calculated on monthly level and way too high if calculated on weekly level...

Honestly, I dont know what might be your "best". Maybe run both in parallel and compare the results, or choose some kind of weighting between the two, such as calculating both and take 2/3 from the monthly and 1/3 of the weekly calculation (ok, maybe that is just a very stupid idea, just thinking loudly...).

But it is a very interesting use case. Please, if you find your "best" solution, post it here 🙂

0 Kudos

Thanks Irmhild,

About the products and locations, they are all active objects. We are talking about pulp celulose, each line of production try to get the same product, but it gets it a little bit different (some can have a better quality). Some customers can prefer one above other, but most of them are totally indifferent, and yes, the use the same ingredients, so that wouldn't be a problem.

About the time bucket problem, I'm thinking to simply dissagregate the forecast based on the sales history, that could be a better approach.

Regards,

Héctor

Answers (1)

Answers (1)

Irmi_Kuntze
Advisor
Advisor
0 Kudos

For me that does make sense - for evaluation purposes.

Just, if you run inventory on weekly level, and you have the FC and with that the error cv on monthly level, your error cv is too low and the recommended safety stock will get lower than you may need it.

And if you calculate the error CV on aggregate level of product-group and without location, that can even be worse. Do you use interchangeability with SOP optimizer in your scenario? Have you set up ATP with product and location substitution? Leaving out the location completely seems not to be a good idea if you expect a customer in US to be delivery from Australia because your stock is at the wrong DC. That would make the whole optimization based on service levels obsolete. Having some kind of grouping of locations on the other side could make sense, depending on your overall scenario.

On the other hand, even if you have a high error cv, that is not the only influencing factor. If a customer can be delivered from other DC's and you have set up your data properly, with your inventory run you will see the resulting costs.

0 Kudos

Thank you very much for your response!

About the FC error CV on a weekly basis. The forecast will be dissagregated based on WEEKWEIGHT, but the sales can be concentrated on the last week for example, so at weekly level the error CV would be high if we have few or no sells the first 3 weeks.

I agree about location grouping as a good option, but I think that there shouldn't be a problem as the cases in which we can interchange locations is when locations are very near each other and almost doesn't matter which one to use (mostly for origin ports with direct sales). In most cases, however, we would have just one location for each customer.

You have to think that we'll be planning at product and location level, so we can plan a demand of a product A, run the SOP optimizer and get the recommended results to supply the product A from the location X. Then, if in execution they decide to deliver instead a product B from location Y, that's fine, it's outside our planning. The problem will arise when we compare the history and the forecast at the standard granularity to calculate IO FC error CV, because we planned "A" and we did "B". So what do you think would be the best course of action?

Thanks,

Héctor