Using Analysis to Maximize Manufacturing Performance Potential

Want More?

Download the PDF

Want to stay up to date on the trends and issues impacting your supply chain?

I understand that Tompkins will only use this information to contact me about business opportunities. By completing this form I am confirming that I have read and accept the Privacy Policy.

Contact Us

We would love to hear from you.
Call Us: 561-994-0012

Want to stay up to date on the trends and issues impacting your supply chain?

I understand that Tompkins will only use this information to contact me about business opportunities. By completing this form I am confirming that I have read and accept the Privacy Policy.

Published January 30, 2018

Share on:

Every business would love to know if they are maximizing their performance potential. For a distributor, would adding more product diverts and loading docks increase output, and at what cost? How does changing staffing levels affect facility output? For manufacturers, would adding additional equipment in a bottlenecking cell improve downstream performance, is it even correct to assume that this cell is a bottleneck, or could the problem be elsewhere in an upstream process that is not being captured? Additionally, is it possible that it is not a physical issue with current processes but instead a scheduling problem that once solved could greatly improve capacity and resultant service levels?

The increasing popularity of simulation and optimization tools, alongside the growing pool of individuals with the expertise to utilize them, are bringing the answers to these questions ever closer for all companies. The hard part is knowing which solution method, simulation or optimization, is correct for the unique challenge at hand. This is especially important if a company is looking to hire outside help or considering purchasing additional simulation or optimization software for internal use.

As a business it is important that the return on investment (ROI) for a detailed analytical study be tangible. For instance, if hiring outside help to conduct the study, clearly defining the objective of the project and what type of analysis is being sought after will save all parties time, money, and stress. If the objective is not clearly defined and it becomes evident the wrong type of analysis was selected mid-project, either the study may continue for such a duration of time that the results become invalid or the scope will have changed so significantly that the budget is blown and both parties lose. Alternatively, if the project is internal and any new software is required, it is important to consider whether the project in question is a one-off situation or if there are broader applications of the new technology. Considering whether anyone will need to be trained on the new resources along with how cost and personnel intensive that process will be are all necessary criteria for project selection and planning. Perhaps the best place to start in answering the question of what analytical approach to take, lies in understanding exactly what simulation and optimization models are and how each can be leveraged to suit a wide range of business needs.

What is Simulation?

Under the umbrella of data modeling, simulation can be best thought of as a descriptive behavioral modeling. This essentially means a model is being created given an existing system with many diverse inputs such as rates of movement, arrival distributions of objects entering a system, physical distances between processes, failure frequencies, etc. These parameters are used to construct a digital, often graphically representative, mathematical model. The model describes and mimics the behavior of what currently exists. The system itself can be anything from a simple interaction of customers entering and exiting an ice cream store to a highly detailed, multi-stage, manufacturing process. The most important aspect is that the simulation reflects what is real and can be validated against existing performance.

Validating a simulation model is a very detail oriented process. The more that can be included in the modeling effort to increase performance accuracy vs the real system, the more analysts will be able to trust the results of changing aspects of the system, or conducting sensitivities, later on. All methods of simulation will be able to output performance metrics for the system being modeled, such as product or service throughput, equipment utilization time percentage, lost time due to equipment failure, and estimations of overall operating costs of the system. This is where simulation offers extensive value. Once there is high confidence that the model represents reality, analysts can determine the resulting effects of say, adding an additional processing station at step three in an assembly line. The ease of making these adjustments and lack of disruption to ongoing operations makes simulation an excellent analytical tool for sensitivity. Often, many sensitivities will need to be performed throughout the system to discover the best case, future state, scenario in which the system sees the most improvement. It should be noted that this process does not always produce the most optimal results. Even after many design changes, adding/upgrading existing virtual equipment and personnel, and substitution of inputs the resulting solution may be better than the baseline,  but it is possible that it is still not the best. This is because the software is usually only working with what goes into it, in other words it is only as good as the team designing it. Some simulation languages do offer the possibility of optimizing portions of the analysis, but these options still function only within the framework that has been created by the user. 

Simulation is also unique as an analytical tool in a few other ways First, being its functional ability to work with and utilize uncertainty. This means that an input parameter for a process does not have to be fixed. Instead, it is possible to use a range, often represented as a statistical distribution, to account for unpredictable data and capture the necessary value, while adding a measure of realism to the system. This attribute opens the possibility of analyzing a simple system quickly and effectively without significant time and energy spent on drawn out observational studies. Secondly, simulation models are often very visual, representing the modeled environment in full function and to scale. It is a helpful characteristic for communicating systemic changes and their impacts clearly to both technical and non-technical personnel reviewing the analysis. Also,  playing a valuable role in understanding physical feasibility as to what can be added or adjusted in a system, while adhering to physical constraints. The additional context this provides is very useful in understanding exactly how something could hypothetically work, where problems may exist, and what any proposed solutions would look like as well as how they would perform. Sometimes just being able to see it, opposed to hearing a description based upon numbers on a presentation slide, makes all the difference in greenlighting a project.

What is Optimization?

Just like with simulation there are a number of defining characteristics that can be identified to assess whether optimization is needed. Where simulation creates a behavioral model using a defined set of rules and interactions of many different pieces of a whole, optimization requires a descriptive model where there is a clear objective with a defined set of constraints and variables that can change and is used to capture a bigger picture or global view of the process under analysis.

In a business context, optimization is exactly what its definition suggests; the act or process of making something as fully functional or effective as possible in order to recognize performance potential. Fully functional here refers to determining either a maximum or minimum of a mathematical function given a set of well-defined constraints. One popular example taught in undergraduate operations research courses is, finding the minimum number of fire stations required to ensure that at least one fire station is within 15 minutes of each of six towns. Here the time required to drive from town to town is known and the number of fire stations must be an integer (cannot be a fractional number). When using this information, it become possible to represent the permutations of travel between cities mathematically and develop representative “locations” to place the firehouses, satisfying the outlined restrictions. In reality, a city would perform this study of its boroughs to understand how much coverage is truly necessary, while avoiding wasteful spending.

Another practical example that came up recently on a project at Tompkins International was the need to maximize demand, given a production schedule. In this example demand was unevenly distributed across working weeks throughout the year. The challenge for the company in question was to schedule production early enough on days where there was available capacity to meet as much of the required demand as possible. A classic production planning problem all manufacturers are likely to face. In this case the company was constrained by the number of hours of production available during different times of the year, individual product assembly line requirements, and instead of utilizing many individual inputs like what would be done with simulation, those parameters became condensed into singular parameters, calculated SKU level production rates. All this information was then built into an excel tool to move demand values, prioritizing the smallest values first, from days that were identified as “over capacity” to earlier days that were “under capacity”. In doing so demand was then “smoothed” across the calendar years.

The example above actually had two effects. The first being the creation of a production schedule that maximized the amount of demand the company could meet given the current production environment. However, there was a secondary benefit in that by maximizing projected demand met, the schedule also provided the opportunity to minimize inventory carrying costs. This is because demand values were not moved back in time any more than necessary, meaning there was a minimal amount of product buildup before demand hit. There is one caveat here, in this example all input values were assumed to be true. The analytical tool did not allow for a factor of uncertainty that could be introduced into the optimization model except to be conservative in any estimates made regarding production rates, which were used to calculate the amount of time consumed in each production batch.

Here the optimization model is changing the defined variables subject to the known constraints or rules to maximize or minimize the singular objective. As a result, the optimization outputs a discrete solution to the problem opposed to simulation, which provides many solution alternatives dependent upon what has been created within the model space. This does not necessarily mean that optimization is better than simulation, simply that they have different uses.

When might simulation and optimization be used together?

When approaching a project determining whether to use simulation or optimization can be a difficult process. Hopefully by outlining several defining characteristics of each method of analysis in the discussion above that decision is clearer. However, when might it be necessary to use both methods together? This was touched on briefly in the simulation passage, some simulation languages offer the user the ability to create optimization scenarios for experimentation. There also exists the possibility that a simulation project requires an optimized input to successfully analyze the system being constructed. An example of this would be creating an optimized schedule, such as the one specified in the optimization section. If it is known that the assembly rates at the SKU level will not change, the rest of the system can be altered to build out sensitivities that assess the effects of other variables, such as material handling, knowing that the foundation of the analysis is already in its optimal state.

There are broad applications to this joint analysis concept. However, the potential for an optimization/simulation hybrid solution mostly applies to already defined simulation projects due to the discrete nature of optimization. It makes more sense that an optimization solution may be necessary for an input of a simulation build opposed to constructing an optimization model and then deciding to create a simulation with the output. Ultimately, defining aspects of the analysis where this may be necessary from the start will allow for tighter budget control, greater clarity to the expected project timeline, and a better understanding of project staffing and skillset requirements.

To summarize, when planning for a project and selecting the tools for the job, optimization is excellent when the system being analyzed is well understood and can be described using mathematical functions to find a discrete solution. Alternatively, simulation is most useful when the system is very complex, not completely understood, or is subject to elements of uncertainty. There certainly exist situations where both methods of analysis can and should be used together, but most importantly knowing what is required before beginning a project will lead to the best outcome for all parties involved.

Newsletter Signup


Sign up for our latest Insights and News.
Join over 50,000 others, it’s completely free!

I understand that Tompkins will only use this information to contact me about business opportunities. By completing this form I am confirming that I have read and accept the Privacy Policy.