Thursday, March 30, 2017

Let’s Test That Idea

The other day, I was talking to a fellow ExtendSim model developer, Aaron Kim from JWA Consulting. Aaron, who is a Lean consultant in the health care industry, uses ExtendSim as part of his Lean toolbox.  

Aaron described one of his recent simulation models, and it got me thinking not only about how underutilized simulation is. Why are there not more models built that simply compare concepts at a high level?  

Many of you who have built models know how easy it is to A) include too much detail or B) include processes around the fringe of the problem. Doing either requires extra effort to model and can cause delays to an entire project. I already suspected these were two root causes of unsuccessful projects, but could they also be the two main reasons simulation is not used as much as it should be?  

When Aaron described his model, I thought it was a perfect example of how valuable a simple simulation model can be. Aaron built a model that compared two scheduling strategies. He stayed out of the weeds, so to speak, and simply looked at the concepts involved.     

Aaron was working with a clinic. The clinic classified their patient visits into two basic categories – Short visits and Long visits. A Short visit would take about 20 minutes, while a Long visit would take about 40 minutes. Generally, the Long visits were new patients and accounted for roughly 25 percent of all visits.

The clinic had been scheduling patients according to what they called a “template” schedule.  The template schedule method works by setting up a template of appointment times for both patient visit types. When a patient requests a specific time, the clinic gives him or her the closest appointment block designated for that type of visit.

For example, if a Long patient called in and requested an 8:10 a.m. appointment, that slot could be open for a Short visit but not for a Long one. In such cases, the clinic would then give the patient the closest appointment time slotted for a Long patient, which might be an hour or two later. The clinic felt that their open appointment scheduling was better, since it gave patients appointments closer to their desired times.

An executive at the clinic suggested to Aaron that they switch to a “Open” schedule because they thought it seemed more patient-centric.

The open schedule method works by giving patients the available time closest to their desired appointment time. For example, if a patient wants a 9:00 a.m. appointment, and that slot is open, then the clinic gives it to the patient, even if it causes gaps in the schedule.

Aaron felt like the open scheduling method would leave gaps that were too small to see other patients and therefore result in the clinic scheduling fewer patients overall. Because of that, Aaron felt the template method would provide better patient satisfaction, as calculated by averaging the difference between the desired appointment times and the given appointment times.

Aaron decided to build a simple model to compare the two scheduling methods. He didn’t want an elaborate model with all the grueling details but rather something simple, just to compare the two methods, to see which one would give the better performance.  

Rather than modeling all the doctors in the clinic, Aaron chose to model just the scheduling of a single room with a single provider. He also did not model how each doctor worked different hours during the week nor how each took his or her lunch break at different times of the day nor how some preferred to come in late on Mondays or golf on Wednesday afternoons or take Friday afternoons off. Those were important details, but Aaron was not trying to model the entire clinic; rather, he simply wanted to see the difference between the two scheduling strategies.  

Aaron’s model had a specified number of patients per day wanting to book appointments for times over the following two weeks. Each patient would be booked on both an open schedule and a template schedule. The key performance measure of the system was the average time difference between the desired appointment times and the given appointment times. The results are shown below.

The Patients Per Day was a variable that varied from 16 patients per day to 20. The results showed that the more patients scheduled per day, the better the template schedule outperformed the open schedule. 

Because Aaron was just trying to compare two scheduling policies, this turned out to be a quick modeling project. It took less than eight hours to build the model and analyze the results.  

The time spent building simple models like this one can pay off immensely. But I hear far too many stories in which models take months to get data and build and far too few in which models are built quickly just to answer simple questions like this one. The challenge for us all is to know the correct level of detail needed to answer the primary question. So the next time you have a problem that a simulation model could be used to answer, don’t be afraid to build the model, but please pay attention to the level of detail required. It will take far less time to build if you can leave out the unnecessary details, and it could make simulation a much more useful tool for you.

Monday, October 26, 2015

Integrated Simulation Databases

Since the late 1990s, ExtendSim has had an embedded database as part of its simulation tool. Now in its second generation, it is so incredibly useful that I can’t imagine building a model without it. In this post, I’ll describe my favorite advantages of using the internal ExtendSim database but for a more comprehensive description of the major features please read ExtendSim Advanced Technology: Integrated Simulation Database (Diamond, Krahl, Nastasi, and Tag 2010).

Here is a list of some of the major features of the ExtendSim database:
  • Rapid data access
  • Connectivity to other data sources
  • Parent \ child relationships
  • Support for easy component scaling
  • Multiple databases in one model
  • Database aware modeling components
  • Database address attributes
  • Embedded distributions
  • Excel Add-in
  • Data linking
  • Link alerts

The first thing you should know about the ExtendSim database is that speed consideration was (and still is) a very high priority in its design. I have seen cases where a modeler used Excel or Access as the primary simulation data repository in such a way that the model interacted with it constantly during the model run. Having that constant interaction with Excel or Access during a model run tends to really slow the model down. Interacting with the internal ExtendSim database during the model run is comparable to the speed of interacting with arrays, which is really fast.

Using the internal database does not prevent you from connecting to other data repositories like Excel or Access or ODBC or ADO; however, the best practice when using another data repository is to import the input data into the ExtendSim database ONCE at the beginning of the model run, interact with the ExtendSim database during the run and then export the results to the external data repository ONCE at the end of the model run. This gives you the ability to use an external data repository to store your input and output data while using the ExtendSim database for speed during the model run.

The next major benefit of using the ExtendSim database is its visibility and the separation of model and data. I once had a discussion with someone who had just completed building a model – and he didn’t use a database! His model had roughly 300 processes in it. He used best guesses for the process times, because he didn’t have the actual data at the time he was building the model. When I spoke to him, he was beginning to get real data, and he wanted to start testing the sensitivity of the model. He was having a difficult time with the task. All of his process data was hidden in the block dialogs, which were spread out across the model. It was difficult for him to see the data he was currently using without going into every single process and looking for it!

Understandably, he was frustrated. He asked me if there was an easier way, and I suggested that he update his model using the ExtendSim database. The database helps make all of your data visible in organized table structures. This enables you to use best guesses as you are building the model, and afterwards, you can easily find and make modifications when you get real data. Using the database, the original best guess data will not get lost and forgotten.

The embedded ExtendSim database also allows the user to create parent \ child relationships between tables. This has a number of advantages.  First and foremost it helps endure data integrity but it also helps with making the data readable so the user does not have to maintain separate lists of indexes.     

The embedded ExtendSim database can also be used to help scale a model. Often, models have many processes that are similar, if not identical, except for their data. In ExtendSim, constructs like this can be encapsulated in a hierarchical block (h-block). Encapsulating the construct into an h-block makes duplication of these similar processes much easier, and it helps organize the model. In order to make maintenance of the encapsulated construct easier, the h-block can be placed in a library so that if changes are needed within the constructs, modifications may be made in one h-block and those same modifications can be automatically replicated to other identical h-blocks. The difficulty comes when the constructs have slightly different input data. This can be handled easily though by using the database to store the input data instead of storing it in the block dialogs.

Let me show you what I mean.  In the illustration below, the station construct is stored in an h-block in a library. Each station has a different process time and capacity. The process time is stored in a database table in which each station looks at a different record. Each station construct is reading the database for that process time, so each process time can be unique, even though the h-block construct for the four stations are identical. The only difference really is that they read from different records for their process time and capacity.

In summary, using the internal ExtendSim database can help make your simulation data visible and easy to find. It can speed up the run time, compared to constantly interacting with an external data repository during the simulation run. It is much easier to work with the data when the data repository is native to the simulation tool, and it can be used to help scale your model as it grows. Keep in mind that this is just a short list of the key benefits; there are many others.

If you have not already started using the ExtendSim database, I would highly encourage you to check it out.  I also teach a week long class on the ExtendSim database. We spend about two days covering the mechanics of using the database and about three days on the techniques of how to integrate it throughout a model. I cover almost all of the tricks I know. So if you have the time, then I encourage you to come and learn how to effectively use the ExtendSim database.  

YouTube video
I also have included a 30 minute overview of the ExtendSim Database on youtube.  Check that out when you have a chance.

I have included some references below for further reading.  The first one, ExtendSim Advanced Technology: Integrated Simulation Database, is a more in-depth look at the advantages of the ExtendSim database.  The next two references are some good examples of users taking full advantage of the ExtendSim database by not just creating a model but creating their own application within ExtendSim.

Diamond, B., Krahl, D., Nastasi, A., Tag, P., 2010. ExtendSim Advanced Technology: Integrated Simulation Database. In Proceedings of the 2010 Winter Simulation Conference, eds. B. Johansson, S. Jain, J. Montoya-Torres, J. Hugan, and E. Yucesan, 32-39. Piscataway, New Jersey: Institute of Electrical and Electronics Engineers, Inc.

Saylor, S., Dailey, J., 2010. Advanced Logistics Analysis Capabilities Environment. In Proceedings of the 2010 Winter Simulation Conference, eds. B. Johansson, S. Jain, J. Montoya-Torres, J. Hugan, and E. Yucesan, 2138-2149. Piscataway, New Jersey: Institute of Electrical and Electronics Engineers, Inc.

Akiya, N., Bury, S., 2011. Generic Framework for Simulating Networks using Rule-Based Queue and Resource-Task. In Proceedings of the 2011 Winter Simulation Conference, eds. S. Jain, R. R. Creasey, J. Himmelspach, K. P. White, and M. Fu, 2194-2205. Piscataway, New Jersey: Institute of Electrical and Electronics Engineers, Inc.

Tuesday, July 7, 2015

Benefits of Simulation

I recently had a discussion with a scheduler at a manufacturing plant. I was trying to explain to him the benefits of simulation, and I thought it might be a good discussion to post here as well.

First, here is a brief, although certainly not all-inclusive, list of the key benefits of simulation:
  • Answer Questions
  • Identify Problems
  • Animation & Visualization
  • Communication
  • Understanding
I am sure you are well aware of these benefits and others. But I want to take a moment and focus on the one I think is most important – Understanding. There are two aspects of this I want to elaborate on.

First, the process of modeling provides the model builder an in-depth look at the existing system. When someone takes the time to build a simulation model, it is much more involved than building a flowchart. A simulation model should be a close representation of the system. The results of the model must validate with reality, which is something you can’t do with a flowchart. The act of building a representative model drives the modeler to understand the flow, the routing logic, the priorities, the resource constraints, and the decisions made on the plant floor!  The modeler must spend time understanding the day to day operational decisions made on the shop floor which goes far beyond the effort one would ever put into a flowchart or a value stream map.

The second aspect is more interesting. Many system experts might have an understanding of their existing system on the surface, but they don’t understand the “physics” of it. Let me explain. The scheduler I mentioned earlier completely understood his current factory. However, if the scheduler were to change the rules he applied, and/or change the job priority scheme at various processes, and/or reduce or increase the batch size, and/or reduce or increase the total Work In Process, and/or change any other major aspect of the system – then neither the scheduler nor anyone else in the plant would be able to predict the effects!
However, think of what would happen if you gave the plant scheduler a simulation model. He could experiment with the model and fairly accurately predict what would happen with policy changes like the ones previously mentioned.  With this toolset, the scheduler could experiment with various policy changes to see what would affect the system in the most positive way and could radically improve the system.  Using simulation, all of the disruptive testing of these policies would be done in the model and NOT on the plant floor. The plant scheduler could really become the Yoda of the plant.

Wednesday, June 10, 2015

Embracing Simulation

Over the last 10 years, more than a thousand students have attended one or more of my simulation classes. I have seen successful and not so successful students. There are ways to help ensure success and here I provide some guidance on avoiding the most common mistakes.

Simulation modeling can be challenging for a beginner.  There is a sharp learning curve not only on the mechanics and nuances of the software, but also with managing a simulation project.  However, there is a magic formula and here it is in a nutshell. Make sure you have…

  • Training on the software
  • Support from a mentor
  • An appropriate first project
  • Support from the rest of the company
    • Access to data
    • Subject matter experts to educate yourself on the process
  • Time to work on the project
  • Time to verify and validate the project
  • Time to measure its success

Training & Mentoring
Get training on the tool as well as a mentor - both are crucial.  The cost of a one-week training class will save two to three months of learning and struggling on your own.  I also would recommend spending some time with a modeling mentor.  If your company has other modelers, buddy up with one of the experts for a couple of models.  If no one else uses simulation at your company, you might want to consider hiring someone to help build the first model.  After the first project, consider hiring that consultant again as a mentor through the next project.  It is difficult for a new modeler to build a good model without knowing what a good model looks like.  I have seen modelers using simulation for years but using the same bad habits they originally started out with because they have never seen it done differently.  So get started out on the right track.

Successful Initial Projects
The first project should be selected with care.  It should be used to help you get comfortable with simulation.  The purpose of your FIRST simulation project should NOT be to create a model that automagically generates the optimal schedule for the factory!  Although that is a lofty goal, first pick a simple project, possibly a subset of what will eventually become part of a larger model.  Start out small.  Pick something that will be successful.  Learn how to manage a simulation project.  Don’t be afraid to make mistakes.  Yes, make mistakes!  We actually learn the most when we make mistakes and correct ourselves. 

Get support from the rest of the company.  A simulation model needs data, often lots of it.  A simulation model needs process knowledge.  As a new modeler, you often do not have either of those within your immediate grasp.  You must go find the information from others within the company.  This information will take resources that you often can’t commandeer yourself.  It takes a certain level of management support to commandeer those resources.  Without this support, you can’t get data plus you can’t get the information you need regarding the process.  So without the right level of management support, the project can’t succeed.

Give yourself time to work on the project.  If you have other responsibilities, quite often your simulation responsibility will fall to the bottom of the task list.  I have seen this way too often.  When a new modeler has other responsibilities and is always starting and stopping frequently; it can be detrimental!  Each time a modeler revisits the model they tend to have forgotten the last few things they had done and waste a lot of time getting back into the groove.  This has a similar effect with experienced modelers, but not to the same extent as a new modeler.

Verification & Validation
Budget plenty of time to verify and validate the model.  A computer will do what you tell it to do, which is not necessarily what you want it to do. While you are building the model, constantly take time to verify that the model is doing what you intended. As you complete stages of the model, validate that it is a reasonable representation of the real system. For validation, you will need the same subject matter experts that helped you understand the process. This step will not only help to ensure that the model is accurate, it will make others more confident in the model.

Measuring Success
This is not easy, but if you can, compare a decision made without the model to one made with the model. Compare the performance of the implemented system with the model. If there are any significant differences, make note of them so that they can be accounted for in the future. If the model proved to be accurate, use this as political capital for future projects.

Here is a link to a paper I co-authored with Dave Krahl (now at Kromite LLC) a few years ago on getting started with simulation.   It is a great read and contains more details. Dave also helped write this post as well, thanks Dave.

Monday, April 27, 2015

Don’t Pay with Pennies

Have you ever been in line behind someone who pays with pennies? It’s not good is it? There’s time for counting, higher chance of error, and not enough space in the cash register to store all of the pennies. However, I see the same technique in simulation models quite often. This is because modelers tend to make a literal translation of the system into the simulation model. With a little thought, it is often more efficient to work with in terms of a group rather than a single item.

Let’s look at some examples of paying with pennies in simulation models:

  • Processing a batch of parts one-at-a-time. Typically this is breaking the batch into individual units processing them and re-batching them into a single unit for transport to the next operation.
  • In Logistics we WANT to model supply chains where every case represents an item.  We WANT to see every case / pallet get moved around and individually placed on the truck for the shipment.  The model should really only care about the fact that it took X minutes to load the truck and X, Y, and Z amounts of inventory was removed from the warehouse.
  • Allocating an array one row at a time. If you are storing results in an output table or an array it is tempting to add results one row at a time as you generate the results. However, this will cause the simulation software to allocate memory in small chunks. And, memory allocation is a relatively slow process in the simulation model. I do know that if you allocate one row in an ExtendSim database table, space is reserved for more rows should you need them, but this is still much less efficient than adding all of the rows that you will need once at the start of the simulation.
When building a model, look for opportunities to group actions together. Consider the work that the CPU will have to go through to simulate the model that you built. When you can:

  • Group items together and process them as a single unit. Use math and attributes to calculate model features such as delays and yield rates.
  • Track information in tables instead of individual items. For example inventory can be represented by a series of linked database tables.
  • Perform operations once, at the start of the simulation. This works well for setting up arrays and tables.
  • Use discrete-rate simulation to model high speed processes such as bottling and filling lines.

While I use ExtendSim, the above techniques would be useful for any simulation software. Considering methods for reducing the number and type of calculations will yield benefits both in modeling and run time.

I would like to thank Robin Clark for his comments and input on this topic.

Thursday, December 11, 2014

A Peak into the Future of Script Editing in ExtendSim

In my last blog, I listed some of the new features we are incorporating into ExtendSim 10.

In this blog, I would like to add a list of new features that are specific to the script editing environment within the ExtendSim block editor that have been added to the application since that blog was posted.

A while ago, we had an internal meeting here in the office where the in-house block developers presented to the application developers a wish list of features they would find useful in the ExtendSim block scripting environment. The open source software package Notepad++ was used as an example to show some of these proposed features. After the meeting, we did a little research and found that Notepad++ is based on an open source software package called Scintilla. Well, one thing led to another and ExtendSim 10 will now include a powerful new code editing environment.

New features for the scripting environment in ExtendSim 10 include:

  • More sophisticated code colorization, including the ability to customize colors
  • Code folding
  • Regular expressions in searching
  • Find in files capability
  • Brace matching
  • Show white space option
  • User customizable code completion
  • Call tips (show arguments during code completion)
  • Zoom/reduce in script window
  • Auto indentation
  • Indentation guides
  • Smart highlighting
  • Word wrap (sometimes known as line wrap)
  • Show end of line characters
  • And many more improvements

As mentioned in my last blog, we still don't have an official release date for ExtendSim 10, but we're looking forward to it as much as you are. We are waiting eagerly for the time when we can say more than it'll be ready 'When it's done.'

Friday, March 21, 2014

Business Intelligence

If you do a web search for the term "business intelligence" or "BI", you'll get two points of view - one is focused on how to gather and transform raw data into information and the other is focused on the process of analyzing and utilizing the information for strategic planning. Two parts of a whole, really, since there is no point gathering a ton of data unless you can somehow put it to use. Plus it's tough to make decisions about what your company/department/process should do unless you have enough information to support those decisions.

That got me thinking again about how important it is that ExtendSim has an internal relational database capability for storing and managing data. An internal database serves as a bridge between the simulation model and external data repositories, both to supply data to the model and to report model outputs. It also helps that ExtendSim has robust tools to facilitate the exchange of data with external sources. For example, its ADO (ActiveX Data Objects) capability allows ExtendSim to perform high speed data exchanges with external databases such as Microsoft Access and SQL Server. And the ExtendSim DB Add-In for Excel allows analysts to fully specify an ExtendSim database in Excel, including parent/child relationships, formatting, and data validation, and then retrieve useful information from the model without having to learn anything about simulation or even how to use ExtendSim.

But the storage, management, and transfer of data is only one part of the story. What is even more important is that the data gets used in a meaningful manner. And that is where simulation comes in. Simulation is a low cost, high reward method that allows you to analyze existing processes and explore the effect of changes. Likewise you can get assurance when designing completely new systems and processes since you've simulated their behavior or performance in advance. An intelligent way to do business.