Who introduced simulation?

03 Apr.,2024

 

The Evolution of Reality: From Ancient Philosophy to Modern Simulation Theory

Simulation Hypothesis is a concept that suggests that our reality is nothing more than a computer-generated simulation. Most people think that this is a completely new theory.

However, this idea has been around for centuries in one way or another. While early philosophers had zero clues about computers and technological advancement, their concepts and thoughts at least slightly resembled modern simulation theory arguments.

Then, the concept gained more attention in recent years due to the advancements in technology and virtual reality, making it easier to imagine the possibility of a simulated reality.

And yet, the origin of the Simulation Hypothesis can be traced back to ancient philosophical ideas.

Plato’s famous allegory of the cave suggests that our reality is nothing more than shadows on a wall, and we are living in a world of illusions.

This idea was later expanded upon by philosopher René Descartes, who argued that it is possible that an evil demon is deceiving us, making us believe in a false reality.

The modern version of the Simulation Hypothesis was first proposed by philosopher Nick Bostrom in 2003, in which he argued that it is more likely that we are living in a simulation than in a physical reality.

And so, let’s expand on each of these ideas or theories.

Philosophical Roots of Simulation Theory

Plato’s Allegory of the Cave

Plato’s “Allegory of the Cave” is one of the earliest examples of a philosophical thought experiment that can be seen as a precursor to the Simulation Hypothesis.

In the allegory, prisoners are chained inside a cave, facing a wall. Behind them is a fire, and between the fire and the prisoners, people walk by carrying objects that cast shadows on the wall.

The prisoners believe that the shadows are the only reality and that there is nothing beyond the cave.

Plato used this allegory to illustrate his Theory of Forms, which suggests that the world we see around us is just a shadow or imperfect copy of a perfect and eternal reality that exists beyond our perception.

  • See also: 10 Philosophers Who Went Insane

This idea has been used by proponents of the Simulation Hypothesis to suggest that our reality is just a simulation, and that there is a higher reality beyond our perception.

Descartes’ Evil Demon Hypothesis & Cartesian Skepticism

Descartes’ “Evil Demon Hypothesis” & “Cartesian Skepticism” is another thought experiment and philosophy that has been used to support the Simulation Hypothesis.

In this hypothesis, Descartes suggests that an evil demon may be deceiving us, making us believe that what we perceive as reality is actually an illusion.

This hypothesis is similar to the Simulation Hypothesis in that it suggests that our perception of reality may not be accurate.

Descartes’ hypothesis has been used to argue that our reality may be a simulation created by an advanced intelligence.

Zhuangzi’s Butterfly Dream

Zhuangzi’s “Butterfly Dream” is a famous story in Chinese philosophy that has been used to support the idea that reality may not be what it seems. In the story, Zhuangzi dreams that he is a butterfly, flying freely and without care. When he wakes up, he is unsure whether he is a man who dreamed he was a butterfly, or a butterfly dreaming that he is a man.

This story has been used to argue that our perception of reality may be an illusion, and that there may be a higher reality beyond our perception.

It has been suggested that this higher reality may be a simulation, similar to the one proposed by the Simulation Hypothesis.

Science Fiction Influences

Simulacron-3

One of the earliest works of science fiction that explored the idea of a simulated reality was the novel “Simulacron-3” by Daniel F. Galouye, published in 1964.

The book tells the story of a man named Douglas Hall, who discovers that his entire world is actually a virtual world simulation created by a group of scientists.

The novel was adapted into a German television movie in 1973, titled “World on a Wire,” which further popularized the concept of simulated reality.

The Matrix

Our brain simulates reality. So, our everyday experiences are a form of dreaming, which is to say, they are mental models, simulations, not the things they appear to be. – Stephen LaBerge

Perhaps the most well-known and influential work of science fiction that explores the idea of a simulated reality is the film “The Matrix,” released in 1999.

The movie tells the story of a computer programmer named Neo, who discovers that the world he lives in is a simulated reality created by machines that have enslaved humanity.

The film’s iconic visual effects, such as the “bullet time” technique, and its exploration of philosophical concepts such as determinism and free will, have made it a cultural touchstone and a major influence on popular culture.

Both “Simulacron-3” and “The Matrix” helped to popularize the idea of a simulated reality and laid the groundwork for the development of the Simulation Hypothesis.

They demonstrated that the concept of a simulated reality was not just the stuff of science fiction, but a serious idea that could be explored and debated.

And then we’re arriving at technological evolution and first real thoughts that perhaps simulation could actually be possible if computers and technology advances enough in the future.

Technological Evolution

Advancements in Computing

The development of the simulation hypothesis has been closely tied to the evolution of computing technology.

Namely, the first computers were as large as an average apartment today and they had much lower calculation power than today’s smartphones that can fit in our pockets.

Moreover, the first games developed in the 80s were ultra-simple with pixelated graphics, while 40 years later (today), we have ultra-realistic games that resemble nature and true reality.

Thus, if we extrapolate these advancements in technology and video games, it is probable that some advanced civilizations are already at such a high level that they have produced a whole world and universe in such a way that we could already be in a computer simulation.

Virtual Reality Development

Another key development that has contributed to the simulation hypothesis is the development of virtual reality technology.

Virtual reality allows users to experience simulated environments in a more immersive way, enabling researchers to study human behavior and perception in a more controlled environment.

The development of virtual reality technology has also enabled the creation of more realistic and detailed simulations, allowing researchers to explore a wide range of scenarios and hypotheses.

As virtual reality technology continues to evolve, it is more than likely that people in 100 or perhaps 1000 years will not be able to distinguish reality from fiction and video games.

Then the next step in the creation of a fully simulated world would be some kind of transition phase.

Transition in a way that people are connected to the virtual world from the day of their birth.

Perhaps we’re getting over our heads with this, but there are basically two possibilities if simulation theory is real.

One is that we do not exist at all.

The second one is that we do exist but we’re locked up somewhere, not even knowing we are there, and our brain is connected to some kind of computer that is creating our reality.

Basically, both scenarios end up in the same way – either you’re not alive at all, or you’re alive but not really.

You’re just a battery serving the higher power.

Anyhow, those are all just speculations and it’s impossible to prove it right or wrong.

At least not just yet.

Okay, so with that, let’s see in more detail the actual simulation argument made by philosopher Nick Bostrom.

The Simulation Argument

As mentioned, the Simulation Hypothesis is a philosophical theory that suggests that reality, as we perceive it, may actually be a computer simulation.

Although this idea has been somewhat touched upon throughout history (as we’ve explained in the introduction), one modern philosopher made a real and specific hypothesis.

In his 2003 paper, “Are You Living in a Computer Simulation?“, Nick Bostrom went into detail about the simulation hypothesis. In this paper, Bostrom presents a trilemma that argues that at least one of three propositions must be true:

Nick Bostrom’s Trilemma

  1. The human species is very likely to go extinct before reaching a “posthuman” stage.
  2. Any posthuman civilization is unlikely to run a significant number of simulations of its evolutionary history (or variations thereof).
  3. We are almost certainly living in a computer simulation.

Bostrom’s argument is based on statistical analysis and the assumption that future civilizations will be able to create simulations that are indistinguishable from reality.

He suggests that if we accept the first two propositions as false, then the third proposition must be true.

Posthuman Civilization

The idea of a posthuman civilization is a key component of Bostrom’s argument. He suggests that if a civilization were to reach a point where they could create simulations that are indistinguishable from reality, they would likely create many such simulations.

Furthermore, if these simulations were created, the simulated beings within them would likely create their own simulations, and so on.

This would result in a vast number of simulated realities, making it statistically more likely that we are living in a simulation than in the “real” world.

However, not everyone accepted this theory. Namely, there are many open questions.

Implications and Open Questions

The Simulation Hypothesis raises several questions that need to be addressed. If we are living in a simulated reality, then who is responsible for the well-being of the simulated beings?

Should they be treated as mere computer programs or as conscious entities with rights? If the latter is true, then what are the implications for the treatment of simulated beings?

Another concern is the potential for abuse of the simulation technology. If we are able to create simulated realities, then what is stopping us from creating simulations for malicious purposes?

For example, could simulated beings be created solely for the purpose of experimentation or entertainment?

Then there are many philosophical questions.

The hypothesis challenges our understanding of reality and raises questions about the nature of consciousness and free will.

If our reality is simulated, then what does that say about the existence of the physical world? Does it mean that the physical world is just an illusion?

Furthermore, the Simulation Hypothesis challenges traditional notions of causality.

If our reality is a simulation, then events could be manipulated by the creators of the simulation.

This raises questions about determinism and the role of free will in a simulated reality.

With all those questions, there has been a lot of criticism and counterarguments of the whole theory and hypothesis.

Criticism and Counterarguments

Some of the most notable ones are below:

  • Lack of empirical evidence: The Simulation Hypothesis is based on philosophical arguments rather than empirical evidence. Critics argue that without any empirical evidence, it is impossible to prove or disprove the hypothesis.
  • Circular reasoning: The Simulation Hypothesis often relies on circular reasoning, where the simulation is used to explain the existence of the simulation itself.
  • Anthropic principle: The Simulation Hypothesis often relies on the anthropic principle, which states that the universe must be compatible with the existence of intelligent life because we exist. Critics argue that this principle is not a valid scientific principle and is based on subjective reasoning.
  • Occam’s razor: Occam’s razor is a principle that states that the simplest explanation is usually the correct one. In the context of the simulation hypothesis, Occam’s razor suggests that it’s more likely that our reality is not a simulated one, as creating an entire simulated universe would introduce unnecessary complexity compared to the simpler explanation that our reality is as it seems.

Despite these criticisms, proponents of the Simulation Hypothesis have provided counterarguments to defend their hypothesis.

They argue that the lack of empirical evidence is not a valid argument against the hypothesis because it is still in the realm of possibility. They also argue that circular reasoning is not a problem because it is a common feature of many scientific theories.

Finally, they argue that the anthropic principle is a valid scientific principle because it is based on the observation that the universe is finely tuned for the existence of intelligent life.

And so, with that let’s look at answers in physics and science.

Scientific Perspectives

Quantum Mechanics

The simulation hypothesis has also been explored from the perspective of quantum mechanics.

Some physicists have proposed that the universe could be a simulation running on a quantum computer. The idea is that the universe is made up of information and that this information could be processed by a quantum computer in the same way that classical computers process information.

One argument in favor of this idea is the fact that quantum mechanics allows for the existence of superposition and entanglement, which are phenomena that are difficult to explain using classical physics.

Some physicists have suggested that these phenomena could be a result of the universe being a simulation.

  • See also: Top 10 Greatest Physicists Of All Time

However, there is currently no direct evidence to support the idea that the universe is a simulation running on a quantum computer.

It remains a speculative idea that is still being explored by physicists.

Cosmology

Another area where the simulation hypothesis has been considered is cosmology.

Some cosmologists have suggested that the universe could be a simulation created by a more advanced civilization.

This idea is based on the assumption that a civilization that is advanced enough to create a simulation of the universe would likely be able to create multiple simulations.

One argument in favor of this idea is the fact that the universe appears to be finely tuned for life.

The fundamental constants of the universe, such as the speed of light and the strength of the electromagnetic force, are precisely set to allow for the existence of life.

Some cosmologists have suggested that this apparent fine-tuning could be a result of the universe being a simulation.

However, like the quantum mechanics perspective, there is currently no direct evidence to support the idea that the universe is a simulation.

It remains a speculative (but rather interesting) idea that is still being explored by cosmologists.

And so in the end let’ summarize significant events and clues that have contributed to the development of this hypothesis.

Table: Timeline of Simulation Hypothesis

Time PeriodClues for Simulation HypothesisExplanationAncient TimesPlato’s Allegory of the CavePlato’s allegory suggests that reality perceived by humans may be an illusion or shadow of a greater truth.17th CenturyCartesian SkepticismRené Descartes’ philosophy of doubt raises questions about the reliability of sensory perceptions and the nature of reality.20th Century“Brain in a Vat” Thought ExperimentPhilosophical thought experiment proposing that an individual’s perceptions could be artificially manipulated, akin to a simulation.1974Robert Nozick’s “Anarchy, State, and Utopia”Nozick’s experience machine argument raises questions about the importance of authenticity and reality in human experiences.1999“The Matrix” FilmThe film popularized the concept of simulated reality, where humans live in a simulated world while their bodies are used as energy.2003Nick Bostrom’s “Are You Living in a Computer Simulation?”Bostrom’s paper explores the hypothesis that advanced civilizations could create computer simulations indistinguishable from reality.2016Elon Musk’s RemarksMusk suggested that the odds are “one in billions” that we are not living in a computer simulation, based on advancements in technology.2020Oxford StudyA study from Oxford University concluded that the probability we are living in a simulated reality is around 50%.Present DayOngoing Scientific and Philosophical DiscourseContinual exploration and debate in various fields regarding the plausibility and implications of the simulation hypothesis.

Conclusion

The Simulation Hypothesis is a super interesting hypothesis that has gained significant attention in recent years. While it may at first seem like an impossible idea, the possibility that our reality is a simulation cannot be dismissed entirely.

Especially if you think about it for a long time and if you remove your ego from the thought.

Of course, it is hard for any of us to accept the possibility that we’re not real or that what we experience is not real.

However, that may be reality just as well.

As we’ve written in our article ‘What is the true nature of reality?’, it is almost impossible to define reality. It is different for everyone because we perceive reality through our five senses.

Thus reality defines itself.

Therefore, if someone or something managed to create such advanced technology to alter senses completely or create the whole virtual universe, well then the simulation hypothesis might be real.

In conclusion, the Simulation Hypothesis completely challenges our understanding of reality and raises vital questions about the nature of existence.

Whether or not our reality is a simulation, the concept reminds us that there is much we still don’t know about the universe and our place in it.

Who knows, perhaps in a thousand, billion, or even 100 quintillion years, we’ll know for sure…

IV. Introduction to Modeling and Simulation Systems

A. Historical Perspective [SS]

Today Simulation is arguably one of the most multifaceted topics that can face an Industrial Engineer in the workplace. It can also be one of the most important to a corporation, regardless of the industry. Quality, safety and productivity are all affected by Simulation, whether the issues occur in the office, on the manufacturing floor, or in a warehouse. This article is focussed towards providing information on the development of Industrial Process Simulation from the stage of infancy to the current stage where it is used as a powerful tool for increasing the competitiveness and profits of the company [5].

Simulation is extensively being used as a tool to increase the production capacity. Simulation software used by Cymer Inc. (leading producer of laser illumination sources), increased the production capacity from 5 units/month at the beginning of 1999 to 45/month at the end of 1999, an increase by around 400% [5].

Visualization and graphics have undoubtedly made a huge impact on all simulation companies. Easy-to-use modeling has resulted in low-priced packages that would have been unthinkable just a few years ago. The Simulation technology has shot up in value to other related industries. The Simulation industry is coming of age and is no longer just the domain of academics.

This article provides insight into the working environment and intellectual and managerial attitudes during the formative period of simulation development. It also suggests a basis for comparison with the current practices.

The history of computer simulation dates back to World War II when two mathematicians Jon Von Neumann and Stanislaw Ulam were faced with the puzzling problem of behavior of neutrons. Hit and trial experimentation were too costly and the problem was too complicated for analysis. Hence, the Roulette wheel technique was suggested by the mathematicians. The basic data regarding the occurrence of various events were known, into which the probabilities of separate events were merged in a step by step analysis to predict the outcome of the whole sequence of events. With the remarkable success of the techniques on neutron problem, it soon became popular and found many applications in the business and industry [1].

This was a time, in the post-war world, when new technologies, developed for military purposes during the war, began to emerge as new problem-solving tools in the world at large. At that time the field of computing was divided into two approaches: analog and digital. Analog computers were particularly suitable for problems requiring the solution of differential equations. Analog computers used electronic DC amplifiers configured as integrators and summers with a variety of non-linear, electronic and Electro-mechanicalComponents for multiplication, division, function generation, etc. These units were manually interconnected so as to produce a system that obeyed the differential equations under study. A great deal of ingenuity was often necessary in order to produce accurate, stable solutions. The electronics used vacuum tubes (valves), as did the early digitalcomputers. The transistor was still some years in the future [3].

In the late ‘40s and early ‘50s, commercially designed computers, both analog and digital started to appear in a number of organizations. Unsuspecting members of the technical staffs of these organizations suddenly found themselves responsible for figuring out how to use these electronic monsters and apply them to the problems of the day. One such engineer, working at the Naval Air Missile Test Center at Point Mugu on the California coast north of Los Angeles, was John McLeod, who took delivery of a new analog computer sometime in 1952. John was not the only engineer in the aerospace community in Southern California facing the same problems, and a few of them decided to get together as an informal user group to exchange ideas and experiences [3].

Computer simulation was not a useful tool in the 1950s.Simulation took too long to get results, needed too many skilled people, and as a result cost a considerable amount in both personnel and computer time. And most disheartening, results were often ambiguous. One example is the attempt to model the field data for peak periods in case of telephone systems. This is because the system did not conform to the queuing theory used during those days. One technique used was discrete event computer simulation. The tools available for the approach were an IBM 650, assembly language, and a team of mathematician, a systems engineer and a programmer. The team accomplished less than half of what they were set to do, took twice as long and overspent the budget by a factor of two [2].

The computer systems of the 60s were predominantly batch systems. Both data and the program were fed to the computer in a batch via punched cards. Source data were taken on forms from which keypunch operators prepared the punched cards. Data Processors developed the programs. The early use of punched cards in manufacturing was predominantly seen in their inclusion in job or order packets for material requisition, labor reporting and job tracking. A mainstay of that period was the classical IBM 1620 [5].

In October 1961 IBM presented the "Gordon Simulator" to Norden (systems design company). In December 1961 Geoffrey Gorden presented his paper at the fall Joint Computer Conference on a General Purpose Systems Simulator (GPSS) [1,2]. This new tool was used to design the system for the FAA to distribute weather information to general aviation [2].

IBM provided the software and hardware. The team was able to construct the model, simulate the problem, and obtain answers in only six weeks. A new tool had become available for systems designers. With the success of this tool models began to be produced for outside groups by Norden and a simulation activity was established. Early simulation groups were established at: Boeing, Martin Marietta, Air Force Logistics Command, General Dynamics, Hughes Aircraft, Raytheon, Celanese, Exxon, Southern Railway, and the computer manufacturers were IBM, Control Data, National Cash Register, and UNIVAC [2].

However the users of GPSS from IBM were concentrating on aspects of computer systems very different from the Norden systems. Geoffrey Gorden’s concept was that the actual designers would use GPSS. But the design engineers preferred to communicate their problems to programmers or a simulation group.The interactions among the GPSS simulation groups occurred through the IBM user’s group conference, SHARE. It was a huge meeting and those interested in simulation had only one session [2].

Meanwhile, at Rand Corporation Harry Markowitz, Bernard Hausner, and Herbert Karr produced a version of SIMSCRIPT in 1962 to simulate their inventory problems. Elsewhere, there were other approaches. In England J. Buxton and J. Laski developed CSL, the Control and Simulation Language. An Early version of SIMULA was developed in Norway by O. Dahl and K. Nygaard and Don Knuth and J. McNeley produced SOL- A symbolic Language for General Purpose System Simulation. Ken Tocker wrote a short book on the ART OF SIMULATION [4].

The characteristics of this period were quantities of simulation language developments and few efforts to coordinate and compare the different approaches. There was, also, no organized activity to help users get started or provide guidance. The first step to address these limitations was to look at simulation languages. This was done at a workshop on Simulation Languages at Stanford University in March of 1964. Then at the International Federation for Information Processing (IFIP) Congress in New York in May of 1965 there was a discussion of languages and application, which in turn led to another workshop at the University of Pennsylvania in March of 1966. One result of this workshop was the realization that a narrower conference on the uses of simulation was needed [4].

In response to these needs, an organizing group was established composed of members osf SHARE, Joint User’s Group of ACM, and the Computer and Systems Science and Cybernetics Groups of IEEE. This group organized the November 1967 Conference on Application of Simulation using the General Purpose Simulation System (GPSS). Highlights of the conference included a speech by Geoffrey Gordon who spoke at length on "The Growth of GPSS" and there was a session on machine interference for GPSS.

Encouraged by the success, the organizing group set out to make the conference format broader, include other languages and provide a conference digest. In December 1968 a second conference on the applications of Simulation was held in New York at the hotel Roosevelt with over seven hundred attendees. For that conference, what is today known as SCS became a sponsor and a 368 page conference digest was published. That conference became the first one to address, in great variety, the many aspects of DiscreteEvent Simulation. There were a total of 78 papers presented at twenty-two sessions [4].

The following topics were discussed in the conference [4]:

  1. "Difficulties in convincing Top Management"
  2. Sessions with papers on Statistical Considerations, random number generation for GPSS/360, languages- SIMSCRIPT 2, SIMULA 67, SPURT, a simulation tutorial and the case for FORTRAN- A Minority viewpoint.
  3. Sessions covered transportation, computer systems, manufacturing applications, reliability and maintainability, graphics and GPSS modifications, simulation and human behavior, distibution systems, communications, urban systems, gaming models, job shops, materials handling, marketing models, languages for modeling computer systems, facility planning models, and simulation and ecology.
In 1969 third conference on the Application of Simulation was held in December in Los Angeles. One sign of becoming established is was that both AIIE and TIMS joined as sponsors. Among the new items were GASP and a session on health systems. The 1970 and 1971 fourth and fifth conference were held in New York for the last time. The fourth conference discussed the first GPSS tutorial by Tom Schriber. The fifth conference became the first to be titled the WINTER SIMULATION CONFERENCE. The number of tutorials grew with Alan Pritsker covering GASP 2 AND Yen Chao SIMSCRIPT. An education session was added since many schools were offering course in both coninuous and discrete event simulation. The first SIMSCRIPT tutorial by Ed Russell was published in 1976. In the 1977 conference held in Washington, D.C. two new sessions on agricultural and military systems were added. There was also an increased interest in the internal workings of the language. One example was an IMPROVED EVENTS LIST ALGORITHM presented by Jim Henriksen [3].

Simulation was a topic that was taught to Industrial Engineers in school but rarely applied. Long hours spent at the computer terminal and seemingly endless runs to find an obscure bug in a language was what simulation meant to I.E. graduates in the 70s. When spreadsheet tools were first introduced in the late 1970s they were only used by a "few true believers". The popularity of simulation as a powerful tool increased with the number of conferences and sessions. The number of sessions held on simulation doubled by 1971 and continued to rise to about forty sessions in 1977 and sixty sessions in 1983 as compared to 12 in 1967. A sign of the growing maturity in the field was a Panel Discussion at Miami in 1978 on the FAILURES OF SIMULATION, focussing on what can and does go wrong and a paper on MANAGING SIMULATION Projects. In 1979 the conference was held in San Diego and in 1980 it was held in Orlando. There were more tutorials and papers were organized into tracts of sessions for beginners, intermediate, and advanced practioneers [3].

Two common fears of simulation in early 80s were [5]:

  1. Simulation is extremely complicated, so only experts can use it.
  2. Simulation takes forever because of programming and debugging.
However, the number of computerized systems increased from relatively four in the 1970s to a great many in the late 70s and early 80s. A survey of commercially available production management systems published by CAM-1 in 1981 listed 283 different computerized systems available and most of the systems listed in the report were under $ 50,000.

The sudden commercial availability of large number of computerized manufacturing systems was complemented by the emergence of an extensive array of available computer hardware and software, particularly from 1980 on. At the same time, the attractive computer price/ performance reduction was fueling a similar explosion of computing applications in engineering design and plant automation [5].

In 1982 most simulation software concentrated on material requirements planning (MRP), which considers only the timing and sizing of orders without regard to capacity limitations. Software didn’t advance beyond this stage to give a true meaning to automated factory. Hundreds of robots and millions of dollars worth of computer-controlled equipment were worthless as they were underutilized and spent their time working on the wrong part because of poor planning. In 1982 personal microcomputers were 16 bit machines capable of holding memories of the order of 128k, 256k, or even 512k. Not much software was available to take the advantage of the 16bit microprocessor and the additional memory. In 1983 the number of companies using simulation was small. With the evolution of information systems that can collect and store much of data necessary to build and maintain models, simulation for production planning became more feasible. The widely used factory management system by CAM-I supported and distributed, closed loop control of shop floor operations and closed loop communications between planning and operations functions. On installing such a system much of the problems associated with building and maintaining simulation models was eliminated [5].

With the development of SLAMII by Pritsker and associates in 1983 simulation software became a powerful tool.It was popularly used on the IBM PC. SLAMII provided three different modeling approaches [5]:

  1. Network
  2. Discrete event
  3. Continuous and the flexibility to use any combination of them in a single simulation model; Its cost was $975.
Late 80s saw the development of SIMANIV and CINEMAIV, the newest in simulation and animation software by systems modeling. All code was self documented, models of complex systems could be developed entirely within SIMAN, with easy-to-use menu driven framework. New interactive capabilities aided in constructing and validating the simulation model. Expanded drawing features, real time plots and frequency graphics added to CINEMA’S new ability [5].

In 1984 the first simulation language specifically designed for modeling manufacturing systems was developed. In the late 80s with the development of the discrete event simulation model, the management was able to assess the cost-benefits of alternatives, maintenance strategies, converting equipment repairs and capital replacements [5].

In the early 90s software such as EMS version of GPSS/PC began to emerge, which allowed users of IBM compatible personal computers to access additional memory, above the 640k limit imposed by the original PC architecture. EXTEND was a Macintosh based graphical behavior simulation application that supported both discrete and continuous event simulation. MIC-SIM version 3.0 provided modeling capabilities and features that were so easy to learn and use that training and consulting services were no longer needed. GPSS/H was supported by a wide variety of hardware in the industry, from PCs and most Unix workstations to VAX/VMS and IBM mainframe systems. It offered numerous extensions, which prevented users from having to write external code in Fortran or ‘C’. MAST provided a single environment for the design, acquisition and operation of manufacturing systems. It required no programming, no modeling, not even a text editing was required to study a production system [5].

The power of simulation as a tool became evident in the middle 90s.Challenges were faced by companies like Universal Data Systems (ultra modern electronics assembly plant). The hurdle was to convert the entire plant to a hybrid flow-shop where an individual unit would be sent to the next operation as soon as it was completed at the current operation. One serious reservation for this change was the impact on finished goods inventory. Experiments were carried out using the simulation program written in GPSS/ PC (Minuteman) using an IBM PC/AT. The entire program took 30 days to simulate and the results were positive with the eventual conversion of the entire plant to a flow-shop environment as compared to the original batch environment [5].

Models were increasingly used to design new plants and to plan the flow of work in these new facilities. The influence of graphics became more marked and a number of vendors used the conference exhibit space to demonstrate the advantages of their system by actually bringing a computer to the conference site. Technology had moved so far thatsimulation, for those who were skilled in the art, became quicker, cheaper, and much more responsive to the designs of the model constructor [5].

In 1998 software such as Micro Saint version 2.0 for Windows 95 began to stand out. It provided automatic data collection, optimization and new Windows interface. In addition to this , it did not require the ability to write in any programming language. Today, Simulation has advanced to such a stage that the software enables the user to model, execute, and animate any manufacturing system in any level of detail. A complex 2000-foot conveyor can be modeled in minutes. The products, equipment and information is represented by a single entity associated with four dimensions (X, Y, Z and time) and a definition of its behavior [5].

Advanced versions of simulation software today, support the following features [5]:

  • Uniquely structured environment lets the user to quickly enter the geometry and production requirements of a model.
  • Expert system technology generates details automatically while windows and pop-up menus guide the user through the modeling process.
  • Changes can be made quickly and easily with far less chances of errors.
  • Built in material handling templates make the user more productive, so he/she doesn’t waste time programming.
  • The user can verify and test designs, answer "what if" questions, explore more alternatives and catch system glitches and 3-D animation- all before implementation.
  • 3-D graphics are automatically created as the user enters data.
  • Results can be communicated in real time animation. [Ref 5]
This history provides a spring board from which to extrapolate a few predictions for simulation capabilities of the future. The future of Simulation may involve integration with other techniques and other software applications. Companies like Pritsker acquired Symix, a producer of Enterprise Resource Planning (ERP) software. Deneb Robotics was acquired by Dassault Systems, a maker of 3-D CAD software. Simulation has developed in leaps and bounds since the 90s and it is predicted that in the future companies not using simulation software may be faced with the challenge to stay afloat in the competitive world [5].

Back to Table of Contents

Who introduced simulation?

Introduction to Simulation and Modeling: Historical Perspective