A digital computer is a discrete system. So, does it follow that a continuous model cannot be simulated on a digital computer? It appears that only an approximation of a continuous model via a discrete simulation is possible. From what I've read this seems to be the case, but I wanted to get some feedback/input from others on the topic.
I did find this while searching around for further information on this topic:
Continuous simulation is something that can only really be accomplished with an analog computer. Using a digital computer one can approximate a continuous simulation by making the time step of the simulation sufficiently small so there are no transitions within the system between time steps. The premise for a continuous simulation is that there is a continuous time flow and the simulation is stepped in time increments. 1
I also thought this made a good point about approximating via a discrete simulation:
In some systems the state changes all the time, not just at the time of some discrete events. For example, the water level in a reservoir with given in and outflows may change all the time. In such cases "continuous simulation" is more appropriate, although discrete event simulation can serve as an approximation. 2
1 Continuous Simulation - http://www.systems-thinking.org/simulation/contsim.htm
2 Modeling & Simulation - http://home.ubalt.edu/ntsbarsh/simulation/sim.htm
Thanks for the input.
You may not be able to perfectly simulate a continuous system using a digital computer, but I have two thoughts on the idea of modeling or simulating continuous systems: