Turing machine vs Von Neuman machine

32.8k Views Asked by At

Background

The Von-Neumann architecture describes the stored-program computer where instructions and data are stored in memory and the machine works by changing its internal state, i.e an instruction operates on some data and modifies the data. So inherently, there is state maintained in the system.

The Turing machine architecture works by manipulating symbols on a tape. i.e A tape with infinite number of slots exists, and at any one point in time, the Turing machine is in a particular slot. Based on the symbol read at that slot, the machine can change the symbol and move to a different slot. All of this is deterministic.


Questions

  1. Is there any relation between these two models? Was the Von Neuman model based on or inspired by the Turing model?

  2. Can we say that Turing model is a superset of Von Newman model?

  3. Does functional Programming fit into Turing model? If so, how? I assume functional programing does not lend itself nicely to the Von Neuman model.

7

There are 7 best solutions below

9
On BEST ANSWER

Turing machines are theoretical concepts invented to explore the domain of computable problems mathematically and to obtain ways of describing these computations.

The Von-Neumann architecture is an architecture for constructing actual computers (which implement what the Turing machine describes theoretically).

Functional programming is based on the lambda-calculus, which is a another method of describing computations or - more precisely - computable functions. Though it uses a completely different approach, it is equally powerful to Turing machine (it's said to be turing complete).

Every lambda-calculus program (term) T is written just using a combination of

  • variables like x
  • anonymous functions like λx. T
  • function applications T T

Despite being stateless, this is sufficient for every computation a computer can do. Turing machines and lambda terms can emulate each other, and a Von-Neumann computer can execute both (apart from technical restrictions like providing infinite storage, which a turing machine could require).

But due to their stateless and more abstract nature, functional programs might be less efficient and less "intuitive" on Von-Neumann computers compared to imperative programs which follow it's style of binary, memory and update.

1
On

I do not know what historical relationship there is between Turing machines and von Neuman architectures. I am sure, however, that von Neuman was aware of Turing machines when he developed the von Neuman architecture.

As far as computational capability, however, Turing machines and von Neuman machines are equivalent. Either one can emulate the other (IIRC, emulating a von Neuman program on a Turing machine is an O(n^6) operation). Functional programming, in the form of the lambda calculus, is also equivalent. In fact, all known computational frameworks at least as powerful as Turing machines are equivalent:

  • Turing machines
  • Lambda calculus (functional programming)
  • von Neuman machines
  • Partial recursive functions

There is no difference in the set of functions that can be computed with any of these models.

Functional programming is derived from the lambda calculus, so it doesn't map directly to either Turing or von Nemuan machines. Either of them can run functional programs, hoewver, via emulation. I think that the mapping for Turing machines is likely more tedious than the mapping for von Neuman machines, so my answer to the 3rd question would be "no, in fact it's worse."

2
On

Generally one refers to the Von Neumann architecture, as contrasted with the Harvard architecture. The former has code and data stored in the same way, whereas the latter has separate memory and bus pathways for code and data. All modern desktop PCs are Von Neumann, most microcontrollers are Harvard. Both are examples of real-world designs that attempt to emulate a theoretical Turing machine (which is impossible because a true Turing machine requires infinite memory).

0
On

Turing model defines computational capabilities without getting deep into implementation, no one will ever create computer that will look like Turing Machine literally. (Except enthusiasts http://www.youtube.com/watch?v=E3keLeMwfHY ).

Turing model is not architecture.

Von Neuman is guidance how to build computers. It says nothing about the computation capabilities. Depending on instruction set the produced computer may or may not be Turing complete (means can solve same tasks as Turing Machine)

Functional programming (lambda calculus) is another computational model that is Turing complete but can't be natively fit into Von Neumann architecture.

0
On

The Turing "model" is not an architectural model at all. It was just a non-existent machine that Turing hypothesized to serve as the vehicle for his proof of the decision problem.

0
On

The easy way to understand the difference is... von Neumann stretched Turing's alpha-machine concept to support more than one algorithm in shared, centralized, unprotected memory. This led away from Alonzo Church's Lambda Calculus and functional programming to RISC instructions, dangerously shared static addressing, the dictatorial superuser, central operating systems, virtual memory, virtual machine, and endless cybercrime. Instead, the Turing Machine was intended as the Lambda engine of the Lambda Calculus, creating virtual functions (instead of virtual machines). Remember, Alan Turing was Alonzo Church's doctoral student in 1936 and 1937. They intended Alonzo Church's symbolic, functional modularity to encapsulate and protect the single algorithm as a simple binary computer to implement their Church-Turing Thesis. Lambda machine code enforces functional programming as a better and more powerful computer using immutable names, object-oriented programs, and Capability-Based Addressing. The binary computer (either Turing's or von Neumann's), when encapsulated this way, creates a Church-Turing Machine with six additional Church Instructions to programmatically control an application namespace, a thread of execution, secure call and return to subroutine abstractions, programmed functions, and binary objects, as explained in Civilizing Cyberspace: The Fight For Digital Democracy

2
On

Having safe and protecting hardware that protects the programming environment in the Church Turing style is really attractive, but what I ask is how can the industry be transitioned to this idea? Scare tactics won't do it no matter how hard the red flag is waved.

There is just too much invested in the current designs. There needs to be a commercially attractive way to introduce it while replacing the unsafe architectures yet still allow use of current abilities and the majority of the current software investment. That includes investment in product and in staff training. The replacement has to be able to happen at an almost invisible level in the hardware if that is possible and protect the current software with minimal impact.

If that can't be done I don't think it will ever be implemented widely.