How to define absolute notion of computation in absulutely strict and most rigorous way?
Gurevich and Blass in "Algorithms: A Quest for Absolute Definitions" did some work. But they resorted to concrete computation model (like Turing's one, but their model is more flexible and more abstract) so there is a question: how to define computation cleanly, in spirit of five axiom Peano definition of natural numbers, or why it's impossible?
Some preliminary considerations:
- exploration boundedness requirement seems to be the root problem;
- it is worth to separate notion of PPL computation (that is in mind of PL designers) and absolute notion of computation.
- the fact that our universe (and mulitiverse too) is finite, so all machines essentialy FSMs is matter (I guess) (but this fact not implies that considering all computation models on FSM basis is best way to solve problems and to build theories, because of usefulness of abstraction)
--
PL - Programming Languages
PPL -- Popular Programming Languages := PLs with 1000+ users that write pragmatic software.