# Thread: Four-dimensional understanding of quantum computers

1. In modern view on quantum mechanics wavefuntion collapse is no longer a ‘mystical out of physics’ phenomena, but is seen as a thermodynamical result of interaction with the environment (‘einselection’) – there is still some concrete unitary evolution behind. So there should exist ‘Hamiltonian of the Universe’ describing evolution of everything.
We have similar situation in (classical!) field theories: for Euler-Lagrange equations (like Klein-Gordon: d_tt psi = laplacian psi – m^2 psi ) the evolution operator is self-adjoint – can be diagonalized (spectral theorem). The evolution on the (lambda) coordinate is: d_tt x = lambda x.
So this operator should be non-positive, because otherwise some coordinates would explode.
For negative eigenvalues, we get unitary evolution – like in quantum mechanics, we can imagine it as superposition of different eigenfunctions, ‘rotating’ with different speeds. And so such hyperbolic PDE are called wave-like.
We have limited knowledge: cannot fully trace these unitary evolutions – from our perspective they 'loose their coherence':
- we don’t/can’t know precise parameters, like initial conditions,
- we cannot fully trace complicated motion (chaos),
- thermodynamically stable state usually have own dynamics, like atomic orbitals or quantum phases.

If we model such our lack of knowledge with proper statistical ensemble among possible scenarios - maximize uncertainty not locally like in Brownian motion, but globally - we get thermodynamical going to quantum mechanical ground state probability density. These new models also show why to translate from amplitude we are working on to the probability, we should take ‘the square’ ( http://arxiv.org/abs/0910.2724 ).

To understand the strength of quantum computers, it should be enough to focus on models with constant (fixed) number of particles, for what classical field theory is enough.
We can observe analogue of Young experiment on a surface of water - it can be described in eigenbase of evolution operator, like in standard view of quantum mechanics - but simulation in this case requires considering all scenarios/trajectories - time complexity grows exponentially with the number of slits/qbits ... but we can also simulate it as differential equation/action optimization - for which time complexity should rater grow polynomially.
What is nonituitive about them is that natural picture for such Lagrangian mechanics is ‘static 4D’ – particles are no just ‘moving points’, but rather their trajectories in the spacetime ... let’s look what gives it us for computational capabilities.

Quantum algorithm usually look like:
initialize qbits,
use Hadamar gates to get superposition of all possible inputs,
calculate classical function of the input,
extract some information from the superposition of results,
look at the classical function calculation it has to use reversible gates, like
(x,y,z)->(x,y,z XOR f(x,y) )
they are also reversible classically, so we can easily reverse the whole function calculation on standard computer.
Unfortunately it’s not so simple: there is a problem about it – such reversible calculations usually requires quite large number of auxiliary (q)bits, which had been initialized (to zero).
While taking classical reverse of such function, we rather cannot control that these auxiliary (q)bits are zeros – they would usually be just random – so we hadn’t really calculated what we wished.
If we could for example calculate square of a number modulo N or multiplicate of two numbers using ‘small’ number of auxiliary bits, we could guess their final value (e.g. randomly) and in a small number of trials we would be able to reverse such function (getting all zeros), what would allow to factorize N – so probably simple multiplication requires linear number of auxiliary bits.
The strength of quantum computers is that they can ‘mount qbits trajectories’ in both past and future – simultaneously initialize auxiliary qbits and using measurement focus only on scenarios having the same final value (the measured one).
In Shor’s algorithm case, we wouldn’t even need to know all the scenarios to make Fourier transform – knowing two would be already enough: if these powers gives the same value modulo N, their difference gives 1 modulo N.
On the 18th page of my presentation is diagram for Shor’s algorithm: https://docs.google.com/fileview?id=...MWVkOTJk&hl=en

For physics it’s natural to find global minimum of action, but simulating such computer in classical field theory, even after simplifications probably still would be difficult, but anyway it suggests that to attack algorithmically difficult problems, it could be useful to translate them into continuous ones.
For example in 3SAT problem we have to valuate variables to fulfill all alternatives of triples of these variables or their negations – look that x OR y can be changed into optimizing
((x-1)^2+y^2)((x-1)^2+(y-1)^2)(x^2+(y-1)^2)
and analogously seven terms for alternative of three variables. Finding global minimum of sum of such polynomials for all terms, would solve our problem.
I’ve just found information that it looks like it is successfully done for a few years – enforcing that there is only one minimum, so local gradient would show the way to the solution: http://en.wikipedia.org/wiki/Cooperative_optimization

What do you think about it?

2.

3. QM is a mess, that is all I will comment on.

Also, people tend not to read super posts like that. Try to shorten it and slowly expand it through conversation.

4. Quantum mechanics is a mess because it mixes in uncontrolled way: unavoidable for imperfect human statistical description with some 'fundamental quantum phase' representing the real process behind ...

We have 'wave equations' also on a surface of water - like Young experiment analogue.
We could describe it by working in kind of eigenbase - like in standard simulations of quantum computers - but it requires 'considering all paths/possibilities': time complexity grows exponentially with the number of slits/qbits - and so it doesn't have a chance to be practical.

... but we can also describe field theories/hyperbolic/wave-like PDE as just differential equations/action optimization ... and in this way time complexity grows rather polynomially ... so it could lead to practical algorithms ... (?)

What we need to practically find reverses of functions is to simulate trajectories fixed on both ends, for which we have some continuous processes corresponding to reversible gates.
If we would have it, we just fix zeros on the auxiliary (q)bits on one side and the value for which we need reverse on the second side ...

5. After 7 years I have finally written it down - page 9 of https://arxiv.org/pdf/0910.2724v2.pdf

The main part of this paper is MERW ( https://en.wikipedia.org/wiki/Maxima...py_Random_Walk ) showing why standard diffusion has failed (e.g. predicting that semiconductor is a conductor) - because it has used only an approximation of the (Jaynes) principle of maximum entropy (required by statistical physics), and if using the real entropy maximum (MERW), there is no longer discrepancy - e.g. its stationary probability distribution is exactly as in the quantum ground state.

Beside showing that Shor uses future-past causality (below), it e.g. shows where Born rules come and that they are sufficient to violate Bell inequalities.

Schematic diagram of quantum subroutine of Shor's algorithm for finding prime factors of natural number N. For a random natural number y<N, it searches for period r of f(a)=y^a mod N, such period can be concluded from measurement of value c after Quantum Fourier Transform (QFT) and with some large probability (O(1)) allows to find a nontrivial factor of N. The Hadamar gates produce state being superposition of all possible values of a. Then classical function f(a) is applied, getting superposition of |a> |f(a)>. Due to necessary reversibility of applied operations, this calculation of f(a) requires use of auxiliary qbits, initially prepared as |0>. Now measuring the value of f(a) returns some random value m, and restricts the original superposition to only a fulfilling f(a)=m. Mathematics ensures that {a:f(a)=m} set has to be periodic here (y^r \equiv 1 mod N), this period r is concluded from the value of Fourier Transform (QFT). Seeing the above process as a situation in 4D spacetime, qbits become trajectories, state preparation mounts their values (chosen) in the past direction, measurement mounts their values (random) in the future direction. Superiority of this quantum subroutine comes from future-past propagation of information (tension) by restricting the original ensemble in the first measurement.

6. Holy f*ck....

How does it work?

A regular computer is 1D, it simply is a line that bounces between yes and no, 0 and 1.. what does this make it 4D.. the 2D i understand.. 00/01/10/11 now how does it go from there? And what are those trajectory mounts. I have hypothesized something that turns a crystal transparent when electrons bounce on both sides at the same time, but not when it doesn't go at exactly the same time (attosecond timeframe)

But next to physical and dielectrical interaction i don't understand how you deal with the directional information? You would need sensors that would make a quantum computer slower.

7. I wanted to express that Shor exploits the fact that we live in a 4D spacetime: by influencing this computation ("circuit" of reversible operations) from both past (initialization) and future (measurement) directions.
The first measurement restricts the original ensemble of all possibilities (after Hadamar gates) to only those having the same outcome in the first measurement (a random value).
Then the second measurement extras information from this restricted ensemble.

What is surprising is the causality of this restriction - which "tension" first kind of goes backward in time, as emphasized in the diagram.

 Bookmarks
##### Bookmarks
 Posting Permissions
 You may not post new threads You may not post replies You may not post attachments You may not edit your posts   BB code is On Smilies are On [IMG] code is On [VIDEO] code is On HTML code is Off Trackbacks are Off Pingbacks are Off Refbacks are On Terms of Use Agreement