From an epistemological point of view these two statements seem to contradict each other: because if microphysics can be described consistently by mathematics, why not by an, equally logical, framework of fundamental notions and concepts? To this objection conventional reasoning usually replies that the notions we employ are classical, and the notions of microphysics far from classical indeed. So there is no way to describe our experiences in microphysics by a logically consistent framework of classical concepts.

This result is quite acceptable since, after all, the classical concepts were derived from our experiences in a macrophysical environment. And that these experiences cannot be applied to the microphysical regime, seems quite in order. On this basis it is commonly argued that all the detected inconsistencies or even contradictions arise from an unreflected application of classical concepts.

But even if this were the case, and we are forced to accept the peculiarities of non-locality, individual particles being in fact ensembles of particles, physical processes occuring acausal, waves without energy, measurements without any interference with measured objects, infinities not being infinite, or spreading wave packets not really spreading, the question seems legitimate, whether all these effects occur in reality, or rather in some logical representation by a theory, which at once employs and contradicts classical physics.

One could suspect, in such a case, that some fundamental misunderstanding prevails in current concepts and, seemingly, not without reason. The real problem may therefore not be to provide yet another ingenious interpretation of quantum mechanics, but to check whether the axioms employed do not in a way follow from a physical basis. So the question is not, whether these axioms could tell us more about microphysics, but rather, where these fundamental axioms come from, whether they are really axioms, or just results and expressions of some original basis in physics. There are quite a few historical examples, that such an analysis may lead to unsuspected scientific progress.

In case such a basis cannot be found, it still will tell us something new about current frameworks: the axioms are not only axioms, because everybody believes they are, but they are axioms, because they cannot be referred to a deeper layer of understanding.

It is the essence of such an undertaking that it does not take the current beliefs or facts for granted: it is rather based on the very philosophical outlook, that some part of what is currently accepted is always due to interpretation and not to the experimental data at hand. How significant this part is, can only be determined a posteriori: both extremes in this case are dangerous. Neither does everything we believe we know depend exclusively on our interpretations, nor does everything depend on pure facts alone. It is probably one of the most difficult questions in any given situation to determine the exact composition of these two elements, and even epistemology ist undecided in this matter.

The basic situation, today, is again a little different: we do have a
theory, which describes, what we measure, it just does not make
sense. One could call that a prerevolutionary situation, since
it is hardly conceivable, that humans will stick ad infinitum to
a concept, which they do not understand: they either will become
bored, since thinking only makes fun, when it gets you somewhere;
or they will simply ignore what they do not understand and make
up their own private version of how it *really* is. In both
cases innovation will cease.

So it might be interesting to look a little closer at the attempted modifications or theories, which proposed a different angle of attack. There are generally three different categories of approaches to micro physics:

Basically all the existing theories use the formal approach: which is, as mentioned at the beginning, a rather peculiar fact. These theories are mainly concerned with the interpretation of axioms. That the third possibility seems to be reserved for lunatics is not surprising: it well surpasses even the best theoretical skills to develop a complete set of rules for micro physics, which at once includes all our measurements and is based on physical assumptions.The problem of the second approach seems to be, that it must be based on physical concepts. And since the only concepts of matter in classical physics are particles or waves, the few existing approaches generally use one or the other, or a combination of both. De Broglie saw quantum waves as real waves, guiding a singularity in random motion within this wave and which was identified as the particle. He never did consistently formulate his theory, so it is still rather unclear, although Selleri and others have tried to render the ideas more precise. So far, these approaches have always met the same objection: they are not compatible with quantum theory.

The objects we seem to know best, since that is, what we became
acquainted with in our first physics courses, are *particles*.
Objects without an extension, point-like, having properties like
mass, charge or velocity, and moving in a world where they
interact with other objects of the same type.

As long as the distance between these objects is vast, the
simplification is not really a problem. For this reason the *Newtonian*
approach to physics works comparatively well in astrophysics and also
in gas kinetics. But as soon as the structure of these objects
has a size comparable to the distance between them, the simplification
is no longer a reasonable one, and that is, where the real problems
begin. In the history of physics, this point is marked by the
experimental ability to manipulate the structure of matter itself.
It is exactly with the beginning of atomic physics, that our classical
picture of particles and interactions became obsolete.

Quite frequently the fact, that all fundamental concepts of quantum
mechanics are *mechanical* concepts, seems just a natural choice
since, after all, mechanics is the foundation of physics. But
considering, that we also know about the wave properties of matter,
which is theoretically expressed by de Broglie's relations
and
experimentally verified by diffraction experiments,
this choice
seems far from natural and rather arbitrary.

The reason for it is basically historical: the outline of a theory of quantum mechanics (the old quantum mechanics) was already formulated in terms of Sommerfeld's path integrals, Bohr's model of a hydrogen atom, and Heisenberg's matrix mechanics, when the wave properties of matter where first proposed and later detected. The formulation of quantum theory in terms of wave properties by Schrodinger was not, as frequently believed, an extension of the existing theory, but aimed at providing a local and physical basis to the numerical scheme already existing. That this basis is equally not physical, has been proved by the spreading of a wave packet (so the wave function could not be a physical entity) and, more than fifty years later, by Bell's inequalities and Aspect's measurements (so there cannot even be the question of local and realistic theories). These results only justified a statistical interpretation of the wave function and excluded the consideration of fundamental processes. As a consequence the physical notions of particles and waves were both retained, but not in any physical, only in a symbolical sense.

Now this is also not hard to accept. That we are in micro
physics not in a very good position to apply our Newtonian concepts
is evident, if dimensions are considered. And that the classical
concept of infinite waves might also not work too well, since
atomic structures are of limited extensions, is also quite easy to
understand. But what about waves with limited extensions, what
about *wavelets*?

The uncertainty relations, if interpreted as a physical axiom, prohibit any local and realistic interpretation, because their immediate consequence is a spreading of wave packets: these waves therefore cannot be physical. But since the relations are a cornerstone of quantum theory, they must still, in a sense, be valid.

The dispersion relations of massive particles provide the same problem, since they lead to the following logical circle: If phase velocity does not equal mechanical velocity, a free particle cannot consist of a single wave of specific frequency. If it does not consist of a single wave, then the wave-features of a particle must be formalized as a Fourier integral over infinitely many partial waves. In this case any partial wave cannot be interpreted as a physical wave. If the interpretation as a physical wave is not justified, then internal wave-features cannot be related to physical qualities. If they cannot be related to physical qualities, then internal processes must remain unconsidered. And in this case one is back at the original problem.

If electrons are interpreted as wavelets, the problem of electrostatic interactions must be accounted for. In case no solution is found, the electron in motion cannot be interpreted as a wave type structure, because the inevitable self interactions must lead to acceleration effects bound to destroy the original wave structure. In this case the interpretation is not even self-consistent.

There exist several experimental loopholes to these experiments, which restrict their validity. But assuming, that even more precise experiments yield the same result, i.e. a violation of Bell's inequalities and thus a contradiction with any local and realistic framework, the problem must be solved from a theoretical, rather than an experimental angle. If it cannot be established, that these measurements and their interpretation contain a fundamental contradiction, they must be seen as a valid proof against any realistic modification of micro physics.

There is some risk to that business, and so I'd like to add a warning sign, which was again set up by Richard Feynman:

Maybe that is not the end end of the story, though, and there still is hope. At least if the feeling of Isidor Rabi is not withoutThere was a time when the newspapers said that only twelve men understood the theory of relativity. I do not believe that there ever was such a time. ... On the other hand, I think it is safe to say that no one understands quantum mechanics. ... Do not keep saying to yourself, if you can possibly avoid it, `But how can it be like that?', because you will get `down the drain' into a blind alley from which nobody has yet escaped. Nobody knows how it can be like that.

The problem is that the theory is too strong, too compelling. I feel we are missing a basic point. The next generation, as soon as they will have found that point, will knock on their heads and say: How could they have missed that?

If all your objects in the theory are mathematical objects - and this applies to wavefunctions, spins, matrices, or operators - then there is no guarantee that what you axiomatise in your mathematical axioms does actually have a bearing on what goes on in real space among real physical objects. I have written a paper about exactly this problem, and it shows that such a theory cannot make any predictions, because it cannot explain what actually happens. The paper is here. It also explains, why nobody understands quantum mechanics. A theory which creates physical objects out of mathematical ones does not allow you to keep track of events in real space and is therefore in principle unintelligible. Today, I would close this chapter with a single sentence.

The only way to avoid this problem is to construct a theoretical framework, which contains physical objects throughout, and show how atomic physics can be described by a theoretical model on this basis. Which is, of course, what I have tried to do in the framework of Microdynamics.Nature consists of physical objects in real, three-dimensional space, and one cannot create physical objects in this space from mathematical objects: Quantum mechanics, therefore, is nonsense.