|
|

Testing time for modern physics
PHYSICS IS BACK in the news, with daily press
stories of astronomical discoveries and the completion of the $8 billion
Large Hadron Collider (LHC) laboratory, a collaboration between 20
European nations. New Scientist magazine sells 100,000 copies in Britain
each week, and the numbers applying for physics courses in British
universities have increased for the first time for many years. So, what
is the state of physics today? GEOFF JONES reports.
MARXISTS HAVE ALWAYS welcomed scientific progress.
Karl Marx and Friedrich Engels had the keenest interest in the
development of science in their day, and aimed to use the scientific
method to analyse society. ‘Truth is concrete’ was their slogan. Marx
spent years of his life in the British Museum poring over government
statistics to test his models of capitalist development. Engels kept up
with the latest scientific discoveries and interpreted them in terms of
Marxist theory.
Today, Marxists have an easier task in some ways.
There are shelves full of popular books and articles on physics. But
there is often a problem sorting out what is correct from what is plain
wrong and, after that, what is readable from what is not. Of all the
books on the foundations of modern physics, two recent publications,
Brian Greene’s The Fabric of the Cosmos, and Lee Smolin’s The Trouble
with Physics, stand out as brilliant and readable (though difficult)
complementary reports on the state of physics at the beginning of the
21st century.
Before we can talk about physics, we have to be sure
we understand how physics works. Or, more generally, how do we look at
things scientifically? First and obviously, we start from the evidence
of our senses, using instruments as extensions of our senses, things we
can measure. Second, we try to make a pattern of that evidence – the
first patterns that people recognised were most likely the rising and
setting of the sun and the progress of seasons. Out of that pattern
comes an explanation, a theory, although this may not be a scientific
theory in our sense (eg the idea that the sun is a horseman driving a
flaming chariot across the heavens once a day). Science, as 16th century
scientists realised, involved using that theory to expand our knowledge
– to predict the consequences of some new, untried experiment. A good
example is the well-known tale of Galileo dropping two spheres of
different mass from the top of the tower of Pisa to see if they hit the
ground at the same time. If the result is as predicted, the theory is
strengthened. If not, it has to be re-examined. At bottom, the progress
of science is a continual debate between theory and experiment.
No theory is perfect. There will always be ‘rogue’
observations which do not seem to fit (and ‘rogue’ scientists who do not
accept it). But, in general, if the vast majority of existing
experimental evidence can be explained and, most important, if the
theory can be used to expand our knowledge, that theory is generally
taken as the best approximation available at the time. At its strongest,
a theory can coagulate into a generally accepted model of reality which
scientists ‘take as read’. At any particular time most scientists work
within the bounds of that model and apply it. For example, physicists
devising new lasers and basing themselves on theories of solid state
physics developed in the 1950s and 1960s.
Scientific revolution
SOME ‘SOCIOLOGISTS’ OF science see such a consensus
as a reactionary structure where the ‘priests’ at the top impart the
‘right’ message to the faithful below. But this is incorrect. In
principle, any theory, no matter how well rooted, is always open to
challenge. Odd results that do not square with theory almost always
occur. Mostly these can be ‘put aside’ for study at a later time, or can
be used to ‘tweak’ the theory to make it fit.
But after a time the tweaks pile up. By analogy, you
could build a simple rectangular garden shed. Over the years, as garden
jobs multiply, bits have had to be added on for tool stores, a hole for
a stovepipe has been knocked in the roof, felt tacked over one corner to
keep the rain out, a compost bin fastened to one side. The shed has
become a mess. In the same way, a theory with too many odd bits added to
accommodate strange results looks a mess.
At some point an alternative is proposed – in our
analogy, knocking down the old shed and building a new one. The proposer
has to justify the new picture by showing how it simplifies and expands
the explanations given by the old theory.
That process is not simple or easy, and may well
involve conflict with established groups in society (as Galileo
discovered when he challenged the prevailing orthodoxy of a universe
centred around the earth). It may require courage and soul searching for
one or many scientists. It is not going too far to describe the
establishment of a new theory as a scientific revolution. And like a
political revolution it may be long drawn out, untidy and end in a way
unexpected by the people who started it.
We saw such a revolution in the first third of the
20th century. At the end of the 19th century, physicists had good reason
to be pleased with themselves. As they saw it, most of the basic rules
of what we now call ‘classical physics’ had been well established, apart
from a few anomalies. But within a few years those anomalies produced a
revolution in the physicist’s view of the world. That revolution looked
in two separate directions: to the world of the very large and to the
world of the very small.
Relativity & quantum mechanics
FOR TWO CENTURIES the mechanical motion of matter
had been described by the dynamical equations first formulated by Isaac
Newton. These equations describe bodies – snooker balls, planets, stars,
etc – moving in well defined paths against a background of infinite
space which formed an unchanging framework to which all measurements of
position and time could be referred.
In 1905, Albert Einstein, studying the propagation
of light, put forward the revolutionary idea that such a unique
unchanging background could not exist – that measurements of time and
distance were dependent on the motion of the person or machine making
the measurements. Time and space each have an objective existence but
only together can they be considered as an absolute entity – unitary
space-time (at least within our universe). This came to be known as the
principle of relativity, although Einstein did not like the phrase,
believing (correctly) that it would lead to much sloppy thinking. Later,
Einstein extended his theory to solve one of the problems Newton left
unanswered: what were gravitational forces? In the general theory of
relativity, mass has the property of being able to distort space-time,
hence producing what we experience as gravitational attraction.
Einstein’s theories were almost as violently contested as Galileo’s but
the weight of experimental evidence accumulated over the last half
century has made the Einstein picture of the universe generally
accepted.
At the same time as Einstein was developing his
theories, new experimental techniques, such as the ability to produce
high vacuums, made it possible to study the structure of atoms,
previously thought to be the indivisible building blocks of nature. In
1898, JJ Thompson demonstrated the existence of electrons – tiny charged
particles ejected from atoms. In 1911, using experimental apparatus
which was a distant (and very much cheaper) ancestor of the Large Hadron
Collider, Ernest Rutherford and his group in Manchester found the
astounding result that an atom, rather than being a solid lump of matter
with electrons scattered inside, consisted mostly of empty space with
nearly all its mass concentrated in a tiny nucleus. There was no way
this result could be squared with theories of classical mechanics and
electromagnetism. It formed the starting point for a whole new view of
the world: quantum mechanics.
The theory of quantum mechanics was developed in the
1930s primarily by Werner Heisenberg (a German), Erwin Schrödinger (an
Austrian) and Paul Dirac (an Englishman). It replaced the classical
picture of well-defined particles moving in well-defined orbits with a
much fuzzier one. Heisenberg’s well-known uncertainty principle stated
that, at any one moment, it was impossible to measure precisely both the
position and speed of a particle. More than that, in an experiment where
one of a number of possible outcomes could occur, it seemed that quantum
mechanics could only predict the probability of each one, rather than
saying definitely that one rather another would be seen. Even simple
experiments produced results that made no sense in classical terms but
could be predicted perfectly using quantum mechanics. For example, a
school experiment uses a simple TV cathode ray tube. A stream of
electrons gives a pattern of light on the CRT screen after passing
through a screen with two parallel slits pierced in it. The result – a
pattern of light and dark bands – is inexplicable in classical terms.
However, quantum mechanics gives a perfect prediction of the result, but
implies that, either each electron travelled somehow through both slits
at the same time; or if, as common sense suggests, the electron
travelled only through one slit, it must somehow ‘know’ that the other
slit exists! So there was no way in which the classical picture of the
electron as a tiny snooker ball could correspond to reality. And what
applied to electrons must surely apply to atoms and to bodies made out
of atoms!
Challenges ahead
FROM THE 1930s, the problem of what this implied for
the meaning of ‘reality’ worried leading physicists (especially
Einstein). But when it came to analysing and predicting the behaviour of
atoms and molecules, quantum mechanics proved astoundingly successful.
The attitude of most physicists was (and is): ‘What the hell, it works’.
Quantum mechanics, despite its bizarre implications,
is arguably the most successful theory in the history of physics. The
whole of today’s electronics, communications, IT and computing
technology has been a result of its application. What is more, in the
last 70 years, no experimental evidence has clashed with its
predictions. In fact, quite the reverse. In the 1930s, Einstein and two
co-workers, Nathan Rosen and Boris Podolsky, thought up an experiment
which would demonstrate that either quantum mechanics was incorrect or
that two particles separated by kilometres, or even light years, would
have to be instantaneously connected together in some inexplicable way.
In the 1980s, technological developments enabled that experiment
actually to be carried out. The quantum mechanical prediction was
vindicated, throwing into even higher relief the question of what
‘reality’ means.
But quantum mechanics made it possible for
physicists to dig deeper into the structure of matter. Forty years of
work has established what we know as the standard model of fundamental
particle physics: the existence of two families of particles (quarks and
leptons) held together by four forces (electromagnetic, strong, weak and
gravitation) which form the basis of all matter as we know it.
This is the situation at the beginning of the 21st
century. Relativity theory and quantum mechanics have separately
revolutionised the way we see the universe. But it is not possible to
mesh the two theories. In fact, applying one to the other produces
meaningless results. Unifying the two theories, producing one
overarching theory which accommodates both, remains an unachieved goal.
The two books, The Fabric of the Cosmos, and The Trouble with Physics,
encapsulate the triumphs of physics in the 20th century and the
challenge lying before it.
Explaining everything?
THE FABRIC OF the Cosmos gives a brilliant sketch of
the present state of fundamental physical theory. Brian Greene starts by
discussing relativity theory and its implications for our view of space
and time. This analysis cannot be bettered for the general reader. He
goes on to outline the strange picture of reality given by quantum
mechanics and describes what has appeared over the past 20 years to be
the best candidate for a theory to unify quantum mechanics and
relativity: superstring theory.
In the standard model, fundamental particles are
seen as ‘point particles’ with no spatial dimensions. Superstring theory
postulates that these ‘particles’ are in fact made up of tiny
one-dimensional ‘strings’. The various ways in which these strings can
vibrate (like the different vibrations of the strings on a guitar)
account for the properties of the particles that we observe and the
forces between them. Unlike earlier theories, this picture includes
gravitational forces and meshes together general relativity and quantum
theory.
But string theory faced – and still faces – major
problems. First was the complexity of the mathematics involved and the
fact that the strings had to vibrate not merely in our four space-time
dimensions but in an eleven dimensional space-time where seven of the
spatial dimensions are ‘curled over’ on themselves so that we do not
observe them. Second was the fact that there appear to be a very large
number of alternative solutions to the equations of string theory. Third
and most important, at our present stage of technological development,
it is impossible to test the theory.
To explain the last point, remember that the only
way we can investigate the constituents of matter is by probing it in
some way. Rutherford investigated the atom by shooting helium nuclei at
a foil of gold atoms and seeing how they were deflected. To probe the
atomic nucleus, particles of much higher energies are needed. In the
1940s and 1950s, cyclotrons or synchrotrons were built which accelerated
electrons or protons to high energies before they smashed into atomic
nuclei. The fragments resulting from such collisions enabled the
identification and measurement of what we refer to as fundamental
particles. At higher energies still, the structures of protons and
neutrons could be investigated, giving rise to our present standard
model. (The LHC is the latest generation of such particle accelerators.)
But to investigate superstring theory, accelerators
are needed which produce particles with energies far, far in excess of
what is technologically imaginable. Nevertheless, Greene expresses well
the enthusiasm which string theorists felt that superstring theory was
indeed the true overarching solution for all questions of the
fundamental nature of matter.
Five great problems
COMPARED WITH GREENE’S book, Lee Smolin’s The
Trouble with Physics gives a cool (and much less mathematically
challenging) look at the state of physics at this moment. Although he
worked on string theory, Smolin has come to the conclusion that an
overwhelming emphasis on that approach has led to an impasse, which has
meant that, over the last 20 years, there has been no significant
development in theoretical physics comparable to that which took place
in the first 30 years of the 20th century. Where Greene is happy to
accept, and almost glorifies in, the bizarre way in which the
development of quantum theory has brought into question the whole
definition of reality as we understand it, Smolin sees this as a major
problem.
Smolin categorises ‘five great problems’ facing
physics at this time which he believes require a completely different
theory or even a different sort of theory to cope with.
Two of these problems – the problem of combining
general relativity and quantum theory, and the need for a theory which
provides a unified explanation of the existence of the various particles
and forces – are problems which superstring theory claims to answer. In
reply, Smolin emphasises that until some experimental test of the theory
can be devised, superstring theory cannot participate in the debate
between experiment and theory which forms the basis of scientific
progress.
A related problem is the fact that the values of the
fundamental constants which make up the standard model of fundamental
particle physics seem arbitrary and unrelated.
Fourthly, Smolin points out the fundamental
scientific and philosophical problem in the foundation of quantum
mechanics: what is ‘reality’? The view most commonly taught today is the
so-called ‘Copenhagen interpretation’, named after the institute where
Neils Bohr developed it, in opposition to Einstein’s views. Essentially,
it states that until some measurement is made on it, a particle in the
sense we mean it, does not exist. Quite obviously, this picture of some
sort of ‘Matrix–style’ shadow universe has deep philosophical
consequences directly contradictory to materialist ideology. But there
is as yet no acceptable alternative – and most scientists adopt a
pragmatic approach: ‘Despite the fact that quantum mechanics seems to
have no basis in reality as we understand it, it works’.
Finally, Smolin discusses two purely experimental
problems deriving from astronomical measurements: ‘dark matter’ and
‘dark energy’. Observation of the rotation of galaxies shows that the
gravitation force holding them together is much larger than can be
accounted for by the mass of the objects we can see (stars, dust clouds,
etc). So it is necessary to assume that invisible dark matter makes up
perhaps 70% of our galaxy. No-one knows what this consists of.
Alternatively, our theory of gravitation needs modification. Second,
measurements of the speed of recession of galaxies has shown that our
universe is not merely expanding but that the expansion is accelerating
and that there must be some unknown dark energy driving this process.
In Smolin’s view, these problems add up to a
‘crisis’ in physics comparable to that which faced classical physics at
the end of the 19th century – a crisis which may require a whole new
approach, possibly as different from present day quantum physics as
quantum physics was from classical physics. Superstring theory may well
form a part of such a new approach, but it may not.
Read together, the two books (which cover much of
the same ground) form a fascinating double act. Greene shows an
infectious enthusiasm and excitement with the forward progress of
fundamental theory. Smolin, on the other hand, notes the progress of
physics in fits and starts. A period of ‘calm’, when a theory is
generally accepted and when the theory enables major developments to be
made, is followed by a ‘revolutionary’ period, when a new, more
comprehensive theory replaces it. He believes that the time for such a
revolution is overdue. Obviously, such a revolution cannot be ‘forced’
like rhubarb but Smolin is concerned that the social structure of
physics as organised today makes it less likely that such a theory will
emerge.
Physics in capitalist society
THE FINAL PART of Smolin’s book discusses what he
sees as the reason why the physical theory of the fundamental nature of
the universe has got stuck in a rut over the last quarter century. There
has been a huge concentration of work along the path of superstring
theory without reference to any possible experimental verification. To
Smolin this shows a failure to complete the revolution started at the
beginning of the 20th century or to engage seriously with his ‘five
great problems’. He recognises that physics research does not take place
in a vacuum but tends to see the problem from inside. He makes a clear
case that the way the ‘physics community’ is structured makes it almost
inevitable that talented researchers who want a permanent job and career
prospects will be forced to concentrate on ‘safe’ areas of research.
Funding will only flow to those areas which senior professors think are
most fruitful and which will enhance their prestige. Smolin’s arguments
are based on his experience in the USA, but could probably be
generalised. Smolin does not enlarge on the reasons for this and why, as
he acknowledges, the situation is getting worse. For this, physics must
be seen in the context of capitalist society as a whole.
The ‘physics community’ has always had its place and
role in capitalist society. It is no accident that the huge leap forward
in the funding of physics came on the back of World War II and the ‘cold
war’. To an extent, the funding of the LHC was a product of competition
between the ruling classes – and hence, the scientific establishments –
of the EU and USA for leadership in this particular branch of physics.
It is also true that the ‘scientific establishment’
has always tended to be resistant to new theories. Einstein himself was
refused a university post and developed his theories while working as a
clerk in the patent office. But the situation has got worse in the past
quarter century. University funding has become tied ever closer to the
needs of big business sponsors or, at one remove, to what government and
its advisors believe is ‘worth’ studying. Avenues of enquiry which do
not seem to offer a quick payoff or which are regarded as likely to lead
to a ‘dead end’ have been increasingly strangled. In 2003, the British
physicist, Tony Leggett, won a Nobel Prize for work carried out in the
1980s. He says that today he would not have the freedom to follow that
particular line of work.
A sclerotic system
IT IS THIS creeping sclerosis in the channels of
communication and research that concerns Smolin. But this sclerosis is
only symptomatic of an ideological attitude common to the whole ruling
class. The belief by string theorists that their theory provides all the
answers, and that any other line of enquiry is beneath contempt, mirrors
that of the pundit Francis Fukuyama that ‘history has ended’, and that
of the US neo-conservatives in their Project for the New American
Century, that US capitalism represents the final, highest and finest
state of society.
But as the 18th century US statesman John Adams put
it: "Facts are stubborn things; and whatever may be our wishes… they
cannot alter the state of facts and evidence". Just as the facts of the
worsening situation in Iraq and the Middle East has reduced the
confident perspectives of the neo-cons to a heap of smoking ash, so may
the predictions of superstring theory disappear into a cloud of
mathematical abstraction uncoupled from any relation to the real world.
Physics may be ripe for a revolution. It is no
accident that revolutionary periods in society are mirrored in science
and in the arts. The foundations of modern physical theory were laid in
the first decade of the 20th century, a period of political and social
upheaval. As we move into a new period of crisis for world capitalism,
the crisis in physics that Smolin identifies may well start to be
resolved. However that resolution takes place, it will change profoundly
our picture of the universe we live in.
At the same time, as the constraints of capitalism
are strangling the freedom of scientific research, the scientific method
itself is being contested, not just by well-funded establishments
dedicated to pushing anti-scientific beliefs such as that of
‘intelligent design’, but by a general mistrust of ‘science’ seen as a
weapon of exploitation. Marxists have a duty to understand not just the
latest developments in modern physics – at least in a basic form – but
also the use of scientific method to produce such developments, and to
be able to pick out what is science and what is mumbo-jumbo. Greene and
Smolin approach from different angles our most basic picture of the
universe. Taken together they give an insight into what the ‘scientific
method’ really means.
The Fabric of the Cosmos, Brian Greene, Penguin Books, 2005, £9-99
The Trouble with Physics, Lee Smolin, Houghton Mifflin, 2006, £8-99
|