Unfortunately,
Internet Explorer 11 will no longer play the videos I have posted to
this page. As far as I can tell, they play as intended in other Browsers.
However, if
you have Privacy Badger [PB] installed, they won't play in Google Chrome unless you
disable PB for this site.
[Having said that,
I have just discovered that they will play in IE11 if you have
upgraded to Windows 10! It looks like the problem is with Windows 7 and earlier
versions of that operating system.]
If you are using Internet Explorer 10 (or later), you might find some of the
links I have used won't work properly unless you switch to 'Compatibility View'
(in the Tools Menu); for IE11 select 'Compatibility View Settings' and add this
site (anti-dialectics.co.uk). Microsoft's browser,
Edge, automatically
renders these links compatible; Windows 10 also automatically makes IE11
compatible with this site.
However, if you are using Windows 10,
IE11 and Edge unfortunately appear to colour these links somewhat erratically.
They are meant to be mid-blue, but those two browsers render them intermittently light blue, yellow, purple and red!
Firefox and Chrome reproduce them correctly.
I am in the
process of reformatting this page, which might take several weeks to complete. I
am also trying to correct a few serious formatting errors; for example, a
passage I posted from The Guardian newspaper refuses to load to this page on the
Internet!
This Appendix used to belong to
Essay Eleven
Part One; it has been moved here since that Essay was becoming
unwieldy.
In this section,
I will be posting examples (in
addition to those already listed in the main body of the above Essay) of
scientists who have radically changed their minds, or where they have overthrown,
questioned or rejected established dogma -- or, indeed, where it looks like they are about to do so.
As is the case with all my
work, nothing here should be read as an attack
either on HM -- a scientific theory I fully accept --, or, indeed, on
revolutionary socialism. I remain as committed to the self-emancipation of the
working class and the dictatorship of the proletariat as I was when I first
became a revolutionary thirty-five years ago.
The
difference between
DM and HM, as I see it, is explained
here.
Anyone using these links must remember that
they will be skipping past supporting argument and evidence set out in earlier
sections.
If your Firewall/Browser has a pop-up blocker, you will need to press the
"Ctrl" key at the same time or these and the other links here won't work!
I have adjusted the
font size used at this site to ensure that even those with impaired
vision can read what I have to say. However, if the text is still either too
big or too small for you, please adjust your browser settings!
"Conditions in the uterus can give rise to life-long changes in genetic
material. People in their sixties who were conceived during the
Hunger Winter of 1944-45 in the Netherlands have been found to have a
different molecular setting for a gene which influences growth. Researchers from
the
LUMC are the first to
demonstrate this effect. They published their findings this week in
PNAS Online
Early Edition, together with colleagues from Columbia University.
"During
the Hunger Winter (the Dutch famine of 1944-1945) the west of the Netherlands
suffered from an extreme lack of food. It now appears that the limited food
intake of mothers who were pregnant during this period altered the genetic
material of embryos in the early stages of development. The effects of this
can still be observed some sixty years later. These alterations are not changes
in the genetic code, but a different setting for the code which indicates
whether a gene is on or off. This is known as
epigenetics. One of the main
processes in epigenetics is connecting the small molecule methyl to DNA.
"The
researchers compared the degree of methylation of a piece of DNA, the IGF2 gene,
of people who were conceived in the Hunger Winter with that of their brothers
and sisters. They chose this particular gene because it plays an important role
during gestation. People in their sixties who were conceived during the Hunger
Winter have less methyl groups on the IGF2 gene than their siblings. This did
not apply to children of the Hunger Winter who were in later stages of gestation
when the famine occurred. They did have a lower birth weight than their
siblings, but the IGF2 gene was not 'packaged' differently. This indicates that
epigenetic information is particularly vulnerable in the early stages of
pregnancy.
"'The next
question is whether the epigenetic change which has been identified is a "scar"
on the DNA because of lack of food, or a specific adaptation to the shortage of
food,' comments Prof Eline Slagboom. Researcher Dr Bas Heijmans: 'Epigenetics
could be a mechanism which allows an individual to adapt rapidly to changed
circumstances. Changes in the DNA sequence occur by chance and it takes
generations before a favourable mutation spreads throughout the population. By
then, a temporary famine is long past. It could be that the metabolism of
children of the Hunger Winter has been set at a more economical level, driven by
epigenetic changes.' This could explain why children of the Hunger Winter suffer
more frequently from obesity and cardio-vascular diseases. The research was
partly financed by the Netherlands Heart Foundation and the EU network
LifeSpan." [Leiden
University Report, October 2011. Quotation marks
altered to conform with the conventions adopted at this site.]
"Epigenetics:
Genome, Meet Your Environment
"As the evidence accumulates for epigenetics, researchers reacquire a taste
for
Lamarckism
"Toward
the end of World War II, a German-imposed food embargo in western Holland -- a
densely populated area already suffering from scarce food supplies, ruined
agricultural lands, and the onset of an unusually harsh winter -- led to the
death by starvation of some 30,000 people. Detailed birth records collected
during that so-called Dutch Hunger Winter have provided scientists with useful
data for analyzing the long-term health effects of prenatal exposure to famine.
Not only have researchers linked such exposure to a range of developmental and
adult disorders, including low birth weight, diabetes, obesity, coronary heart
disease, breast and other cancers, but at least one group has also associated
exposure with the birth of smaller-than-normal grandchildren. The finding is
remarkable because it suggests that a pregnant mother's diet can affect her
health in such a way that not only her children but her grandchildren (and
possibly great-grandchildren, etc.) inherit the same health problems.
"In
another study, unrelated to the Hunger Winter, researchers correlated
grandparents' prepubertal access to food with diabetes and heart disease. In
other words, you are what your grandmother ate. But, wait, wouldn't that
imply what every good biologist knows is practically scientific heresy: the
Lamarckian inheritance of acquired characteristics?
"If
agouti mice are any indication, the answer could be yes. The multicoloured
rodents make for a fascinating epigenetics story, which Randy Jirtle and Robert
Waterland of Duke University told last summer in a Molecular and Cell Biology
paper; many of the scientists interviewed for this article still laud and
refer to that paper as one of the most exciting recent findings in the field.
The Duke researchers showed that diet can dramatically alter heritable
phenotypic change in agouti mice, not by changing DNA sequence but by changing
the DNA methylation pattern of the mouse genome. 'This is going to be just
massive,' Jirtle says, 'because this is where environment interfaces with
genomics.'
"Epigenetics
Explained
"This type
of inheritance, the transmission of non-DNA sequence information through either
meiosis or
mitosis, is
known as epigenetic inheritance. From the Greek prefix epi, which
means 'on' or 'over', epigenetic information modulates gene expression without
modifying actual
DNA
sequence.
DNA methylation patterns are the longest-studied and best-understood
epigenetic markers, although
ethyl,
acetyl,
phosphoryl, and other modifications of
histones, the
protein spools around which DNA winds, are another important source of
epigenetic regulation. The latter presumably influence gene expression by
changing
chromatin structure, making it either easier or more difficult for genes to
be activated.
"Because a
genome can pick up or shed a methyl group much more readily than it can change
its DNA sequence, Jirtle says epigenetic inheritance provides a 'rapid mechanism
by which [an organism] can respond to the environment without having to change
its hardware.' Epigenetic patterns are so sensitive to environmental change
that, in the case of the agouti mice, they can dramatically and heritably alter
a phenotype in a single generation. If you liken the genome to the hardware of a
computer, Jirtle explains, then 'epigenetics is the software. It's the grey
area. It's just so darn beautiful if you think about it.'
"The
environmental
lability
of epigenetic inheritance may not necessarily bring to mind Lamarckian images of
giraffes stretching their necks to reach the treetops (and then giving birth to
progeny with similarly stretched necks), but it does give researchers reason
to reconsider long-refuted notions about the inheritance of acquired
characteristics. Eighteenth-century French naturalist
Jean Baptiste de Lamarck proposed that environmental cues could cause
phenotypic changes transmittable to offspring. 'He had a basically good idea but
a bad example,' says Rohl Oflsson, Uppsala University, Sweden.
"Although
the field of epigenetics as it is known today (that is, the study of heritable
changes in gene expression and regulation that have little to do with DNA
sequence) has been around for only 20 years or so, the term epigenetics has been
in use since at least the early 1940s. Developmental biologist
Conrad Waddington used it back then to refer to the study of processes by
which
genotypes
give rise to
phenotypes (in contrast to genetics, the study of genotypes). Some reports
indicate that the term is even older than Waddington, dating back to the late
1800s. Either way, early use of the term was in reference to developmental
phenomena.
"In 2001,
Joshua Lederberg proposed the use of more semantically, or historically, correct
language. But it appears that today's use of the term is here to stay, at least
for now, as are its derivatives: epiallele (genes with different degrees of
methylation), epigenome (the genome-wide pattern of methyl and other epigenetic
markers), epigenetic therapy (drugs that target epigenetic markers), and even
epigender (the sexual identity of a genome based on its imprinting pattern).
"Terminology aside, biologists have long entertained the notion that certain
types of cellular information can be transmitted from one generation to the
next, even as DNA sequences stay the same. Bruce Stillman, director of Cold
Spring Harbor Laboratory (CSHL), NY, traces much of today's research in
epigenetics back to Barbara McClintock's discovery of
transposons
in maize. Methyl-rich transposable elements, which constitute over 35% of the
human genome, are considered a classical model for epigenetic inheritance.
Indeed, the epigenetic lability of Jirtle's agouti mice is due to the presence
of a transposon at the 5' end of the agouti gene. But only over the past two
decades has the evidence become strong enough to convince and attract large
numbers of epigenetics researchers. '[Epigenetics] has very deep roots in
biology,' says Stillman, 'but the last few years have been just an explosion in
understanding.'
"Methylation
And More
"One of
the prominent features of DNA methylation is the faithful propagation of its
genomic pattern from one cellular or organismal generation to the next. When a
methylated DNA sequence replicates, only one strand of the next-generation
double helix has all its methyl markers intact; the other strand needs to be
remethylated. According to Massachusetts Institute of Technology biologist Rudy
Jaenisch, the field of epigenetics took its first major step forward more than
two decades ago when, upon discovering DNA
methyltransferases (DMTs, the enzymes that bind methyl groups to
cytosine nucleotides), researchers finally had a genetic handle on how
epigenetic information was passed along. Now, it is generally believed that DMTs
bind methyl groups to the naked cytosines based on the methylation template
provided by the other strand. This is known as the maintenance methylase theory.
"But even
a decade ago, says Wolf Reik of the Babraham Institute, Cambridge, UK, 'a lot of
epigenetics was phenomenology, and so people looked at it and said, well, this
is all very interesting, but what's the molecular mechanism?' Reik points to
recent evidence suggesting a critical link between the two main types of
epigenetic regulation, DNA methylation and histone modification, as one of the
most interesting recent developments in the field. Because of that link,
researcher Eric Selker and colleagues at the University of Oregon, Portland,
have proposed that there may be more to methylation propagation than
maintenance, despite 25 years of evidence. In 2001, Selker and coauthor Hisashi
Tamaru showed that dim-5, a gene that encodes a
histone H3 Lys-9 methyltransferase, is required for DNA methylation in the
filamentous fungus,
Neurospora crassa.The histone enzyme is, in turn, influenced
by modifications of
histone H3.
So even though DNA methylation is guided by a DNA methyltransferase encoded by
dim-2, it still takes orders from the chromatin.
"In a
study by CSHL researchers Robert Martienssen, Shiv Grewal, and colleagues,
evidence suggests that histone modifications are, in turn, guided by
RNA
interference (RNAi). Using the fission yeast
Schizosaccharomyces pombe, the researchers deleted genes that encode
RNAi molecular machinery and observed a loss of histone H3 lys-9 methylation and
impaired
centromere function. 'This new understanding has created a lot of
excitement,' says Stillman....
"Lamarckism Revisited
"Normally,
the fur of agouti mice is yellow, brown, or a calico-like mixture of the two,
depending on the number of attached methyl groups. But when Duke University
researchers Jirtle and Waterland fed
folic acid
and other methyl-rich supplements to pregnant mothers, despite the fact that all
offspring inherited exactly the same agouti gene (i.e., with no
nucleotide
differences), mice who received supplements had offspring with mostly brown fur,
whereas mice without supplements gave birth to mostly yellow pups with a higher
susceptibility to obesity, diabetes, and cancer. The methyl groups bound to a
transposon at the 5' end of the agouti
locus,
thereby shutting off expression of the agouti gene, not just in the
murine
recipient but in its offspring as well.
"Although
the study demonstrates that, at least in mice, folic acid supplementation in
pregnant mothers reduces the risk of their babies having certain health
problems, Jirtle warns that the results cannot be extrapolated to humans. 'Mice
are not men,' he emphasizes. But he doesn't downplay the proof of principle. The
take-home message is not that folic acid supplements are a good thing. Rather,
environmental factors such as nutritional supplementation can have a dramatic
impact on inheritance, not by changing the DNA sequence of a gene or via
single-nucleotide polymorphism, but by changing the methylation pattern of
that gene. 'It's a proof of concept,' says Donata Vercelli, University of
Arizona, Tucson. 'That's why it's so important.'
"According
to Vercelli, the environmental susceptibility of epigenetics probably
explains why genetically identical organisms such as twins can have dramatically
different phenotypes in different environments. She points to the agouti
mice, as well as another recent cluster of studies on a heat shock protein,
Hsp90, in
Drosophila melanogaster, as 'model systems that have very eloquently
demonstrated' the critically important role that epigenetic inheritance plays in
this kind of gene-by-environment interaction.
"Hsp90
regulates developmental genes during times of stress by releasing previously
hidden or buffered phenotypic variation. Douglas Ruden of the University of
Alabama, Tuscaloosa, says he noticed some weird fruit fly phenotypes -- things
like appendage-like organs sticking out of their eyes -- at about the same time
that a paper appeared in Nature connecting Hsp90 activity in
Drosophila to genetic variation. In that paper, Suzanne Rutherford and Susan
Lindquist, then at the University of Chicago, presented compelling evidence that
Hsp90 serves as an 'evolutionary capacitor,' a genetic factor that regulates
phenotypic expression by unleashing 'hidden' variation in stressful conditions.
Even after restoring normal Hsp90 activity, the new phenotypes responded to ten
or more generations of selection. The scientists concluded that, once released,
even after normal Hsp90 activity was restored, the previously buffered variation
persisted in a heritable manner, generation after generation.
"When the
Lindquist paper came out, Ruden says he thought, 'Ah, I'm probably seeing the
same thing.' He was doing some crosses, 'and I started to see this weird
phenotype.' But Ruden and collaborators concluded that their strange eye
phenotype was due to something other than, or in addition to, the sudden
unleashing of hidden genetic variation. Indeed, the researchers used a strain of
flies that had little genetic variation, and yet was still capable of responding
to 13 generations of selection even after normal Hsp90 activity was restored.
Because of the genomic homogeneity of their flies, combined with observations
that mutations encoding chromatin-remodeling proteins induced the same abnormal
eye phenotype, the investigators concluded that reduced levels of Hsp90 affected
the phenotype by epigenetically altering the chromatin.
"Although
it is hard to imagine that an appendage-like structure sticking out of the eye
would be adaptive in times of stress, Vercelli says that epigenetic change
clearly can be environmentally induced in a heritable manner, in this case
by alterations to Hsp90. The morphological variations in the eye were probably
only the most obvious of many phenotypic differences caused by the chromatin
changes.
"In a
written commentary, evolutionary biologist Massimo Pigliucci said that Ruden's
experiment was 'one of the most convincing pieces of evidence that epigenetic
variation is far from being a curious nuisance to evolutionary biologists.'
Pigluicci doesn't go so far as to say that the heritable changes caused by
Hsp90 alterations are Lamarckian, but Ruden does. 'Epigenetics has always
been Lamarckian. I really don't think there's any controversy,' he says.
"Not that
Mendelian genetics is wrong; far from it. The increased understanding of
epigenetic change and the recent evidence indicating its role in inheritance and
development doesn't give epigenetics greater importance than DNA. Genetics and
epigenetics go 'hand in hand,' says Ohlsson. In the case of disease, says Reik,
'there are clearly genetic factors involved, but there are also other factors
involved. My suspicion is that it will be a combination of genetic and
epigenetic factors, as well as environmental factors, that determine all these
diseases.'" [Pray
(2004). Quotation marks
altered to conform with the conventions adopted at
this site. Spelling changed to UK English. Bold emphases and some links added. References
included in
the original article have been omitted.]
On this, see Carey (2011) and Francis (2012).
[Details concerning these two books can be accessed
here.]
"We used to think
evolution had to start with random mutations -- now walking fish and bipedal
rats are turning our ideas on their head
"'To be honest, I was
intrigued to see if they'd even survive on land,' says Emily Standen. Her plan
was to drain an aquarium of nearly all the water and see how the fish coped. The
fish in question were
bichir fish
that can breathe air and haul themselves over land when they have to, so it's
not as far-fetched as it sounds.
"What was perhaps more
questionable was Standen's rationale. Two years earlier, in 2006,
Tiktaalik had become a global sensation. This
360-million-year-old fossil provides a snapshot of the moment our fishy
ancestors hauled themselves out of the water and began trading fins for limbs.
Standen thought forcing bichir fish to live almost entirely on land could reveal
more about this crucial step in our evolution. Even if you were being kind, you
might have described this notion as a little bit fanciful.
"Today, it seems
positively inspired. The bichirs did far more than just survive. They
became better at 'walking'. They planted their fins
closer to their bodies, lifted their heads higher off the ground and slipped
less than fish raised in water. Even more remarkably, their skeletons changed
too. Their 'shoulder' bones lengthened and developed stronger contacts with the
fin bones, making the fish better at press-ups. The bone attachments to the
skull also weakened, allowing the head to move more. These features are
uncannily reminiscent of those that occurred as our four-legged ancestors
evolved from Tiktaalik-like forebears.
"What is really amazing about
this experiment is that these changes did not come about after raising
generations of fish on land and allowing only the best walkers to breed.
Instead, it happened within the lifetime of individual fish. Simply forcing
young fish to live on land for eight months was all it took to produce these
quite dramatic changes. We have long known that our muscles, sinews and bones
adapt to cope with whatever we make them do. A growing number of biologists
think this kind of plasticity may also play a key role in evolution. Instead of
mutating first and adapting later, they argue, animals often adapt first and
mutate later. Experiments like Standen's suggest this process could even play a
role in major evolutionary transitions such as fish taking to land and apes
starting to walk upright.
"The idea that
plasticity plays a role in evolution goes back more than a century. Some early
biologists thought that
characteristics acquired during an animal's lifetime
could be inherited by their offspring: giraffes got their long necks by
stretching to eat leaves, and so on. The French naturalist
Jean-Baptiste Lamarck
is the best-known advocate of this idea, but Darwin
believed something similar. He
even proposed an elaborate mechanism to explain how
information about changes in the body could reach eggs and sperm, and therefore
be passed on to offspring. In this way, Darwin suggested, plasticity produces
the heritable variations on which natural selection can work its magic.
"With the rise of
modern genetics, such notions were dismissed. It became clear that there is no
way for information about what animals do during their lifetime to be passed on
to their offspring (although
a few exceptions have
emerged since). And it was thought this meant
plasticity has no role in evolution. Instead, the focus shifted to mutations. By
the 1940s, the standard thinking was that animals mutate first and adapt later.
A mutation in a sperm cell, say, might produce a physical change in the bodies
of some offspring. If the change is beneficial, the mutation will spread through
the population. In other words, random genetic mutations generate the variation
on which natural selection acts. This remains the dominant view of evolution
today.
"The dramatic effects of
plasticity were not entirely ignored. In the 1940s, for instance, the Dutch
zoologist Everhard Johannes Slijper studied a goat that had been born without
forelegs and learned to hop around, kangaroo-like, on its rear legs. [It seems
that
a
dog in the UK can do likewise -- RL.] When Slijper examined the goat after
its death, he discovered that the shape of its muscles and skeleton looked more
like those of a biped than a quadruped. Few biologists considered such findings
relevant to the evolutionary process. The fact that changes acquired during an
animal's lifetime are transient seemed to rule out that possibility. If
Standen's better-at-walking fish were bred and the offspring raised in a normal
aquarium, for instance, they should look and behave like perfectly ordinary
bichirs.
"Transient response
"But what if the
environmental conditions that induce the plastic response are themselves
permanent? In the wild, this could happen as a result of alterations in prey
animals, or in the climate, for instance. Then all the members of a population
would develop in the same, consistent way down the generations. It would look as
if the population had evolved in response to an altered environment, but
technically it's not evolution because there is no heritable change. The thing
is, the only way to tell would be to 'test' individuals by raising them in
different circumstances.
"In this way at least,
plasticity can allow animals to 'evolve' without evolving. The crucial question,
of course, is whether it can lead to actual evolution, in the sense of heritable
changes. 'You can plastically induce generation after generation,' says Standen,
who is now at the University of Ottawa in Ontario, Canada. 'At some point, can
you remove the environmental conditions that induced the change and have the
organisms remain changed?' The answer, surprisingly, seems to be yes. In the
1950s, British biologist
Conrad
Hal Waddington
showed that it is feasible in an experiment involving fruit
flies. Waddington found that when pupa are
briefly heated, some offspring develop without
crossveins in their wings. He then selected and bred those flies. By the 14th
generation, some lacked crossveins even when their pupa were not heated. A
physical feature that began as a plastic response to an environmental trigger
had become a hereditary feature.
"How is this possible?
Plastic changes occur because an environmental trigger affects a developmental
pathway in some way. More of a certain hormone may be produced, or produced at a
different time, or genes are switched on that normally remain inactive, and so
on. The thing is, random mutations can also have similar effects. So in an
environment in which a particular plastic response is crucial for survival, only
mutations that reinforce this response, or at least do not impede it, can spread
through a population. Eventually, the altered developmental pathway will become
so firmly stabilised by a genetic scaffolding that it will occur even without
the environmental trigger, making it a permanent hereditary feature.
"Waddington called this
process genetic assimilation. It may sound like Lamarckism, but it is not. The
acquired characteristics don't shape the genetic changes directly as Darwin
proposed, they merely allow animals to thrive in environments that favour
certain mutations when they occur by chance
(see diagram).
Waddington's findings have been regarded as a curiosity rather than a crucial
insight. But in the past decade or two,
attitudes have begun to change. One reason for this is
a growing appreciation of the flexibility of genes. Rather than being rigidly
pre-programmed, we now know that the environment influences many aspects of
animals' bodies and behaviour.
"Such discoveries have
led some biologists to claim that developmental plasticity plays a major role in
evolution. A few, such as Kevin Laland at the University of St Andrews, UK, even
argue that the conventional 'mutate first, adapt later' picture of evolution
needs a rethink (Nature,
vol 514, p.161). Most biologists have yet to be
convinced. The sceptics point out that genetic assimilation does not overturn
any fundamental principles of evolution -- in the long run, evolution is all
about
the spread of mutations, whether or not plasticity is
involved. Yes, say the proponents of plasticity, but the key point is that
plasticity can determine which mutations spread (New
Scientist, 12 October 2013, p 33), so its role
should be given the prominence it deserves. 'Several major recent evolutionary
textbooks do not even mention plasticity,' says Laland.
"It may play a role
occasionally, respond the sceptics, but it's a minor one at best. 'There is
little debate that genetic assimilation can happen,' says Gregory Wray of Duke
University in Durham, North Carolina. 'But there is unfortunately very little
support for its role in nature.' This is what makes Standen's work on the bichir
so significant. It implicates plasticity in a major evolutionary transition:
fish turning into four-legged land animals (Nature,
vol 513, p.54). Plasticity will soon be implicated in
another major transition too -- the one our ancestors made from four legs to two
about 7 million years ago. As part of his PhD dissertation, Adam Foster, now at
the Northeast Ohio Medical University in Rootstown, has been making rats walk on
a treadmill. 'I had a custom harness system built so I could modify the load
experienced by the hind limbs,' he says. Some rats had to walk on their hind
limbs, while others walked on all fours. Each rat exercised on the treadmill for
an hour a day for three months, and then Foster examined their skeletons.
"He found that the 'bipedal'
rats had developed longer legs than standard quadrupedal rats, and that their
thigh bones had larger femoral heads -- the ball in the hip joint. Both features
are associated with the transition to bipedalism in our hominin ancestors.
Foster hopes to publish the results later this year. 'I think Adam's research is
really compelling,' says Jesse Young, an anatomist at Northeast Ohio Medical
University. 'As he was getting it going, I was a bit sceptical. You couldn't
predict it would reveal anything useful.' While the work of Standen and Foster
suggests that developmental plasticity could play a role in major evolutionary
transitions, it is only suggestive. Indeed, these studies do not even show that
the plastic changes seen in the bichir fish and rats can be fixed by mutations.
Demonstrating this kind of genetic assimilation would certainly be tricky, says
Standen. It would not be practical with the bichir fish she studied. 'As
wonderful as they are, they're frustrating fish,' says Standen. 'They take the
better part of a decade to mature, and even then they're really difficult to
breed in captivity.'
"The fossil record is usually
no help either. It is possible that some of the changes seen as fish colonised
the land were a result of plasticity rather than genetics, says Per Ahlberg of
the University of Uppsala in Sweden who studies the transition to land. For
Ahlberg, the trouble is that there is no way to prove it. 'There's no evidence
that will allow us to choose between the two,' he says.
"More evolvable
"Other biologists are more
enthusiastic. It has long been suggested that different parts of the skeleton
are more plastic and 'evolvable' than others, says William Harcourt-Smith of the
American Museum of Natural History. 'So a foot bone or a hand bone might give
you more useful info than a hip bone, for instance.' Work like Foster's could
reveal if this is indeed the case and help us interpret the fossil record of
human evolution. 'These experiments do have validity,' Harcourt-Smith says.
'They can help us understand whether traits are plastic or not.'
Take the honeycomb
structure in the heads of our long bones. It is lighter and weaker than it was
in our extinct cousins such as the Neanderthals. A study out last month compared
the bones of hunter-gatherers and early farmers in North America. It concluded
that our bones became weak only when our ancestors' lifestyles changed (PNAS,
doi.org/xwq). 'We could have a skeleton as strong as
our prehistoric ancestors,' says team member Colin Shaw of the University of
Cambridge, UK. 'We just don't because we're not as active.' It's possible that
similar kinds of skeletal structural change seen in prehistory have been
misinterpreted as signs of speciation when they really just reflect
developmental plasticity, says Shaw -- perhaps especially so in hominin
evolution. Humans are unique, he points out. 'Our first line of defence against
environmental insult is culture. When that's not adequate -- for instance if the
clothing you can make is not good enough to keep you warm -- then arguably the
second line of defence is plasticity. Only after that fails might you actually
get genetic selection.'
"All this still leaves
open the question of whether genetic assimilation can 'fix' traits that first
appear as a result of plasticity. A decade ago, Richard Palmer at the University
of Alberta in Edmonton, Canada, found a way to search for evidence in the fossil
record. Most animals have some asymmetric traits. In our case,
it's the position of the heart and other organs, which
is encoded in our genes. But in other species, asymmetries are plastic. For
instance, the enlarged claw of male fiddler crabs...is as likely to be on the
left as on the right. What Palmer showed by examining the fossil record of
asymmetry in 68 plant and animal species is that on 28 occasions, asymmetries
that are now hereditary and appear only on one side started out as
non-hereditary asymmetries that appeared on either side (Science,
vol 306, p.828). 'I think it's one of the clearest
demonstrations that genetic assimilation has happened and that it is more common
than expected,' says Palmer.
"There is a caveat here,
though. The ancestral non-hereditary asymmetries may have been a result of
random genetic noise, says Palmer. So while his work does show genetic
assimilation in action, it was not necessarily fixing traits due to
developmental plasticity. There is no simple way to prove the evolutionary
importance of developmental plasticity, says Mary Jane West-Eberhard of the
Smithsonian Tropical Research Institute in Costa Rica, whose work has been
particularly influential. 'Evolutionary biology that is concerned with evolution
and speciation in nature necessarily depends on indirect proof -- an
accumulation of facts that support or deny a hypothesis,' she says. At the
moment, the facts that are accumulating seem to support the hypothesis. Expect
lots more results soon: Standen's success is inspiring others. 'I've already had
people ask me what other critters we could try this on,' says Standen.
'Everybody is friendly and excited and interested. It's fun -- it's the way
science should be.'" [Barras
(2015), pp.26-30. (This links to a PDF.) Quotation marks altered
to conform with the conventions adopted at this site. Some links added, and
several paragraphs merged to save
space.]
August 2015: We now read this
from The Guardian:
"Study of Holocaust survivors finds trauma passed on to
children's genes
"Helen Thomson, The Guardian, 21/08/2015
"New finding is first example in humans of the theory of epigenetic inheritance:
the idea that environmental factors can affect the genes of your children
"Genetic changes stemming from the trauma suffered by Holocaust survivors are
capable of being passed on to their children, the clearest sign yet that one
person’s life experience can affect subsequent generations. The conclusion from
a
research team at New York's Mount Sinai hospital
led by Rachel Yehuda stems from the genetic study of 32 Jewish men and women who
had either been interned in a Nazi concentration camp, witnessed or experienced
torture or who had had to hide during the second world war. They also analysed
the genes of their children, who are known to have increased likelihood of
stress disorders, and compared the results with Jewish families who were living
outside of Europe during the war. 'The gene changes in the children could only
be attributed to Holocaust exposure in the parents,' said Yehuda.
"Her team's work is the clearest example in humans of the transmission of trauma
to a child via what is called 'epigenetic inheritance' -- the idea that
environmental influences such as smoking, diet and stress can affect the genes
of your children and possibly even grandchildren. The idea is controversial, as
scientific convention states that genes contained in DNA are the only way to
transmit biological information between generations. However, our genes are
modified by the environment all the time, through chemical tags that attach
themselves to our DNA, switching genes on and off. Recent studies suggest that
some of these tags might somehow be passed through generations, meaning our
environment could have and impact on our children's health.
"Other studies have proposed a more tentative connection between one
generation's experience and the next. For example,
girls born to Dutch women who were
pregnant during a severe famine at the end of the second world war had an
above-average risk of developing schizophrenia. Likewise,
another study has showed that men who
smoked before puberty fathered heavier sons than those who smoked after. The
team were specifically interested in one region of a gene associated with the
regulation of stress hormones, which is known to be affected by trauma. 'It
makes sense to look at this gene,' said Yehuda. 'If there's a transmitted effect
of trauma, it would be in a stress-related gene that shapes the way we cope with
our environment.'
"They found epigenetic tags on the very same part of this gene in both the
Holocaust survivors and their offspring, the same correlation was not found in
any of the control group and their children. Through further genetic analysis,
the team ruled out the possibility that the epigenetic changes were a result of
trauma that the children had experienced themselves.
"'To our knowledge, this provides the first demonstration of transmission of
pre-conception stress effects resulting in epigenetic changes in both the
exposed parents and their offspring in humans,' said Yehuda, whose work was
published in Biological Psychiatry.
"It's still not clear how these tags might be passed from parent to child.
Genetic information in sperm and eggs is not supposed to be affected by the
environment - any epigenetic tags on DNA had been thought to be wiped clean soon
after fertilisation occurs. However, research by
Azim Surani at Cambridge University and
colleagues, has recently shown that some epigenetic tags escape the cleaning
process at fertilisation, slipping through the net. It's not clear whether the
gene changes found in the study would permanently affect the children's health,
nor do the results upend any of our theories of evolution.
"Whether the gene in question is switched on or off could have a tremendous
impact on how much stress hormone is made and how we cope with stress, said
Yehuda. 'It's a lot to wrap our heads around. It's certainly an opportunity to
learn a lot of important things about how we adapt to our environment and how we
might pass on environmental resilience.' The impact of Holocaust survival on the
next generation has been investigated for years -- the challenge has been to
show intergenerational effects are not just transmitted by social influences
from the parents or regular genetic inheritance, said Marcus Pembrey, emeritus
professor of paediatric genetics at University College London.
"'Yehuda's paper makes some useful progress. What we're getting here is the very
beginnings of a understanding of how one generation responds to the experiences
of the previous generation. It's fine-tuning the way your genes respond to the
world.'
"Can you inherit a memory of trauma?
"Researchers have already shown that certain fears might be inherited through
generations, at least in animals. Scientists at Emory University in Atlanta
trained male mice
to fear the smell of cherry blossom by pairing the smell with a small electric
shock. Eventually the mice shuddered at the smell even when it was delivered on
its own. Despite never having encountered the smell of cherry blossom, the
offspring of these mice had the same fearful response to the smell -- shuddering
when they came in contact with it. So too did some of their own offspring.
"On the other hand, offspring of mice that had been conditioned to fear another
smell, or mice who'd had no such conditioning had no fear of cherry blossom. The
fearful mice produced sperm which had fewer epigenetic tags on the gene
responsible for producing receptors that sense cherry blossom. The pups
themselves had an increased number of cherry blossom smell receptors in their
brain, although how this led to them associating the smell with fear is still a
mystery." [Quoted from
here; accessed 22/08/2015. Several
paragraphs merged to save space. Quotation marks altered to conform with the
conventions adopted at this site. Links in the original. Bold added.]
However, it should be noted that there are serious problems with the above
research; on that, see
here.
"There's
something very odd going on in space -- something that shouldn't be possible. It
is as though vast swathes of the universe are being hoovered up by a vast and
unseen celestial vacuum cleaner.
"Sasha
Kaslinsky, the scientist who discovered the phenomenon, is understandably
nervous: 'It left us quite unsettled and jittery' he says, 'because this is not
something we planned to find'. The accidental discovery of what is ominously
being called 'dark flow' not only has implications for the destinies of large
numbers of galaxies -- it also means that large numbers of scientists might
have to find a new way of understanding the universe.
"Dark flow is
the latest in a long line of phenomena that have threatened to rewrite the
textbooks. Does it herald a new era of understanding, or does it simply
mean that everything we know about the universe is wrong?" [Quoted from
here. Bold emphases
added.]
"14 billion years ago there
was nothing; then everything exploded into existence
and the universe was born, but a new generation
of cosmologists are questioning this theory.
Cosmologists have created a replica of the universe
by using equations; it's called the standard model
of cosmology and it's the reason behind the Big Bang
theory; however, this model is now doubted.
Professor
Alan Guth's theory challenges the Big Bang
by stating that the universe started out small,
allowing the temperature to even out everywhere,
before expanding on a massive scale.
"Stars nearer the edge of a
galaxy move just as fast as those in the centre.
This made cosmologists think that galaxies needed
more gravity, but the only way to get more gravity
was to create it. Astrophysicist Dan Bauer is
hunting for dark matter half a mile under the dark
plains of Minnesota in order to trace and record it
more effectively. The discovery that the universe is
speeding up suggests that a new force is powering
the universe. This force is known as dark energy,
and cosmologists have no idea what it is.
"The combination of the
standard model, inflation and dark matter has given
way to a new theory called dark flow. The nature of
this theory could show that our universe isn't the
only one. The standard model of cosmology has
withstood much criticism, therefore making the
theory stronger; however it could still be totally
wrong." [Quoted from
here, where the BBC programme
in question can be accessed.
Bold emphases and links added.]
August 2012: The New Scientist
has yet more disconcerting news to offer its readers:
"Plate
tectonics can't explain all the earthquakes, volcanoes and landscapes of
Earth, so what else is shaping its surface?
"'A lot of people (sic) thinks that the devil has come here.
Some thinks (sic) that this is the beginning of the world coming to a end.'
"To George Heinrich Crist,
who
wrote this on 23 January 1812, the series of
earthquakes that had just ripped through the Mississippi river valley were as
inexplicable as they were deadly. Two centuries on and we are no closer to an
understanding. According to our established theory of Earth's tectonic activity,
the US Midwest is just not the sort of place such tremors should occur.
"That's not the only thing we are struggling to explain.
Submerged fossil landscapes off the west coast of Scotland, undersea volcanoes
in the south Pacific, the bulging dome of land that is the interior of southern
Africa: all over the world we see features that plate tectonics alone is hard
pressed to describe.
"So what can? If a new body of research is to be believed,
the full answer lies far deeper in our planet. If so, it could shake up
geology as fundamentally as the acceptance of plate tectonics did half a century
ago.
"The central idea of plate tectonics is that Earth's
uppermost layers -- a band of rock between 60 and 250 kilometres thick known as
the
lithosphere -- is divided into a mosaic of rigid pieces that float and move
atop the viscous
mantle
immediately beneath. The theory surfaced in 1912, when German geophysicist
Alfred
Wegener argued on the basis of fossil distributions that today's continents
formed from a single supercontinent, which came to be called
Pangaea, that
broke up and began drifting apart 200 million years ago.
"Wegener lacked a mechanism to make his plates move, and
the idea was at first ridiculed. But evidence slowly mounted that Earth's
surface was indeed in flux. In the 1960s people finally came to accept that
plate tectonics could not only explain many features of Earth's topography, but
also why most of the planet's seismic and volcanic activity is concentrated
along particular strips of its surface: the boundaries between plates. At some
of these margins plates move apart, creating rift valleys on land or ridges on
ocean floors where hotter material wells up from the mantle, cools and forms new
crust. Elsewhere, they press up against each other, forcing up mountain chains
such as the Himalayas, or dive down beneath each other at seismically vicious
subduction
zones such as the
Sunda trench,
the site of the
Sumatra-Andaman earthquake in December 2004.
"And so plate tectonics became the new orthodoxy. But
is it the whole truth? 'Because it was so hugely successful as a theory,
everybody became a bit obsessed with horizontal motions and took their eye off
an interesting ball,' says geologist
Nicky White at the University of Cambridge.
"That ball is what is happening deep within Earth, in
regions far beyond the reach of standard plate-tectonic theory. The US
geophysicist
Jason
Morgan was a pioneer of plate tectonics, but in the
1970s he was also one of the first to find fault with the theory's explanation
for one particular surface feature, the volcanism of the Hawaiian islands. These
islands lie thousands of kilometres away from the boundaries of the Pacific
plate on which they sit. The plate-tectonic line is that their volcanism is
caused by a weakness in the plate that allows hotter material to well up
passively from the mantle. Reviving an earlier idea of the Canadian geophysicist
John
Tuzo Wilson, Morgan suggested instead that a plume of hot mantle material is
actively pushing its way up from many thousands of kilometres below and breaking
through to the surface.
"That went against the flow, and it wasn't until the
mid-1980s that others began to think Morgan might have a point. The turnaround
came when
seismic waves unleashed by earthquakes began to reveal some of our
underworld's structure as they travelled through Earth's interior. Seismic waves
travel at different velocities through materials of different densities and
temperatures. By timing their arrival at sensors positioned on the surface we
could begin to construct a 3D view of what sort of material is where....
"If we can do that, will history repeat itself, the
doubters be won over, and another hotly disputed model become the new orthodoxy?
[Professor Dietmar]
Müller certainly thinks so: 'Geology is on the cusp of another revolution
like plate tectonics.'" [Ananthaswamy
(2012),
pp.38-41. Italic emphasis in the original; bold emphases added. Quotation
marks altered to conform with the conventions adopted at this site. Several links
also added.]
"Pioneering
experiments have cast doubt on a founding
idea of the branch of physics called quantum
mechanics. The
Heisenberg uncertainty principle is in
part an embodiment of the idea that in the
quantum world, the mere act of observing an
event changes it. But the idea had never
been put to the test, and a team
writing in Physical
Review Letters
says 'weak measurements' prove the rule
was never quite right. That could play
havoc with 'uncrackable codes' of
quantum cryptography.
"Quantum mechanics has
since its very inception raised a great many
philosophical and metaphysical debates about
the nature of nature itself. Heisenberg's
uncertainty principle, as it came to be
known later, started as an assertion that
when trying to measure one aspect of a
particle precisely, say its position,
experimenters would necessarily 'blur out'
the precision in its speed. That raised the
spectre of a physical world whose nature
was, beyond some fundamental level,
unknowable. This problem with the act of
measuring is not confined to the quantum
world, explained senior author of the new
study,
Aephraim Steinberg of the University of
Toronto.
"'You find a similar
thing with all sorts of waves,' he told BBC
News. 'A more familiar example is sound: if
you've listened to short clips of audio
recordings you realise if they get too short
you can't figure out what sound someone is
making, say between a "p" and a "b". If I
really wanted to say as precisely as
possible, "when did you make that sound?", I
wouldn't also be able to ask what sound it
was, I'd need to listen to the whole
recording.'
"The problem with
Heisenberg's theory was that it vastly
predated any experimental equipment or
approaches that could test it at the quantum
level: it had never been proven in the lab. 'Heisenberg had this
intuition about the way things ought to be,
but he never really proved anything very
strict about the value,' said Prof
Steinberg. 'Later on, people came up with
the mathematical proof of the exact
value.'...
"Prof Steinberg and
his team are no stranger to bending quantum
mechanics' rules; in 2011, they
carried out a version
of a classic experiment on photons -- the
smallest indivisible packets of light energy
-- that plotted out the ways in which they
are both wave and particle, something the
rules strictly preclude. This time, they
aimed to use so-called weak measurements on
pairs of photons, putting into practice an
idea first put forward in a 2010
paper in the New
Journal of Physics.
"Photons can be
prepared in pairs which are inextricably
tied to one another, in a delicate quantum
state called
entanglement, and the weak measurement
idea is to infer information about them as
they pass, before and after carrying out a
formal measurement. What the team found was
that the act of measuring did not
appreciably 'blur out' what could be known
about the pairs.
"It remains true that
there is a fundamental limit of knowability,
but it appears that, in this case, just
trying to look at nature does not add to
that unavoidably hidden world. Or, as the
authors put it: 'The quantum world is still
full of uncertainty, but at least our
attempts to look at it don't have to add as
much uncertainty as we used to think!' Whether the finding
made much practical difference was an open
question, said Prof Steinberg.
"'The jury is still
out on that. It's certainly more than a
footnote in the textbooks; it will certainly
change the way I teach quantum mechanics and
I think a lot of textbooks. But there's
actually a lot of technology that relies on
quantum uncertainty now, and the main one is
quantum cryptography -- using quantum
systems to convey our information securely
-- and that mostly boils down to the
uncertainty principle.'" [Quoted from
here. Minor typos corrected; paragraphs merged to save space. Several
links added. Quotation marks altered to
conform with the conventions adopted at this
site. Italic and bold emphases
added.]
September 2012: We encountered this
disturbing news in the New Scientist:
"What if you constantly change the ingredients in your raw
batter, but the baked cake is always lemon? It sounds like something from a
surrealist film, but equivalent scenarios seem to play out all the time in the
mathematics of the quantum world. Nobel prize-winner
Frank Wilczek and colleague Alfred Shapere say we
can't ignore the absurdity of the situation any longer. It's time to get to the
bottom of what is really going on, and in the process cement our understanding
of the fundamental nature of the universe.
"They are part of a broader call to arms against those who
are content to use the maths behind quantum mechanics
without having physical explanations for their more baffling results,
a school of thought often dubbed 'shut up and calculate'. 'I don't see why we should take quantum mechanics as
sacrosanct,' says
Roger Penrose of the University of Oxford. 'I think
there's going to be something else which replaces it.'
"Einstein's
widely accepted theory of special relativity states
that nothing can travel faster than the speed of light. But the phenomenon of
quantum entanglement seems to flout that speed limit by allowing a
measurement of one particle to instantaneously change another, even when the two
are widely separated. Einstein famously called this 'spooky action at a
distance'. 'It's very disturbing,' says Wilczek of the Massachusetts
Institute of Technology. 'It bothered Einstein. It should bother everybody.'
"To underline what they mean Wilczek and Shapere of the
University of Kentucky in Lexington, examined a quantum system affected by a key
aspect of
special relativity: simultaneous events might not look simultaneous to all
observers. If two fireworks go off at exactly the same time in
neighbouring towns, a spectator will be able to see both simultaneously. To an
observer moving from one town to the other, one firework will seem to explode
first. What holds true for you depends on your frame of reference -- that is,
it's relative.
"Now add a third firework. If there's a reference frame in
which it goes off at the same time as only one of the other two, you'd think
there should be another reference frame in which all three happen
simultaneously. Surprisingly, that is not how it works mathematically. Instead
the calculations only work for 4 of the 6 possible orderings (arxiv.org/abs/1208.3841).
The team then applied this test to the quantum world.
When
particles are entangled, they share a 'wave
function'. Physically measuring one of the particles 'collapses' all the
possibilities encoded in the wave function into a single value, instantly
affecting any entangled partners.
"Based on the new maths, if you have three entangled
photons and you measure photon A, you can have reference frames where measuring
A affects C, or measuring A impacts B, but never where measuring A happens
before the effects on both B and C. This implies that measuring photon A can
influence events that happened in the past, creating a mathematical paradox.
'That's the tension: how can you have such large effects on the mathematical
objects without physical consequences?' Wilczek says...." [New
Scientist 22/09/2012, p.13.
The on-line article is slightly different from the printed version. Quotation
marks altered to conform with the conventions adopted at this site. Several links added.
Several paragraphs merged to save space.]
October 2015: Physicist,
David Deutsch
had this to say in recent issue of the New Scientist:
"Probability is as useful to
physics as flat-Earth theory
"You can't
explain how the world really works with probability. It's time for a
different approach
"Probability theory
is a quaint little piece of mathematics. It is about sets of
non-negative numbers that are attached to actual and possible physical
events, that sum to 1 and that obey certain rules. It has numerous
practical applications. So does the flat-Earth theory: for instance,
it's an excellent approximation when laying out your garden.
"Science abandoned
the misconception that Earth extends over an infinite plane, or has
edges, millennia ago. Probability insinuated itself into physics
relatively recently, yet the idea that the world actually follows
probabilistic rules is even more misleading than saying Earth is flat.
Terms such as 'likely', 'probable', 'typical' and 'random', and
statements assigning probabilities to physical events are incapable of
saying anything about what actually will happen.
"We are so familiar
with probability statements that we rarely wonder what 'x has a
probability of ˝' actually asserts about the world. Most physicists
think that it means something like: 'If the experiment is repeated
infinitely often, half of the time the outcome will be x.' Yet no
one repeats an experiment infinitely often. And from that statement
about an infinite number of outcomes, nothing follows about any finite
number of outcomes. You cannot even define probability statements as
being about what will happen in the long run. They only say what will
probably happen in the long run.
"The awful secret at
the heart of probability theory is that physical events either happen or
they don't: there's no such thing in nature as probably happening.
Probability statements aren't factual assertions at all. The theory of
probability as a whole is irretrievably 'normative': it says what ought
to happen in certain circumstances and then presents us with a set of
instructions. It is normative because it commands that very high
probabilities, such as 'the probability of x is near 1', should
be treated almost as if they were 'x will happen'. But such a
normative rule has no place in a scientific theory, especially not in
physics. 'There was a 99 per cent chance of sunny weather yesterday'
does not mean 'It was sunny'.
"It all began quite
innocently. Probability and associated ideas such as randomness didn't
originally have any deep scientific purpose. They were invented in the
16th and 17th centuries by people who wanted to win money at games of
chance.
"Gaming the system
"To discover the best
strategies for playing such games, they modelled them mathematically.
True games of chance are driven by chancy physical processes such as
throwing dice or shuffling cards. These have to be unpredictable (having
no known pattern) yet equitable (not favouring any player over another).
The three-card trick, for example, does not qualify: the conjurer deals
the cards unpredictably (to the onlooker) but not equitably. A roulette
wheel that indicates each of its numbers in turn, meanwhile, behaves
equitably but predictably, so equally cannot be used to play a real game
of roulette.
"Earth was known to
be spherical long before physics could explain how that was possible.
Similarly, before game theory, mathematics could not yet accommodate an
unpredictable, equitable sequence of numbers, so game theorists had to
invent mathematical randomness and probability. They analysed games as
if the chancy elements were generated by 'randomisers': abstract devices
generating random sequences, with uniform probability. Such sequences
are indeed unpredictable and equitable -- but also have other, quite
counter-intuitive properties.
"For a start, no
finite sequence can be truly random. To expect fairly tossed dice to be
less likely to come up with a double after a long sequence of doubles is
a falsehood known as the gambler's fallacy. But if you know that a
finite sequence is equitable -- it has an equal number of 1s and 0s, say
-- then towards the end, knowing what came before does make it easier to
predict what must come next.
"A second objection
is that because classical physics is deterministic, no classical
mechanism can generate a truly random sequence. So why did game theory
work? Why was it able to distinguish useful maxims, such as 'never draw
to an inside straight' in poker, from dangerous ones such as the
gambler's fallacy? And why, later, did it enable true predictions in
countless applications, such as
Brownian motion, statistical mechanics
and evolutionary theory? We would be surprised if the four of spades
appeared in the laws of physics. Yet probability, which has the same
provenance as the four of spades but is nonsensical physically, seems to
have done just that.
"The key is that in
all of these applications, randomness is a very large sledgehammer used
to crack the egg of modelling fair dice, or Brownian jiggling with no
particular pattern, or mutations with no intentional design. The
conditions that are required to model these situations are awkward to
express mathematically, whereas the condition of randomness is easy,
given probability theory. It is unphysical and far too strong, but no
matter. One can argue that replacing the dice with a mathematical
randomiser would not change the strategy of an ideally rational dice
player -- but only if the player assumes that pesky normative rule that
a very high probability of something happening should be treated as a
statement that it will happen.
"So the early game
theorists never did quite succeed at finding ways of winning at games of
chance: they only found ways of probably winning. They connected those
with reality by supposing the normative rule that 'very probably
winning' almost equates to 'winning'. But every gambler knows that
probably winning alone will not pay the rent. Physically, it can be very
unlike actually winning. We must therefore ask what it is about the
physical world that nevertheless makes obeying that normative rule
rational.
"You may have
wondered when I mentioned the determinism of classical physics whether
quantum theory solves the problem. It does, but not in the way one might
expect. Because quantum physics is deterministic too. Indeterminism --
what Einstein called 'God playing dice' -- is an absurdity introduced to
deny the implication that quantum theory describes many parallel
universes. But it turns out that under deterministic, multi-universe
quantum theory, the normative rule follows from ordinary,
non-probabilistic normative assumptions such as 'if x is
preferable to y, and y to z, then x is
preferable to z'.
"You could conceive
of Earth as being literally flat, as people once did, and that falsehood
might never adversely affect you. But it would also be quite capable of
destroying our entire species, because it is incompatible with
developing technology to avert, say, asteroid strikes. Similarly,
conceiving of the world as being literally probabilistic may not prevent
you from developing quantum technology. But because the world isn't
probabilistic, it could well prevent you from developing a successor to
quantum theory. In particular, constructor theory -- the framework that
I have advocated for fundamental physics, within which I expect
successors to quantum theory to be developed -- is deeply incompatible
with physical randomness.
"It is easy to accept
that probability is part of the world, just as it's easy to imagine
Earth as flat when in your garden. But this is no guide to what the
world is really like, and what the laws of nature actually are." [Deutsch
(2015); quotation marks altered to conform with the
conventions adopted at this site. Several paragraphs merged to save space.
Emphases in the original.]
November 2012: The New Scientist
back-tracked yet again:
"So we've finally found it. Or have we? Four months on, the identity of the
particle snared at the Large Hadron Collider [LHC -- RL] remains unclear. It may indeed be the much-vaunted Higgs boson. Or it might not. Finding out
will require a welter of tests hard to do in the messy environment of the LHC's
proton collisions (see 'Particle
headache: why the Higgs could spell disaster' --
reproduced below, RL).
"What's needed is...wait for it...a successor to the LHC. Physicists have
already started dreaming of another huge particle smasher, this time based on
electrons, to finally pin down the Higgs. In these straitened times that won't be an easy sell, especially as the LHC
still feels so shiny and new. But a successor was always part of the long-term
plan and will eventually be needed to make more progress. Whatever the LHC
found, the public was captivated. Now is a good time for physicists to start --
subtly -- making their case." [Editorial,
New Scientist, 10/11/12, p.3.
Several paragraphs merged to save space.]
"Particle
headache: Why the Higgs could spell disaster
"Matthew
Chalmers
"If the
particle discovered at CERN this July is all we think it is, there are good
reasons to want it to be something else.
"So Peter Higgs didn't get
this year's Nobel for physics after all. It would have
been the Hollywood ending to a story that began half a century ago with a few
squiggles in his notebook, and climaxed on 4 July this year
with a tear in his eye as physicists armed with a $6
billion particle collider announced they had found the particle that bears his
name. Or something very like it anyway.
Higgs wasn't the only one feeling a little emotional. This was the big one,
after all. The
Higgs
boson completes the grand edifice that is the 'standard
model' of matter and its fundamental interactions. Job done.
"If only things were that simple. As particle physicists gather in Kyoto, Japan,
next week for their
first big
conference since July's announcement, they are still
asking whether that particle truly is the pičce de résistance of the
standard model. And meanwhile, even more subversive thoughts are doing the
rounds: if it is, do we even want it?
"Higgs's squiggles aimed to solve a rather abstruse problem. Back in the early
1960s, physicists were flushed with their ability to describe electromagnetic
fields and forces through the exchange of massless
photons. They
desperately wanted a similar quantum theory for the
weak
nuclear force, but rapidly hit a problem: the calculations demanded that the
particles that transmit this force, now known as the
W and Z
bosons, should be massless too. In reality, they weigh in at around 80 and
90 gigaelectronvolts (GeV), almost 100 times meatier than a
proton. The solution hit upon by Higgs and others was a new field that filled space,
giving the vacuum a positive energy that in turn could imbue particles with
different amounts of mass, according to how much they interacted with it. The
quantum particle of this field was the Higgs boson.
"As the standard model gradually took shape, it became clear how vital it was to
find this particle. The model demanded that in the very early hot universe the
electromagnetic and weak nuclear forces were one. It was only when the Higgs
field emerged a billionth of a second or less after the big bang that the pair
split, in a cataclysmic transition known as
electroweak symmetry breaking. The W and Z bosons grew fat and retreated to
subatomic confines; the photon, meanwhile, raced away mass-free and the
electromagnetic force gained its current infinite range. At the same time, the
fundamental particles that make up matter -- things such as
electrons
and
quarks,
collectively known as
fermions --
interacted with the Higgs field and acquired their mass too. An ordered universe
with a set hierarchy of masses emerged from a madhouse of masslessness.
"It's a nice story, but one that some find a little contrived. 'The minimal
standard model Higgs is like a fairy tale,' says
Guido Altarelli of CERN near Geneva,
Switzerland. 'It is a toy model to make the theory match the data, a crutch to
allow the standard model to walk a bit further until something better comes
along.' His problem is that the standard model is manifestly incomplete. It
predicts the outcome of experiments involving normal particles to accuracies of
several decimal places, but is frustratingly mute on gravity, dark matter and
other components of the cosmos we know or suspect to exist. What we need, say
Altarelli and others, is not a standard Higgs at all, but something subtly or
radically different -- a key to a deeper theory.
"Questions of identity
"Yet so far, the Higgs boson seems frustratingly plain and simple. The particle
born on 4 July was discovered by sifting through the debris of trillions of
collisions between protons within the mighty ATLAS and CMS detectors at CERN's
Large Hadron Collider. For a start, it was spotted
decaying into W and Z bosons, exactly what you would expect from a particle
bestowing them with mass. Even so, a definitive ID depends on fiddly measurements of the particle's
quantum properties (see 'Reflections
on spin'). 'The task facing us now is ten times harder
than making the discovery was,' says
Dave Newbold of the University of Bristol, UK, a
member of the CMS collaboration.
"Beyond that, a standard-model Higgs has to decay not just into
force-transmitting bosons, but also to matter-making fermions. Here the waters
are little muddier. The particle was also seen decaying into two photons, which
is indirect proof that it interacts with the heaviest sort of quark, the
top quark:
according to the theory, the Higgs cannot interact directly with photons because
it has no electric charge, so it first splits into a pair of top quarks and
antiquarks that in turn radiate photons. Further tentative evidence for
fermion interactions comes from the US, where researchers on the now-defunct
Tevatron collider at Fermilab in Batavia, Illinois, have seen a hint of the
particle decaying into
bottom
quarks.
"But equally, the CMS detector has measured a shortfall of decays into
tau
leptons, a heavier cousin of the electron. If substantiated, that could
begin to conflict with standard model predictions; ATLAS is expected to present
its first tau-decay measurements in Kyoto next week. Both ATLAS and CMS see more
decays into photons than expected, perhaps signalling the influence of new
processes and particles beyond the standard model.
"It is too early to draw any firm conclusions. Because we know the new
particle's mass fairly well -- it is about 125 GeV, or 223 billionths of a
billionth of a microgram -- we can pin down the rates at which it should decay
into various particles to a precision of about 1 per cent, if it is the standard
Higgs. Because of the limited number of decays seen so far, however, the
measurement uncertainty on the new particle's decay rates is more like 20 or
even 30 per cent. By the end of the year, ATLAS and CMS will have around two and
a half times the data used for the July announcement, but that still won't
reduce the uncertainty enough. Then the LHC will be shut down for up to two
years to be refitted to collide protons at higher energies. 'We're probably not
going to learn significantly more about the new particle in the immediate
future,' says Newbold.
"What physicists would like to fill this vacuum is a new collider altogether.
The LHC is not exactly ideal anyway: it smashes protons together, and protons
are sacks of quarks and other innards that make measurements a messy business.
Researchers are lobbying for a cleaner electron-positron
collider, possibly in Japan, to close the Higgs file, but that too is a distant
prospect. So we are left with a particle that looks like the standard Higgs, but we can't
quite prove it. And that leaves us facing an elephant in the accelerator tunnel:
if it is the standard Higgs, how can it even be there in the first place?
"The problem lies in the prediction of quantum theory, confirmed by experiments
at CERN's previous mega-accelerator, the
Large Electron Positron collider, that particles
spontaneously absorb and emit
'virtual' particles by borrowing energy from the vacuum. Because the Higgs
boson itself gathers mass from everything it touches, these processes should
make its mass balloon from the region of 100 GeV to 1019
GeV. At this point, dubbed the
Planck scale,
the fundamental forces go berserk and gravity -- the comparative weakling of
them all -- becomes as strong as all the others. The consequence is a
high-stress universe filled with black holes and oddly warped
space-time.
"Conspirators sought
"One way to avert this disaster is to set the strength of virtual-particle
fluctuations that cause the problem so they all cancel out, reining in the Higgs
mass and making a universe more like the one we see. The only way to do that
while retaining a semblance of theoretical dignity, says Altarelli, is to invoke
a conspiracy brought about by a suitable new symmetry of nature. 'But where you
have a conspiracy you must have conspirators.'
"At the moment, most physicists see those conspirators in the hypothetical
superpartners, or 'sparticles',
predicted by
the theory of supersymmetry. One of these sparticles
would partner each standard model particle, with the fluctuations of the
partners neatly cancelling each other out. These sparticles must be very heavy:
the LHC has joined the ranks of earlier particle smashers in ruling them out
below a certain mass, currently around 10 times that of the putative Higgs.
"That
has already put severe pressure on even the simplest
supersymmetric models. But all is not lost, according to
James
Wells of CERN's theory group. If you don't find
sparticles with low masses, you can twiddle the theory, to an extent, and 'dial
them up' to appear at higher masses. 'We expected that the Higgs would be found
and that a supporting cast would be found with it, but not necessarily at the
same energy scale,' he says. [Tweaking the 'epicycles'? -- RL.]
"Even so, the goalposts cannot be shifted too far: if the sparticles get too
heavy, they won't stabilise the Higgs mass in a convincingly 'natural' way.
Sparticles are also hotly sought after as candidates to make up the universe's
missing dark matter. Updates will be presented in Kyoto next week, and there is
also hope for indirect leads to supersymmetry from measurements of anomalies in
the decay rates of other standard-model particles. If nothing stirs there, all
eyes are on what happens when the LHC roars back early in 2015 at near double
its current collision energy. A revamped LHC should be able to conjure more
massive sparticles from thin air, or perhaps even more radical particles such as
those associated with extra dimensions of space. These particles amount to
another attempt to fill the gap between where the Higgs 'should be' -- at the
Planck scale --- and where it actually is.
"The weirdest scenario of them all, though, is if there is nothing but
tumbleweed between the energies in which the standard model holds firm and those
of the Planck scale, where
quantum field theories and Einstein's gravity break down. How then would we explain the vast discrepancy between the Higgs's actual mass and that
predicted by quantum theory?
"Teetering on the brink
"One solution is to just accept it: if things were not that way, the masses of
all the particles and their interactions strengths would be very different,
matter as we know it would not exist, and we would not be here to worry about
such questions. Such anthropic reasoning, which uses our existence to exclude
certain properties of the universe that might have been possible, is often
linked with
the concept of a multiverse -- the idea that there are
innumerable universes out there where all the other possible physics goes on. To
many physicists, it is a cop-out. 'It looks as if it's an excuse to give up on
deeper explanations of the world, and we don't want to give up,' says
Jon Butterworth
of University College London, who works on the ATLAS experiment.
"But a second fact about the new particle gives renewed pause for thought.
Not
only is its 125 GeV mass vastly less than it should be, it is also about as
small as it can possibly be without dragging the universe into another
catastrophic transition. If it were just a few GeV lighter, the strength of the
Higgs interactions would change in such a way that the lowest energy state of
the vacuum would dip below zero. The universe could then at some surprise moment
'tunnel' into this bizarre state, again instantly changing the entire
configuration of the particles and forces and obliterating structures such as
atoms.
"As things stand, the universe is seemingly teetering on the cusp of eternal
stability and total ruin. 'It's an interesting coincidence that we are right on
the border between these two phases,' says CERN theorist
Gian Giudice, who set about calculating the
implications of a 125 GeV Higgs as soon as the first strong hints came out of
the LHC in December last year.
"He doesn't know what the answer is. In any case, finding any new particles will
change the game once more. 'There are many questions in the history of science
whose answers have turned out to be environmental rather than fundamental,' says
Giudice. 'The slightest hint of new physics and my calculation will be
forgotten.' So that is what all eyes will really be on in Kyoto. Higgs's
squiggles seem to have become reality -- but for a more satisfying twist to the
tale, we must hope some other squiggles show similar signs of life soon." [New
Scientist, 10/11/12, pp.34-37.
Several links added. Quotations marks
altered to conform with the conventions adopted at this site. The on-line version
has a different title to the published article. Bold emphases added. Typo
corrected. Several paragraphs merged to save space.]
Update November 2014: It looks like
the Higgs Boson might not have been discovered, after all; here is what Science News
had to say:
"Techni-Higgs: European Physicists Cast Doubt on Discovery
of Higgs Boson
"'The CERN data
are generally taken as evidence that the
particle is the
Higgs particle.
It is true that the Higgs particle can explain
the data but there can be other explanations, we
would also get these data from other particles,'
said Dr Mads Toudal Frandsen of the University
of Southern Denmark, the senior author of the
study published in the
journal
Physical Review D
(arXiv.org
preprint). The study does not debunk the possibility that
CERN physicists have discovered the Higgs boson.
That is still possible -- but it is equally
possible that it is a different kind of
particle.
"'The current data is not precise enough to
determine exactly what the particle is. It could
be a number of other known particles,' Dr
Frandsen said. 'But if it wasn't the Higgs
particle, that was found in CERN's particle
accelerator, then what was it? We believe that
it may be a so-called
techni-Higgs particle.
This particle is in some ways similar to the
Higgs boson -- hence half of the name.'
"Although the techni-Higgs particle and Higgs
boson can easily be confused in experiments,
they are two very different particles belonging
to two very different theories of how the
Universe was created. The Higgs
particle is the missing piece in the theory
called the
Standard Model. This theory describes three of the four forces
of nature.
"'But it does not explain what dark matter is --
the substance that makes up most of the
universe. A techni-Higgs particle, if it exists,
is a completely different thing,' the scientists
said. 'A techni-Higgs particle is not an elementary
particle. Instead, it consists of so-called
techni-quarks, which we believe are elementary.
Techni-quarks may bind together in various ways
to form for instance techni-Higgs particles,
while other combinations may form dark matter,'
Dr Frandsen said.
"'We therefore expect to find several different
particles at CERN's Large Hadron Collider, all
built by techni-quarks.' If techni-quarks exist, there must be a force
to bind them together so that they can form
particles. None of the four known forces of nature
(gravity, the electromagnetic force, the weak
nuclear force and the strong nuclear force) are
any good at binding techni-quarks together.
"'There must therefore be a yet undiscovered
force of nature. This force is called the
technicolor force. What was found last year in CERN's accelerator
could thus be either the Higgs particle of the
Standard Model or a light techni-Higgs particle,
composed of two techni-quarks.' More data from CERN will probably be able to
determine if it was a Higgs or a techni-Higgs
particle.
"'If CERN gets an
even more powerful accelerator, it will in
principle be able to observe techni-quarks
directly.'" [Taken from
here;
accessed 11/08/2015. Quotation marks altered to
conform with the conventions adopted at this site.
Bold emphasis alone added. Several paragraphs
merged to save space.]
And we also read the following:
"God particle
may not be God particle: Scientists in shock claim
"'Data not
precise enough to determine exactly what it is'
"A new
scholarly paper has raised suspicions in boffinry circles as to
whether last year's breakthrough discovery by CERN was indeed
the fabled, applecart-busting Higgs boson. The
report
from the University of Southern Denmark suggests that while
physicists working with data from the Large Hadron Collider
(LHC) did discover a new particle, the data might not point to
the fabled Higgs boson, but rather to a different particle that
behaves similarly.
"'The
CERN data is generally taken as evidence that the particle is
the Higgs particle. It is true that the Higgs particle can
explain the data but there can be other explanations, we would
also get this data from other particles,'
said associate professor Mads Toudal
Frandsen. 'The current data is not precise enough to determine
exactly what the particle is. It could be a number of other
known particles.'
"The
Southern Denmark researchers suggest that the particle
discovered by the LHC may not have been the Higgs boson, but
rather a 'techni-higgs' particle that's composed of
'techni-quarks.' Such a particle might behave similarly to the
Higgs particle but in fact is very different from the genuine
Higgs boson. If the researchers are right, their report would
discredit the claims of
discovery of the Higgs boson, which
has been sought because its existence would fill vital holes in
the Standard Model of physics.
"The
researchers claim that although their findings may disprove the
Higgs boson discovery, they also pave the way for the discovery
of another force -- one not yet uncovered -- that would be
responsible for binding the techni-quarks into particles,
including those that form dark matter. The group says that more
data is needed to establish whether the particle observed by
CERN was indeed the Higgs boson or otherwise. One way, they say,
would be for CERN to build an even larger collider to better
observe the particles and provide more evidence as to the
existence of the theorized techni-quarks." [Quoted from
here.
Accessed 11/08/2015. Links and capitals in the original. Some
paragraphs merged to save space; quotation marks altered to
conform with the conventions adopted at this site. Bold emphasis
added. See also
here.]
"CERN May Not Have Found The
Higgs Boson
"There's been some
buzz recently about doubts regarding the existence
of the Higgs boson. The popular press has picked up
on this a bit, leading to claims that it's similar
to the
BICEP2 debacle.
In this case the comparison is unfounded. There's
been some interesting work, but nothing that merits
tracking down Peter Higgs and taking back his Nobel
prize.
"The buzz is based
upon a new article in Physical Review D claiming
there
isn't enough evidence to
support the Higgs over other alternatives.
That might sound similar to the BICEP2 case for
cosmic inflation, but it isn't. The 'alternatives'
presented in the paper involve an extension of the
standard model known as technicolor. The name
derives from the
standard model of particle
physics,
where the quarks that comprise protons and neutrons
(among other particles) interact through strong
nuclear 'colour' charges. Technicolor models extend
the colour model, hence the name.
"Technicolor models were developed to address
certain difficulties in the standard model, such as
symmetry breaking
in the
weak nuclear force,
through which particles can gain mass. In the
standard model the Higgs field is used as the mass
mechanism, but in technicolor models it is done by
technicolor gauge particles.
It all gets pretty complicated, but the upshot is
that in the technicolor model there isn't a Higgs
boson, and particle mass is generated by other
means.
"But given that we've discovered the Higgs boson,
and even awarded Nobel prizes for it, doesn't that
mean the technicolor models must be wrong? Not
necessarily, which is what this paper is about.
Although the Higgs boson doesn't exist as a
fundamental particle in technicolor models, there
are ways that technicolor can produce a composite
particle that looks similar to the Higgs, which the
authors call a techni-Higgs. Just as a proton is
made of quarks, a techni-Higgs is made of
techni-quarks.
"What the authors show is that the data we have so
far isn't sufficient to distinguish between the
Higgs and the techni-Higgs. While that's true, it
isn't enough to cast doubt on the Higgs at this
point. While there are some theoretical advantages
to technicolor, there is currently no experimental
evidence to support it. This kind of work is
worthwhile because it helps prevent us from assuming
too much about the data we have, but just because
what we’ve discovered could
be a techni-Higgs, doesn't mean it is. So there's no reason to doubt the Higgs yet, and
supporters of this alternative model can continue to
have technicolor dreams." [Quoted from
here.
Accessed 11/08/2015. Bold emphases and several links
added; spelling altered to UK English. Typo
corrected. Several paragraphs merged to save space.]
November 2015:
The 'discovery' of the Higgs boson might in fact
have created more problems than it 'solved':
"The Higgs mass mystery: Why
is everything so light?
"Our best models can't explain
why the mass-giving Higgs boson is so small....
"To most of us the mass of an eyelash seems like
just about nothing. But to a Higgs boson -- the
particle believed to endow all others with their
mass -- it might as well weigh a tonne. The mass of
the Higgs has a bearing on all the other particles
that make up reality, and if it were as large as an
eyelash the world would look very different. The
electrons buzzing inside your computer's circuits
would be as weighty as the dust coating the top of
it. If the dust bulked up on the same scale, each
speck would have roughly the mass of a well-fed
elephant.
"A strange world indeed.
Yet believe the standard
model, our best theory of particle physics, and this
is just the sort of situation we should expect to
find ourselves in. According to this idea, the Higgs
should be roughly the mass of an eyelash. This is so
big that it would produce fundamental particles
almost so massive and dense that they would create a
microscopic black hole every time they collided. Yet
none of the fundamental particles -- electrons,
quarks, neutrinos and so on -- are anywhere near the
mass they ought to be. They're all much
smaller: 100 quadrillion times smaller.
"Why the clustering
at the bottom? This 'hierarchy problem' has haunted
physicists for decades, and there's never been an
easy answer. The front runner, a
theory known as
supersymmetry,
has fallen from grace now that the Large Hadron
Collider (LHC), located near Geneva, Switzerland,
has searched the most obvious avenues for evidence
and failed to find it...." [Quoted from
here;
accessed 18/11/2015. Quotation marks altered to
conform with the conventions adopted at this site.
Bold emphasis added. Link in the original.]
"Popular
physics theory running out of hiding places
"By Pallab Ghosh,Science correspondent, BBC News
"Researchers at the
Large Hadron Collider have detected one
of the rarest particle decays seen in
Nature. The finding deals a significant blow
to the theory of physics known as
supersymmetry. Many researchers had
hoped the LHC would have confirmed this by
now. Supersymmetry, or SUSY, has gained
popularity as a way to explain some of the
inconsistencies in the traditional theory of
subatomic physics known as the
Standard Model. The new observation,
reported at the Hadron Collider Physics
conference in Kyoto, is not consistent with
many of the most likely models of SUSY.
"Prof Chris Parke, who
is the spokesperson for the UK Participation
in the LHCb experiment, told BBC News: 'Supersymmetry
may not be dead but these latest results
have certainly put it into hospital.'
Supersymmetry theorises the existence of
more massive versions of particles that have
already been detected. Their existence would
help explain why galaxies appear to rotate
faster than the Standard Model would
suggest. Physicists have speculated that as
well as the particles we know about,
galaxies contain invisible, undetected
dark matter made up of super particles.
The galaxies therefore contain more mass
than we can detect and so spin faster.
"Researchers at the
LHCb detector have dealt a serious blow to
this idea. They have measured the decay
between a particle known as a
Bs Meson into two particles known as
muons. It is the first time that this
decay has been observed and the team has
calculated that for every billion times that
the Bs Meson decays it only decays in this
way three times.
"If
superparticles were to exist the decay
would happen far more often. This test is
one of the 'golden' tests for
supersysymmetry and it is one that on the
face of it this hugely popular theory among
physicists has failed. Prof Val Gibson,
leader of the Cambridge LHCb team, said that
the new result was 'putting our
supersymmetry theory colleagues in a spin'.
The results are in fact completely in line
with what one would expect from the Standard
Model. There is already concern that the
LHCb's sister detectors might have expected
to have detected superparticles by now, yet
none have been found so far.
"If
supersymmetry
is not
an
explanation
for dark
matter,
then
theorists
will
have to
find
alternative
ideas to
explain
those
inconsistencies
in the
Standard
Model.
So far
researchers
who are
racing
to find
evidence
of so
called
'new
physics'
have run
into a
series
of dead
ends.
'If new
physics
exists,
then it
is
hiding
very
well
behind
the
Standard
Model,'
commented
Cambridge
physicist
Dr
Marc-Olivier
Bettler,
a member
of the
analysis
team.
The
result
does not
rule out
the
possibility
that
super
particles
(sic)
exist.
But
according
to Prof
Parkes,
'they
are
running
out of
places
to
hide'."
[Quoted
from
here.
Accessed
20/11/2012. Quotation
marks
altered
to
conform
with the
conventions
adopted
at this
site.
Paragraphs
merged
to save
space.
Bold
emphases
and
links
added.
October 2013: An editorial from in the
New Scientist had this to say:
"The idea of putting a dead salmon in a brain
scanner would be funny if it were not so serious. When Craig Bennett of the
University of California, Santa Barbara, tried it in 2009, he wasn't expecting
to find anything -- he was just doing test runs on the machine. But when he
looked at the data he got a shock. The fish's brain and spinal column were
showing
signs of neural activity.
"There was no such activity, of course. The
salmon was dead. But the signal was there, and it confirmed what many had been
quietly muttering for years: there's something fishy about neuroscience.
"When fMRI brain scanners were invented in the
early 1990s, scientists and the general public were seduced by the idea of
watching the brain at work. It seems we got carried away. The field is plagued
by false positives and other problems. It is now clear that the majority --
perhaps the vast majority -- of neuroscience findings are as spurious as brain
waves in a dead fish (see
Hidden
depths: Brain science is drowning in uncertainty).
"That seems shocking, and not just because
neuroscience has appeared to be one of the most productive research areas of
recent years. Some of those dodgy findings are starting to make their way into
the real world, such as in ongoing debates about the use of
fMRI evidence in court.
"Some historical perspective is helpful here,
however. The problems are not exclusive to neuroscience. In 2005, epidemiologist
John Ioannidis published a bombshell of a paper called 'Why most published
research findings are false'. In it he catalogued a litany of failures that
undermine the reliability of science in general. His analysis concluded that at
least half, and possibly a large majority, of published research is wrong.
"Ioannidis might have expected anger and denial,
but
his paper was well received.
Scientists welcomed the chance to debate the flaws in their practices and work
to put them right.
"Things are by no means perfect now. Scientists
are under immense pressure to make discoveries, so negative findings often go
unreported, experiments are rarely replicated and data is often 'tortured until
it confesses'. But -- thanks in no small part to Ioannidis's brutal honesty --
all of those issues are now out in the open and science is working to address
them. The kerfuffle over neuroscience is just the latest chapter in a
long-running saga." [New
Scientist220, 2939, 18/10/2013, p.3. Accessed 25/10/2013. Links in the original.
Quotation marks altered to conform with the conventions adopted at this site.]
Add the above to the comments posted in
Essay Thirteen
Part Three.
October 2013: As the above copy of the
New Scientist
also noted:
"Some historical perspective is helpful here,
however. The problems are not exclusive to neuroscience. In 2005, epidemiologist
John Ioannidis published a bombshell of a paper called 'Why most published
research findings are false'. In it he catalogued a litany of failures that
undermine the reliability of science in general. His analysis concluded that at
least half, and possibly a large majority, of published research is wrong.
"Ioannidis might have expected anger and denial,
but
his paper was well received.
Scientists welcomed the chance to debate the flaws in their practices and work
to put them right.
"Things are by no means perfect now. Scientists
are under immense pressure to make discoveries, so negative findings often go
unreported, experiments are rarely replicated and data is often 'tortured until
it confesses'. But -- thanks in no small part to Ioannidis's brutal honesty --
all of those issues are now out in the open and science is working to address
them. The kerfuffle over neuroscience is just the latest chapter in a
long-running saga." [New
Scientist220, 2939, 18/10/2013, p.3. Links in the original.
Quotation marks altered to conform with the conventions adopted at this site.]
And, here is an article from a few years earlier:
"Interview: The man who
would prove all studies wrong
"When the clinical
epidemiologist John Ioannidis published a paper entitled 'Why most
published research findings are false' in 2005, he made a lot of scientists very
uncomfortable. The study was the result of 15 years' work cataloguing the
factors that plague the interpretation of scientific results, such as the misuse
of statistics or poor experimental design. Ioannidis tells Jim Giles why
his conclusion is not as depressing as it appeared, and what he is doing to
improve matters.
"You've been described as
the 'man who would prove all studies wrong'. What was it like to find yourself
in this role?
"Overall, the reaction I got
was very positive. People should not feel threatened by me: science is an
evolutionary process, and contradiction and falsification are part of the game.
We have tons of literature and a lot of it will eventually be refuted, but that
is not bad news. If a small proportion is correct and survives then we will have
progress.
"How did you end up taking
on the whole of science?
"Some of the early work I did
looked at whether small studies give the same results as larger ones. After
looking at hundreds of studies I started to ask: how often do the results of
different studies agree with each other? My conclusion was that sometimes small
studies disagreed with large ones beyond the level of chance. Early small
studies generally tended to claim more dramatic results than subsequent larger
studies. It's not just because scientists often oversell their results; it's
also because small studies with negative results are often filed away and never
published.
"How
did you end up taking on the whole of science?
"My parents were physicians
and I trained in internal medicine. I wanted to deal with people and feel that I
could improve their health, but I liked mathematics too. It was difficult for me
to choose between the two. Then in 1993, I met Tom Chalmers and Joseph Lau. Tom
was one of the first people to run a clinical trial and probably the first
physician to combine the results of several studies in a meta-analysis -- he and
Joseph described cumulative meta-analysis in 1992. They introduced me to the
idea of evidence-based medicine. That meeting had a great influence on me. It
showed me how to inject robust quantitative thinking into clinical work.
"A lot of your work relies
on complex statistical arguments. Can you explain them in simple terms?
"In my 2005 paper 'Why
most published research findings are false' (PLoS
Medicine, vol 2, p.e124), I tried to model the probability of a research
finding being true. By research finding I mean any association that is tested
empirically. You can make some inferences based on how big the study is, since
bigger studies tend to be more reliable. You also need to know what level of
statistical significance a researcher is using when they claim a result. There
are other things to try and compensate for, such as the fact that researchers
seek out positive results to get further funding. These are the layers of
complexity I tried to model.
"What did your modelling
reveal?
"For some areas, if you get a
positive result then it is 85 per cent or even 90 per cent likely to be true.
That is the case for very large-scale randomised clinical trials with very clear
protocols, full reporting of results and a long chain of research supporting the
finding beforehand. Then there is the other extreme, where an experiment is so
poorly designed or so many analyses are performed that a statistically
significant finding wouldn't have better than a 1-in-1000 chance of being true.
"If most studies are
wrong, how can science progress?
"Things change as a field
matures. In fields that generate data very quickly, you can get one study with
an extreme result and then in less than a year you get another with the opposite
result. The subsequent research falls somewhere in the middle. I think it works
this way because an extreme finding sells in the literature. Once it's
published, you get lots of competition in the field. Another group may happen to
find something that is extreme in the opposite direction. I don't think it's
fraud. We are talking about a sea of analyses. With one click on my computer I
can find thousands of results. It's not that difficult for one team to
contradict another very quickly.
"With so much refutation
going on, how can we know what to believe?
"By keeping an open mind and
trying to be cautious and critical. What's missing from a lot of papers is a
sense of whether the result is tentative and unlikely to be true, or whether it
has high credibility. This can be really important. For example, we need a lot
of certainty about medical care before writing guidelines recommending a certain
drug. But in some other fields, research with low credibility is extremely
interesting. Most molecular science is highly exploratory and complex, with
occasional hints of interesting associations.
"How often have you come
across high-profile oft-cited papers that later turn out to be wrong?
"This is actually a common
scenario. Some colleagues and I have looked at high-profile papers, with over
1000 citations each, that were later completely contradicted by large,
well-conducted studies. One example is the finding that beta-carotene protects
against cancer. It doesn't, but we found a sizeable component of literature
where these original beliefs were still supported. It's hard to believe the
researchers had never heard they had been refuted.
"People aren't willing to
abandon their hypothesis. If you spend 20 years on a specific line of thought
and suddenly your universe collapses, it is very difficult to change jobs.
"How are you trying to
improve matters?
"I'd like researchers to
include credibility estimates in their papers. Many fields rely on a measure of
statistical significance called a p-value. The problem is that the same p-value
may have very different credibility depending on how the analysis was done and
what results preceded it. If you take these into account, you can measure the
credibility for a particular finding -- the percentage chance that the largest,
most perfect study possible would produce the same outcome. There is uncertainty
also in the credibility, but I think we can say what ballpark we are in.
"How should we promote the
studies that produce more credible results, rather than those that are simply
statistically significant?
"There are several ways to do
this. One: do larger, well-designed studies. Two: instead of having 10 teams of
researchers, each working behind closed doors, investigators should collaborate
and study the same questions. All the data should be made publicly available. If
one team comes up with an interesting result then the whole consortium should
try to replicate it. Much of the work I've been doing for the past 10 years has
been about creating consortia to carry out research. The experience has been
very positive." [New
Scientist, 2643, 16/02/2008. pp.44-45. Link and bold emphases in in the
original; italic emphases added. Quotation marks altered to conform with the conventions adopted at this
site.]
[Those confident with the mathematics can consult
the original article (from PLOS Magazine), posted
here.]
Of course, the above interview makes no
distinction between a scientifictheory and the facts
discovered by scientists. Moreover, it also ignores the massive amount of
fraud in Medical Science and Pharmacology (and, indeed, the rest of
science). [On this, see Angell (2005)
and Goldacre (2012), as well as some of the books listed
earlier. (Details
about these two books can be accessed
here).]
[The last point is also connected with
the content of the next sub-section, where much of the relevant data is heavily controlled by Big Pharma.]
"Swapping
butter for a sunflower spread may not lower
heart risk, say British Heart Foundation
researchers. Contrary to guidance, there is
no evidence that changing the type of fat
you eat from 'bad' saturated to 'healthier'
polyunsaturated cuts heart risk. They
looked at data from 72 studies with more
than 600,000 participants. Heart experts
stressed the findings did not mean it was
fine to eat lots of cheese, pies and cakes.
"Too much
saturated fat can increase the amount of
cholesterol in the blood, which can
increase the risk of developing
coronary heart disease. Saturated fat is
the kind of fat found in butter, biscuits,
fatty cuts of meat, sausages and bacon, and
cheese and cream. Most of us
eat too much of it -- men should eat no more
than 30g a day and women no more than 20g a
day. There has been a big health drive to
get more people eating unsaturated fats such
as olive and sunflower oils and other
non-animal fats -- instead.
"But
research published in Annals of Internal Medicine,
led by investigators at the University of
Cambridge, found no evidence to support
this. Total saturated fat, whether measured
in the diet or in the bloodstream as a
biomarker, was not associated with coronary
disease risk in the 72 observational
studies. And polyunsaturated fat intake did
not offer any heart protection.
"Trans
fats
were
strongly
and
positively
associated
with
risk of
heart
diseases.
These
artificial
fats,
found in
many
processed
food
items
and
margarine
spreads,
should
continue
to be
regulated
and
avoided,
say the
study
authors. Lead
researcher
Dr Rajiv
Chowdhury
said: 'These
are
interesting
results
that
potentially
stimulate
new
lines of
scientific
inquiry
and
encourage
careful
reappraisal
of our
current
nutritional
guidelines.'
"He
added
that the
common
practice
of
replacing
saturated
fats in
our diet
with
excess
carbohydrates
(such as
white
bread,
white
rice,
potatoes
etc.), or
with
refined
sugar
and
salts in
processed
foods
should
be
discouraged.
'Refined
carbohydrates,
sugar
and salt
are all
potentially
harmful
for
vascular
health,'
he said.
The
British
Heart
Foundation
said the
findings
did not
change
the
advice
that
eating
too much
fat is
harmful
for the
heart.
"Prof
Jeremy
Pearson,
the
charity's
associate
medical
director,
said:
'This
research
is not
saying
that you
can eat
as much
fat as
you
like.
Too much
fat is
bad for
you. But,
sadly,
this
analysis
suggests
there
isn't
enough
evidence
to say
that a
diet
rich in
polyunsaturated
fats but
low in
saturated
fats
reduces
the risk
of
cardiovascular
disease.
Alongside
taking
any
necessary
medication,
the best
way to
stay
heart
healthy
is to
stop
smoking,
stay
active,
and
ensure
our
whole
diet is
healthy
-- and
this
means
considering
not only
the fats
in our
diet but
also our
intake
of salt,
sugar
and
fruit
and
vegetables.'"
[Quoted
from
here;
accessed
19/03/2014.
Quotation
marks
altered
to
conform
with the
conventions
adopted
at this
site;
several
paragraphs
merged
to save
space.
All but
one link
and
emphases
added.]
Of course,
'maverick'
health scientists have been arguing along these lines for years; see, for example,
Campbell-McBride
(2007). [However, also see
here for a
different view on
Campbell-McBride's work. Details concerning Campbell-McBride's book can be found
here.]
And then there is this:
Video One:
How Bad Science And Big Business Created The
Obesity Epidemic
August 2014: The New Scientist
poses the following question: "Did we really get 40 years of dietary advice
wrong?" The answer, as it turns out, is rather complex:
"Heart attack on a plate? The
truth about saturated fat
"After decades
of health warnings, the idea that steak, cheese and lard are bad for your heart
is melting away. The truth is more complex -- and delicious
"There's a famous
scene in Woody Allen's film
Sleeper in which two scientists in the year
2173 are discussing the dietary advice of the late 20th century.
'You mean there
was no deep fat, no steak or cream pies or hot fudge?' asks one, incredulous.
'Those were thought to be unhealthy,' replies the other. 'Precisely the opposite
of what we now know to be true.'
"We're not quite
in Woody Allen territory yet, but steak and cream pies are starting to look a
lot less unhealthy than they once did. After 35 years as dietary gospel, the
idea that saturated fat is bad for your heart appears to be melting away like a
lump of butter in a hot pan. So is it OK to eat more red meat and cheese? Will
the current advice to limit saturated fat be overturned? If it is, how did we
get it so wrong for so long? The answers matter. According to the World Health
Organization,
cardiovascular disease is the world's leading cause of death, killing more
than 17 million people annually, about a third of all deaths. It predicts that
by 2030, 23 million will succumb each year. In the US, an estimated 81 million
people are living with cardiovascular disease. The healthcare bill is a small
fortune.
"The idea that
eating saturated fat -- found in high levels in animal products such as meat and
dairy -- directly raises the risk of a heart attack has been a mainstay of
nutrition science since the 1970s. Instead, we are urged to favour the 'healthy'
fats found in vegetable oils and foods such as fish, nuts and seeds. In the US
the official guidance for adults is that no more than 30 per cent of total
calories should come from fat, and no more than 10 per cent from saturated
fat.... UK advice is roughly the same. That is by no means an unattainable
target: an average man could eat a whole 12-inch pepperoni pizza and still have
room for an ice cream before busting the limit. Nonetheless, adults in the UK
and US manage to eat more saturated fat than recommended.
"We
used to eat even more. From the 1950s to the late 1970s,
fat accounted for more than 40 per cent of dietary calories in the
UK. It was a similar story in the US. But as warnings began to
circulate, people trimmed back on foods such as butter and beef. The
food industry responded, filling the shelves with low-fat cookies,
cakes and spreads. So the message got through, at least partially.
Deaths from heart disease have gone down in Western nations. In the
UK in 1961 more than half of all deaths were from coronary heart
disease; in 2009 less than a third were. But medical treatment and
prevention have improved so dramatically it's impossible to tell
what role, if any, changes in diet played. And even though fat
consumption has gone down, obesity and its associated diseases have
not.
"To
appreciate how saturated fat in food affects our health we need to
understand how it is handled by the body, and how it differs from
other types of fat. When you eat fat, it travels to the small
intestine where it is broken down into its constituent parts --
fatty acids and glycerol -- and absorbed into cells lining the gut.
There they are packaged up with cholesterol and proteins and posted
into the bloodstream. These small, spherical packages are called
lipoproteins, and they are what allow water-insoluble fats and
cholesterol (together known as lipids) to get to where they are
needed.
"The
more fat you eat, the higher the levels of lipoprotein in your
blood. And that, according to conventional wisdom, is where the
health problems begin. Lipoproteins come in two main types, high
density and low density. Low-density lipoproteins (LDLs) are often
simply known as 'bad cholesterol' despite the fact that they contain
more than just cholesterol. LDLs are bad because they can stick to
the insides of artery walls, resulting in deposits called plaques
that narrow and harden the vessels, raising the risk that a blood
clot could cause a blockage. Of all types of fat in the diet,
saturated fats have been shown to raise bad cholesterol levels the
most. (Consuming cholesterol has surprisingly little influence: the
reason it has a bad name is that it is found in animal foods that
also tend to be high in saturated fat.)
"High-density lipoproteins (HDLs), or 'good cholesterol', on the
other hand, help guard against arterial plaques. Conventional wisdom
has it that HDL is raised by eating foods rich in unsaturated fats
or soluble fibre such as whole grains, fruits and vegetables. This,
in a nutshell, is the lipid hypothesis, possibly the most
influential idea in the history of human nutrition.
"The hypothesis traces its origins back to the 1940s
when a rising tide of heart attacks among middle-aged
men was spreading alarm in the US. At the time this was
explained as a consequence of ageing. But Ancel Keys, a
physiologist at the University of Minnesota, had other
ideas. Keys noted that heart attacks were rare in some
Mediterranean countries and in Japan, where people ate a
diet lower in fat. Convinced that there was a causal
link, he launched the pioneering
Seven Countries Study in 1958. In all, he recruited
12,763 men aged 40 to 59 in the US, Finland, The
Netherlands, Italy, Yugoslavia, Greece and Japan. The
participants' diet and heart health were checked five
and 10 years after enrolling. Keys concluded that there
was a correlation between saturated fat in food, raised
levels of blood lipids and the risk of heart attacks and
strokes.
The lipid hypothesis was born.
"The finding was supported by other research, notably
the
Framingham Heart Study, which tracked diet and heart
health in a town in Massachusetts. In light of this
research and the rising toll -- by the 1980s nearly a
million Americans a year were dying from heart attacks
--
health authorities decided to officially push for a
reduction in fat, and saturated fat in particular.
Official guidelines first appeared in 1980 in the US and
1991 in the UK, and have stood firm ever since. Yet the
voices of doubt have been growing for some time. In
2010, scientists pooled the results of 21 studies that
had followed 348,000 people for many years. This
meta-analysis found 'no significant evidence' in support
of the idea that saturated fat raises the risk of heart
disease (American
Journal of Clinical Nutrition, vol 91, p.535).
"The doubters were given a further boost by another
meta-analysis published in March (Annals
of Internal Medicine, vol 160, p.398). It
revisited the results of 72 studies involving 640,000
people in 18 countries. To the surprise of many, it did
not find backing for the existing dietary advice.
'Current evidence does not clearly support guidelines
that encourage high consumption of polyunsaturated fatty
acids and low consumption of total saturated fats,' it
concluded. 'Nutritional guidelines...may require
reappraisal.'
"In essence, the study found that people at the extreme
ends of the spectrum -- that is, those who ate the most
or least saturated fat -- had the same chance of
developing heart disease. High consumption of
unsaturated fat seemed to offer no protection. The
analysis has been
strongly criticised for containing methodological
errors and omitting studies that should have been
included. But the authors stand by their general
conclusions and say the paper has already had the
intended effect of breaking the taboo around saturated
fat.
"Outside of academia, its conclusion was greeted with
gusto. Many commentators interpreted it as a green light
to resume eating saturated fat. But is it? Did Keys
really get it wrong? Or is there some other explanation
for the conflict between his work and the many studies
that supported it, and the two recent meta-analyses?
Even as Keys's research was starting to influence health
advice, critics were pointing out flaws in it. One
common complaint was that he cherry-picked data to
support his hypothesis, ignoring countries such as
France which had high-fat diets but low rates of heart
disease. The strongest evidence in favour of a low-fat
diet came from Crete, but it transpired that Keys had
recorded some food intake data there during Lent, a time
when Greek people traditionally avoid meat and cheese,
so he may have underestimated their normal fat intake.
"The Framingham research, too, has its detractors.
Critics say that it followed an unrepresentative group
of predominantly white men and women who were at high
risk for heart disease for non-dietary reasons such as
smoking. More recently, it has also become clear that
the impact of saturated fat is more complex than was
understood back then.
"Ronald Krauss of the University of California, San
Francisco, has long researched the links between
lipoprotein and heart disease. He was involved in the
2010 meta-analysis and is convinced there is room for at
least a partial rethink of the lipid hypothesis. He
points to studies suggesting that not all LDL is the
same, and that casting it all as bad was wrong. It is
now widely accepted that LDL comes in two types -- big,
fluffy particles and smaller, compact ones. It is the
latter, Krauss says, that are strongly linked to
heart-disease risk, while the fluffy ones appear a lot
less risky. Crucially, Krauss says, eating saturated fat
boosts fluffy LDL. What's more, there is some research
suggesting small LDL gets a boost from a low-fat,
high-carbohydrate diet, especially one rich in sugars.
"Why might smaller LDL particles be riskier? In their
journey around the bloodstream, LDL particles bind to
cells and are pulled out of circulation. Krauss says
smaller LDLs don't bind as easily, so remain in the
blood for longer -- and the longer they are there, the
greater their chance of causing damage. They are also
more easily converted into an oxidised form that is
considered more damaging. Finally, there are simply more
of them for the same overall cholesterol level. And more
LDLs equate to greater risk of arterial damage, Krauss
says. He thinks that the evidence is strong enough for
the health advice to change.
"But Susan Jebb, professor of diet and
population health at the University of
Oxford, says it is too early to buy into
this alternative model of LDLs and health.
'The jury has to be out because relatively
few of the studies have subdivided LDL. It
may well be worth exploring, but right now I
am not persuaded.' Jeremy Pearson, a
vascular biologist and associate medical
director at the British Heart Foundation,
which part-funded the 2014 meta-analysis,
agrees. He says the original idea that a
diet high in saturated fat raises the risk
of heart disease remains persuasive, and
that there are other meta-analyses that
support this. He also points to hard
evidence from studies in animals, where
dietary control is possible to a degree that
it is not in people. They repeatedly show
high saturated fat leads to high LDL and
hardened arteries, he says.
"So how does he explain the meta-analyses
that cast doubt on the orthodoxy? 'I guess
what that means is that in free living
humans there are other things that are
usually more important regarding whether you
have a heart attack or not than the balance
of saturated and unsaturated fat in your
diet,' Pearson says. Factors such as lack of
exercise, alcohol intake and body weight may
simply overshadow the impact of fat.
"Certainly, the debate cannot be divorced
from the issue of overall calorie intake,
which rose in the three decades from the
1970s in the US and many other countries.
The result was rising numbers of overweight
people. And being overweight or obese raises
the risk of heart disease. Another key
factor might be what people now eat instead
of saturated fat. 'The effect of reducing
saturated fat depends on what replaces it,'
says Walter Willett of the Harvard School of
Public Health. 'We consciously or
unconsciously replace a large reduction in
calories with something else.' The problem,
as some see it, is that the something else
is usually refined carbohydrates, especially
sugars, added to foods to take the place of
fat. A review in 2009 showed that if
carbohydrates were raised while saturated
fat cut, the outcome was a raised
heart-disease risk. This plays to the
emerging idea that
sugar is the real villain.
"Then there are trans fats. Created by food
chemists to replace animal fats such as
lard, they are made by chemically modifying
vegetable oils to make them solid. Because
they are unsaturated, and so 'healthy' the
food industry piled them into products such
as cakes and spreads. But it later turned
out that trans fats cause heart disease. All
told, it is possible that the meta-analyses
simply show that the benefits of switching
away from saturated fat were cancelled out
by replacing them with sugar and trans fats.
Meanwhile, science continues to unravel some
intricacies of fat metabolism which could
also help to account for the confusing
results. One promising avenue is that not
all types of saturated fat are the same. The
2014 meta-analysis, for example, found clear
indications that different saturated fatty
acids in blood are associated with different
coronary risk. Some saturated fats appear to
lower the risk; some unsaturated ones
increase it.
"Although further big studies
are needed to confirm these
findings, lead author Rajiv
Chowdhury, an epidemiologist at
the University of Cambridge,
says this is an avenue that
might be worth exploring. There
is other evidence that not all
saturated fats are the same. A
study from 2012 found that
while eating lots of saturated
fat from meat increased the risk
of heart disease, equivalent
amounts from dairy actually
reduced it. The researchers
calculated that cutting calories
from meaty saturated fat by just
2 per cent and replacing them
with saturated fat from dairy
reduces the risk of a heart
attack or stroke by 25 per cent.
"Krauss also cites studies
showing that eating cheese does
not raise bad cholesterol as
much as eating butter, even when
both have identical levels of
saturated fat. So could future advice say that
saturated fat from dairy sources
is less risky than that from
meat, for example? Or urge us to
favour cheese over butter? It's
too early to say. Jebb is aware
that the idea that some
saturated fatty acids may be
worse than others is gaining
credence, but says it is far
from being ready to guide eating
habits.
"Nonetheless,
there is a growing
feeling that we need to
reappraise our thinking on fat.
Marion Nestle, professor of
nutrition at New York
University, says that studies of
single nutrients have a
fundamental flaw. 'People do not
eat saturated fat,' she says.
'They eat foods containing fats
and oils that are mixtures of
saturated, unsaturated and
polyunsaturated fats, and many
other nutrients that affect
health and also vary in
calories. So teasing saturated
fat out of all that is not
simple.'
"The only way to
rigorously test the
various hypotheses
would be to put some
people on one kind
of diet and others
on another for 20
years or more.
'Doable? Fundable? I
don't think so,'
says Nestle. So
where does that
leave us? Is it time
to reverse 35 years
of dietary advice
and stop worrying
about fuzzing up our
arteries? Some nutritionists
say yes. Krauss
advocates a rethink
of guidelines on
saturated fat when a
new version of the
Dietary
Guidelines for
Americans is
put together next
year. He certainly
believes that the
even stricter limit
on saturated fat
recommended by the
American Heart
Association -- that
it constitute no
more than 7 per cent
of daily calorie
intake -- should be
relaxed.
"Others, though,
strike a note of
caution. Nestle says
that the answer
depends on context.
'If calories are
balanced and diets
contain plenty of
vegetables, foods
richer in saturated
fat should not be a
problem. But that's
not how most people
eat,' she says. Jebb
and Pearson see no
reason to shift the
guidance just yet,
although Jebb says
it may be time for a
review of fat by the
UK's Scientific
Advisory Committee
on Nutrition, which
last visited the
issue in 1991.
"So while dietary
libertarians may be
gleefully slapping a
big fat steak on the
griddle and lining
up a cream pie with
hot fudge for
dessert, the dietary
advice of the 1970s
still stands -- for
now. In other words,
steak and butter can
be part of a healthy
diet. Just don't
overdo them." [New
Scientist223,
2980,
02/08/2014,
pp.32-37. Quotation
marks altered to
conform with the
conventions adopted
at this site.
Some links and bold emphases added. Several paragraphs
merged to save
space. The online
article has a
different title to
the published
version.]
So, in the DM-"Totality", are such
fats
harmful or not?
October 2015: Hold the press! Here is
a letter published in a recent issue of the New Scientist:
"Saturated fat is not off the hook
"From Neal Barnard, Physicians Committee For
Responsible Medicine
"You report a new
Canadian study on the risks of trans fats and
saturated fat (15
August, p.6).
This meta-analysis of 41 previous reports looked at
the data in two ways. One way revealed the dangers
of saturated fat, while the other did not. Neither
showed that saturated fat is safe. The first analysis used more or less raw data,
finding that people whose diets were heaviest in
saturated fat had a 12 per cent higher risk of
developing heart disease and a 20 per cent higher
risk of dying from it, compared with those whose
diets were lowest in saturated fat. Trans fats,
found in many snack foods, were also linked to heart
disease.
"The second looked at data adjusted for cholesterol
levels, body weight, and so on. This is
statistically risky. For example, saturated fat
increases cholesterol levels, which, in turn,
increase cardiovascular risk. Adjusting the data for
cholesterol levels as if they were an independent
variable can make the link between saturated fat and
cardiovascular risk disappear. Meta-analyses are like metal detectors. If they
find a landmine, you can be confident that it is
there. But if they don't find one, that does not
mean that it's time to go skipping through the
field. It may be that your method is simply not
sensitive enough." [Quoted from
here.
Paragraphs merged to save space.]
Until scientists finally make up their minds
(if ever!), perhaps we should consign this topic to the 'revolving
door' department of the "Totality"...
March 2014: A story rapidly spread
across the
entire media that heralded the 'discovery' of gravitational waves -- in fact, it was nicely timed to coincide with
the release of Gravity on DVD. According to a BBC Horizon programme
-- Aftershock broadcast in March 2015
--, this 'news' "made headlines right
around the world". As a result, we witnessed countless mega-hyped reports,
rather like the following breathless and uncritical article (which included
irresponsible speculation about a Nobel Prize!) in The Guardian:
"Gravitational
waves: have US scientists heard echoes of the big bang?
"There is intense speculation
among cosmologists that a US team is on the verge of
confirming they have detected 'primordial
gravitational waves'
-- an echo of the big bang in which the universe came into
existence 14bn years ago. Rumours have been rife in the
physics community about an announcement due on Monday from
the Harvard-Smithsonian Center for Astrophysics. If there is
evidence for gravitational waves, it would be a landmark
discovery that would change the face of cosmology and
particle physics. Gravitational waves are the last untested
prediction of Albert Einstein's General Theory of
Relativity. They are minuscule ripples in the fabric of the
universe that carry energy across space, somewhat similar to
waves crossing an ocean. Convincing evidence of their
discovery would almost certainly lead to a Nobel prize.
"'If they do announce
primordial gravitational waves on Monday, I will take a huge
amount of convincing,' said
Hiranya Peiris, a cosmologist from
University College London.
'But if they do have a robust detection…Jesus, wow! I'll be
taking next week off.'
"The discovery of
gravitational waves from the big bang would offer scientists
their first glimpse of how the universe was born. The signal
is rumoured to have been found by a specialised telescope
called
Bicep (Background Imaging of Cosmic
Extragalactic Polarization)
at the south pole. It scans the sky at microwave
frequencies, where it picks up the fossil energy from the
big bang. For decades, cosmologists
have thought that the signature of primordial gravitational
waves could be imprinted on this radiation. 'It's been
called the Holy Grail of cosmology,' says Peiris, 'It would
be a real major, major, major discovery.' Martin Hendry at
the University of Glasgow works on several projects designed
to directly detect gravitational waves. 'If Bicep have made
a detection,' he says, 'it's clear that this new window on
the universe is really opening up.'
"According to theory,
the primordial gravitational waves will tell us about the
first, infinitesimal moment of the universe's history.
Cosmologists believe that 10-34
seconds after the big bang (a decimal point followed by 33
zeros and a one) the universe was driven to expand hugely.
Known as
inflation,
the theory was dreamed up to explain why the universe is so
remarkably uniform from place to place. But it has always
lacked some credibility because no one can find a convincing
physical explanation for why it happened.
Now researchers may be
forced to redouble their efforts. 'The primordial
gravitational waves have long been thought to be the smoking
gun of inflation. It's as close to a proof of that theory as
you are going to get,' says Peiris. This is because
cosmologists believe only inflation can amplify the
primordial gravitational waves into a detectable signal.
"'If a detection has
been made, it is extraordinarily exciting. This is the real
big tick-box that we have been waiting for. It will tell us
something incredibly fundamental about what was happening
when the universe was 10-34
seconds old,' said
Prof Andrew Jaffe, a cosmologist from
Imperial College, London,
who works on another telescope involved in the search called
Polarbear.
"But extracting that signal
is fearsomely tricky. The microwaves that carry it must
cross the whole universe before arriving at Earth. During
the journey, they are distorted by intervening clusters of
galaxies. 'It's like looking at the universe through bubbled
glass,' said Duncan Hanson of McGill University in Montreal,
Canada, who works on the South Pole Telescope, a rival that
sits next to Bicep. He said the distortion must
be removed in a convincing way before anyone can claim to
have made the detection. The prize for doing that, however,
would be the pinnacle of a scientific career. 'The Nobel
Prize would be for the detection of the primordial
gravitational waves.' 'Yeah, I would give them a
prize,' said Jaffe." [Stuart Clarke,
The
Guardian,
14/03/2014. Quotation marks altered to
conform with the conventions adopted at this site; several
paragraphs merged to save space. Some links added. Of
course, the results
were duly announced.]
Alarm bells should have been ringing when the
scientists involved announced this 'breakthrough' directly to the media, by-passing the usual peer review system. This was reminiscent of the 'discovery' of
'cold fusion' back in 1989, which was also announced directly to the media, having been
subjected only to an "abbreviated peer review system". On that, see
here. (This links to a PDF.)
So, and not unexpectedly, we were faced with this
major qualification a month
or so later:
"Cosmic inflation seen?
Don't get hopes up too quickly
"'The wolves are circling the campfire.' That's
how one cosmologist describes the reaction of some of his colleagues to last
month's headline-grabbing discoveries about the early universe. One stunning
claim was that the BICEP2 telescope at the South Pole had seen evidence that the
universe underwent a period of rapid inflation. Another was that it had detected
the imprint of long-sought gravitational waves. Those claims have since come under intense
scrutiny. Some physicists now say that the team didn't adequately exclude other
processes that could have given rise to the data (see 'Star
dust casts doubt on recent big bang wave result'
and 'Big
bang breakthrough: The dark side of inflation').
"So is what some commentators described as the 'discovery
of the century' about to be
brought crashing down? There is real room for doubt about the results.
Confirmation bias is as much a danger in the physical sciences as elsewhere: by
setting out with a clear theoretical prediction of what they should see, the
BICEP2 team may have ended up seeing exactly what they wanted to." [New
Scientist222, 2966, 26/04/2014, p.5. Quotation marks
altered to conform with the conventions adopted at this site.
Two paragraphs merged to save space.]
By June 2014, even the researchers
involved had begun to question their own results:
"Cosmic
Inflation: Confidence Lowered For Big Bang Signal
"By Jonathan Amos Science correspondent, BBC News 20/06/2014
"Scientists
who claimed to have found a pattern in the
sky left by the super-rapid expansion of
space just fractions of a second after the
Big Bang say they are now less confident of
their result. The
BICEP2 Collaboration
used a telescope at the South Pole to detect
the signal in the oldest light it is
possible to observe. At the time of the
group's announcement in March, the discovery
was hailed as a near-certain Nobel Prize.
But the criticism since has been sharp.
"Rival groups
have picked holes in the team's methods and
analysis. On Thursday, the BICEP2
collaboration formally
published its research in a peer reviewed
journal -- Physical Research Letters (PRL).
In the paper, the US-led group stands by its
work but accepts some big questions remain
outstanding. And addressing a public lecture
in London, one of BICEP2's principal
investigators acknowledged that
circumstances had changed. 'Has my
confidence gone down? Yes,' Prof Clem Pryke,
from the University of Minnesota, told his
audience.
"Light twist
"What the
team announced at its 17 March press
conference was the long sought evidence for
'cosmic inflation'. Developed in the 1980s,
this is the idea that the Universe
experienced an exponential growth spurt in
its first trillionth of a trillionth of a
trillionth of a second. It helps explain why
deep space looks the same on all sides of
the sky -- the contention being that a very
rapid expansion early on could have smoothed
out any unevenness. Inflation
theory makes a very specific prediction --
that it would have been accompanied by waves
of gravitational energy, and that these
ripples in the fabric of space-time would
leave an indelible mark on the oldest light
in the sky --
the famous Cosmic Microwave Background.
"The BICEP
team claimed to have detected this signal.
It is called B-mode polarisation and takes
the form of a characteristic swirl in the
directional properties of the CMB light. It
is, though, an extremely delicate pattern
and must not be confused with the same
polarisation effects that can be generated
by nearby dust in our galaxy. The critiques
that have appeared since March have largely
focused on this issue. And they intensified
significantly when some new information
describing dust polarisation in the Milky
Way was released by scientists working on
the
European Space Agency's orbiting Planck
telescope.
"Planck,
which everyone agrees is extremely powerful
at characterising dust, will release further
data before the end of the year. And, very
significantly, this will include
observations made in the same part of the
sky as BICEP2's telescope. Until then, or
until new data emerges from other sources,
the BICEP2 collaboration recognises that its
inflation detection has greater uncertainty
attached to it. '[Our]
models are not sufficiently constrained by
external public data to exclude the
possibility of dust emission bright enough
to explain the entire excess signal,' it
writes in the PRL paper.
"'Data trumps
models'
"At his
lecture at University College London, Prof
Pryke explained his team's lowered
confidence: 'Real data from Planck are
indicating that our dust models are
underestimates. So the prior knowledge on
the level of dust at these latitudes, in our
field, has gone up; and so the confidence
that there is a gravitational wave component
has gone down. Quantifying that is a very
hard thing to do. But data trumps models.'
"Prof Pryke
spoke of the pressure he and his colleagues
had been under since March. He said he never
expected there would be such interest in
their work, especially from mainstream
media. 'I'm feeling like I'm at the eye of
the storm,' he told me. 'Look, the
scientific debate has come down to this --
we need more data. With the existing data
that's out there, you can generate a lot of
farce, a lot of blog posts, a lot of
excitement and controversy, but you can't
really answer the question scientifically.
So, what you need is more data, and that's
coming from Planck and it's coming from us.' Prof Marc
Kamionkowski, from Johns Hopkins University,
commented that what we were witnessing
currently was 'science in action'.
"'If it was
not such an exciting result, you would not
be hearing so much about it,' he said in a
phone conversation last week. 'We're going
to need confirmation by independent groups.
That's the way things work in science. We
don't believe things because somebody says
they're true; we believe them because
different people make the measurements
independently and find the same results.'"
[Quoted from
here. Accessed 20/06/2014. Links in the
original. Several paragraphs merged to save
space.]
September 2014:
The BBC now reports the following (however, it is worth recalling that a year
earlier the
author of the following article, Jonathan Amos, had heralded these results
as "spectacular",
in the same way that the rest of the media gushed effusively over this 'discovery'):
"Jonathan
AmosScience
correspondent,
BBC
News,
22/09/2014
"One of the
biggest scientific claims of the year has
received another set-back. In March, the US
BICEP team said it
had found a pattern on the sky left by the
rapid expansion of space just fractions of a
second after the Big Bang.
The astonishing assertion
was countered quickly by others who thought
the group may have underestimated the
confounding effects of dust in our own
galaxy. That explanation has now been
boosted by a new analysis from the
European Space Agency's (Esa) Planck
satellite. In a paper
published on the arXiv pre-print server,
Planck's researchers find that the part of
the sky being observed by the BICEP team
contained significantly more dust than it
had assumed. This new information does not
mean the original claim is now dead. Not
immediately, anyway.
"Cosmic 'ripples'
"The BICEP
and Planck groups are
currently working on a joint assessment of
the implications,
and this will probably be released towards
the end of the year. However, if the
contention is eventually shown to be
unsupportable with the available data, it
will prove to be a major disappointment,
especially after all the initial excitement
and talk of Nobel Prizes. What BICEP
(also known as BICEP2) claimed to have done
was find the long-sought evidence for 'cosmic
inflation'.
This is the idea that the Universe
experienced an exponential growth spurt in
its first trillionth of a trillionth of a
trillionth of a second. The theory was
developed to help explain why deep space
looks the same on all sides of the sky --
the notion being that a very rapid expansion
in the earliest moments could have smoothed
out any unevenness.
"Inflation
makes a very specific prediction -- that it
would have been accompanied by waves of
gravitational energy, and that these ripples
in the fabric of space-time would leave an
indelible mark on the 'oldest light' in the
sky --
the famous Cosmic Microwave Background (CMB). The BICEP
team said its telescope at the South Pole
had detected just such a signal.... It is
called B-mode polarisation and takes the
form of a characteristic swirl in the
directional properties of the CMB light. But
it is a fiendishly difficult observation to
make, in part because the extremely faint
B-mode polarisation from nearly 14 billion
years ago has to be disentangled from the
polarisation being generated today in our
Milky Way Galaxy. The main source of this
inconvenient 'noise' is spinning dust
grains. These countless particles become
trapped and aligned in the magnetic fields
that thread through our galaxy. As a
consequence, these grains also emit their
light with a directional quality, and this
is capable of swamping any primordial
background signal.
"The BICEP
team's strategy was to target the cleanest
part of the sky, over Antarctica, and it
used every piece of dust information it
could source to do the disentanglement. What
it lacked, however, was access to the dust
data being compiled by Europe's Planck
satellite, which has mapped the microwave
sky at many more frequencies than the
American team. This allows it to more easily
characterise the dust and discern its
confounding effects. The Planck
report in the arXiv paper details dust
polarisation properties across a great
swathe of sky at intermediate and high
galactic latitudes. Only a very small
portion of those fields is relevant to
BICEP, but the results are not encouraging.
There is significantly more dust in BICEP's
'southern hole' than anticipated. Indeed,
most of the American signal -- perhaps all
of it -- could have been attributed to dust.
"'It's
possible, but the error in our measurement
is quite high,' Planck scientist Dr Cécile
Renault told BBC News. 'The conclusion
really is that we need to analyse the data
together -- BICEP and Planck -- to get the
right cosmological [versus] galactic signal.
It's really too early to say.' The American
group had already
downgraded confidence
in its own result when it finally published
a paper on the inflation claim in
Physical Review Letters in June.
In the eyes of many commentators, the new
Planck analysis will shake that confidence
still further.
"'Premature'
announcement
"The Planck
researchers themselves promise to report
back on their own search for a primordial
polarisation signal very soon (probably at
the same time as the joint paper with
BICEP). Their approach is different to the
American one.
"'Planck has
wider spectral coverage, and has mapped the
entire sky; BICEP2 is more sensitive, but
works at only one frequency and covers only
a relatively small field of view,' explained
Prof Peter Coles from Sussex University, who
has been tracking the developing story
on his blog, In The Dark.
'Between them they may be able to identify
an excess source of polarisation over and
above the foreground, so it is not
impossible that a gravitational wave
component may be isolated. That will be a
tough job, however, and there's by no means
any guarantee that it will work. We will
just have to wait and see.'
"But what can
be said now, adds Prof Coles, is that
BICEP's March claim 'was premature, to say
the least'. Even if the American and
European approaches turn out to be
unsuccessful this time, these groups will
have pointed the way for future observations
that are planned with superior technology.
Planck has actually now identified parts of
the sky that have less dust than the area
probed by BICEP. 'If you look at the maps
we've produced, the more blue parts are
where you should go look for the primordial
signal,' explained Dr Renault." [Quoted from
here. Accessed 22/09/2014. Several
paragraphs merged to save space. Quotation
marks altered to conform with the conventions
adopted at this site. Links in the
original.]
Should DM-fans now issue 'gravitational
waves' with their very own temporary residents' permit granting them only
provisional membership status in the
"Totality"? Or, maybe, tuck these dubious 'waves' in a 'dialectical broom closet'
somewhere while scientists 'finally' make up their minds -- and then change them again a few
years later?
January 2015: Surprise, surprise, we
are now told that the above "spectacular" results were "mistaken":
"Cosmic inflation: New
study says BICEP claim was wrong
"By Jonathan Amos,Science
correspondent, BBC News 30/01/2015
"Scientists who claimed last year to have
found a pattern in the sky left by the
super-rapid expansion of space just
fractions of a second after the Big Bang
were mistaken. The signal had been
confounded by light emission from dust in
our own galaxy. This is the conclusion of a
new study involving the US-led BICEP2 team
itself. A paper describing the findings has
been submitted to the peer-reviewed journal
Physical Review Letters. A summary
was briefly posted on an official French
website on Friday before being pulled. A
press release was then issued later in the
day, although the paper itself is still not
in the public domain at the time of writing.
A determination that BICEP2 was mistaken in
its observations is not a major surprise.
"The team itself had already made known its
reduced confidence in the detection. But the
new paper is significant because it is
co-authored by 'rival' scientists. These are
the Planck Consortium of researchers, who
were operating a European Space Agency (Esa)
satellite that had also been seeking the
same expansion pattern. It was on the
website of one of this satellite's
instrument teams -- its High Frequency
Instrument (HFI) -- that the outcome of the
joint assessment was briefly leaked on
Friday. All the unified effort can do,
according to the Esa press release, is put
an upper limit on the likely size of the
real signal. This will be important for
those future experiments that endeavour to
make what would still be one of the great
discoveries in modern science.
"Issues of
confusion
"BICEP2 used extremely sensitive detectors
in an Antarctic telescope to study light
coming to Earth from the very edge of the
observable Universe -- the famous
Cosmic Microwave Background (CMB)
radiation. It was looking for swirls in the
polarisation of the light. This pattern in
the CMB's directional quality is a
fundamental prediction of 'inflation'
-- the idea that there was an ultra-rapid
expansion of space just fractions of a
second after the Big Bang. The twists, known
as B-modes, are an imprint of the waves of
gravitational energy that would have
accompanied this violent growth spurt almost
14 billion years ago. But the primordial
signal -- if it exists -- is expected to be
extremely delicate, and a number of
independent scientists expressed doubts
about the American team's findings as soon
as they were announced at a press conference
in March 2014.
"At issue are a couple of complications. One
is an effect where a 'false' B-mode signal
can be produced on the sky by the CMB
passing through massive objects, such as
huge galaxies. This so-called lensing effect
must be subtracted. But the second and most
significant issue is the confusing role
played by foreground dust in our galaxy.
Nearby spinning grains can produce an
identical polarisation pattern, and this
effect must also be removed to get an
unambiguous view of the primordial
background signal.
"Bright dust
"The BICEP2 team used every piece of dust
information it could source on the part of
the sky it was observing above Antarctica.
What it lacked, however, was access to the
dust data being compiled by the
Planck space telescope, which had mapped
the microwave sky at many more frequencies
than BICEP2. This allowed Planck to more
easily characterise the dust and discern its
confounding effects. The Planck Consortium
agreed to start working with BICEP2 back in
the summer. The European group incorporated
its high frequency information -- where dust
shines most brightly -- and the US team
added additional data collected by its
next-generation instrument in Antarctica
called the Keck Array.
"However, the results of the joint
assessment would suggest that whatever
signal BICEP2 detected, it cannot be
separated at any significant level from the
spoiling effects. In other words, the
original observations are equally compatible
with there being no primordial gravitational
waves. 'This joint work has shown that the
detection of primordial B-modes is no longer
robust once the emission from galactic dust
is removed,' Jean-Loup Puget, principal
investigator of Planck's HFI instrument,
said in the Esa statement. 'So,
unfortunately, we have not been able to
confirm that the signal is an imprint of
cosmic inflation.'
"Ongoing quest
"This is not the end of the matter. Other
experiments are still chasing the B-mode
signal using a variety of detector
technologies and telescopes. These groups
will have learnt from the BICEP experience
and they will all devour Planck's latest
batch of relevant data products when they
start to be published next week. Like any
field of scientific endeavour, advances are
continually being made.
"One of the ironies of this story is that
BICEP itself may now actually be best paced
to make the ultimate detection of the cosmic
inflation pattern. Although the past year,
with its quashed claim, will have been
painful, the team will have new insights.
What is more, it can still boast
world-leading scientists among its members,
and detectors that are universally
acknowledged to be top-notch." [Quoted from
here; accessed 02/02/2015. Quotation
marks altered to conform with the conventions
adopted at this site. Several paragraphs
merged to save space. Minor typo corrected.]
February 2015: This from
Physics
World:
"Galactic dust sounds death knell for
Bicep2 gravitational wave claim
"Astronomers working
on the
Background Imaging
of Cosmic
Extragalactic
Polarization
(BICEP2)
telescope at the
South Pole have
withdrawn their
claim to have found
the first evidence
for the primordial
'B-mode'
polarization of the
cosmic microwave
background (CMB).
The claim was first
made in March 2014
and this update
comes after analysis
of data from the
Keck Array telescope
at the South Pole
and the most
up-to-date maps
showing polarized
dust emission in our
galaxy from the
European Space
Agency's
Planck collaboration.
It now seems clear
that the signal
initially claimed by
BICEP2 as an imprint
of the rapid
'inflation' of the
early universe is in
fact a foreground
effect caused by
dust within the
Milky Way.
"Cosmologists
believe that when
the universe was
very young -- a mere
10–35 s
after the Big Bang
-- it underwent a
period of extremely
rapid expansion
known as 'inflation
when its volume
increased by a
factor of up to 1080
in a tiny fraction
of a second. About
380,000 years after
the Big Bang, the
CMB -- the thermal
remnant of the Big
Bang -- came into
being. BICEP2,
Planck and the Keck
Array all study the
CMB and BICEP2's
main aim was to hunt
down the primordial
B-mode polarization.
This 'curl' of
polarized CMB light
is considered to be
the smoking gun for
inflation.
"In March last year
BICEP2 scientists
claimed success,
saying that they had
measured primordial
B-modes with a
statistical
certainty of 7σ --
well above the 5σ
'gold standard' for
a discovery in
physics. However,
doubts soon crept
in, especially about
how the team had
handled the effect
of galactic dust on
the result. Also,
the BICEP2
measurements used in
the 2014 analysis
were made at just
one frequency of
150 kHz -- for a
signal to be truly
cosmological in
nature, it must be
crosschecked at
multiple
frequencies.
"Dusty data
"When the most
recent dust maps
from Planck were
released in
September last year,
it became apparent
that the polarized
emission from dust
is much more
significant over the
entire sky than
BICEP2 had allowed
for. While the dust
signal is comparable
to the signal
detected by BICEP2
even in the cleanest
regions, this did
not completely rule
out BICEP2's
original claim. To
put the issue to
rest, the three
groups of scientists
analysed their
combined data --
adding into the mix
the latest data from
the Keck Array,
which also measures
CMB polarization.
"This analysis was
based on CMB
polarization
observations on a
400 square-degree
patch of the sky.
The Planck data
cover frequencies of
30-353 GHz, while
the BICEP2 and Keck
Array data were
taken at a frequency
of 150 GHz. 'This
joint work has shown
that the detection
of primordial
B-modes is no longer
robust once the
emission from
galactic dust is
removed,' says
Jean-Loup Puget,
principal
investigator of the
HFI instrument on
Planck at the
Institut
d'Astrophysique
Spatiale in Orsay,
France. 'So,
unfortunately, we
have not been able
to confirm that the
signal is an imprint
of cosmic
inflation.' While
they did find a
signal from
B-modes
that arose due to
gravitational
lensing from
galaxies, which had
been spotted before
in the CMB, it is
not the primordial
signal the groups
were looking for.
"Since its public
announcement in
March 2014, the
BICEP2 team has been
criticized by some
physicists for
prematurely claiming
to have found the
first 'smoking gun'
evidence for
inflation.
Neil Turok,
director of the
Perimeter Institute
of Theoretical
Physics in Canada,
who had been
an early critic of
the BICEP2 results,
now points out that
the latest joint
analysis has shied
away from making a
clear comparison of
the data against the
most basic models of
inflation. 'These
data imply that the
simplest inflation
models are now ruled
out with 95%
confidence,' he
says, explaining
that, while this is
not yet conclusive,
it may just be just
a matter of time
thanks to a host of
experiments that are
currently gathering
new and better data
on the B-modes.
Indeed, Turok
believes that within
a year we may have
the data that begins
to winnow away many
inflationary
theories. One such
basic theory --
dubbed the Φ2
theory -- predicts a
15% contribution in
the CMB fluctuations
to come from
primordial
gravitational waves,
but the data from
the joint analysis
show that the
maximum contribution
is 12%. Turok told
physicsworld.com
he would not be
surprised if this
contribution were
shown to be a mere
5% in the next year.
"Bandwagon jumping
"Peter Coles, an astrophysicist at the University of Sussex in the UK, told physicsworld.com that, with data at only one frequency, there was no way BICEP2 could have ruled out dust emission. 'I think they were right to publish their result, but should have been far more moderate in their claims and more open about the uncertainty, which we now know was huge,' he says. 'I don't think BICEP2 comes out of this very well, but neither do the many theorists who accepted it unquestioningly as a primordial signal and generated a huge PR bandwagon.'
"Coles adds that, despite what has happened, he still believes strongly in open science. 'The BICEP2 debacle has really just demonstrated how science actually works, warts and all, rather than how it tends to be presented in the media. But I do feel that it has exposed a worrying disregard for the scientific method in some very senior scientists who really should know better. It can be dangerous to want your theory to be true so much that it clouds your judgement. In the end it's the evidence that counts.'
"Caught in a loop?
"Following BICEP2's announcement last March, Subir Sarkar -- a particle theorist at the University of Oxford and the Niels Bohr Institute in Copenhagen -- claimed to have found evidence that emissions from local "radio loop" structures of dust in our galaxy could generate a previously unknown polarized signal. This new foreground -- which should be seen in the radio and microwave frequencies and is present at high galactic latitudes -- could easily mimic a B-mode polarization signal, according to Sarkar.
"Planck's data from last year convinced Sarkar and colleagues that the loop structures crossing the BICEP2 observation region may be the main cause of the polarization signal. Sarkar told physicsworld.com that he is surprised that the latest paper does not offer a physical explanation for why there should be so much dust at such high galactic latitudes. 'Unless we understand this it will be hard to model the foreground emission to the level of accuracy required to make progress in the continuing search for gravitational waves from inflation,' he says. Currently, a variety of satellite and ground-based experiments -- such as LiteBIRD, COrE, Atacama Cosmology Telescope and the recently launched SPIDER telescope -- are busy taking measurements of the CMB polarization to settle the debate on gravitational waves, and hence inflation. Indeed, the BICEP experiment itself is now taking data at two frequencies and will soon up that number to three.
"Theories run amok
"'For the past 35 years, theoretical physics has been an extravaganza of model-building,' says Turok, adding that theories have 'sort of run amok.' He alludes to the fact that data from experiments as different in scale as the Large Hadron Collider and Planck have shown that the universe 'is much simpler than we expected'. The data in the coming years will show whether or not relics of gravitational waves indeed abound in the universe -- or if inflationary theories should be consigned to the dusty corners of history.
"The paper 'A joint analysis of BICEP2/Keck Array and Planck data' by BICEP2/Keck and the Planck collaboration has been submitted to the journal Physical Review Letters. A pre-print is available here." [Quoted from here; accessed 13/03/2015. Quotation marks altered to conform with the conventions adopted at this site. Links in the original; bold emphasis added. Several paragraphs merged to save space.]
Fortunately, this means that the DM-committee
that decides on such matters (if there is one!) can now strike these 'waves' from
the "Totality's" roll call -- or, at least it can do so until some other premature and
over-hyped 'results' send the world's media into another hysterical spin.
However, the original result in February met with the same acclaim that
greeted the now debunked BICEP results above. Clearly, it is far too
early to tell if this discovery will also be 'corrected' at
some point, and then quietly abandoned.
May 2014: Materialists -- let alone
scientists --
should have displayed a healthy scepticism toward the idea that 80-90% of the
matter in the universe is 'invisible' (i.e., undetectable), despite the
fact that we have time and again seen researchers
accept the
'existence' of any number of
weird and wonderful
objects and processes,
many of which were (indeed, still are) no less invisible and
undetectable, which they later rejected or quietly forgot about.
"It
supposedly makes up 80 per cent of the matter in the universe, but we still have
no direct evidence that dark matter exists. Physicist
Mordehai Milgrom tells Marcus Chown that it's time to abandon the idea --
because he has a better one
"Why
is now a good time to take an alternative to dark matter seriously?
"A host of experiments searching for dark matter, including the
Large Hadron Collider, many underground experiments and several space
missions, have failed to see anything convincing. This comes on top of
increasing realisation that the leading dark matter model has its failings.
Among other things, it predicts that we should see many more dwarf galaxies
orbiting our Milky Way than we actually do. Set against this is the fact that in
the past two years, numerous observations have added support to my proposed
alternative,
modified Newtonian dynamics (MOND).
"What
does dark matter explain?
"To give one example, according to Newton's laws of motion and gravitation,
stars at ever greater distances from the centre of a spiral galaxy should orbit
the centre ever more slowly because of the rapid drop-off in gravity. But this
doesn't happen. The mainstream explanation is that every spiral galaxy,
including our Milky Way, is embedded in a 'halo' of dark matter that enhances
gravity at the galaxy's outer regions, preventing stars from simply flying off
into intergalactic space. But for that to work, you have to give each galaxy its
own arbitrary amount and distribution of dark matter.
"So
what is MOND, your alternative?
"I believe that galaxies are governed by laws that differ from Newton's.
According to MOND, the faster-than-expected motion of stars in the outer regions
of spiral galaxies could be due to either a new law of gravity that yields a
stronger-than-expected force or a new law that reduces the inertia of stars.
This departure from Newtonian dynamics occurs when the acceleration of stars
drops below a hundred billionth of a
g, which
happens at different distances from the centre in different galaxies. So, with a
single parameter, MOND can predict the rotation curves of all galaxies with no
need for dark matter.
"What
new evidence is there for MOND?
"I will mention just two recent findings. In what's known as galaxy-galaxy
lensing, light from distant galaxies is distorted as it passes by nearer
galaxies on its way to Earth. This enables us to probe the gravitational field
of millions of galaxies of all types -- not just spiral galaxies, where it is
easy to see MOND at work. Predictions made using MOND agree well with recent
observations (Physical
Review Letters, vol 111, p.041105).
"Other evidence comes from our neighbouring galaxy
Andromeda.
Because MOND assumes that there is no dark matter, it must predict the velocity
of stars orbiting in a galaxy from the distribution of visible matter alone.
Last year, with Stacy McGaugh at Case Western Reserve University in Cleveland,
Ohio, we predicted the velocities of stars in about 30 dwarf satellite galaxies
of Andromeda.
When these velocities were actually measured, our predictions proved correct
(The
Astrophysical Journal, vol 775, p.139).
The main dark matter paradigm has no such predictive power; it can only explain
after the event.
"When
did you first come up with this alternative to dark matter?
"More than 30 years ago, I began to wonder whether the gravitational dynamics
changed at a particular distance from the centre of a galaxy. That didn't appear
to be the case. I tried a few other things and, finally, in April 1981, I hit on
acceleration. The meagre data we had then could be explained if at a critical
acceleration -- a mere hundred billionth of a g -- gravity switched from a type
that weakens in line with the familiar Newtonian law to a type that falls off
more slowly, following a different law. That alternative law is MOND. At first,
I didn't tell anyone. Only after working on the idea for six more months did I
announce
it in
threepapers.
By and large, they were met by silence.
"Why
didn't most people take it seriously?
"Dark matter was the hypothesis of least daring -- just add some gravitating
stuff that gives out no light. Modifying dynamics, on the other hand, meant
tampering with fundamental things such as Newton's and Einstein's theories of
gravity. That appalled people. Also, initially, the theory applied only to
nonrelativistic systems -- with constituents that move slowly compared with
light. To be taken seriously, MOND had to be made compatible with Einstein's
principles of relativity.
"So
how did you solve that problem?
"It took a while, but in 2004 Jacob Bekenstein at the Hebrew University of
Jerusalem put forth a theory known as TeVeS (tensor-vector-scalar). It built on
his earlier work with Bob Sanders at the University of Groningen in the
Netherlands. TeVeS describes gravity with three fields and made MOND compatible
with Einstein's relativity. After it was introduced, people started to take MOND
more seriously.
"Is
MOND more elegant than dark matter?
"It is certainly far more economical. For every galaxy, dark matter theorists
must fit a made-to-measure amount and distribution of dark matter. So, if we
understand 10 galaxies, we still don't understand an 11th. Dark matter explains
only after the fact. MOND predicts things ahead of time. This is key.
"What
do the dark matter theorists say to this?
"They believe the problems with dark matter will one day be solved. A single
MOND formula perfectly describes every spiral galaxy, even though the birth of
each one is chaotic, complex and unique. It is hard to see how the dark matter
model can explain this. Still, they cling to the hope that it will one day be
possible. To my mind there is no hope of that happening.
"Does
it bother you that most physicists remain dismissive of your idea?
"Fifteen years ago, I found it somewhat dismaying. Now my spirits are up. There
has been an explosion in interest. In recent years, around 700 papers dealing
with MOND have been published. It's very encouraging.
"What
killer observation could support MOND?
"Well, if dark matter is discovered, that would kill MOND. But I don't think
there is one killer observation that will clinch the idea in the minds of dark
matter advocates. Hundreds of MOND predictions have been vindicated already;
what more can one ask for?
"How
long should we keep looking before giving up on dark matter?
"The ongoing, failed attempts to find it actually benefit MOND, so I would like
to see the search continue. To my mind it is already high time to give up on
dark matter. So much time, money and effort can be saved. Human nature being
what it is, that might take 10 years or longer. I envision a gradual
disillusionment as dark matter continues not to turn up in experiments. Even
Einstein's theory of gravity was accepted only slowly. So I'm not despairing.
Far from it.
"Mordehai Milgrom is professor of physics at
the Weizmann Institute of Science in Rehovot, Israel. He proposed the theory
of modified Newtonian dynamics (MOND) as an alternative to dark matter." [New
Scientist222, 2967, 03/05/2014. Quotation marks
altered to conform with the conventions adopted at this site; several paragraphs merged
to save space. Some links added. Emphases in the original.]
March 2015: The BBC has just aired a
programme entitled "Dancing In The Dark -- The End Of Physics". As the search
for 'Dark matter' seems to be stalling, if not running into the
sand, the astrophysicists who appeared in the programme were
perhaps being a little too honest. Among the choice comments they
came out with were the following:
"Maybe
we're just going to have to scratch our heads and start all over
again." (Professor
John Ellis) "How does any theorist sleep at night knowing that
The Standard Model of particle physics is off by so many orders of
magnitude?" (Professor
Leslie Rosenberg) "We have no idea what 95% of the universe is.
It hardly seems we understand everything!" (Professor
David Charlton) "What is it they say about Cosmologists? They are
always wrong but never in doubt." (Professor
Juan Collar) "There are more theories than there are theoreticians." (Professor
Bob Nichol) "I'm not the hugest fan of SUSY [Supersymmetry -- RL]; it seems
slightly messy the way you just add in one extra particle for every other
particle that we know about." (Dr
Katherine Leney) "People have been looking for SUSY for decades,
and we have been building bigger and bigger machines, and its always
been just always out of reach.... It's getting to the point where
now with the LHC [Large Hadron Collider -- RL] it's going up in
energy and that's such a huge reach now, if we still don't find it
then...it starts to look like it's not the right idea." (Dr
Gavin Hesketh)
This is from the BBC's
Science News
page:
"Dancing
in the
dark:
The
search
for the
'missing
Universe'
"By
Peter
Leonard
BBC
Horizon
"They say the hardest pieces of music to
perform are often the simplest ones. And so
it is with science -- straightforward
questions like what is the Universe made
from? have so far defeated the brightest
minds in physics. Until -- perhaps -- now.
Next week, the Large Hadron Collider at Cern
will be fired up again after a two-year
programme of maintenance and upgrading. When
it is, the energy with which it smashes
particles will be twice what it was during
the LHC's Higgs boson-discovering glory
days. It is
anticipated -- hoped, even -- that this
increased capability might finally reveal
the identity of 'dark matter' -- an
invisible but critical entity that makes up
about a quarter of the Universe. This is the
topic of
this week's Horizon programme on BBC Two.
"Dark matter arrived on most scientists'
radar in 1974 thanks to the observations of
American astronomer Vera Rubin, who noticed
that stars orbiting the gravity-providing
black holes at the centre of spiral galaxies
like ours did so at the same speed
regardless of their distance from the
centre. This shouldn't happen -- and doesn't
happen in apparently comparable systems like
our Solar System, where planets trapped by
the gravity of the sun orbit increasingly
slowly the further away they find
themselves. Neptune takes 165 Earth years to
plod around the Sun just once. This is what our understanding of gravity
tells us should happen. Vera's stars racing
around at the same speed were a surprise:
there had to be more stuff there --
providing more gravity -- than we could see.
Dark matter.
"Dark
matter,
then, is
a
generic
term for
the
stuff
(matter)
that
must be
there
but
which we
can't
see
(dark).
But as
to what
this
dark
matter
might
actually
be, so far
science
has
drawn a
blank.
That's
not to
say that
there's
been no
progress
at all.
It's now
thought
that
dark
matter
isn't
just
ordinary
stuff in
the form
of gas
and dust
and dead
stars
that are
dark
simply
because
they
don't
shine.
It's now
generally
agreed
that the
dark
matter
is a
miasma
of (as
yet
unidentified)
fundamental
particles
like
(but
not) the
quarks
and
gluons,
and so
on, that
make up
the
atoms
with
which
we're
more
familiar.
"These
'dark'
fundamental
particles
are
known as
Wimps:
Weakly
Interacting
Massive
Particles.
This
acronym,
like the
term
'dark
matter'
itself,
is a
description
of how
these
theoretical
dark
matter
creatures
behave,
rather
than a
definition
of what
they
are: The
'weakly
interacting'
bit
means
that
they
don't
have
much to
do with
ordinary
matter.
They fly
straight
though
it. This
makes
them
very
tricky
to
detect,
given
that
ordinary
matter
is all
we have
to
detect
them
with.
The
'massive'
part
means
simply
that
they
have
mass. It
has
nothing
to do
with
their
size. All
that's
left is
'particle',
which
means,
for want
of a
better
description,
that
it's a
thing.
[That
certainly clears
things
up! --
RL.]
"So dark
matter
is some
form of
fundamental
particle
that has
Wimp
characteristics.
In
theory,
these
Wimps
could be
a huge
range of
different
things,
but work
done by
Prof
Carlos
Frenk of
Durham
University
has
narrowed
the
search
somewhat.
It was
Frenk
and his
colleagues
who, at
the
start of
their
scientific
careers
in the
1980s,
announced
that
dark
matter
had to
be of
the Wimp
type
and,
additionally,
it had
to be 'cold'.
At the
time, it
was a
controversial
claim,
but in
the
years
since,
Frenk
has
added
computerised
flesh to
the
bones of
the
theory
-- by
making
universes.
"'It's
quite a
simple
process,'
says
Frenk.
'All you
need is
gravity
and a
few
basic
assumptions.'
Key
among
these
basic
assumptions
is
Frenk's
claim
that
dark
matter
is of
the Wimp
variety,
and
cold.
The
universes
that
emerge
from his
computer
are
indistinguishable
from our
own,
providing
a lot of
support
to the
idea of
cold
dark
matter.
And
because
dark
matter
is part
of the
simulation,
it can
be made
visible.
The
un-seeable
revealed.
'You can
almost
touch
it!'
enthuses
Frenk. But so
far,
'almost'
is the
issue.
The fact
is that
you
can't
touch it
-- which
is why
tracking
it down
'in the
wild'
has, to
date,
ended in
disappointment.
And yet
it must
be
there,
and it
must be
a
fundamental
particle
-- which
is where
Cern's
Large
Hadron
Collider
comes
in.
"What
happens
in the
LHC is
that
protons
are
fired
around
its
27km-long
tube in
opposite
directions.
Once
they've
been
accelerated
to
almost
the
speed of
light,
they're
collided
--
smashed
together.
This
does two
things.
Firstly,
it makes
the
protons
disintegrate,
revealing
the
quarks,
gluons
and
gauge
bosons
and the
other
fundamental
particles
of
atomic
matter.
There
are 17
particles
in the
standard
model of
particle
physics
-- and
all of
them
have
been
seen at
the LHC.
"Secondly,
the
collisions
might
produce
other,
heavier
particles.
When
they do,
the
LHC's
detectors
will
record
them. In
charge
of one
of those
detectors
is Prof
Dave
Charlton
from the
University
of
Birmingham.
'Sometimes
you
produce
much
more
massive
particles.
These
are the
guys
we're
looking
for.' Dave --
and
everyone
else at
Cern --
is
looking
for them
because
they
could be
the
particles
that
could be
the dark
matter.
It
all
sounds
highly
unlikely
-- the
idea
that
ordinary
matter
produces
matter
you
can't
see or
detect
with the
matter
that
made it
-- but
it makes
sense in
terms of
the
uncontroversial
concept
of the
Big
Bang.
"If dark matter exists, it would have been produced at the Big Bang like everything else. And to see what actually was produced at the big bang, you need to create the conditions of the Big Bang -- and the only place you're likely to get anywhere near those conditions is at the point of collision in the LHC. The faster the collision, the closer you get to the Big Bang temperature. So there's every reason to think that dark matter might well be produced in particle accelerators like the LHC.
"What's more, there's a mathematical theory that predicts that the 17 constituents of the standard model are matched by 17 more particles. This is based on a principle called 'super symmetry'. Prof John Ellis, a theoretical physicist from Kings College, London, who also works at Cern, is a fan of super symmetry. He's hopeful that some of these as yet theoretical super symmetrical particles will show up soon.
"'We were kind of hoping that they'd show up in the first run of the LHC. But they didn't,' he confesses, ruefully. Ellis explains that what that means is that the super symmetric particles must be heavier than they thought, and they will only appear at higher energies than have been available -- until now. In the LHC's second run, its collisions will occur with twice as much energy, giving Prof Ellis hope that the super symmetric particles might finally appear. 'When we increase the energy of the LHC, we'll be able to look further -- produce heavier super symmetric particles, if they exist. Let's see what happens!'
"It's crunch time for super symmetry. If it shows itself in the LHC, then all will be well. The dark matter problem would finally be solved, along with some other anomalies in the standard model of physics. But if, like last time, super symmetry fails to turn up, physicists and astrophysicists will have to come up with some other ideas for what our Universe is made from.
"'It might be," concedes Prof Ellis, 'that we'll have to scratch our heads and start again.'
"Dancing in the Dark -- The End of Physics? is broadcast on BBC Two on Tuesday 17 March at 21:00." [Quoted from here; accessed 19/03/2015. Quotation marks altered to conform with the conventions adopted at this site. Paragraphs merged to save space; bold emphases and one link added.]
Doubts are also beginning to be raised
about 'Dark Energy', too:
Under construction...
I'll refrain from making the same points or
asking the obvious
questions -- again -- but by now I'm sure readers will know what they are...
"If imitation is the sincerest form of flattery,
the periodic table has many true admirers.
Typefaces,
types of meat and even
the Muppets have been ordered in its image. For
chemists, knowing an element's position in the periodic table, and the company
it keeps, is still the most reliable indicator of its properties -- and a
precious guide in the search for new substances. 'It rivals Darwin's Origin
of Species in terms of the impact of bringing order out of chaos,' says
Peter Edwards of the University of
Oxford. The origins of the periodic table lie in the 19th century, when chemists
noticed that patterns began to emerge among the known chemical elements when
they were organised by increasing atomic weight. In the 1860s, Dmitri Mendeleev
and others began to group the elements in rows and columns to reflect those
patterns -- and realised gaps in the resulting grids allowed them to predict the
existence of elements then unknown.
"It was only with the advent of
quantum theory in the 20th century that we began to
grasp what lies behind these patterns. The periodic table's rows and blocks
roughly correspond to how an atom's electrons are arranged, in a series of
'shells'
around the proton-rich nucleus. Electrons fill shells
and subdivisions of shells starting with the one closest to the nucleus, which
has the lowest energy. The number of electrons in the outermost shell, and its
distance from the nucleus and the other shells, are the main factors that
determine an element's chemical behaviour. 'Chemical periodicity is a natural
property,' says
Eric Scerri, a philosopher of
chemistry at the University of California, Los Angeles.
"But that perhaps leads us to some hasty
conclusions about the table. 'People assume surely it's been sorted out. It's
not settled -- many, many aspects are still up for grabs,' says Scerri. Electron
configurations do not always mesh neatly with chemical properties. Properties
and patterns we take as given on Earth are very different when we venture into
the extreme environment of space. And quite what happens towards the end of the
periodic table -- and indeed, where this end lies -- are questions that remain
unanswered. As the following examples show, the periodic table is still very
much a work in progress…
"Weighty affair
"The earliest periodic tables arranged elements
by ascending atomic weight -- basically, the number of protons and neutrons in
an atom's nucleus. But most atoms come in various isotopes containing different
numbers of neutrons. Today's tables order the elements by atomic number -- the
unambiguous number of protons. The atomic weights are still there -- but the
question is, which is the 'correct' one? They used to be displayed as a single
number for each element, based on averaging the weights of its natural isotopes
according to their relative abundances. But this perpetuates the misconception
that this number is some kind of fundamental constant, says
Tyler Coplen of the Reston Stable
Isotope Laboratory in Virginia. In reality, the atomic weight of an element such
as carbon, say, varies slightly from sample to sample depending on the exact
quantities of each isotope.
"Ordering the periodic table by atomic number
makes the position of elements indisputable -- except when it doesn't. Take the
case of the two rows of elements floating rather like an afterthought below the
main body of the table: the
lanthanides and
actinides. Two gaps in the main
table, below
scandium and
yttrium in group 3, mark where these series slot in.
The question is, how do they slot in? There are two schools of thought. One goes
by electron configurations: scandium and yttrium both have three outer
electrons, as do
lanthanum
and
actinium, the elements at the left-hand end of
the series, so they are the rightful placeholders. But others point out that
chemical properties such as atomic radius and melting point make
lutetium and
lawrencium at the right end of the rows a better fit. In 2008,
simmering tensions between the two sides boiled over
in the pages of the Journal of Chemical Education.
"Resolving
the dispute matters, says Scerri, and
not just for pedagogical clarity. Yttrium can be used to make superconductors,
compounds that conduct electricity without resistance, that work at relatively
high temperatures. The hunt is on for materials with similar abilities and
Scerri thinks lutetium and lawrencium compounds may have been overlooked because
they are seen as belonging to a completely unrelated group. Any resolution will
be years away. IUPAC has given Scerri the go-ahead to set up a committee -- but
only to make the case for why a decision might be needed.
"Half empty -- or half full?
"In the early universe, the two simplest
elements, hydrogen and helium, were pretty much all there was. But the advent of
more complex stuff has made it difficult to work out where they fit in. 'It's a
bit like asking how would you classify dinosaurs along with other animals,' says
Scerri.
"Hydrogen has one proton surrounded by one
electron rattling around in a shell that might hold two. But is that shell
half-empty or half-full? Most elements tend to either gain or lose electrons
during chemical reactions. Hydrogen swings both ways, sometimes picking up an
electron to fill its shell and form compounds like sodium hydride (NaH), and
sometimes losing its one electron to form compounds like hydrogen fluoride (HF).
Most periodic tables,
including IUPAC's, put hydrogen in group 1 with
electron-losing metallic elements such as lithium and sodium. But even IUPAC
allows that
hydrogen might sit equally well with electron-scarfing halogens
such as fluorine right over in group 17. Most chemists shrug at this ambiguity.
'It doesn't bother me to include hydrogen both in group 1 and in group 17,' says
Pekka Pyykkö of the University of
Helsinki in Finland.
"The problem with helium, meanwhile, is that it
hardly reacts at all, thanks to its full outer electron shell. In a standard
periodic table it sits atop neon with the
noble gases that share this
characteristic, in group 18. But the fact its outer shell contains only two
electrons makes some suspect it would be better off with elements such as
beryllium, over in group 2. That suspicion is increased by calculations
indicating that both helium and neon might under certain circumstances react
with other elements, but that helium is the more likely to. This goes against
the trend of increasing reactivity as you go down that group -- a wrinkle that
could also be smoothed,
some suggest, by spiriting helium away
to group 2.
"When is a metal not a metal?
"It was a trend that made the pioneers of the
periodic table confident they were on to something: if you sweep diagonally
across the periodic table from bottom left to top right, the elements gradually
become less metallic. Commonly, the boundary between the two is depicted as a
thick line staircasing its way down the table's right side. Sadly, it isn't that
simple. 'Metal-non-metal status isn't sacrosanct,' says Edwards. Take hydrogen.
We Earthlings encounter it as a distinctly non-metallic transparent gas. But in
the cores of hydrogen-rich planets such as Jupiter and Saturn, high pressures
and temperatures are thought to make hydrogen a shiny, metallic fluid. Its usual
position in the periodic table, above metallic
lithium, hints at this. But
avowed non-metals such as
helium or
oxygen are also expected to loosen up under pressure,
so their outermost electrons roam free to conduct as they see fit. 'The periodic
table as you learned it is only the periodic table at ambient conditions,' says
Friedrich Hensel of the University of
Marburg in Germany.
"Hydrogen under pressure might even make a solid
metal, a substance with possibly exciting applications as a fuel or
room-temperature superconductor -- although recent claims to have made it in the
lab
are disputed. It goes the other way, too. In 2009, a
team led by
Artem Oganov, now at the State University of New York
at Stony Brook, used high pressures to turn the shiny group 1 metal sodium into
a
translucent, reddish-orange non-metal.
In this case, it seems the pressure brings electrons so close that they are
forced to occupy spots that minimise subsequent repulsions, instead of roaming
free. Such questionable behaviours highlight that there is little settled in the
chemical world, but we shouldn't panic, says Oganov. 'I don't think we need to
revise the periodic table. What we are doing now is making very important
comments and corrections to it.'
"Einstein's influence
"Einstein's relativity bends space, time, minds
-- and the periodic table. By the time you reach gold, with an atomic number 79,
the pull of the highly charged nucleus is such that the innermost electrons
whizz round at a zippy 80 per cent of the speed of light. This increases their
mass, causing them to orbit the nucleus more closely and shield electrons
further out from its pull. The outer shells then expand -- and the neat
connection between how electrons fill up shells and an element's chemical
properties begins to break down. The knock-on effects on the wavelengths of
light that gold absorbs are why it looks so very different from the precious
metal directly above it in the periodic table. 'You need relativity to actually
make gold different from silver,' says Pyykkö. That's not the only thing. Just
last year,
Peter Schwerdtfeger at Massey University in Auckland,
New Zealand, finally proved something suspected for decades: that mercury's
anomalously low melting point, causing it uniquely among metals to be a liquid
at room temperature,
is also down to relativistic effects.
"As ever heavier elements are added to the
periodic table, where does this leave things? We're not quite sure. When the
properties of
rutherfordium (atomic number 104) and
dubnium (105) were found to
be out of keeping with
hafnium and
tantalum immediately above them, questions
were asked. But
seaborgium
(106) seems stolidly conventional. Element 107 has
been dubbed 'boring
bohrium' for the way it toes the group line.
Experiments with two of the table's recent additions,
copernicium
(112) and
flerovium
(114), have so far painted a mixed picture. So has Einstein deprived
the periodic table of its predictive power? Pyykkö is relaxed about the
ructions. 'You don't have a simple mathematical theory underlying the periodic
table,' he says. 'You have a number of nuts and bolts -- one is relativity. When
put together, they explain the workings of the periodic table.'
Matthias Schädel, who studies
superheavy elements at the GSI Helmholtz Centre for Heavy Ion Research in
Darmstadt, Germany, is less sanguine. 'The periodic table is still intact -- but
you can't predict detailed properties any more,' he says.
"Where will it all end?
"Early atomic models indicated that, above an
atomic number of 103, the repulsive force between positively charged protons
would become so great that atoms would just fall apart. Nature generally gives
up at 92, with
uranium. But thanks to experiments to fabricate elements beyond
uranium in the lab, the heaviest atom now officially recognised is
livermorium
(116). Elements 117 and 118, already fabricated, are yet to be legitimised.
Clearly, where the end of the table lies isn't that simple. 'There certainly is
a limit,' says Schädel, 'but we don't know where it is.' The existence of
superheavy elements with atomic numbers above 103 is now explained by theories
that say that, just as orbiting electrons are arranged in shells, so too are
protons and neutrons within the nucleus. In and around specific 'magic' numbers
corresponding to full shells, atoms are more stable. But there must still come a
point where the fields within highly charged, weighty atoms will become
irresistible. One suggestion, from theorist Paul Indelicato of the Pierre and
Marie Curie University in Paris, France, and his colleagues in 2011, is that
quantum instabilities within these fields
will finally do for elements above number 172.
"Stability is only ever relative with the
superheavy elements, anyway: those made so far are radioactive and often only
survive for fractions of a second before decaying, making them difficult to
study. What's more, most of them can only be produced one atom at a time,
meaning chemical properties such as volatility, conductivity, or whether
something is a solid, liquid or gas, cease to make much sense. So do they really
count as elements at all? 'As a chemist, you want to see chemical interactions,'
says Schädel. Several research teams have devised ingenious ways to study the
properties of single atoms. One experiment has probed the volatility of
copernicium and flerovium by comparing what temperature a single atom sticks to
a gold surface with the sticking point of atoms whose volatility is known.
Theory also indicates a higher sticking temperature could be evidence for a
metallic character. 'Chemists and physicists have defined metal for generations
but they never had to think about what happens with one atom,' says Schädel.
'This is a completely new way of defining a metal.'
"Even so, with no realistic prospect of
exploiting these superheavy atoms, isn't this a fool's errand? Schädel thinks
not, and sees such experiments in the 150-year tradition that have made the
periodic table the iconic chart of the elements it is. 'It's experiencing terra
incognita, going to regions that no one has a glimpsed yet,' he says." [New
Scientist223, 2977, 12/07/2014, pp.38-41.
Accessed 05/09/2014. Links in the original, several paragraphs merged to save
space. Quotation marks altered to conform with the conventions adopted at this
site.]
Figure One: The
Periodic Table
September 2015:And then there is this from the
New Scientist:
"Modern-day
alchemy is putting the periodic table under pressure
"It is chemistry's poster child. From
copper's conductivity to mercury's mercurial
liquidity, the periodic table assigns the
chemical elements to neat columns and rows
and so reveals their properties. It is
chemists' first reference point in all their
endeavours, whether building better
catalytic converters, making concrete set
faster or looking for the best materials for
medical implants. Yet this iconic picture of science is
hopelessly parochial. Most of the known
matter in the universe doesn't exist under
the cool, calm conditions of Earth's surface
that the periodic table assumes. By mass,
more than 99.9 per cent of normal matter
resides within planets and stars --
environments of high temperatures, but above
all unimaginable pressures.
"Here, the elements' familiar identities
start to blur. 'We essentially have a new
periodic table at high pressures,' says
materials scientist Paul Loubeyre of the
Alternative Energies and Atomic Energy
Commission (CEA) in Bruyčres-le-Châtel,
France. As yet, this high-pressure realm is
one we know little about -- proportionally
as if, in thousands of years of Earth
exploration, geographers had mapped out a
region no larger than Spain.
Advertisement:
Replay Ad
Ads by ZINC
"Slowly, however, modern-day alchemists are
ratcheting up the pressure. As they do, they
are transforming the familiar physical and
chemical properties of elements from
hydrogen to iron, turning liquids to solids,
non-metals to metals and more besides. The
aim is not just to understand more about the
deep chemistry of our planet and others, but
also to find materials that react more
efficiently, store energy more effectively,
or even perform that most yearningly
sought-after of tricks: conducting
electricity without resistance at room
temperature.
"Electron squash
"It was the Russian chemist Dmitri
Mendeleev who, back in 1869,
produced the first recognisable
periodic table. He showed that
patterns begin to emerge in the
properties of the chemical elements
if you order them by their atomic
weight, and used those patterns to
predict the existence of
undiscovered elements. But it wasn't
until the advent of quantum
mechanics in the 20th century that
those patterns were explained.
Electrons circling an atom's nucleus
can only occupy discrete 'orbitals',
each of which accepts a strict
number of electrons. The
distribution of electrons within
these orbitals -- especially the
outermost ones -- determines an
element's chemical character.
"There are still niggles about the
periodic table's validity under
normal conditions (New
Scientist,
12 July 2014, p.38),
but turning up the pressure changes
things entirely. Atoms get squished,
deforming the 'unit cells' that
define matter's basic scale.
Electrons squeeze into tighter
orbitals, overlapping and forming
more exotic configurations, and
begin making chemical bonds with
electrons in other atoms in entirely
different ways. Carbon provides a familiar example
of the changes that can result. Coal
is carbon formed from plant debris,
compacted and heated for millions of
years a few kilometres below ground.
But go 100 kilometres or so down,
and the high temperature, along with
pressures 30,000 to 50,000 times
those at Earth's surface, transform
carbon's bonding to make an
apparently different substance:
diamond.
"Occasionally, geological processes
fortuitously bring diamonds closer
to the surface, where they can be
mined. Since the 1950s, however,
researchers have been cutting out
the billion-year geological
middlemen. Large hydraulic presses
weighing tens of tonnes now compress
carbon to yield synthetic diamonds
used in coatings, cutting tools and
even jewellery.
But
even these pressures are tiny
compared with those deep in Earth's
interior.... For a more complete
picture of the planet, we'd love to
track what goes on in this
high-pressure realm. 'Stuff is
moving and reacting, causing fluxes
in different elements such as
carbon, and this has relevance for
climate change over geological
history and theories about the
origin of life,' says geochemist
Catherine McCammon
of Bayreuth University in Germany.
"Diamond to the core
"With
no direct access to such depths --
our deepest drilled hole, the
Kola Superdeep
Borehole,
penetrates just 12 kilometres
beneath the north-west tip of Russia
-- simulating the pressures found
there requires some cunning ruses.
As anyone who has stepped on a
polished wooden floor wearing
high-heeled shoes knows, force
concentrated into a small area
produces very high pressures. A
device known as an anvil cell takes
advantage of this fact, producing
the same effect as a monster
hydraulic press by crushing tiny
samples between two high-heel tips
made of extremely hard materials.
Tungsten carbide is one such
material, capable of delivering
pressures equivalent to those in
Earth's upper mantle. Diamond tips
go further, to pressures found in
Earth's core. They are also
transparent, allowing researchers to
observe in real time what happens as
materials are crushed.
"Even
so, to really see what is going on
at the atomic level requires special
illumination: the intense beams of
X-ray light produced at synchrotron
accelerators such as the European
Synchrotron Radiation Facility
(ESRF) in Grenoble, France. McCammon
and her colleagues recently placed
their diamond anvil cell in the
ESRF's beams to address the puzzle
of Earth's core consistency. The
core generates Earth's magnetic
field, and is mostly made of iron.
But the way seismic waves pass
through it suggest the core has a
consistency rather like tyre rubber
-- which is not what pure iron looks
like under pressure. When mixed with
carbon graphite powder and
compressed, however, the researchers
found iron forms an entirely
different crystalline phase that
mimics this plasticity, suggesting
Earth has a vast, previously unknown
carbon reservoir (Nature
Geoscience,
vol 8, p.220).
Similar experiments have also
recently yielded new estimates for
iron's melting temperature under
pressure, leading to the conclusion
that the temperature of Earth's
inner core is 6000°C, some 1000
degrees hotter than previously
thought.
"Iron's large number of orbiting
electrons make it particularly
difficult to model under pressure.
Undaunted, researchers such as
Loubeyre are moving on to even more
complex elements, such as the
rare earths.
These materials are essential for
technology from
screens for
smartphones
to
magnets for
wind turbines,
and the hope is that exploring
high-pressure forms will allow us to
better exploit their properties. This same motivation holds for
simpler elements, too -- and it's
here that the surprises have been
greatest. Exert a pressure akin to
those found in Earth's outer core,
and sodium -- a soft, highly
reactive metal at Earth's surface --
becomes transparent and something
between a semiconductor and an
insulator. Oxygen, meanwhile, turns
into a solid metal at similar
pressures, and if then cooled
becomes a superconductor -- in other
words, it conducts electricity with
no resistance.
"The dramatic nature of these
changes has caught many on the hop.
'There have been simple rules
guiding this field, but it turns out
these things were wrong,' says
Loubeyre. 'We are seeing baffling
changes in unit cell volumes and
remarkable electrical properties
that are highly promising for
applications.' Perhaps the most promising is the
simplest atom of all: hydrogen. This
is the universe's most abundant
element, consisting of just a single
electron orbiting a single proton.
It is a bit of a misfit in the
periodic table, generally sitting
uneasily above lithium and sodium at
the top of
metallic group
1.
"Calculations since the 1930s have
indicated that high pressures would
indeed turn hydrogen into a metal,
squeezing the electron out of each
proton's orbit and leaving them free
to conduct. Not only that, but
calculations suggested that the
electrons' skittish interactions
with the lattice of proton cores
might at some point allow them to
team up and pass through the solid
hydrogen unimpaired -- hydrogen
could be made a superconductor. That
might even occur at room temperature
and potentially persist even when
the pressure is taken off, just as
diamonds retain their structure at
lower pressures.
"This would be a trillion-dollar
transition. The highest temperature
so far seen for a superconductor, a
copper-based compound called a
cuprate, is -140°C at ambient
pressure, and -110°C at high
pressures. A room-temperature
superconductor could allow the
transmission of energy over distance
without loss at a much lower price,
transforming the electricity grid
and consigning the internal
combustion engine to history. No one is sure at what pressure hydrogen's
metallic and superconducting transitions
might occur, although it seems likely that a
metallic, although not superconducting,
state makes up the core of Jupiter and other
gas-giant planets. That's tough to
replicate: the pressure at Jupiter's centre
is 4000 gigapascals (GPa), more than 10
times that in Earth's inner core and 40
million times greater than at Earth's
surface.
"In 2011
Mikhail Eremets
and Ivan Troyan of the Max Planck Institute
for Chemistry in Mainz, Germany, reported a
metallic transition in hydrogen compressed
to just 260 GPa in a diamond anvil cell (Nature
Materials,
vol 10, p.927).
The claim was met with scepticism, and later
analysis suggested they had probably found
something subtly different: a poorly
conducting semi-metal state (Physical
Review B,
vol 88, p.045125).
"Still, just
last month Eremets and Troyan pulled off
something close to the desired trick with
hydrogen atoms in a slightly more complex
chemical form. They compressed hydrogen
sulphide -- that smelliest of gases, which
gives off a strong whiff of rotten eggs --
to just 90 GPa and transformed it to a solid
metal. Taken to 150 GPa, it seems to become
a superconductor when cooled to just -70°C,
the
highest temperature
ever recorded
for a superconductor (Nature,
DOI
10.1038/nature14964).
This superconductor also seems to work by a
conventional mechanism that has been
understood for 50 years, in contrast to the
still-mysterious workings of the cuprates.
"Pure hydrogen still remains the ultimate
goal -- after all, says Patrick Bruno of the
ESRF, the discovery of metallic hydrogen
would probably be worth a Nobel prize.
Besides a potential room-temperature
superconductor, metallic hydrogen might form
the basis of a revolutionary fuel
technology: the molecular bonds of pure
hydrogen are broken at high pressure,
leaving a system that can store a huge
amount of chemical energy. There are still
high-pressure depths to plumb to get there.
'Recent work on sophisticated simulations
cast doubts that there is any metallic state
below 400 GPa -- the current edge for
experimental studies,' says Eremets.
Loubeyre and his colleagues are working with
a two-stage anvil -- essentially a diamond
crushing a diamond crushing something else
-- that lets them achieve pressures of 600
GPa, and he is confident of results soon.
'Metallic hydrogen will probably be
discovered in the next two to three years,'
he says.
"Meanwhile
new pressure records continue to be set --
most recently osmium, Earth's least
compressible metal,
crushed at 770 GPa.
As we progress we are beginning to find out
just how much the 99.9 per cent of chemistry
we don't see differs from the 0.1 per cent
we do. Our techniques currently limit us to
observing the properties of small samples of
high-pressure materials, but the hope is
that better synchrotrons, for example, will
allow us to observe and perhaps even mimic
the chemical reactions between different
compounds that must go on deep beneath our
feet.
"That truly
would be an eye-opener. Chemists already
know and routinely exploit the
transformations in chemical reactivity that
even relatively small pressure increases
bring. Molecular nitrogen in the atmosphere,
for example, is a thankfully unreactive gas.
But up the pressure 200-fold, and it readily
reacts to make the fertiliser ammonia -- the
basis of the
Haber-Bosch
reaction
that, by transforming agricultural
productivity, has fed many a hungry mouth
since the early 20th century. With such
successes in mind, the pressure's on to
discover more." [Quoted from
here;
accessed 15/09/2015. Quotation marks altered
to conform with the conventions adopted at
this site. Some paragraphs merged to save
space. Two links added.]
"By Toby MacdonaldProducer and director,
BBC
Horizon
"Rogue alien planets are
forcing astronomers to rethink the birth of our Solar System. What's emerging is
a tale of hellfire, chaos and planetary pinball -- and it's a miracle our Earth
survived. Hunting for alien planets is big business. Since the first exoplanet
was discovered in 1995, astronomers around the world have been searching for
those elusive Earth-like planets that could harbour life. The tally of planets
found is staggering. So far, more than 1,800 confirmed planets have been
discovered orbiting around other stars in our galaxy (the latest figure from
Nasa is 1,816 planets around 1,117 stars). Among them, are a few rocky planets
in the magical 'Goldilocks' zone where water can remain in the liquid state, and
life could evolve.
"It's no surprise to astronomers that these planets exist.
But what has come as a shock is that many of these exoplanets seem to break all
the rules. Dr Christopher Watson, from Queen's University, Belfast, is at the
forefront of research into these bizarre new worlds. 'They're very strange,' he
told the
BBC's Horizon
programme. 'These are nothing like our Solar System,
and in some cases, I think really science fact can be a lot weirder than science
fiction.'
"Two hundred light-years away
from Earth is a strange world called
Kepler 16b. It is a planet with two suns,
much Luke Skywalker's home planet -- Tatooine. Then there is
Kepler 78b,
a rocky planet about the size of Earth, but so close to its star that it
completes an orbit once every eight-and-a-half hours. As a result, it is just a
seething ball of lava. Most baffling of all have been the 'Hot
Jupiters' -- giant gas planets in impossible orbits close to their star.
Watson, along with most of the astronomical community, was initially baffled:
'They were right up against their host star and it's amazing. They really
shouldn't be there.'
"The trouble with these rogue
planets is that they seem to break the laws of physics. It's simply not possible
for them to form that close to a star. But, if you eliminate the impossible,
then whatever remains, however improbable, must be the truth. The startling
conclusion reached by
Dr Watson and others is that the planets formed further
away, and then migrated in to where we see them today. They changed orbit. This
idea that planets could change orbit, moving closer or further away from their
star, turned astronomy on its head and led to one of the biggest rethinks since
the time of Copernicus and Galileo. It led astronomers like Dr Kevin Walsh, from
the Southwest Research Institute in Boulder, Colorado, to question everything
about the birth and evolution of our own Solar System.
"'When we started finding
planets in places we thought you could never possibly form a planet, we had to
go back to the drawing board and say, "wow, planets can move, planets can really
move. Maybe that happened here",' he says.
"Since the time of Galileo, we have assumed
that the planets in our Solar System follow stable, fixed orbits. We assumed
that the planets were formed where we find them now -- from the left-over dust
and gas when the Sun burst into life. We assumed that our Solar System has, for
four-and-a-half billion years, been a peaceful haven, stable for sufficient time
to enable life to evolve on Earth. And yet there are some nagging mysteries
about our own system of planets that have never been explained. Mars, for
example, is much smaller than we would expect; the asteroid belt is divided into
two neat bands -- an inner band of rocky material, and an outer band of icy
lumps.
"New narrative
"No-one ever found a
satisfactory explanation for these mysteries, and yet they are clues to a
turbulent history that astronomers are beginning to uncover. Since the discovery
of migrating exoplanets, planetary scientists have proposed new models for the
formation of our Solar System. They suggest that, far from being a peaceful and
stable system with fixed orbits, our Solar System underwent periods of chaos,
with Jupiter, the big bully of the Solar System, calling the shots.
"'The Solar System is not this nice, safe,
quiescent place, but can go through periods of intense violence,' says Dr Hal
Levison of the Southwest Research Institute in Boulder. He believes it's time to
re-examine the evidence with fresh eyes: 'Sometimes the blood spattered on the
wall can tell you more about what happened than the body lying on the floor.'
According to Dr Walsh, Jupiter may have undertaken a wild journey through the
Solar System, spreading havoc, stunting the growth of Mars and scattering
everything in its path.
"'Dice roll'
"With a planet the size of Jupiter travelling freely through the system of
planets, the outcome would have been far from certain. 'The Solar System could
have done a lot of different things. It could have evolved in a lot of different
ways. We could have ended up with our Jupiter right next to the Sun,' he
explains. The latest evidence suggests our Solar System underwent a period of
chaotic upheaval, before settling down to the stable system we see today. Our
own home, planet Earth -- in the perfect place for life to evolve -- was lucky to
survive.
"Dr Walsh comments: 'Getting an Earth where we have our Earth today was not a
given when this whole Solar System started.' So, astronomers are left with the
question -- how common is a solar system like ours? There may be plenty of
planets out there. But getting a stable, ordered arrangement like our own Solar
System, could just be a lucky roll of the dice.
"You can watch
Horizon: Secrets of the Solar
System on BBC2 on Tuesday 3 March at 21:00." [Quoted
from
here; accessed 08/03/2015.
Some links added; several paragraphs merged to save space. Quotation marks
altered to conform with the conventions adopted at this site. The entire programme
can be accessed
here.]
The programme itself featured quotes from
scientists (and the editors) like the following: "If we want to understand the
birth of our Solar System, we're going to need a brand new model", "There's some
mysteries when we look around the Solar System, where the theories really don't
match what we see", "We started finding planets in places we'd never thought you
could possibly form a planet. We had to go back to the drawing board. It's
changed the way we look at almost every process in the Solar System", and
"Strange new worlds that break all the rules have made astronomers question
everything."
Add to the above the following article about
"Wandering Jupiter":
"Wandering
Jupiter swept away super-Earths, creating our unusual Solar
System
"Jupiter may have swept through the early Solar System like
a wrecking ball, destroying a first generation of inner
planets before retreating into its current orbit, according
to a new study published March 23rd in
Proceedings of the National Academy of Sciences.
The findings help explain why our Solar System is so
different from the hundreds of other planetary systems that
astronomers have discovered in recent years.
"'Now
that we can look at our own Solar System in the context of
all these other planetary systems, one of the most
interesting features is the absence of planets inside the
orbit of Mercury,' said Gregory Laughlin, professor and
chair of astronomy and astrophysics at UC Santa Cruz and
coauthor of the paper. 'The standard issue planetary system
in our galaxy seems to be a set of super-Earths with
alarmingly short orbital periods. Our Solar System is
looking increasingly like an oddball.' The new
paper explains not only the 'gaping hole' in our inner Solar
System, he said, but also certain characteristics of Earth
and the other inner rocky planets, which would have formed
later than the outer planets from a depleted supply of
planet-forming material.
"Laughlin and coauthor
Konstantin Batygin explored the implications of a leading scenario for the
formation of Jupiter and Saturn. In that scenario, proposed by another team of
astronomers in 2011 and known as the 'Grand Tack,' Jupiter first migrated inward
toward the Sun until the formation of Saturn caused it to reverse course and
migrate outward to its current position. Batygin, who first worked with Laughlin
as an undergraduate at UC Santa Cruz and is now an assistant professor of
planetary science at the California Institute of Technology, performed numerical
calculations to see what would happen if a set of rocky planets with close-in
orbits had formed prior to Jupiter's inward migration.
"At that time, it's plausible
that rocky planets with deep atmospheres would have been forming close to the
Sun from a dense disc of gas and dust, on their way to becoming typical
'super-Earths' like so many of the exoplanets astronomers have found around
other stars. As Jupiter moved inward, however, gravitational perturbations from
the giant planet would have swept the inner planets (and smaller planetesimals
and asteroids) into close-knit, overlapping orbits, setting off a series of
collisions that smashed all the nascent planets into pieces. 'It's the same thing we
worry about if satellites were to be destroyed in low-Earth orbit. Their
fragments would start smashing into other satellites and you'd risk a chain
reaction of collisions. Our work indicates that Jupiter would have created just
such a collisional cascade in the inner Solar System,' Laughlin said.
"The resulting debris would
then have spiralled into the Sun under the influence of a strong 'headwind' from
the dense gas still swirling around the Sun. The ingoing avalanche would have
destroyed any newly-formed super-Earths by driving them into the Sun. A second
generation of inner planets would have formed later from the depleted material
that was left behind, consistent with evidence that our Solar System's inner
planets are younger than the outer planets. The resulting inner planets --
Mercury, Venus, Earth, and Mars -- are also less massive and have much thinner
atmospheres than would otherwise be expected, Laughlin said. 'One of the predictions of
our theory is that truly Earth-like planets, with solid surfaces and modest
atmospheric pressures, are rare,' he said.
"Planet
hunters have detected well over a thousand exoplanets
orbiting stars in our galaxy, including nearly 500 systems
with multiple planets. What has emerged from these
observations as the 'typical' planetary system is one
consisting of a few planets with masses several times larger
than the Earth's (called super-Earths) orbiting much closer
to their host star than Mercury is to the Sun. In systems
with giant planets similar to Jupiter, they also tend to be
much closer to their host stars than the giant planets in
our Solar System. The rocky inner planets of our Solar
System, with relatively low masses and thin atmospheres, may
turn out to be fairly anomalous.
"According to Laughlin, the formation of giant planets like
Jupiter is somewhat rare, but when it occurs the giant
planet usually migrates inward and ends up at an orbital
distance similar to Earth's. Only the formation of Saturn in
our own Solar System pulled Jupiter back out and allowed
Mercury, Venus, Earth, and Mars to form. Therefore, another
prediction of the paper is that systems with giant planets
at orbital periods of more than about 100 days would be
unlikely to host multiple close-in planets, Laughlin said.
"'This kind of theory, where first this happened and then
that happened, is almost always wrong, so I was initially
sceptical,' he said. 'But it actually involves generic
processes that have been extensively studied by other
researchers. There is a lot of evidence that supports the
idea of Jupiter's inward and then outward migration. Our
work looks at the consequences of that. Jupiter's 'Grand
Tack' may well have been a 'Grand Attack' on the original
inner Solar System." [Quoted from
here; accessed 06/08/2015;
quotation marks altered to conform with the conventions
adopted at this site. Several paragraphs merged to save
space.]
November 2015: The following
article
appeared in the New Scientist; it shows that the
centuries old Copernican
Cosmological Principle is not only
threatened by recent observations, it has also served as a "form
of representation". Notice how physicists will consider
certifiably harebrained ideas in order to protect this Principle, allowing them
to continue using it as an inferential, or explanatory device (i.e., as a "paradigm")
-- and how this also puts serious pressure on Einstein's Theory of Relativity:
"A giant hole in the web of galaxies that fills the
cosmos. A colossal string of
quasars
billions of light years across. A ring made out of
hugely energetic bursts of radiation that spans 6
per cent of the visible universe. As our
observations of the cosmos come into ever sharper
focus, astronomers are beginning to identify
structures bigger than any seen before. There's
only one problem: none of them should be there.
"Ever since
Copernicus proposed his revolutionary idea that
Earth's place among the stars is nothing special,
astronomers have regarded it as fundamental. The
cosmological
principle
it has evolved into goes a step further, stating
that nowhere in the universe is special. You're
allowed to have patches of individuality on the
level of solar systems, galaxies and galaxy
clusters, of course, but zoom out far enough and the
universe should exhibit a drab homogeneity. No vast
galactic walls or bald spots, and no huge
structures.
Small wonder that the spate of recent findings has
got cosmologists hot under the collar. But the
solution could prove equally controversial. One
researcher claims these massive structures are
illusions projected from another dimension, the
first tantalising evidence of realities beyond our
own. If he is right, and these behemoths don't exist
as physical objects within our universe, then the
cosmological principle might still be safe.
"The
concept of favoured regions in the universe is
anathema to modern cosmology. 'All our thinking
since the Renaissance has been working against that
idea,' says Seshadri Nadathur, a cosmologist at
the University of Portsmouth in the UK. It also
makes using Einstein's general theory of relativity
to understand gravity's role in the evolution of our
universe an even more fiendish task than it already
is. 'Einstein's equations are much easier to solve
if you assume a universe that's almost homogeneous,'
says Nadathur. But, at the moment, the cosmological
principle is just that -- an assumption. There is no
concrete evidence that it is true, and the evidence
we do have seems increasingly against it.
"Take that giant hole
in the universe -- a void almost 2 billion light
years wide, according to its co-discoverer, András
Kovács of the Institute for High Energy Physics in
Barcelona, Spain. 'There are 10,000 fewer galaxies
in that part of the sky compared with the universal
average,' says Kovács. Based on the latest data,
astronomers believe that the cosmological principle
must apply on scales of roughly a billion light
years, with the average amount of material in any
given volume more or less the same. A big empty
patch almost double the size of the cut-off stands
out like a sore thumb. Kovács and his team call
this vast expanse a
supervoid,
and believe it might explain away the
giant cold spot in the
cosmic microwave background, an
observation that has been puzzling astronomers for
over a decade....
"And the supervoid
isn't the half of it. As far back as 2012, a team
led by Roger Clowes at the University of Central
Lancashire, UK, claimed to have found an enormous
structure strung out over 4 billion light years --
more than twice the size of the supervoid. 'We
thought "what is that!?" It was obviously something
very unusual,' says Clowes. Yet this time it wasn't
an empty patch of space, but a particularly crowded
one. Known as the
Huge Large Quasar
Group,
it contains 73 quasars -- the bright, active central
regions of very distant galaxies. Astronomers
have known since the early 1980s that quasars tend
to huddle together, but never before had a grouping
been found on such a large scale.
"Then earlier this year a team of
Hungarian astronomers uncovered a
colossal group of
gamma-ray
bursts (GRBs) --
highly energetic, short-lived
flashes of energy erupting from
distant galaxies. The galaxies
emitting these GRBs appear to
form a ring a whopping 5.6 billion
light years across -- 6 per cent of
the size of the entire visible
universe. 'We really didn't expect
to find something this big,' says
Lajos Balázs from the Konkoly
Observatory in Budapest, Hungary,
who led the study. Its size makes it
five times larger than the typical
scale at which the cosmological
principle tells us that homogeneity
should kick in.
"So fundamental is the cosmological
principle to our understanding of
the universe that such apparent
violations make astronomers and
cosmologists deeply uncomfortable,
even those who discovered them in
the first place. When it comes to
the intense flashes of light that
make up the GRB ring, for instance,
there's a possibility they might be
surrounded by other galaxies,
currently shining less brightly
because of an absence of GRBs. It's
like being in a darkened room in
which light bulbs are evenly
distributed: if only a few are
illuminated when you look into the
room, you're likely to draw the
wrong conclusions about how they are
arranged. 'It doesn’t necessarily
contradict the cosmological
principle,' says Balázs.
"Rise of the
giant-killers
"The huge large
quasar group is also
the subject of
intense debate. 'I
don't think it's
really a structure
at all,' says
Nadathur. In 2013,
he published a paper
studying the
algorithm Clowes and
his team used to
analyse their data,
calculating the
probability that a
random distribution
of quasars would
also yield an
apparent structure.
'The chances of
seeing a pattern
like the one they
see, even if there
is nothing there, is
quite high,' he
says. But the giant
might not be dead
just yet. Clowes's
PhD student, Gabriel
Marinello, is
working on a paper
countering
Nadathur's claims,
which he describes
as 'conservative and
unrealistic'. He
argues that instead
of modelling a
random distribution,
Nadathur should have
included the fact
that quasars -- just
like other galaxies
-- are known to
huddle together on
scales of around 300
million light years.
"As well as the
quasar group,
Nadathur thinks the supervoid could
also be reconciled
with the
cosmological
principle. 'The
principle is not
saying that any one
place cannot
fluctuate from the
norm, just that on
average the
large-scale universe
must be
homogeneous,' says
Nadathur. In short,
the probability of
finding objects like
the supervoid is not
zero. There just
can't be too many of
them. But Rainer Dick, a
theoretical
physicist at the
University of
Saskatchewan,
Canada, believes
such attempts to
brush these cosmic
megastructures aside
are misguided. In
fact, he says they
should be embraced
as our best bet of
keeping the
cosmological
principle alive. All we have to do is
accept that they
don't actually
exist. Instead, they
represent the first
evidence of other
dimensions intruding
into our own,
leaving dirty
footprints behind on
our otherwise smooth
and homogeneous
cosmic background.
"It seems a
breathtakingly
audacious proposal
-- but it builds on
a solid foundation
of theoretical work.
For one thing,
conjuring up other
dimensions beyond
our own is nothing
new. For decades,
many theorists have
regarded the
existence of extra
dimensions as our
best hope of
reconciling
Einstein's general
relativity with that
other bastion of
20th century
physics: quantum
theory. A marriage
between these two
seemingly disparate
concepts, one
dealing with the
very large and the
other with the very
small, would yield
what is often called
a theory of
everything, a
one-size-fits-all
framework capable of
describing the
universe in its
entirety.
"One popular
candidate is
M-theory,
an extension of
string theory that
famously suggests we
live in an 11
dimensional
universe, with the
other seven
dimensions curled up
so tightly as to
drop out of sight.
It's an elegant and
mathematically
appealing framework
with a number of
influential
supporters. But it
has one major
failing: the lack of
solid predictions
offering
opportunities to
verify it. Dick's
work on a
generalisation of
string theory known
as
brane theory
might provide just
such a prediction,
and resolve the
cosmological
principle dilemma to
boot.
"At the heart of
brane theory is the
idea that what we
perceive as our
universe is a single
four dimensional
membrane floating in
a sea of similar
branes spanning
multiple extra
dimensions. Such an
idea is not
inconsistent with
our established
theory of gravity,
says Dick, as 'you
can add infinitely
large extra
dimensions and still
get back general
relativity'. Although the other
branes occupy extra
dimensions, and so
would be impossible
to observe directly,
the theory suggests
we might just be
able to spot the
effects of a
neighbouring brane
overlapping with
ours.
"So how does this
help with the
problem of the
cosmological
principle? Well, in
order to measure our
distance to far-off
objects, astronomers
exploit an effect
known as
redshift.
They break down the
light from the
object using a
spectrometer -- a
fancy version of a
prism -- to reveal
bands known as
spectral lines. Any
object moving away
from us because of
the universe's
ongoing expansion
will have its light
stretched out to
longer, redder
wavelengths and the
lines will appear
shifted towards the
red end of the
spectrum. The
further away the
object, the faster
it will appear to
recede and the more
the lines will
shift. If
astronomers see many
objects all
exhibiting the same
redshift, they will
interpret that as
some form of
structure, just like
the GRB ring or the
huge quasar group.
"Except, looking
into a region where
another brane is
overlapping with our
own might skew our
redshift
measurements. Under
these conditions,
photons in one brane
would exert a force
on charged particles
in another -- a
phenomenon Dick
calls brane
crosstalk. 'This
would change the
distance between the
energy levels within
hydrogen atoms in
the overlap region,'
he says. Electrons
moving between these
energy levels either
emit or absorb
photons, producing
the spectral lines
we rely on for
working out their
distance from Earth.
"But if brane
crosstalk were to
narrow the
energy-level gap,
this would produce
photons of a
slightly longer
wavelength -- a
redshift that has
nothing to do with
the expansion of the
universe. If you
fail to take this
into account, and
assume the overall
redshift you measure
is solely the result
of distance, then
you will
systematically
overestimate how far
away an object in
the overlap region
actually is, with
large swathes of
empty space visible
in its true
location....
"If such a model
held true, areas of
brane overlap would
produce an apparent
pile-up of objects
at one redshift and
a distinct lack of
objects at another
-- an optical
illusion that would
make a homogeneous
universe appear to
contain massive
structures and
enormous voids. In a
stroke, this would
explain the origins
of the quasar group
and the GRB ring as
well as the
supervoid, says
Dick. 'These
structures match the
potential signal of
brane crosstalk.'
"Of course, it's
hardly an
open-and-shut case.
'There are many
assumptions that one
must accept in order
for this to happen,
and some of them may
just be taking
things a bit too
far,' says Moataz
Emam from the State
University of New
York College at
Cortland. Emam also
warns that some of
the assumptions
about gravity that
Dick's theory relies
on have been
severely criticised
in the past, not
least by string
theorists who have
had difficulty
reconciling them
with their
calculations. 'But
his model is
certainly testable,'
he says.
"Emam suggests that
the necessary
evidence could be
found by observing
parts of the sky
where high density
regions coexist next
to apparent barren
patches. Provided
the discrepancy in
redshift
measurements is
identical in all
cases, it might well
suggest that our
brane is overlapping
with another.
With the help of
the
Sloan Digital Sky
Survey
(SDSS) -- the most
detailed
three-dimensional
map of the universe
ever made -- Dick is
now planning to
scour the databases
for redshift data
that could support
his theory. 'That
really would be
compelling evidence
that our universe is
not alone,' he says.
Such a discovery
would not only
explain away some of
the most perplexing
observations in
astronomy, but give
the abstract field
of string theory a
tantalising
experimental
foundation.
"But his quest to
cut the universe's
largest objects down
to size might lead
to new monsters
arising in their
place. The
discovery of branes
beyond our own, for
instance, would pose
a serious challenge
to humanity's
fragile sense of its
place in the cosmos,
and make a nonsense
of our concept of
cosmic homogeneity.
In a vast multiverse
of interacting
membranes, the
cosmological
principle might not
be worth saving
after all."
[Quoted from
here;
accessed 13/11/2015.
Quotation marks
altered to conform
with the conventions
adopted at this
site. Bold emphases
and some links
added. Several
paragraphs merged to save space.]
"We must be missing something.
The universe is expanding 9 per cent
faster than it should be. Either our best measurements are
wrong, or a glimmer of new physics
is peeking through the cracks of
modern cosmology.
"If that's the case, some
lightweight, near-light-speed
particles may be missing from our
picture of the universe shortly
after the big bang. But we might be
in luck. Particle physicists have
already spent over a decade chasing
something that fits the bill:
ghostly
neutrinos
unlike the three already known. For
a cosmological quandary, the issue
isn't that complicated: two ways
of measuring how quickly the
universe is flying apart are coming
up with increasingly different
numbers. The first looks at
dimples in the cosmic microwave
background, a glow left behind by
the hot, soupy universe just a few
hundred thousand years after the big
bang. The size of these fluctuations
let us calculate how quickly the
universe was expanding when it began
some 13.7 billion years ago.
"The
other method measures how distant
galaxies appear to recede from us as
the universe expands -- which led to
the discovery of dark energy (sic --
RL!), a mysterious outward pressure
pushing the universe apart. The
trouble comes when we compare the
two estimates. 'They don't agree,'
says Adam Riess of the Space
Telescope Science Institute in
Baltimore, Maryland, one of the
recipients of the 2011 Nobel prize
in physics for dark energy's
discovery and an author of a new
paper pointing out the tension (arxiv.org/abs/1604.01424).
"So what are we missing? Our picture
of what the universe is made of
can't change much, since it agrees
so well with observations. These
show that the history of the
universe has been a balancing act
between just a few ingredients,
which competed for dominance as the
universe stretched and changed. This
model of the cosmos has been the
mainstream idea for years, but it's
showing signs of strain. 'We've
given these really smart kids, the
young cosmologists, what we thought
was a pretty good toy, and now
they're trying to break it,' says
Michael Turner at the University of
Chicago. 'Maybe they have.' Would tweaking the ingredients
themselves help make sense of the
difference?
"One possibility is that dark energy
is a little stronger than we
thought. Or it could have ramped up
over time, giving expansion a bigger
push. That's not a very appealing
theory, though, says Avi Loeb of
Harvard University. The measured
strength of dark energy is already a
'big headache', he says. Letting it
vary in time would add another,
perhaps unjustifiable, wrinkle.
'That would be twice as much pain,'
Loeb says.
"But the deeper problem with
darkening dark energy is that it
doesn't do enough to bridge the gap
between the ancient and modern
measurements. Fiddling with
dark energy enough to help would put
it into disagreement with other
observations. 'You can only do
this so much,' Riess says. The
easiest solution, says Riess, is
dark radiation: small, unknown
particles similar to neutrinos,
moving close to the speed of light
around the beginning of time. This
is the period when effects from
undiscovered particles would have
been felt most strongly.... In our
current understanding, as the
universe expanded, dark energy
filled the space formed, with matter
becoming more dilute. Through a war
of attrition, the outward-pushing
dark energy came to dominate matter.
"Weaker
brakes
"But if some mass was trapped in
light, fast-moving particles, dark
energy would have won even more
quickly. That's because as the
universe expanded, stretching space
would have shifted the particles to
lower energies, weakening their
pull. Adding this ingredient into
the standard account of the early
universe could bring the modern and
primitive expansion rates back in
line -- not because the foot on the
accelerator was heavier than
expected, but because back then the
brakes were a little weaker.
"There may be a chance that we have
already glimpsed a dark radiation
particle. For years, we have seen
hints of so-called 'sterile'
neutrinos, which would interact with
gravity and the three known
neutrinos, but little else.
Vexingly, measurements rule out the
simplest version of sterile
neutrinos as our missing particle.
But there may be room for something
stranger still.
"'Let's say these neutrinos are not
truly sterile,' says Alexander
Friedland at the Stanford Linear
Accelerator in California. 'They
have their own interactions, and
they are part of some hidden sector
-- some world which exists right
under our noses but interacts with
our world extremely weakly.' If so, such neutrinos could be the
missing ingredient. And through
neutrino experiments and ever-better
studies of the early universe, we
might know within the next decade if
a hidden sector of particles offers
a way out.
"'This is where we are,' Friedland
says. 'There are hints, and they
will be tested.' [New Scientist,
11/06/2016, pp.8-9. Quoted from
here;
accessed 12/06/2016. Quotation marks
altered to conform with the
conventions adopted at this site.
Several paragraphs merged to save
space. The on-line article and the
published version have different
titles. Bold emphases and one link
added. Several paragraphs merged to
save space.]
It seems that when cosmologists
encounter a problem they just invent
another particle -- a bit like
astronomers invent a new planet
whenever they observe an unexpected
wobble in another planet's orbit (which
does not always turn out to be a wise
move), and bit like
theoretical physicists help
themselves to a new 'dimension'
whenever they encounter problems in
M-theory.