Category Archives: Science

The evolutionary secrets of garden flowers described at Birkbeck’s Science Week

This post was contributed by Tony Boniface a member of the University of the Third Age.

Science Week logo

Science Week logo

On 3 July, Dr Martin Ingrouille, of Birkbeck’s Department of Biological Sciences, began his talk by pointing out that Darwin had studied plants for 40 years and had published books on pollination. However, Darwin knew nothing of genes and chromosomes and could not explain the rapid origin of flowering plants in the Cretaceous period.

Dr Ingrouille continued by emphasising that garden plants are sterile and exotic plants without their natural pollinators. They have been selected for showiness, many being artificial hybrids. He referred to Goethe, who stressed the essential unity of floral parts, which have all evolved from leaves.

Dr Ingrouille explained how genetic control, in its simplest form, consists of three classes of genes: A, B and C. Class A genes control sepals and petals, class B genes control petals and stamens, and class C genes control stamens and carpels. Mutations of these genes result in parts being converted into others.

Floral evolution in plants could have been the result of duplication of basic genes allowing one to perform its normal function while the other could give rise to a novel structure or function. New plant species have often arisen by chromosome doubling in a sterile hybrid as seen in the formation of Primula kewensis.

Dr Ingrouille then explained how much insight into plant evolution arose from the work of John Gerard (gardener to William Cecil), John Ray (author of the first modern text book of botany) and the Jussieus family (three generations of gardeners to the king of France). These people put plants into groups that were the first natural classification of the angiosperms.

Now DNA sequencing has resulted in a detailed understanding of the phylogeny or evolutionary history of these plants in which many of the families have survived such as the umbellifers and legumes but some such as the figwort family have been split. The result was the arrangement of the plants into two main groups namely the Eudicots, with three  grooves on their pollen grains, and the Basal Angiosperms, with only one groove. Within the Eudicots are the Core Eudicots including the Rosids and the Asterids whilst the Monocots are within the Basal Angiosperms. The first ancestor was Amborella trichopoda, a weedy shrub from New Caledonia in the Pacific – a place Dr Ingrouille hopes to visit on his retirement.

Dr Ingrouille finished  by urging his audience – all members of the University of the Third Age (a movement for retired and semi-retired people to come together and learn together) – to examine their garden plants in detail to look for the variations, which suggest their origins.

Exploring the hidden complexities of routine behaviour at Birkbeck’s Science Week

This post was contributed by Guy Collender, Communications Manager, Birkbeck’s Department of External Relations.

Dr Richard Cooper

Dr Richard Cooper at Birkbeck’s Science Week

How often do you forget to attach the relevant document when you are sending emails? When was the last time you accidentally put the coffee in the fridge instead of the milk? Or, more alarmingly, when did you last leave the nozzle of the petrol pump in your car when you drove off from the petrol station? (Yes, believe it or not, there is ample photographic evidence to prove the last point).

Such errors, made during routine tasks, were the centre of attention at a fascinating lecture, entitled The hidden complexities of routine behaviour, during Birkbeck’s Science Week. Dr Richard Cooper explained why it is important to understand routine behaviour, why mistakes are made during everyday tasks, and the implications for the rehabilitation of brain-damaged patients.

Benefits of routine behaviour
The presentation on 3 July began with a description of routine behaviour and its advantages. Dr Cooper, of Birkbeck’s Department of Psychological Sciences, defined routine tasks, such as dressing, grooming, preparing meals, and cleaning, as frequently performed tasks carried out in a stable and predictable environment. By automatically performing various stages in a routine task, people do not have to plan every action on a moment-by-moment basis. This, as Dr Cooper showed, saves the mental exertion associated with constant planning, and enables the brain to think about other things when performing routine tasks.

Difficulties associated with routine tasks
However, routine tasks are prone to error, especially following an interruption, and these mistakes may have “catastrophic consequences”, including vehicle collisions and industrial accidents. Dr Cooper said: “Routine behaviour is not something we can take for granted.”

The lecture continued with a list of different types of errors made while performing routine tasks. These include omission errors (leaving out a vital task), perseverative errors (repeating an action even though the goal has been achieved), and substitution errors (mixing up objects).

Dr Cooper showed how people with brain injuries are much more prone to making these mistakes. He said: “Neurological patients can have a much more difficult time.” They can suffer from a range of problems, including anarchic hand syndrome (where one hand performs involuntary movements), frontal apraxia (which leads to patients making sequential errors and substitution errors on a minute-by-minute basis), or ideational apraxia (which leads to the right action, but wrong place – such as trying to light the wrong end of a candle).

Devising solutions
Dr Cooper also referred to studies of brain-damaged patients in rehabilitation clinics and their performance of routine tasks in a controlled environment. He said: “Re-learning must focus on rote learning of the precise procedure, with no variation. Home environments should be designed to minimise distractions.”

Dr Cooper also hinted at future developments in this field as smart devices might be able to monitor the performance of routine tasks for certain errors. Hopefully the latest technology will be able to help reduce everyday problems in the years ahead.

Mindfulness Meditation Training

This post was contributed by Lucia Magis-Weinberg, who is doing her PhD under the supervision of Dr Dumontheil and Dr Custers, investigating how motivation impacts adolescents’ executive functions (which include self-regulation and attention). Learn more about Dr Dumontheils’ research. Follow us on Twitter @idumontheil and @luciamawe

PsychologyInhale. Exhale. Focus your attention on the present moment. Mindfulness meditation (MM) is a type of awareness that involves focusing on moment to moment experiences in a non-judgmental and non-reactive way. MM was adopted from the Buddhist tradition, and was originally implemented in Western medicine for the treatment of chronic intractable pain. In adults, it has been shown to improve people’s ability to manage attention, regulate emotion, well-being, and ameliorate anxiety and depression. It even boosts immune function. But can these benefits be extended to other age groups?

This was discussed as part of the Birkbeck Science Week by Dr Iroise Dumontheil from the Department of Psychological Sciences, who talked about her ongoing research on the effects of mindfulness meditation training (MMT) in adolescence. The teenage years are characterised by continued improvements in self-regulation, the ability to exert voluntary control on thought, emotion and action. A deficit in self-regulation results in impaired impulse control and increased sensation seeking and risk taking. Furthermore, adolescents can struggle with the regulation of emotions. Failures in self-regulation can have a bigger impact in decisions and behaviour in the teenage years than later in life, as is evident by the alarmingly high rates of death by accidents and violence, two preventable issues, in the second decade of life. Around 75% of mental disorders have an onset before the age of 24. All of these issues could benefit from enhancing the ability of adolescents to self-regulate. Can MMT be one of the ways? Dr Dumontheil’s ongoing research, conducted in collaboration with UCL and the University of Minnesota, is starting to address this question and is motivated by the impact that interventions could have on adolescent well-being.

There is some initial promise from early research in the adolescent population. It has been shown that MMT is feasible in adolescence, particularly because it could be done in schools. MMT seems to benefit performance in tasks that involve a negative emotional component (being less distracted by angry faces, for example). Additionally, there is evidence that anxiety is decreased. The positive effects of MMT on increased attentional control are similar to those seen previously in adult studies.

Currently, Dr Dumontheil is looking at changes in brain function in response to an 8-week MMT with fMRI, a neuroimaging technique. In adults, it has been shown previously that MMT reinforces self- regulation by targeting the ability to control thought and action (associated with increased activity in the prefrontal regions of the brain) and lessening the influence of anxiety, stress and immediate reactivity (associated with decreased activity in the amygdala). Preliminary data from Dr Dumontheil’s research show changes in the brain regions that control attention.

As was noted by attendees of the lecture, there are very interesting questions yet to be explored, such as gender differences in the response to MMT and whether variants of the standard MMT may be more successful in adolescents and children. While research in psychology and neuroscience shed light on this interesting phenomenon, let’s reorient our attention to our present moment for now. Inhale. Exhale.

The Many Uses of Bioinformatics

This post was contributed by Dr Clare Sansom, Senior Associate Lecturer in Birkbeck’s Department of Biological Sciences

Dame Janet Thornton

Dame Janet Thornton

Every year, Birkbeck hosts a lecture by a distinguished scientist to honour the memory of the founder of its Crystallography Department, J.D. Bernal. “Sage” as he was called by all who worked with him had an enormous range of research interests spanning both science and society; he is widely considered one of the most brilliant scientists never to have won a Nobel Prize. The 2014 Bernal Lecture, held on March 27, was given by Professor Janet Thornton, the director of the European Bioinformatics Institute (EBI) at Hinxton near Cambridge.

Introducing the lecture Professor David Latchman, Master of Birkbeck, described it as a unique occasion: the only time he has introduced as a guest lecturer someone who he had interviewed for a job. Thornton includes both Birkbeck and UCL on her CV: appropriately, her last post in London was that of Bernal Professor, held jointly at both colleges. She moved on to “even greater heights” as director of one of Europe’s top bioinformatics institutions in 2003.

Thornton began her lecture with a quote from Bernal: “We [academics] can go on being useless up to a point, with confidence that sooner or later some use will be found for our studies”. That quote is of particular relevance to the subject that she has made her own: bioinformatics. She had already begun her research career in 1977, when Fred Sanger invented the process that was used to obtain the DNA sequence of the human genome. That endeavour, which was completed in 2003, took over ten years and cost billions of dollars. Sequencing a human-sized genome, which has about 3 billion base pairs of DNA, now takes maybe 10 minutes and costs about a thousand dollars. While a decade ago we had one “Human Genome”, we now have lots. Mega-sequencing projects already planned or in progress include projects to sequence about 8,000 Finns, and the entire 50,000 population of the Faeroe Islands; one to sequence paired tumour and normal genomes from 20,000 cancer patients; and the UK10K project, which is investigating the genetic causes of rare diseases.

It is now almost extraordinarily simple and cheap to obtain genomic data, but real challenges remain in interpreting and understanding it so that it can be used in medicine. This is the province of bioinformatics, and Thornton devoted much of her presentation to explaining five ways in which gene (and protein) sequence information is being applied to both basic and clinical medical research:

1)      Understanding the molecular basis of disease

2)      Investigating differences in disease risk caused by human genetic variation

3)      Understanding the genomics of cancer

4)      Developing drugs for infectious diseases, including neglected diseases

5)      Investigating susceptibility to infectious disease

There are rather more than 20,000 genes in the human genome, far fewer than were originally predicted. Tiny differences between individuals in many of these either directly cause a genetic disorder or confer an increased – or in some cases decreased – risk of developing a disease. The genetic causes of some diseases, such as the bleeding disorder haemophilia, were known many years before the “genome era”: others have been discovered more recently. Mapping known mutations onto the structure of the enzyme copper, zinc superoxide dismutase has revealed the cause of the inherited disorder amyotrophic lateral sclerosis, a form of motor neurone disease. And knowing the genome sequence has already made an enormous contribution to our understanding of the mechanisms of disease development, contributing to improvements in diagnosis and the design of novel drugs.

We now understand that cancer is a genetic disease: it arises when mutations in a group of cells cause them to grow and divide excessively. A cancer is no longer classified just by its location (for example, a breast or lung cancer) but by the particular spectrum of genetic variations in its cells. About 500 different genes are known to be mutated in cancer, some much more often than others. For example, about 60% of cases of melanoma, a type of skin cancer, contain one specific mutation in the gene BRAF. This codes for a protein that can direct cells to grow and divide, and the cancer-causing mutation sticks this protein into the ON position, so this signal is always sent. Scientists in a company called Plexxicon used their knowledge of this mutation and the structure of the protein to design a drug, vemurafenib, which prevents the BRAF protein from signalling. This can cause a dramatic, if short-term improvement in melanoma patients, but, crucially, it only works in patients whose cancers carry this mutation. It is one of the first developed examples of a “personalised medicine” that is only used alongside a diagnostic test for a genetic variation. There will soon be many more.

Genomics is also proving very useful in the fight against infectious disease. Antibiotic resistance is one of the greatest emerging threats to human health, and scientists have to use all the tools at their disposal, including genomics and bioinformatics, as they try to stay one step ahead of rapidly mutating pathogens. Sequencing is widely used to track the sources of outbreaks of infection and of resistant bacteria such as methicillin-resistant Staphylococcus aureus (MRSA) in hospitals, and it is the only way of determining the exact nature of an infection.  One of the most dramatic examples of the use of genomics in infectious disease control occurred in 2011, when a novel strain of E. coli O104 caused about 4,000 cases of serious food-borne illness and 50 deaths in Germany. This was originally linked to cucumbers imported from Spain but a global effort to trace its specific sequence variants proved that the source of the infection was beansprouts grown on a farm near Hamburg.

There was much more to Thornton’s wide-ranging lecture than simply bioinformatics and medicine: more, indeed, than it is possible to do justice to in a single blog post. She went on to describe some of the benefits of genomics for agriculture and food security. These included designing new strategies for controlling pests and diseases, maximising the efficiency of biomass processing, and even managing biodiversity. It is necessary to measure biodiversity in order to manage it properly; it is now possible to define a short stretch of DNA sequence that fully identifies a species or sub-species (a so-called “DNA barcode”) and these are beginning to be used to track some very diverse organisms, including the 400,000 known species of beetle.

The lecture ended with a short discussion of some of the challenges facing bioinformatics and genomics in the second decade of this century, largely relating to difficulties with storing, manipulating and understanding the enormous quantity of data that is being generated. Mining this data mountain for the benefit of mankind is a task that is beyond either the academic community or the biotech industry alone. It will require novel ways of doing science that involve governments and charities as well as academia and industry. The new Centre for Therapeutic Target Validation, launched at Hinxton on the same day as Thornton’s Bernal Lecture, is a pioneering example of such a partnership. It has been set up by the EBI, the Sanger Institute where a third of the original human genome sequence was obtained, and pharmaceutical giant GSK, and its scientists aim to use the whole range of available genomic data to select and evaluate new targets for novel drugs.

A podcast of the 2014 Bernal Lecture is available now.