Reading Without Seeing

By Melissa Wright

When the seeing brain goes blind

In the late 90’s, a blind 63-year old woman was admitted to a university hospital emergency room. After complaining to co-workers of light-headedness that morning, she had collapsed and become unresponsive. Within the 48 hours following her admission, after what was found to be a bilateral occipital stroke, she recovered with no apparent motor or neurological problems. It was only when she tried to read that an extraordinary impairment became apparent: despite the damage only occurring in the occipital lobe, which is typically devoted to vision, she had completely and specifically lost the ability to read Braille. Braille is a tactile substitution for written letters, consisting of raised dots that can be felt with the fingertips. Before this, she had been a proficient Braille reader with both hands, which she had used extensively during her university degree and career in radio (Hamilton, Keenan, Catala, & Pascual-Leone, 2000). So what happened?

The Visual Brain

It is estimated that around 50% of the primate cortex is devoted to visual functions (Van Essen, Anderson, & Felleman, 1992), with the primary visual areas located right at the back of the brain within the occipital lobe (also known as the visual cortex). Visual information from the retina first enters the cortex here, in an area named V1. Within V1, this information is organised to reflect the outside world, with neighbouring neurons responding to neighbouring parts of the visual field. This map (called a retinotopic map) is biased towards the central visual field (the most important part!) and is so accurate that researchers have even managed to understand which letters a participant is reading, simply by looking at their brain activity (Polimeni, Fischl, Greve, & Wald, 2010). These retinotopic maps are found in most visual areas in some form. As information is passed forward in the brain, the role of these visual areas becomes more complex, from motion processing, to face recognition, to visual attention. Even basic visual actions, like finding a friend in the crowd, requires a hugely complex chain of processes. With so much of the cortex devoted to processing visual information, what happens when visual input from the retina never occurs? Cases such as the one above, where a person is blind, suggest that the visual cortex is put to use in a whole new way.

Cortical Changes

In sighted individuals, lexical and phonological reading processes activate frontal and parietal-temporal areas (e.g. Rumsey et al., 1997), while touch involves the somatosensory cortex. It was thought that braille reading activated these areas, causing some reorganisation of the somatosensory cortex. However, as the case above suggests, this does not seem to be the whole story (Burton et al., 2002). Remember, in this instance, the unfortunate lady had  damage to the occipital lobe, which is normally involved in vision, but as the lady was born blind it had never received any visual information. Although you might expect that damage to this area would not be a problem for someone who is blind, it turned out instead to impair abilities associated with language and touch! This seriously went against what scientists had understood about brains and their specialised areas, and had to be investigated.

Neuroimaging, such as functional Magnetic Resonance Imaging (fMRI), allows us to look inside the brain and see what area is activated when a person performs a certain task. Using this technique, researchers have found that in early blind individuals, large portions of the visual cortex are recruited when reading Braille (H. Burton et al., 2002). This activity was less apparent for those who became blind in their later years, though was still present, and it wasn’t there at all for sighted subjects. That the late-blind individuals had less activity in this region seems to show that as we get older and as brain regions become more experienced, they become less adaptable to change. A point to note however – fMRI works by correlating increases in blood oxygen (which suggests an increase in energy demand and therefore neural activity) with a task, such as Braille reading. As any good scientist will tell you, correlation doesn’t equal causation! Perhaps those who cannot see are still somehow ‘visualising’ the characters?

So is there any other evidence that the visual areas can change their primary function? Researchers have found that temporarily disrupting the neural activity at the back of the brain (using a nifty technique called Transcranial Magnetic Stimulation) can impair Braille reading, or even induce tactile sensations on the reading fingertips (e.g. Kupers et al., 2007; Ptito et al., 2008)!

Other fMRI studies have investigated the recruitment of the occipital lobe in non-visual tasks and found it also occurs in a variety of other domains, such as in hearing (e.g. Burton, 2003) and working memory (Harold Burton, Sinclair, & Dixit, 2010). This reorganisation seems to have a functional benefit, as researchers have found that the amount of reorganisation during a verbal working memory task is correlated with performance (Amedi, Raz, Pianka, Malach, & Zohary, 2003). As well, it has been reported that blind individuals can perform better on tasks such as sound localisation (though not quite as good as Marvel’s Daredevil!) (Nilsson & Schenkman, 2016).

But Is It Reorganisation?

This is an awesome example of the ability of the brain to change and adapt, and this seems true also for areas that are so devoted to one modality. How exactly this happens is still unknown, and could fill several reviews on its own! One possibility is that neuronal inputs from other areas grow and invade the occipital lobe, although this is difficult to test non-invasively in humans because we can’t look at individual neurons with an MRI scan. The fact that much more occipital lobe activity is seen in early-blind than late-blind individuals (e.g. H. Burton et al., 2002) suggests that whatever is changing is much more accessible to a developing brain. However, findings show that some reorganisation can still occur in late-blind, and even in sighted individuals who undergo prolonged blindfolding or sensory training (Merabet et al., 2008). This rapid adaptation suggests that the mechanism involved may be making use of some pre-existing multi-sensory connections that multiply and reinforce following sensory deprivation.

Cases of vision restoration in later life are rare, but one such example came from a humanitarian project in India, which found and helped a person called SK (Mandavilli, 2006). SK was born with Aphakia, a rare condition in which his eye developed without a lens. He grew up near blind, until the age of 29 when project workers gave him corrective lenses. 29 years with nearly no vision! Conventional wisdom said there was no way his visual cortex could have developed properly, having missed the often cited critical period that occurs during early development. Indeed, his acuity (ability to see detail, tested with those letter charts at the optometrists) showed initial improvement after correction, but this did not improve over time suggesting his visual cortex was not adapting to the new input. However, they also looked at other forms of vision, and there they found exciting improvements. For example, when shown a cow, he was unable to integrate the patches of black and white into a whole until it moved. After 18 months, he was able to recognise such objects even without movement. While SK had not been completely without visual input (he had still been able to detect light and movement), this suggests that perhaps some parts of the visual cortex are more susceptible to vision restoration. Or perhaps multi-sensory areas, that seem able to reorganise in vision deprivation, are more flexible to regaining vision?

So Much left to Find Out!

From this whistle-stop tour, the most obvious conclusion is that the brain is amazing and can show huge amounts of plasticity in the face of input deprivation (see the recent report of a boy missing the majority of his visual cortex who can still see well enough to play football and video games; https://tinyurl.com/yboqjzlx). The question of what exactly happens in the brain when it’s deprived of visual input is incredibly broad. Why do those blind in later life have visual hallucinations (see Charles Bonnet Syndrome)? Can we influence this plasticity? What of deaf or deaf-blind individuals? Within my PhD, I am currently investigating how the cortex reacts to another eye-related disease, glaucoma. If you want to read more on this fascinating and broad topic, check out these reviews by Merabet and Pascual (2010), Ricciardi et al. (2014) or Proulx (2013).

Edited by Chiara & Sam

References:

Amedi, A., Raz, N., Pianka, P., Malach, R., & Zohary, E. (2003). Early ‘visual’ cortex activation correlates with superior verbal memory performance in the blind. Nature Neuroscience, 6(7), 758–766. https://doi.org/10.1038/nn1072

Burton, H. (2003). Visual cortex activity in early and late blind people. The Journal of Neuroscience: The Official Journal of the Society for Neuroscience, 23(10), 4005–4011.

Burton, H., Sinclair, R. J., & Dixit, S. (2010). Working memory for vibrotactile frequencies: Comparison of cortical activity in blind and sighted individuals. Human Brain Mapping, NA-NA. https://doi.org/10.1002/hbm.20966

Burton, H., Snyder, A. Z., Conturo, T. E., Akbudak, E., Ollinger, J. M., & Raichle, M. E. (2002). Adaptive Changes in Early and Late Blind: A fMRI Study of Braille Reading. Journal of Neurophysiology, 87(1), 589–607. https://doi.org/10.1152/jn.00285.2001

Fine, I., Wade, A. R., Brewer, A. A., May, M. G., Goodman, D. F., Boynton, G. M., … MacLeod, D. I. A. (2003). Long-term deprivation affects visual perception and cortex. Nature Neuroscience, 6(9), 915–916. https://doi.org/10.1038/nn1102

Hamilton, R., Keenan, J. P., Catala, M., & Pascual-Leone, A. (2000). Alexia for Braille following bilateral occipital stroke in an early blind woman. Neuroreport, 11(2), 237–240.

Kupers, R., Pappens, M., de Noordhout, A. M., Schoenen, J., Ptito, M., & Fumal, A. (2007). rTMS of the occipital cortex abolishes Braille reading and repetition priming in blind subjects. Neurology, 68(9), 691–693. https://doi.org/10.1212/01.wnl.0000255958.60530.11

Mandavilli, A. (2006). Look and learn: Visual neuroscience. Nature, 441(7091), 271–272. https://doi.org/10.1038/441271a

Merabet, L. B., Hamilton, R., Schlaug, G., Swisher, J. D., Kiriakopoulos, E. T., Pitskel, N. B., … Pascual-Leone, A. (2008). Rapid and Reversible Recruitment of Early Visual Cortex for Touch. PLoS ONE, 3(8), e3046. https://doi.org/10.1371/journal.pone.0003046

Merabet, L. B., & Pascual-Leone, A. (2010). Neural reorganization following sensory loss: the opportunity of change. Nature Reviews Neuroscience, 11(1), 44–52. https://doi.org/10.1038/nrn2758

Nilsson, M. E., & Schenkman, B. N. (2016). Blind people are more sensitive than sighted people to binaural sound-location cues, particularly inter-aural level differences. Hearing Research, 332, 223–232. https://doi.org/10.1016/j.heares.2015.09.012

Park, H.-J., Lee, J. D., Kim, E. Y., Park, B., Oh, M.-K., Lee, S., & Kim, J.-J. (2009). Morphological alterations in the congenital blind based on the analysis of cortical thickness and surface area. NeuroImage, 47(1), 98–106. https://doi.org/10.1016/j.neuroimage.2009.03.076

Polimeni, J. R., Fischl, B., Greve, D. N., & Wald, L. L. (2010). Laminar analysis of 7T BOLD using an imposed spatial activation pattern in human V1. NeuroImage, 52(4), 1334–1346. https://doi.org/10.1016/j.neuroimage.2010.05.005

Proulx, M. (2013, February). Blindness: remapping the brain and the restoration of vision. Retrieved 28 March 2018, from http://www.apa.org/science/about/psa/2013/02/blindness.aspx

Ptito, M., Fumal, A., de Noordhout, A. M., Schoenen, J., Gjedde, A., & Kupers, R. (2008). TMS of the occipital cortex induces tactile sensations in the fingers of blind Braille readers. Experimental Brain Research, 184(2), 193–200. https://doi.org/10.1007/s00221-007-1091-0

Ricciardi, E., Bonino, D., Pellegrini, S., & Pietrini, P. (2014). Mind the blind brain to understand the sighted one! Is there a supramodal cortical functional architecture? Neuroscience & Biobehavioral Reviews, 41, 64–77. https://doi.org/10.1016/j.neubiorev.2013.10.006

Rumsey, J. M., Horwitz, B., Donohue, B. C., Nace, K., Maisog, J. M., & Andreason, P. (1997). Phonological and orthographic components of word recognition. A PET-rCBF study. Brain: A Journal of Neurology, 120 ( Pt 5), 739–759.

Van Essen, D. C., Anderson, C. H., & Felleman, D. J. (1992). Information processing in the primate visual system: an integrated systems perspective. Science (New York, N.Y.), 255(5043), 419–423.

It’s Fright Night!

By Lucy Lewis

Halloween is the time for all things scary, and the best thing for scaring people is to put on a good horror movie, preferably something with creepy dark figures appearing suddenly after a long, tense build up. But what is it about these films that gets us so freaked out?

Fear is an integral part of our survival mechanisms, it is a part of the ‘fight or flight’ response to prepare us for combating or escaping from a potential threat. Deep down, you know the demons from Insidious aren’t real, and the clown from It isn’t going to appear behind the sofa, and yet we jump at their appearance on screen, hide behind pillows when the music goes tense and even continue thinking about them once the film is finished.

Halloween

Scene from ‘A Nightmare on Elm Street’ (1984).

This is because horror movies tap in to this innate survival mechanism by presenting us with a situation/character that looks like a threat. When we see something potentially threatening, this activates a structure in the brain known as the amygdala, which associates this threat with something to be fearful of (Adolphs 2013). The amygdala acts as a messenger to inform several other areas of the brain that there is a threat present, thus resulting in multiple physiological changes, including the release of adrenaline (Steimer, 2002) which causes the typical bodily functions we experience during fear, e.g. increased heart rate.

Even though we know the movie isn’t real, the amygdala takes over our logical reasoning to produce these responses, a phenomenon known as the “amygdala hijack” (Goleman, 1995).  This is when the amygdala stimulates these physiological changes to prepare us against this fearful threat before the prefrontal cortex of the brain, responsible for executive functions, can assess the situation and regulate our reactions accordingly (Steimer, 2002).

In fact, studies looking into fear conditioning, which is when associations are made between a neutral stimulus like a clicking sound and an aversive stimulus like an electric shock (Gilmartin et al 2015), have shown that inactivation of the medial prefrontal cortex results in a deficit in the ability to disconnect this association, suggesting that the prefrontal cortex is necessary for the extinction of a fear memory (Marek et al, 2013).

Halloween2

A schematic diagram of the fear pathway, adapted from a VideoBlocks image.

You’ve probably noticed that you can also remember something that scared you so much better than even remembering what you had for breakfast. Forming memories of the events that cause us to feel fear is important for our survival mechanisms, because it helps us to avoid threats in the future. Evidence suggests that the stress hormones that are released following exposure to fearful events contribute to the consolidation of memories, for example Okuda et al (2004) gave rats an injection of corticosterone (the major stress hormone for rodents) immediately following one training session of a task requiring the animals to remember objects presented to them in an arena. They showed that, 24 hours later, the animals that received this injection showed significantly greater recognition of these objects than controls, suggesting that this stress hormone increased their ability to remember the objects around them. Therefore, the scarier the film, the more we remember it.

So, don’t worry if you are a big scaredy-cat when it comes to horror films, it’s actually natural! It’s just your brains response to a threatening situation, and you can’t control it no matter how hard you try. But if you are planning a film night full of ghosts and ghouls this Halloween, go and get yourself a big pillow to hide behind and prepare for that good old amygdala to get into action!

Edited By Lauren & Sophie

References

Adolphs, R. 2013. The Biology of Fear. Current Biology, 23:R79-R93.

Gilmartin, M. R., Balderston, N. L., Helmstetter, F. J. 2015. Prefrontal cortical regulation of fear learning. Trends in Neuroscience, 37:455-464.

Goleman, D. 1995. Emotional Intelligence: Why It Can Matter More Than IQ. New York: Bantam Books.

Marek, R., Strobel, C., Bredy, T. W., Sah, P. 2013. The amygdala and medial prefrontal cortex: partners in the fear circuit. Journal of Physiology, 591: 2381-2391.

McIntyre, C. K. and Roozendaal, B. 2007. Adrenal Stress Hormones and Enhanced Memory for Emotionally Arousing Experiences. In: Bermudez-Rattoni, F. Neural Plasticity and Memory: From Genes to Brain Imaging. Boca Raton (FL):CRC Press/Taylor & Francis, Chapter 13.

Okuda, S., Roozendaal, B., McGaugh, J. L. 2004. Glucocorticoid effects on object recognition memory require training-associated emotional arousal. Proceedings of the National Academy of Science, USA, 101:853-858.

Steimer, T. 2002. The biology of fear- and anxiety-related behaviors. Dialogues in Clinical Neuroscience, 4:231-249.

VideoBlocks, AeMaster still photo from: 3D Brain Rotation with alpha. https://www.videoblocks.com/video/3d-brain-rotation-with-alpha-4u23dfufgik3slexf

Square eyes and a fried brain, or a secret cognitive enhancer- how do video games affect our brain?

By Shireene Kalbassi

If, like me, you spent your childhood surrounded by Gameboys and computer games, you have probably heard warnings from your parents that your eyes will turn square, and that your brain will turn to mush. While we can safely say that we are not suffering from an epidemic of square-eyed youths, it is less clear what gaming is doing to our brain.

In the support of worried parents all around the world, there is a disorder associated with gaming. Internet gaming disorder is defined as being an addictive behaviour, characterised by an uncontrollable urge to play video games. In 2013, internet gaming disorder was added to the Diagnostic and Statistical Manual of Mental Disorders (DSM), with a footnote, saying that more research on the matter is needed[i]. Similarly, in 2018, the world health organisation (WHO) included internet gaming disorder to the section ‘disorders due to addictive behaviours’[ii].

There is evidence to suggest that internet gaming does lead to changes in brain regions associated with addiction. Structurally, it has been shown that individuals diagnosed with internet gaming disorder show an increase in the size of a brain region known as the striatum, a region associated with pleasure, motivation, and drug addiction (Cai et al 2016[iii], Robbins et al 2002[iv]). The brains of those with internet gaming disorder also show altered responses to stimuli related to gaming. In one study, two groups of participants were assessed: one with internet gaming addiction, and the other without. All the participants with internet gaming disorder were addicted to the popular multiplayer online role-playing game, World of Warcraft. The participants were shown a mixture of visual cues, some being associated with World of Warcraft, and others being neutral. Whilst being shown the visual cues, the brains of the participants were scanned for brain activation using an fMRI machine. It was observed that when being shown visual cues relating to gaming, the participants with internet gaming disorder showed increased activation of brain regions associated with drug addiction, including the striatum and the prefrontal cortex.  The activation of these brain regions was positively correlated with self-reported ‘craving’ for these games; the higher the craving for the game, the higher the levels of activation (Ko et al 2009[v]). These studies, among others, do suggest that gaming does have a place in joining the list of non-substance related addictive disorders.

But don’t uninstall your games yet; it is important to note that not everyone who plays computer games will become addicted. And what if there is a brighter side to gaming? What if all those hours of grinding away on World of Warcraft, thrashing your friends on Mario Kart, or chilling on Minecraft might actually benefit you in some way? There is a small, but growing, amount of research that suggests that gaming might be good for your brain.

What we have learnt about how the brain responds to the real world, is being applied to how the brain responds to the virtual world. In the famous work of Maguire et al (2000[vi]), it was demonstrated that the taxi drivers of London showed an increased volume of the hippocampus, a region associated with spatial navigation and awareness. This increased volume was attributed to the acquisition of a spatial representation of London. Following from this, some researchers asked how navigation through a virtual world may impact the hippocampus.

In one of these studies, the researchers investigated how playing Super Mario 64, a game in which you spend a large amount of time running and jumping around a virtual world (sometimes on a giant lizard) impacts the hippocampus. When compared to a group that did not train on Super Mario 64, the group that trained on Super Mario 64 for 2 months showed increased volumes of the hippocampus and the prefrontal cortex. As reduced volumes of the hippocampus and the prefrontal cortex are associated with disorders such as post-traumatic stress disorder, schizophrenia and neurodegenerative diseases, the researchers speculate that video game training may have a future in their treatment (Kühn et al 2014[vii]). In another study, the impact of training on Super Mario 64 on the hippocampus of older adults, who are particularly at risk of hippocampus-related pathology, was assessed. It was observed that the group that trained by playing Super Mario 64 for 6 months showed an increased hippocampal volume and improved memory performance compared to participants who did not train on Super Mario 64 (West et al 2017[viii]). So, it appears that navigating virtual worlds, as well as the real world, may lead to hippocampal volume increase and may have positive outcomes on cognition.

1

A screenshot of Super Mario 64. This game involves exploration of a virtual world. Image taken from Kühn et al 2014[1]

Maybe it makes sense that the world being explored doesn’t have to be real to have an effect on the hippocampus, and games like Super Mario 64 have plenty to offer in terms of world exploration and navigation. But what about the most notorious of games, those first-person shooter action games? It has been suggested that first-person shooter games can lead to increased aggressive behaviours in those who play them, however researchers do not agree that this effect exists (Markey et al 2014[ix] Greitemeyer et al 2014[x]). Nevertheless, can these action games also have more positive effects on the cognitive abilities of the brain? Unlike Super Mario 64, these games require the player to quickly respond to stimuli and rapidly switch between different weapons and devices to use, depending upon the given scenario. Some researchers have investigated how playing action games, such as Call of Duty, Red Dead Redemption, or Counterstrike, impact short-term memory. Participants who either did not play action games, causally played action games, or were experienced in playing action games were tested for visual attention capabilities. The participants were tested on numerous visual attention tests, involving recall and identification of cues that were flashed briefly on a screen. The researchers observed that those who played action games showed significantly better encoding of visual information to short-term memory, dependent on their gaming experience, compared to those who did not (Wilms et al 2013[xi]).

In another study, the impact of playing action games on working memory was assessed. Working memory is a cognitive system involved in the active processing of information, unlike short-term memory which involves the recall of information following a short delay (Baddeley et al 2003[xii]). In this study, the researchers tested groups of participants who either did not play action games or did play action games. The researchers tested the participants’ working memory by utilising a cognitive test known as the “n-back test”. This test involves watching a sequence of squares that are displayed on a screen in alternating positions. As the test progresses the participants have to remember the position of the squares on the screen from the previous trials whilst memorising the squares being shown to them at that moment.  The researchers observed that people who did play action games outperformed those who did not on this test; they were better able to remember the previous trials, whilst simultaneously memorising the current trials (Colzato et al 2013[xiii]). From these studies, it appears that action games may have some benefit on the cognitive abilities of the players, leading to increased short-term processing of information in those who play them.

A screen grab from first person shooter games: Call of Duty: WW2 (left), and Halo (right). These fast-paced games involve quickly reacting to stimuli and making quick decisions to bypass enemies and progress in the game.

So, for the worried parents, and the individuals who enjoy indulging in video games, maybe it’s not all bad. As long as you are not suffering from a form of gaming addiction (and if you think you might be please see a health expert) maybe all these hours gaming may actually not be as bad for your brain as it might seem. But ultimately, much more research is needed to understand how a broader range of games played over childhood development, and for time periods of years and decades, affects our brains and mental health.

If you think you may be suffering from a gaming addiction, see the NHS page  for more information.

Edited by Lauren & Monika

References:

[i] American Psychiatric Association, 2013. Diagnostic and statistical manual of mental disorders (DSM-5®). American Psychiatric Pub

[ii] World Health Organization [WHO]. 2018a. ICD-11 beta draft – Mortality and morbidity statistics. Mental, behavioural or neurodevelopmental disorders.

[iii] Cai, C., Yuan, K., Yin, J., Feng, D., Bi, Y., Li, Y., Yu, D., Jin, C., Qin, W. and Tian, J., 2016. Striatum morphometry is associated with cognitive control deficits and symptom severity in internet gaming disorder. Brain imaging and behavior10(1), pp.12-20.

[iv] Robbins, T.W. and Everitt, B.J., 2002. Limbic-striatal memory systems and drug addiction. Neurobiology of learning and memory78(3), pp.625-636

[v] Ko, C.H., Liu, G.C., Hsiao, S., Yen, J.Y., Yang, M.J., Lin, W.C., Yen, C.F. and Chen, C.S., 2009. Brain activities associated with gaming urge of online gaming addiction. Journal of psychiatric research43(7), pp.739-747

[vi] Maguire, E.A., Gadian, D.G., Johnsrude, I.S., Good, C.D., Ashburner, J., Frackowiak, R.S. and Frith, C.D., 2000. Navigation-related structural change in the hippocampi of taxi drivers. Proceedings of the National Academy of Sciences97(8), pp.4398-4403.

[vii] Kühn, S., Gleich, T., Lorenz, R.C., Lindenberger, U. and Gallinat, J., 2014. Playing Super Mario induces structural brain plasticity: gray matter changes resulting from training with a commercial video game. Molecular psychiatry19(2), p.265

[viii] West, G.L., Zendel, B.R., Konishi, K., Benady-Chorney, J., Bohbot, V.D., Peretz, I. and Belleville, S., 2017. Playing Super Mario 64 increases hippocampal grey matter in older adults. PloS one12(12), p.e0187779.

[ix] Markey, P.M., Markey, C.N. and French, J.E., 2015. Violent video games and real-world violence: Rhetoric versus data. Psychology of Popular Media Culture4(4), p.277

[x] Greitemeyer, T. and Mügge, D.O., 2014. Video games do affect social outcomes: A meta-analytic review of the effects of violent and prosocial video game play. Personality and Social Psychology Bulletin40(5), pp.578-589.

[xi] Wilms, I.L., Petersen, A. and Vangkilde, S., 2013. Intensive video gaming improves encoding speed to visual short-term memory in young male adults. Acta psychologica142(1), pp.108-118

[xii] Baddeley, A., 2003. Working memory: looking back and looking forward. Nature reviews neuroscience4(10), p.829

[xiii] Colzato, L.S., van den Wildenberg, W.P., Zmigrod, S. and Hommel, B., 2013. Action video gaming and cognitive control: playing first person shooter games is associated with improvement in working memory but not action inhibition. Psychological research77(2), pp.234-239

Can’t hear someone at a loud party? Just turn your head!

By Josh Stevenson-Hoare

At one point or another, we have all been in this uncomfortable situation. You are talking to someone at a party, and then had the realisation – you can’t understand a word they are saying. You smile and nod along until they get to what sounds like a question. You ask them to repeat it, and you can’t hear what they say. So you ask again. We all know how it goes from there, and how awkward it gets on the fifth time of asking.

Fortunately, there is a way to avoid this happening. Researchers at Cardiff University have developed a way to maximise your chances of hearing someone at a party on the first try. You don’t need fancy equipment, or to start learning sign language. All you have to do is turn your head. 

If you are trying to listen to something difficult to hear, it makes sense to turn your head 90 degrees from it. This is the same angle as when someone is whispering in your ear, or when you only have one working earbud. If you do this, then one ear will get most of the useful sounds, and the other ear will only get un-useful sounds. You can then ignore any sounds from the un-useful ear, and focus on the sounds you want – the words.

However, for speech, the sounds made by the person speaking are not the only useful piece of information. Lip-reading is something we can all do a little bit, without training. You may not be able to follow a whole conversation, but what you can do is tell the difference between small sounds. These are called phonemes. 

A phoneme is the smallest bit of sound that has meaning. It’s the difference between bog and bag, or tend and mend. If you miss hearing part of a word you often use the shape of the speaker’s mouth to help you figure out what the word was. This can lead to some interesting illusions, such as the McGurk effect. This is when the same sound paired with different mouth shapes causes you to hear different syllables.

If you want to read someone’s lips while they talk, you have to be able to see their face. Unless you have a very wide field of vision, the best way to lip-read is to face someone.

So, to improve hearing you should face away from someone, and for best lip-reading you should face towards them. Not the easiest things to do at the same time.

To find out which is more useful, the researchers played video clips of Barack Obama speaking to some participants. Over this they played white noise, which sounds like a radio tuned between stations. The participants turned their heads to different angles as instructed by the researchers. The white noise was then made louder and quieter until participants could only just hear what Obama was saying.

Participants found it easier to understand the speech when they pointed their heads away from the speech. The best angle, though, changed depending on where the white noise was coming from. 

You might expect that pointing away from the white noise would be best strategy. But, the researchers found that the best strategy was to point around halfway between the speech and the white noise. Not so far that participants couldn’t see Obama’s face, but far enough to have one ear pointed at him.
y

The best angle to face when trying to hear someone in a loud environment. Picture Credit: Josh Stevenson-Hoare.  

In real-life, noise comes from lots of different angles at once. It would be very difficult to work out what angle would be the best all the time. But, if there is one major source of noise such as a pneumatic drill or a music speaker, it can be useful to remember this technique. 

It might look a little silly at first, but it could save you from an embarrassing faux pas next time you go to a party. So now you can hear all the details of Uncle Jimmy’s fishing trip with ease. For the third time this year. You’re welcome.

Edited by Lucy & Sophie

Microglia: Guardians of the Brain

By Ruth Jones

Our Immune System works hard to protect us from unwanted bugs and help repair damage to our bodies. In our brains, cells called Microglia are the main immune players. Microglia are part of a group of immune cells called Macrophages, which translates from Greek as “big-eaters”. Much like students faced with free food, macrophages will eat just about anything, or at least anything out of the ordinary. Whether it’s cell debris, dying cells or an unknown entity, macrophages will eat and digest it. Microglia were first discovered in 1920’s by W. Ford Robertson and Pio del Rio-Hortega. They saw that these unidentified cells seemed to act as a rubbish disposal system. Research into these mysterious cells was side-lined during WWII, but was eventually picked up again in the early 1990’s.

When I think about how microglia exist in the brain, I am always reminded of the blockade scene in Marvel’s Guardians of the Galaxy. Each “ship” or cell body stays in its own territory while its branches extend out to touch other cells and the surrounding environment. The microglia ships can then use their branches to touch other cells or secrete chemicals to communicate, together monitoring the whole brain.

ruthNova Core Blockade Scene © Guardians of the Galaxy. (2014). [DVD] Directed by J. Gunn. USA: Marvel Studios.

Microglia have evolved into brilliant multi-taskers. They squirt out chemicals called chemokines that affect the brain’s environment by quickly promoting inflammation to help remove damaged cells, before dampening inflammation to protect the brain from further damage. They also encourage the growth of new nerve cells in the brain and can remove old “unused” connections between neurones, just like clearing out your Facebook friend list. Microglia keep the brain functioning smoothly by taking care of and cleaning up after the other cells day in, day out.

Today, microglia are a trending topic, believed to play a part in multiple brain diseases. Genetic studies have linked microglia to both neuropsychiatric disorders like autism and neurodegenerative diseases such as Alzheimer’s disease.

Do microglia help or hurt in Alzheimer’s disease? This is a complicated question. Scientists have found microglia can both speed-up and slow-down Alzheimer’s disease. Scientists have thought for a while that in disease these microglia “ships” are often destroyed or are too aggressive in their efforts to remove sticky clumps of Αmyloid-β protein (a big problem in Alzheimer’s disease).

What is going on in the Alzheimer’s disease brain? Recent advances in technology mean scientists are now starting to discover more answers. One problem when working with microglia is they have a whole range of personalities, resulting in a spectrum of protein expression and behaviour. Therefore, when you just look at the entire microglial population you may miss smaller, subtle differences between cells. It is likely that microglia don’t start out as “aggressive” or harmful, so how can we see what causes microglia behaviour to change?

Luckily a new process called “single-cell sequencing” has been able to overcome this. An individual cell is placed into a dish where the mRNA, the instructions to make protein, can be extracted and measured. This means you can compare the variation in the entire microglia population, which could make finding a “trouble-maker” microglia easier in disease. This process could also be used to see how individual cell types change across the course of Alzheimer’s disease.

In the future, by looking in detail at how these individual microglia “guardians” behave, scientists can hopefully begin to unravel some of the mysteries surrounding these fascinating and hugely important cells in both health and across all disease stages.

Edited by Sam & Chiara

My reflections on studying Parkinson’s Disease

By Dr. Lucia Fernandez Cardo 

Parkinson’s disease (PD) owes its name to Doctor James Parkinson, who in 1817 described the disorder in his manuscript “An essay on the shaking palsy”. It has been 200 years since we began to study this disease, and despite the advances in understanding, we are still far from finding a cure.

PD is the second most common neurodegenerative disorder affecting 1-2% of the worldwide population. The pathological hallmark is the loss of dopaminergic neurons in a very small part of the brain called the Sustantia nigra, and the presence of protein depositions called Lewy bodies1. The loss of dopamine leads to a number of motor symptoms: bradykinesia (slowness of movement), rigidity, resting tremor, and postural instability. For the clinical diagnosis of PD, the patient must present bradykinesia plus one of the other three signs. Along with these motor signs, some non-motor symptoms are often present, amongst them anxiety, depression, dementia, sleeping disturbances, intestinal problems or hallucinations 2.

PD can be due to both genetic and environmental risk factors. 10-15% of the cases are described as familial PD with a clear genetic origin (mutations in SNCA, LRRK2 or Parkin genes are the main cause), the remaining cases are considered ‘sporadic’ or ‘idiopathic PD’ and are due to a possible combination of multiple genetic (risk polymorphisms) and environmental risk factors (toxins, exposure to pesticides, side effects of drugs, brain lesions, etc.)3-5.

In my experience, researching PD is something that I find challenging, but also motivational and rewarding. Every 11th of April (James Parkinson’s birthday), the regional and national associations of PD patients and their relatives have a day of celebration. They obviously do not celebrate having the disease, but celebrate being together in this battle and never giving up. This day they put aside the pain and the struggling, and celebrate that they are alive by gathering together, laughing, eating, and dancing.

When I was doing my PhD dissertation, I was lucky to be invited to this celebration as part of a group of scientists working on PD. Our group study the molecular genetics of PD, amongst other disorders, and my thesis was focused on studying genetic risk variants in sporadic patients. I will never forget how nervous I was, having to deliver a very brief talk, explaining the genetic component of PD and our current projects at the time. Some years later, though they may not remember me, I still clearly remember the smiling faces in the audience and the cheering and nice words that were said after I finished. Perhaps they did not grasp the most technical concepts, but for them, the mere fact of knowing that there were people researching their disease, working on understanding the mechanisms, and fighting to find a cure, was more than enough to garner many thanks and smiles.

Sadly, there is not yet a cure for PD, but medications, surgery, and physical treatment can provide relief to patients and improve their symptoms. The most common treatments (e.g.  levodopa, dopamine agonists, MAO-B inhibitors) all restore the dopamine levels in the brain 6. Levodopa is usually the most successful treatment but the side effects, appearance of dyskinesia (involuntary movement) and fluctuations in the effectiveness can be an issue with long term use. Some patients can be candidates for a very successful surgery treatment called Deep Brain Stimulation (DBS) which involves the implantation of a neurostimulator, usually in the top chest area, and a set of electrodes in specific parts of the brain. The electrical pulses stop the over excitation in the brain and the reduction of motor symptoms is astonishing. Follow the links below to check out some videos of the effects.

https://www.youtube.com/watch?v=wZZ4Vf3HinA https://www.youtube.com/watch?v=Uohp7luuwJI

Despite the treatments, these people can struggle daily due to the difficulty of finding the right drug combinations, the on and off phases of the medication, or the issues of carrying an internal battery to control the electrode pulses in their brain – and this is not even mentioning the possible non-motor symptoms of the disease. After spending a whole day with them I felt overwhelmed by their energy and good sense of humour and definitely saw things from a different perspective.

Finally, I just want to say that getting the chance to meet real people who have the disorder can be so important for us scientists. It helps to remind us why we dedicate so much of our lives researching a disorder. It is why during the hardest moments of failed experiments, struggling with new techniques, and so many extra hours of work, we can keep on going and will not give up. We do it so that we can see patients smile and keep up their hopes that we will one day find a cure for a disease as debilitating as PD.

Edited by Sam & Chiara

References:

[1] Spillantini MGSchmidt MLLee VMTrojanowski JQJakes R and Goedert M. Alpha-synuclein in Lewy bodies. Nature. 1997 Aug 28; 388(6645):839-40.

[2] Postuma RB, Berg D, Stern M, Poewe W, Olanow CW, Oertel W, Obeso J, Marek K, Litvan I, Lang AE, Halliday G, Goetz CG, Gasser T, Dubois B, Chan P, Bloem BR, Adler CH and Deuschl G. MDS clinical diagnostic criteria for Parkinson’s disease. Mov Disord. 2015 Oct; 30(12):1591-601.

[3] Lee A and Gilbert RM. Epidemiology of Parkinson Disease. Neurol Clin. 2016 Nov; 34(4):955-965.

[4] Billingsley KJ, Bandres-Ciga S, Saez-Atienzar S, Singleton AB. Genetic risk factors in Parkinson’s disease. Cell Tissue Res. 2018 Mar 13. [Epub ahead of print].

[5] Fleming SM. Mechanisms of Gene-Environment Interactions in Parkinson’s Disease. Curr Environ Health Rep. 2017 Jun; 4(2):192-199.

[6] Fox SH, Katzenschlager R, Lim SY, Barton B, de Bie RMA, Seppi K, Coelho M and Sampaio C. International Parkinson and movement disorder society evidence-based medicine review: Update on treatments for the motor symptoms of Parkinson’s disease. Mov Disord. 2018 Mar 23. [Epub ahead of print].

If you are interested in learning a little bit more about PD and helping us to demystify this disease, here is the link to some useful websites:

https://www.parkinsons.org.uk/

http://www.parkinson.org/understanding-parkinsons/

https://www.michaeljfox.org/understanding-parkinsons/index.html?navid=understanding-pd

The healing power of companionship

By Shireene Kalbassi

When it comes to the recovery of wounds and other medical conditions, most people probably think of hospital beds, antibiotics, and maybe some stitches. What probably doesn’t come to mind is the role that companionship may play in speeding up the healing process.

And yet, studies in humans have shown a link between increased companionship and enhanced recovery prospects (Bae et al 2001, Boden-Albala et al 2005).

So why should it be that social interaction influences the recovery process? Well, in social species, social interaction leads to the release of the hormone known as oxytocin, AKA the “love hormone”. This hormone is released from the pituitary gland, located in the brain. Increased levels of oxytocin have been associated with lower levels of stress response hormones, such as cortisol and corticosterone, and high levels of these stress response hormones have been shown to lead to impaired healing (Padgett et al 1998, DeVries  et al 2002, Heinrichs et al 2003, Ebrecht et al 2004).

This link between social interaction, oxytocin, stress hormones and recovery has been explored in studies, such as the work of Detillion et al (2004). Here, the authors investigated how companionship impacts wound healing in stressed and non-stressed hamsters. The role of companionship was explored by comparing socially isolated hamsters to ‘socially housed’ hamsters, that share a home environment with another hamster. Stressed hamsters were physically restrained to induce a stress response, while non-stressed hamsters did not undergo physical restraint.

In order to understand how these factors relate, the authors therefore compared four different groups: hamsters that were socially isolated and stressed, hamsters that were socially housed and stressed, hamsters that were socially isolated and non-stressed, and hamsters that were socially housed and non-stressed.

The hamsters that were socially isolated and stressed showed decreased wound healing and increased cortisol levels, when compared to socially housed hamsters or non-stressed socially isolated hamsters. Furthermore, when a blocker of oxytocin was given to socially housed hamsters decreased wound healing was observed, while supplementing stressed hamsters with oxytocin lead to increased wound healing and lower levels of cortisol.

So it seems that when social animals interact oxytocin is released, which reduces the levels of stress hormones, leading to increased wound healing.

But what if there is more to the story than this? These studies, and others like it, demonstrate a relationship between companionship and wound healing, but how might factors relating to social interaction impact recovery?

Venna et al (2014) explored the recovery of mice that were given a brain occlusion, where part of the blood supply of the brain is shut off to try to replicate the damage seen in stroke. However, in this study the mice were either socially isolated, paired with another stroke mouse, or paired with a healthy partner. When assessing recovery, the authors looked at multiple parameters including death rates, recovery of movement, and new neuron growth. The authors observed that, as expected, socially isolated stroke mice showed the lowest rates of recovery. Interestingly, stroke mice that were housed with other stroke mice showed decreased recovery rates when compared to stroke mice that were housed with a healthy partner.

So why should the health status of the partner influence the healing process of the mice? The work of Venna et al did not assess if the amount of social contact between stroke mice that were housed with another stroke mouse was equal to that of stroke mice that were housed with a healthy partner, which may explain the discrepancy seen between the two groups. Exploration of this could lead to better understanding of whether the quantity of social interaction may be leading to decreased recovery rates in the stroke mice housed with other stroke mice groups, or if the decreased recovery may be due to other factors.

Regardless, it appears that social interaction may not be a simple box to tick when it comes to enhancing the recovery process but is instead dynamic in nature. And while nothing can replace proper medical care and attention, companionship may have a role in speeding up the recovery process.  

If you want to know more about the use of animals in research, please click here.

Edited By Sophie & Monika

References:

Bae, S.C., Hashimoto, H.I.D.E.K.I., Karlson, E.W., Liang, M.H. and Daltroy, L.H., 2001. Variable effects of social support by race, economic status, and disease activity in systemic lupus erythematosus. The Journal of Rheumatology, 28(6), pp.1245-125

Boden-Albala, B., Litwak, E., Elkind, M.S.V., Rundek, T. and Sacco, R.L., 2005. Social isolation and outcomes post stroke. Neurology, 64(11), pp.1888-1892

Padgett, D.A., Marucha, P.T. and Sheridan, J.F., 1998. Restraint stress slows cutaneous wound healing in mice. Brain, behavior, and immunity, 12(1), pp.64-73.

DeVries, A.C., 2002. Interaction among social environment, the hypothalamic–pituitary–adrenal axis, and behavior. Hormones and Behavior, 41(4), pp.405-413.

Heinrichs, M., Baumgartner, T., Kirschbaum, C. and Ehlert, U., 2003. Social support and oxytocin interact to suppress cortisol and subjective responses to psychosocial stress. Biological psychiatry, 54(12), pp.1389-1398.

Ebrecht, M., Hextall, J., Kirtley, L.G., Taylor, A., Dyson, M. and Weinman, J., 2004. Perceived stress and cortisol levels predict speed of wound healing in healthy male adults. Psychoneuroendocrinology, 29(6), pp.798-809.

Detillion, C.E., Craft, T.K., Glasper, E.R., Prendergast, B.J. and DeVries, A.C., 2004. Social facilitation of wound healing. Psychoneuroendocrinology, 29(8), pp.1004-1011.

Glasper, E.R. and DeVries, A.C., 2005. Social structure influences effects of pair-housing on wound healing. Brain, behavior, and immunity, 19(1), pp.61-68

Venna, V.R., Xu, Y., Doran, S.J., Patrizz, A. and McCullough, L.D., 2014. Social interaction plays a critical role in neurogenesis and recovery after stroke. Translational psychiatry, 4(1), p.e351

Can we solve problems in our sleep?

By Sam Berry

Have you heard the song “Scrambled Eggs”? You know:

“Scrambled eggs. Oh my baby how I love your legs.”

No? Perhaps you would recognize the tune.

A young Paul McCartney woke up one morning with an amazing melody in his head. He sat at the piano by his bed and played it out, and he liked it so much he couldn’t quite believe it had come to him in a dream. The tune was there, but he just couldn’t find the right words to fit. For several months he tried, but he couldn’t get past “Scrambled Eggs” as a working title.

So how did the famous Beatle complete his masterpiece? He did some more sleeping. Another fateful day, he woke up and the song was there, fully formed with lyrics and the now famous title “Yesterday.”

“Yesterday, all my troubles seemed so far away.”

Recognise it now? A critically acclaimed worldwide hit had formed itself in his sleep. Boom. A chart smashing phenomenon.

—— —– —– —– —— ——

It may seem obvious, but not sleeping is extremely bad for you. Symptoms of sleep deprivation include a marked decline in the ability to concentrate, learn, and retain new information. It can affect your emotions, self-control, and cause visual and auditory hallucinations.

Whether not sleeping at all would actually kill you has not yet been established. The record time for someone staying awake is 11 days and 25 minutes during a science experiment in 1965. The subject was kept awake by two ‘friends’ as they observed him become a drooling delusional mess. Yet there are plenty of studies that demonstrate serious detrimental health effects of both short and long-term sleep deprivation.

Being mentally and physically alert will certainly help you to solve problems, but many scientists think something much more interesting is going on during sleep. Your brain is still learning whilst you are snoring.  

You are only coming through in waves…

Okay, so do we know how sleep can help us to learn? We’re getting there. Using brain imaging technology like fMRI scanners (giant magnets that use blood flow changes to see how different parts of the brain react to things) and EEG (funky hats with electrodes that measure how our neurons are firing in real time), we can have a look at what the brain is doing while we’re dozing off.

Our brains remain active while we sleep. Sleep can be split into different stages, and what happens during these stages is important for memory and learning. Broadly speaking, your sleep is split into non-REM (Stage 1, 2, and Slow Wave) and REM (Rapid Eye Movement) stages. These are traditionally separated depending on what the pattern of electrical output from the EEG is showing. I’ll briefly take you through what these different stages are and how our neuron activity changes as we go through them:

Stage One sleep is when we start to doze off and have our eyes closed. Have you ever noticed a grandparent falling asleep in their chair, but when you ask them to stop snoring they wake up insisting they were never asleep in the first place? That’s stage one sleep; you can be in it without even knowing.

Stage Two is still a light sleep, but when brain activity is viewed using EEG you can see an increase in spiking brain activity known as sleep spindles.

Slow Wave Sleep is so called because in this stage neurons across the brain activate together in unison, creating a slow, large coordinated electrical pattern. This makes the EEG output look like a wave. Slow wave sleep also contains some of Stage Two’s sleep spindles, and as well has something called sharp wave ripples. This is where a brain area called the Hippocampus (involved in memory and navigation) sends bursts of information to the Neocortex (involved in our senses, motor movement, language, and planning to name a few).

REM sleep is when our bodies are paralysed but our eyes dart around. Our blood pressure fluctuates and blood flow to the brain increases. While we dream throughout sleep, our dreams during REM become vivid and our brain activity looks similar to when we’re awake.

We cycle through these stages in 90 -120 minute intervals throughout the night, our sleep becoming deeper and more REM-based as the cycle progresses. Disruptions to the sleep cycle are associated with decreases in problem-solving ability as well as psychiatric and neurodegenerative disorders like Alzheimer’s.

Spikey learning

Problem solving requires memory: you need to use information you already have and apply it to the problem at hand. You also need to remember what you tried before so that you don’t keep making the same mistakes (like singing “Scrambled Eggs” over the same tune forever). The stages of sleep most relevant to helping us keep hold of our memories are the non-REM ones, and in particular Slow Wave Sleep.

Recent research reveals that sleep spindles, slow waves, and sharp wave ripples work together so when a slow wave is at its peak the brain cells are all excited, creating the perfect environment for the sleep spindles to activate. When the wave is crashing down, the sharp wave ripples from the Hippocampus are more likely to fire to the Neocortex. Recent research tells us this coupling of spindles and slow waves is associated with how well you retain memories overnight. Interestingly, in older adults spindles can fire prematurely before the wave reaches its peak, suggesting a possible reason why memory gets worse with age.

Researchers say this pattern of brain activity is a sign of the brain consolidating, or crystallising, what was learned or experienced whilst awake. This process strengthens the neural connections of the brain. Studies show that the pattern of neurons that get excited when we learn something are reactivated during sleep. This could mean that during sleep our brains replay experiences and strengthen newly formed connections.

Getting freaky

So what do our dreams mean? We’ve all had bizarre ones—how about that common dream where all your teeth fall out?

During REM sleep, our brain activity looks similar to when we’re awake. Scientist Deirdre Barrett suggested we think of REM sleep like merely a different kind of thinking. This type of thinking uses less input from the outside world or from the frontal parts of our brain in charge of logical thinking. REM is thought to be involved in consolidating our emotional memories, but it is also when we tend to have the vivid visual dreams that may defy logic. This combination enables REM “thinking” to be creative or even weird. REM sleep may allow us to form connections between ideas that are only distantly related.

Recently, a team in Germany suggested that Non-REM sleep helps put together what we know while REM breaks it up and puts it back together in new ways.

Thoughts before bed

So “sleeping on it” really can help solve problems. It strengthens the memories you make during the day and it helps learn and see things more clearly when you wake up. REM sleep may also allow thinking to be unconstrained by logic and divide and reshape ideas during REM. If reading this article made you sleepy, go ahead and take a nap. You might learn something.

Edited by Becca Loux. Becca is a guest editor for Brain Domain and an avid fan of science, technology, literature, art and sunshine–something she appreciates more than ever now living in Wales. She is studying data journalism and digital visualisation techniques and building a career in unbiased, direct journalism.

References:

Barrett, D. (2017). Dreams and creative problem-solving: Dreams and creative problem-solving. Annals of the New York Academy of Sciences, 1406(1), 64–67. https://doi.org/10.1111/nyas.13412

Carskadon, M. A., & Dement, W. C. (2005). Normal human sleep: an overview. Principles and Practice of Sleep Medicine, 4, 13–23.

Chambers, A. M. (2017). The role of sleep in cognitive processing: focusing on memory consolidation: The role of sleep in cognitive processing. Wiley Interdisciplinary Reviews: Cognitive Science, 8(3), e1433. https://doi.org/10.1002/wcs.1433

Haus, E. L., & Smolensky, M. H. (2013). Shift work and cancer risk: Potential mechanistic roles of circadian disruption, light at night, and sleep deprivation. Sleep Medicine Reviews, 17(4), 273–284. https://doi.org/10.1016/j.smrv.2012.08.003

Helfrich, R. F., Mander, B. A., Jagust, W. J., Knight, R. T., & Walker, M. P. (2018). Old Brains Come Uncoupled in Sleep: Slow Wave-Spindle Synchrony, Brain Atrophy, and Forgetting. Neuron, 97(1), 221–230.e4. https://doi.org/10.1016/j.neuron.2017.11.020

Klinzing, J. G., Mölle, M., Weber, F., Supp, G., Hipp, J. F., Engel, A. K., & Born, J. (2016). Spindle activity phase-locked to sleep slow oscillations. NeuroImage, 134, 607–616. https://doi.org/10.1016/j.neuroimage.2016.04.031

Landmann, N., Kuhn, M., Maier, J.-G., Spiegelhalder, K., Baglioni, C., Frase, L., … Nissen, C. (2015). REM sleep and memory reorganization: Potential relevance for psychiatry and psychotherapy. Neurobiology of Learning and Memory, 122, 28–40. https://doi.org/10.1016/j.nlm.2015.01.004

Lewis, P. A., & Durrant, S. J. (2011). Overlapping memory replay during sleep builds cognitive schemata. Trends in Cognitive Sciences, 15(8), 343–351. https://doi.org/10.1016/j.tics.2011.06.004

Ólafsdóttir, H. F., Bush, D., & Barry, C. (2018). The Role of Hippocampal Replay in Memory and Planning. Current Biology, 28(1), R37–R50. https://doi.org/10.1016/j.cub.2017.10.073

Sio, U. N., Monaghan, P., & Ormerod, T. (2013). Sleep on it, but only if it is difficult: Effects of sleep on problem solving. Memory & Cognition, 41(2), 159–166. https://doi.org/10.3758/s13421-012-0256-7

Staresina, B. P., Bergmann, T. O., Bonnefond, M., van der Meij, R., Jensen, O., Deuker, L., … Fell, J. (2015). Hierarchical nesting of slow oscillations, spindles and ripples in the human hippocampus during sleep. Nature Neuroscience, 18(11), 1679–1686. https://doi.org/10.1038/nn.4119

Brain-controlled mice face robobugs

­

By Monika Sledziowska

Have you ever wondered how realistic was the mind control technology presented in Kingsman: the secret service? I can’t claim to go to many Hollywood parties, but I may have an idea.

In a lesser-known series of experiments, in the 1960s a Yale University Professor, Jose Delgado, demonstrated that direct electrical stimulation of the brain is able to elicit various emotional states and reactions from a range of animals, including humans (1). The stimulation and recording of electrical activity was also achieved using radio waves. Doesn’t it sound just like a recent film?

However cool (and perhaps disturbing) this line of enquiry was, it was soon abandoned. One reason for this may be that the technology of the time wasn’t quite advanced enough to establish which specific neural pathways were responsible for which state or reaction. However, in an interesting turn of events, a new technique has come along, which has the capacity to do precisely that.

The technique, by the name of optogenetics, doesn’t use radio waves but rather controls cells in the brain using light. Before you get too worried about the effects of light on your brain, I should explain that only cells that have been genetically modified to have light-sensitive proteins will be influenced by light of a specific frequency (2). These light-sensitive proteins can be limited to a particular part of the brain and a particular cell type using genetic manipulation.

So, what can optogenetics tell us about the neural basis of specific behaviours? A recent paper by Han et al. (2017) focused on defining the neural pathways that are responsible for different aspects of hunting behaviour.

Step one: the authors injected a virus carrying the genes necessary to have light-sensitive proteins into the amygdala of mice. This is the part of the brain known to be involved in reward and fear processing (3). The genetic background of the mice and the genes in the virus interacted in such a way that only inhibitory neurons (neurons that restrict or ‘inhibit’ activity), are affected. The researchers then implanted the mice with small optical fibres that could shine light of certain frequency onto the amygdala.

Step two: the mice’s amygdala was exposed to the light resulting in the inhibitory neurons being active in the amygdala. As a result, the mice showed increased jaw muscle activity (one could argue that jaws are quite important for capturing prey) and they were faster to chase and capture crickets (their natural prey).

What’s interesting is that they also bit and ate inedible objects such as wooden sticks and attacked artificial robotic bugs (you can purchase these, like anything else, from Amazon), which rarely happens in the real mouse world. And if you don’t believe me, you can see for yourself in the video below:

Using this technique, the researchers were also able to uncover the projection pathways from the amygdala to a part of the brainstem called the reticular formation (responsible for biting the prey), and to part of the midbrain called the periaqueductal grey (responsible for pursuing prey).

Even though we may still be quite far away from the mind-controlling techniques like those used by the villain in the Kingsman, current technology offers interesting opportunities for uncovering the workings of mouse and human brains.

If you want to know more about the use of animals in research, please click here.

Edited by Oly & Jonathan 

References:

  1. Delgado JMR. Free Behavior and Brain Stimulation. In: Neurobiology CCP and JRSBT-IR of, editor. Academic Press; 1964. p. 349–449.
  2. Deisseroth K, Feng G, Majewska AK, Miesenbock G, Ting A, Schnitzer MJ. Next-Generation Optical Technologies for Illuminating Genetically Targeted Brain Circuits. J Neurosci [Internet]. 2006;26(41):10380–6.
  3. Han W, Tellez LA, Rangel MJ, Motta SC, Zhang X, Perez IO, et al. Integrated Control of Predatory Hunting by the Central Nucleus of the Amygdala. Cell. 2017;168(1–2):311–324.e18.