Forever Unfulfilled: Attention-Deficit, Addiction and the 21st Century Brain

This was written back in January but is, I hope, still relevant. I welcome all comments or e-mails on this topic – please do get in touch.

Photograph (C) 2007 James Brooks

The disconcerting background noise rumbling under the crystalline digital audio of the internet age is intensifying. The unsettling idea that our use of modern technology is somehow rewiring our brains has found increasing expression over the last couple of years. And from the titles of the (mostly American) books propounding this hypothesis it would seem that the neural re-fit is a bit of a botch job. There’s Mark Bauerlein’s thundering The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes our Future, UCLA Professor of Psychiatry Gary Small’s iBrain: Surviving the Technological Alteration of the Modern Mind and Maggie Jackson’s Distracted: the Erosion of Attention and the Coming Dark Age.

Attention, and our decreasing ability to pay any, is a key issue. You don’t have to look very far to see the harbingers of Jackson’s “Dark Age”. Just witness the crisis in publishing as we’re unable to give books – those weighty slabs of text with all their snaking, idea-laden prose – our undivided. Newspapers are similarly too much. We substitute with skim-read internet articles and the inane volleys of YouTube. Blogs look overlong and unwelcoming. Twitter, with its 140-character-per-post limit, would seem to be about as much many of us can handle.

Maybe this is an exaggeration, but when the Leader of the Free World starts worrying, you know something’s up. In his budget speech last April Barack Obama closed by noting

an attention span that has only grown shorter with the 24 hour news cycle, and insists on instant gratification in the form of immediate results […] instead of confronting the major challenges that will shape our future.

His words are reminiscent of asides being made with increasing frequency in the press. There’s a typical mention of the “24 hour news cycle” hinting at over-stimulation from a multi-media information deluge.

We don’t need to invoke neurology to uncover the link here. Sam Anderson’s contrarian article “In Defense of Distraction” published in New York Magazine last May, unearthed some choice lines from the economist Herbert A. Simon:

What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.

Anderson remarks that “as beneficiaries of the greatest information boom in the history of the world, we are suffering, by Simon’s logic, a correspondingly serious poverty of attention.”

Quite so, but there’s something else Obama touches on: a growing expectance of instant gratification. It’s hard to doubt this. Those two words constitute The Great Promise of the technology-driven 21st Century. Every itch can be scratched in the digital age. Troubled by your inability to remember some incidental pop-culture factette? Google it. Query about some trifling matter at work? Whip out the Blackberry. Your cousin in New Zealand had her baby yet? It’s too late to call over there, check her Facebook. And Gentlemen! Home alone, feeling mildly horny? Internet access? Hours of footage beyond your most depraved imaginings all at the click of a mouse. Lose yourself.

And millions do, not just to internet porn, but to all the other instant response interfaces and gadgetry at our disposal. Remember when that sleek black lump of plastic and silicon in your pocket was jokingly referred to as a Crackberry? If we transpose to the language of behavioural psychology; if we talk instead of ‘reward’ and ‘reinforcement’, we can see why. The ubiquitous promise of instant gratification leads to a syndrome bearing the hallmarks of addiction. And it is becoming increasingly apparent that addiction is underscored by subtle but fundamental changes in the brain itself, changes which leave addicts unable to concentrate on anything but the next hit: an attention span that “only grows shorter”.

Photograph (C) 2010 James Brooks

As a postgraduate psychology student at Harvard in the early 1930s, Burrhus Frederic (B.F. on everything he wrote) Skinner was unimpressed by the abstruse theories of ‘mind’ in vogue on campus. Driven by a stringently scientific vision of psychology, he wanted objective data from reproducible experiments. The unobservable mind didn’t lend itself to such study so Skinner turned instead to behaviour. For his work on animal behaviour Skinner devised an apparatus which would revolutionize psychology: the Operant Conditioning Chamber, popularly known as the Skinner Box.

Rats and, somewhat bizarrely, pigeons, were the first inhabitants of Skinner Boxes. Early models were furnished with little more than a lever linked to a mechanism which released a food pellet into the box. Bells, whistles, electrified floors and intracranial cocaine came later (B.F. himself never used the last two). What Skinner set out to observe was how a given behaviour (lever-pressing) of his animal analysands was altered in its frequency by differing regimens of reward (food, for example): in his terminology, differing “reinforcement schedules”.

One early discovery was particularly notable. Schedules where every, say, five lever-presses precipitated a food-pellet elicited considerably lower frequencies of lever-pressing than schedules where two, then nine, then one, then seven presses got the goods. In fact this “variable-ratio schedule” was more reinforcing than any of the early regimens studied, including those that tinkered with the interval between lever-press and food pellet. Rats on variable-ratio schedules would also continue lever-pressing for longer when the food was withdrawn.

The little rodent imprisoned in a sterile cage, hammering incessantly on a lever even after his reward is long gone is a depressing image to contemplate. Now posit a man in the rodent’s place, tapping away until his heart gives out – you’re conjuring the final hours of Lee Seung Seop.

Lee was South Korean. He lived in the city of Daegu, the country’s manufacturing hub. On the 3rd August 2005 he went to a local internet café and pressed virtual levers for 50 hours until he collapsed from exhaustion and died of heart failure.  Lee had been playing StarCraft, a “real-time strategy” video game. The website Gamespot UK relayed the words of his former office manager: “He was a game addict. We all knew about it. He couldn’t stop himself.”

Although a couple of similar fatalities have been reported in China such scenarios remain exceptionally rare. And despite alarming – or alarmist – reports in the press of game ‘overuse’, to employ a frequent and necessary euphemism, ‘video game addiction’ is not recognised by the major psychiatric bodies. After deliberation the American Medical Association rejected it for inclusion in the next edition of their Bible, the Diagnostic and Statistical Manual of Mental Disorders, stating that more evidence and research were required.

Yet this should not obscure the fact that video games designers know all about variable-ratio schedules and routinely deploy them in the products we lock and load into the console. This three card trick on our attention is increasingly obfuscated by graphic design of astounding sophistication. These games are precisely tuned so that we lose ourselves in them for hours on end; an ‘addictable’ game is one assured of long life on the market.

But don’t think that you can slip from the Skinner Box simply by binning the PlayStation. You enter a Skinner Box operating on a variable-ratio schedule every time you search the net. Who hasn’t entered a state of slack-jawed otherworldly distraction listlessly clicking from link to link? Two, then nine, then one, then seven presses sending a little reward pellet rolling down the wire.

The instantaneous – or ‘synchronous’, to use a common academic epithet – and interactive technology we currently have at our disposal predisposes to this kind of behaviour. In the chapter entitled “The Internet as a Time Sink” from her book The Psychology of the Internet, psychologist Patricia M. Wallace noted the following:

When you contribute a line of text to synchronous internet environment, you may receive a reply within a few seconds, or you may not. The short time delay seems to combine with that variable-ratio schedule to create a behaviour that is very difficult to extinguish, much like ‘lever-pressing’ on a slot machine. The fact that you can make endless changes to your [online] persona, […] searching for ways to improve that ratio of reinforcement, may make the synchronous internet environments even more compelling.

Modern technology’s interactivity, its challenge to improve the ratio of reinforcement, is what differentiates it from the other great time sink of the modern world, television. With TV there is precious little behaviour to reinforce: manipulating the remote is about as far as it goes (lolling on the sofa and spangling your belly with crisp crumbs doesn’t count). Until a few years ago once you’d done this five times (remote-hitting, not belly-spangling) a law of diminishing returns set in. The advent of FreeView prolonged this, but not by a great deal. It’s hard to imagine terrestrial TV surfathons lasting more than a few minutes.

We are now surrounded by an armoury of gadgetry far more reinforcing than TV. And it has slipped into the gaps between us where previously direct human interaction held sway. Adolescents – and it’s not just adolescents – feverishly texting on their mobile phones are ensconced in Skinner Boxes, too. They’re playing the odds, trying to up the ratio of reinforcement, just like the second-person internet message-board commenter in Wallace’s illustration.

Leaving aside the dehumanising implications of this state of affairs – a ‘friend’ is reduced to a content provider existing solely to generate a little dose of on-screen reward – one could still ask what the big deal is. So what if modern technology furnishes us with ever-more alluring time sinks? Most people won’t die down them. The Lee Seung Seops of this world are few and far between. So what if people spend idle hours lost on Facebook instead of…what? Reading Proust?

The answer to that question lies in what happens to our brains when they are subjected to the instantaneous, reward-supplying environments of modernity. These are so compelling because they short-circuit neural pathways which normally guide us in deciding what is worthy of our attention and what is not. This is the neurological basis of ‘psychological addiction’. In other words, losing yourself to the variable-ratio schedule of modern technology may engender lasting alterations in your brain that reading Proust wouldn’t, alterations which, were they to occur en masse, would resemble a ‘dark age’ for human attention.

Photograph (C) James Brooks 2008

Skinner’s distaste for theories of ‘mind’ did not arise from dubiety regarding the existence of mind itself, it was just that in 1930 the workings of the mind were imperceptible. Now, thanks to brain scanning and similar techniques – and assuming one accepts the brain as the mind’s physical correlate – we can see what’s going on.

The boom in brain imaging featured in Tom Wolfe’s zesty 1997 essay on neuroscience “Sorry, But Your Soul Just Died”. As the title suggests, the mood at the time was grimly deterministic. Accruing knowledge of neurology and genetics pointed towards a minimal influence of environment on the brain. Wolfe set the tone discussing the theories of sociobiologist Edward O. Wilson:

Every human brain, he says, is not born as a blank tablet but as ‘an exposed negative waiting to be slipped into developer fluid’. You can develop the negative well or you can develop it poorly, but either way you are going to get precious little that is not already imprinted on the film.

In fact something closer to the opposite view has arisen in the thirteen years of brain scanning since those words were penned. The hot topic in neuroscience is our brains’ ability to rewire as they learn and adapt to environment. This is ‘neuroplasticity’ and reports of its manifold wonders pepper newspapers’ science sections: London cabbies’ hippocampii (the hippocampus is a part of the brain concerned with spatial memory) grow as they accommodate a neural A-Z; the tonetic nature of Mandarin Chinese beefs up the right brains of its speakers (language processing is normally a more left-brain activity); musicians’ brains are different from non-musicians’, and so on.

But it’s not all peppy reports of brains bolstered by life experience. The malleability of our grey matter also means that we can hard-wire our bad habits. Such is the precept underlying the Disease Model of Addiction. The Disease Model proposes that the descent into addiction is driven by alterations in neural machinery so radical that after a while we can legitimately talk of the addict suffering from a ‘brain disease’. To appeal to an addict’s willpower is to ride the brakes of a car after they’ve been cut: the structures in the brain that modulate self-control are themselves badly afflicted.

The brain cell pathways at play here are some of the most studied in all neurology. One important tract is called the mesolimbic pathway, known by virtue of its function as the ‘reward pathway’. Brain cells communicate by firing chemicals called neurotransmitters at each other, receiving these signals via receptors on their surface; different neuronal tracts release different neurotransmitters: the mesolimbic pathway fires dopamine when stimulated.

Reward is a major factor motivating animal and human behaviour, no doubt. Nora Volkow, the director of the US National Institute on Drug Abuse, would probably call it the major factor. Dopamine and the mesolimbic pathway have been focal points of her illustrious career. In a 2004 interview with the journal Molecular Interventions she outlined their primacy:

Transgenic animals that don’t have dopamine die because they lack the motivation to eat regardless of whether it’s tasty food or not. It’s as if dopamine signalling tells the brain, ‘This is important, it’s salient. Pay attention!’ For example, pleasure is salient: it’s the way that nature makes us do things that are important for survival.

Volkow’s brain scanning studies on cocaine addicts constitute some of the most compelling evidence for the Disease Model of Addiction. When she started her work in this field it had long been known that cocaine boosted the action of the mesolimbic pathway. The sudden stimulation of the reward circuit provides that drug’s euphoric rush. To compensate for the dopamine overload the brain reduces its own production of the substance resulting in a subsequent ‘crash’ once the cocaine has gone. Prior to Volkow’s work, cocaine addiction, the compulsion to take cocaine, was thought to arise principally as addicts sought to counteract the deepening crashes of prolonged abuse.

But the ardency of the compulsion to take cocaine that her subjects would relate convinced Volkow that something else was awry. They knew they shouldn’t be taking cocaine, but somehow couldn’t control their actions: whither willpower?

Volkow analyzed the frontal cortices of her afflicted subjects, the frontal cortex being, in simplified terms, the locus of willpower and self-control. It’s the rational, reasoning headmaster of the brain, controlling behaviour by calming the clamouring voices of the mesolimbic pathway and other primitive structures. Volkow found that frontal cortex activity was diminished in cocaine addicts; the headmaster had lost control of a particularly unruly pupil, this pupil was now leading the class.

Such findings are not specific to cocaine, although that drug, as it acts directly on the reward pathway, induces particularly pronounced effects. The impetuous mesolimbic pathway needs constant supervision from the high-minded cortex but to empower the former is to enfeeble the latter.

Put another way, it would seem that evolution didn’t equip our brains with a sufficient brake against too much of a good thing. Such a brake wasn’t necessary. The mesolimbic pathways of our ancestors were stimulated infrequently by food, sex and social interaction. All three were essential to the survival of the species. Only recently have we been gifted access to non-essential instant access stimulants of the power of today’s street drugs and variable-ratio schedule technology.

Now we can’t help ourselves. Worse, the very neural component that should be admonishing us to step away from the syringe or the console is weakened the more we inject or click.

Photograph (C) 2008 James Brooks

In September of last year Volkow’s name cropped up in the mainstream press as the leader of a particularly remarkable study. Once again she was probing the reward pathway. This time she deployed a brain scanning technique called positron emission tomography to quantify the dopamine receptors and transporters present at two points along the mesolimbic pathway in 53 volunteers. Receptors and transporters (which ingest and recycle neurotransmitters in brain cells) are considered to be ‘markers’ of dopamine activity: their number is significantly reduced in drug addicts as the brain compensates for the dopamine overload induced by the drug.

This study’s subjects also showed a significant reduction in reward pathway dopamine markers relative to control, but they weren’t drug addicts. In fact they had been carefully selected as having no history of substance abuse. The 53 proprietors of apparently malfunctioning reward pathways were adults suffering from attention-deficit/ hyperactivity disorder (ADHD), the behavioural disorder more frequently diagnosed in children and often popularly referred to by its previous medical appellation, ADD.

Volkow’s paper was immediately heralded as a landmark publication in the field. Much previous work held the central deficit in ADHD to be one of inhibition. In this view ADHD patients were being persistantly distracted by peripheral ‘noises’ and impulses that most of us would block out (inhibit) as we focused on the task in hand. Volkow’s paper gave credence to a rival, though perhaps complementary, paradigm.

Here, by their inattentive, hyperactive behaviour ADHD sufferers were trying to stimulate naturally sluggish reward pathways, thereby compensating for their neural defect. Slower activities like reading, which supply sufficient reward for most of us to engage in them for prolonged periods, are just not enough for those with ADHD.

Volkow elaborated in an interview on US National Public Radio:

This could explain […] the paradoxical perspective of many parents that say: ‘My kid can spend hours playing video, focused and attentive, and yet they cannot focus at all at school.’ Well, the difference is that the video game is inherently rewarding and reinforcing. And by being inherently rewarding, it activates the dopamine system [which is] indispensable to motivate you and engage the attentional network. [With] a deficit in the function of the system, it requires stronger stimuli to engage an individual.

Volkow’s paper was labelled a breakthrough for another reason. ADHD is a field of medicine fraught with controversy. For 57% of the British public, according to a 2007 survey, it’s just “a term for excusing unacceptable or uncontrollable behaviour”, not a legitimate medical diagnosis. Critics pointed to a lack of evidence, despite years of exhaustive and expensive research, of a corresponding neurological defect. Volkow’s paper provided some hard data showing an abnormality of the ADHD brain, consistent with the clinical picture.

Surprisingly, it was not the first study to do so. Previous researchers, though, had not been so careful in the recruitment of their subjects who, as ADHD patients, were on medication. Sceptics held that any brain scan anomalies were not manifestations of a neurological disorder but rather illustrations of damage wrought by months or years of prescribed drug use.

The principal finding of these earlier studies was that ADHD patients had, to put it bluntly, smaller frontal cortices than their peers. The corresponding reduction in higher function left them unable to control their impulses: the inhibition deficit hypothesis of ADHD.

Over the last couple of years the teams responsible for these studies have acted with greater vigilance when recruiting their subjects. Although none have assembled an abstinent population sample matching the quality of Volkow’s study (which took 8 years to produce) there is now decent evidence to counter critics’ charges. In short, kids with ADHD appear to be doubly afflicted. Their malfunctioning reward pathways spur them on to pursue every distraction while their enfeebled frontal cortices are powerless to intervene. They can’t control themselves.

The neurological similarities with drug addiction are flagrant and they are hardly obscured when one learns that most ADHD medications are ‘psychostimulants’, classified alongside ‘speed’ (amphetamine) and cocaine. In the US a mixture of amphetamine and its chemical sibling dextroamphetamine, marketed under the trade name Adderall, is an established ADHD ‘therapy’.

Sales of these drugs are booming as the western world’s children succumb to an ADHD epidemic. By 2006, 4.5 million of America’s children had received the diagnosis. In certain states (Alabama was the leader at the last count) it afflicts more than 1 in 10. The plague took some time to cross the Atlantic but now runs rife in the UK. Diagnosis of ADHD tripled in the ten years leading up to 2003  and currently somewhere between 60,000 and 100,000 British children are  prescribed psychostimulants.

For the national medical and psychiatric establishments these figures are simply indicative of more accurate diagnoses made possible by growing medical and public knowledge of the condition. The possibility that more and more children actually suffer from ADHD, the syndrome underscored by the same neural malfunctions as occur in addiction, is unaddressed.

Yet we can see how this could be the case. We know now that our brains are not “exposed negatives waiting to be slipped into developer fluid”; to a large degree they are melded by our environment. This is especially true of young people. Brain development and maturation is one long exercise in neuroplasticity, firstly laying down the neural circuitry and then ‘pruning’ those tracts unused by day-to-day life. Pruning is particularly important for the frontal cortex and continues into the mid-twenties.

So, wouldn’t an environment that overloads the reward pathways of its inhabitants from an early age, thereby damaging the frontal cortices of these subjects in accordance with the mechanisms shown to occur in addiction, wouldn’t that environment fashion brains resembling those in the skulls of ADHD patients?

Baroness Susan Greenfield, Professor of Synaptic Pharmacology at Oxford and former director of the Royal Institution, thinks so. In February last year she addressed the House of Lords and asked

whether the near total submersion of our culture in screen technologies over the last decade might in some way be linked to the threefold increase over this period in prescriptions for methylphenidate, the drug prescribed for ADHD.

As Greenfield is one of Britain’s foremost neuroscientists you’d think her words would be given credence but she was roundly ridiculed in the mild media furore that ensued. “Given what we know about the human brain,” sneered The Observer’s Catherine Bennett, “it is clear that the prolonged exposure to an unnatural environment like the House of Lords must have a damaging effect.”

Greenfield’s mistake had been to focus on the popularity of social networking sites in her speech and then fill it as much with psychological conjecture as the circumstantial scientific evidence which might have helped her cause (and it’s worth noting that Volkow’s ADHD study appeared 7 months after her oration). The Newsnight debate that followed was titled “Social Websites: Are They Bad for Kids’ Brains?” Seeing as Facebook had taken off only a couple of years before, it was easy for the Guardian columnist and Bad Science author Ben Goldacre to saunter on and say that no, there’s really no evidence of this.

Photograph (C) 2010 James Brooks

So let’s be clear: this isn’t specifically about Facebook, or Twitter, or whatever internet fad jumps down the wire at you next. But, yes, this is about “the near total submersion of our culture in screen technologies”. A sample of British under 16’s in a 2009 survey by the research agency Childwise averaged just under six hours a day of screen time. Roughly half of this was attributed to the internet or video games. Three hours a day, every day, lever-pressing in variable-ratio schedule Skinner Boxes! It’s hard to imagine this scenario not affecting the developing brain.

In Greenfield’s words, “we’re sleepwalking into these technologies and assuming that everything will shake down just fine.” Based on our knowledge of neurology, this outcome looks unlikely. The human brain simply isn’t equipped to withstand the hammering on the reward pathway that a technology-driven culture, vaunting instant gratification above all else, metes out. The frontal cortex – the primary locus of reason, enabling us to guide and control our actions – is the first casualty.

Perhaps a permanent one. Contrary to the pronouncements of certain critics, ADHD is not a recent invention of the psychiatric establishment. Its history stretches back to the dawn of modern paediatric medicine, at least to a 1902 publication in The Lancet by ‘the father of British Paediatrics’ George Frederic Still, who called the condition ‘Moral Defect’. In the first 80 years of research, clinicians consistently noted the tendency of symptoms to dissolve after a few years. Recent brain imaging studies bear this out; they point to a lag in cortical development rather than a permanent defect.

Only in the last 20 years or so has there been talk of ‘Adult ADHD’ (the focus for Volkow’s paper) where the attention-deficit continues into maturity. Maybe the alarming rise of this new malady is attributable to better diagnosis. Or maybe not. Maybe we’re witnessing the rise of ‘New Variant ADHD’, the logical result of the mesolimbic pathway overload intrinsic to modern western society.

Isn’t this attention-poor 21st Century Man, whose habits we sketched at the top of the piece? With his altered neural architecture, the result of adaptation to a new environment, can we not postulate the dawn of a new species?

And so, Ecce Homo Sollicitus! Behold Agitated Man! Unencumbered by the cortical control that afflicted his predecessor, he is free to pursue his every whim. Moral Defect? Ha! It was sapiens, with all his useless reason and morals, who was defective!

Although, on closer inspection, we could say that Homo sollicitus is not free at all. Rather he’s obligated to pursue his every whim, to give constant stimulation to his obliterated reward pathway. His destiny is that of the sapiens cocaine addicts who prefigured him. His life surrendered in necessary pursuit of the next hit, he is unable to confront “the major challenges that will shape [his] future.” And what challenges they are! Global environmental and economic collapse loom large in the world sollicitus inherits.

Yet he fiddles with virtual levers while the earth burns, biologically obligated to forage for little scraps of reward. Chained as he is to an eternal reinforcement schedule, it’s hard to imagine sollicitus happy. Why, his very name also means ‘troubled’ and ‘anxious’.

No, Homo sollicitus’ psychological fate is sealed. It is that recounted by those diligent chroniclers of his early society, the Arctic Monkeys, in “This House is a Circus”:

We’re forever unfulfilled / but can’t think why / like a search for murder clues / in dead men’s eyes

Advertisements

5 responses to “Forever Unfulfilled: Attention-Deficit, Addiction and the 21st Century Brain

  1. Excellent read thankyou for the info

  2. This is an excellent article! I thought something was happening to my concentration and now struggle to read a book as I’m becoming more and more reliant on the internet to fulfill my time. Thank you… you should turn your research into a book.

  3. Concrete man

    Incredible research, thank you Jim.

    Yeah, I have lost my family, no, not divorced, but lost them to the TV. When my kids were babies I knew about the detrimental affects to the brain, so did not want my kids to watch TV, or at least too much. But (long story cut short here) my wife (who is not American and I’m living in the Orient) has entirely different values. TV is more important than family time, in fact, TV is family time. On top of that, as you mention Jim, you have all the other gadgets. My kids study a lot and do well in school, but when they are not (and they do do athletics), they are glued to the Tube, or to gadgets, games, cell phones etc. All of which I was against but I was overruled every step of the way (very frustrating). I don’t even own a cell phone today. The anti social nature of what Jim has enumerated above in such rich history and detail is happening right before our eyes. Those of us old enough knew a bit what real life was like (I grew up with color TV though). But we played out of doors a lot and got muddy and bruised up. Children and even adults in the country I am in are afraid of insects. As human and nature relationships break down, social pathology is sure to increase, but the worse result of all is simply that people have become pathetic zombies with no strong emotions or opinions, programmed to go through “life” from birth to death. I feel very sad that I have lost my family to the virtual world. Rarely when we go on a picnic and roll around and tickle and laugh it is worth it because the joy comes back, but that is so rare. Part of what happened to me is the bizarre culture I entered, but much of it is universal to the post modern situation.

  4. Remarkable article, very well researched and articulated =-) Thanks

  5. CosmicDrBii

    Excellent article. It should also be seen in combination with Richard Louw’s work on what he calls Nature Deficit Disorder in his very good book “Last Child in the Woods”.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s