by Kasra Hassani

I, a scientist with a PhD in Microbiology and Immunology, was a climate change denialist. Wait, let me add, I was an effective climate change denialist: I would throw on a cloak of anecdotal evidence and biased one-sided skepticism and declare myself a skeptic. Good scientists are skeptics, right?I sallied forth and denied every piece of evidence that was presented to me, for a relatively long time.

Changes in Arctic sea ice are an indicator of climate change. By NASA/Goddard Scientific Visualization Studio and adapted for NASA’s Global Climate Change website http://climate.nasa.gov/ (http://photojournal.jpl.nasa.gov/catalog/PIA14385) [Public domain], via Wikimedia Commons

It feels strange when I look back — I inadvertently fell into almost every pitfall of pseudo-science, shutting my eyes and repeating a series of mantras, such as “I don’t believe it!”“Why does it even matter?” and “I don’t care!”.

Thankfully, those days are over, but the memories linger. Although the evolution of my thought, from ignorance, to denial, to skepticism and finally to acceptance was a continuum, in retrospect I can distinguish certain phases that are worth listing and discussing. I hope my experience encourages others to loosen up some strongly held beliefs and listen to the din of evidence.. Here are the prominent phases of my climate change denialism:

The “We have bigger problems” phase:

Being a biology and ecology geek in high school, my mind nurtured environmental concerns, especially in my birth country, Iran, where air and environment pollution, uncontrolled hunting, deforestation and desert formation are rampant. When I first heard about climate change through media (nothing had been taught in school), I couldn’t help but see it as a distraction from more immediate issues — poverty, childhood mortality, wars and conflicts, pollution, and so on. It bothered me to think of countries coming together and people marching in the streets over such a hypothetical long-term effect while children die of preventable causes. This phase slowly transformed into…..

The “It’s all a conspiracy!” phase:

Courtesy Wikipedia: http://upload.wikimedia.org/wikipedia/en/1/1e/MichaelCrighton_StateOfFear.jpg

The conspiracist in me intensified after I read the novel State of Fear by Michael Crichton, a science-fiction author of Jurassic Park and The Lost World, and whom I adored during my teenage years.. State of Fear had a very science-y look with references, graphs, arguments and counter-arguments. It’s thesis was that the media exploited global warming to keep us in a state-of-fear and guilt over the very act of being human. And then, I moved into…

The “OK, it may be happening, but who knows if it’s our fault” phase:

As time went on, I was exposed to more and more evidence in support of climate change that I could no longer deny. I had no choice but to adapt my theory and finally admit to some sort of climate change. “OK, it may be happening, but how can you tell if it’s our fault? We lack a control Earth!” To back myself up, I clung to a variety of fringe arguments: “It’s the sun!”, “We can’t trust the measurements!”, “It has happened before! It’s normal!” and so on. (You can find a long list of common climate change myths debunked here and a shorter version here. Right now the list counts up to 176. New ones are added often.)

Some studies have suggested that people who believe in one conspiracy theory, tend to believe others as well, even if they are contradictory to one another. This is usually because the conspiracy theory needs to be strengthened in face of every new piece of contradictory evidence. Also, once you fall into the trap of believing that a huge sinister organisation can perform an action so perfectly yet covertly, then you believe other conspiracies are plausible. Thankfully, I avoided the meta-conspiracist delusions. During this years, I actively discussed and argued with other conspiracy theorists and denialists, especially on biology and health-related issues such as evolution, immunization, and genetic engineering. Still, I kept adaptin my own denial, eventually leading to….

The “It’s not that important: phase:

This became recurring thought supplied by my lack of knowledge and failure to see the impact that climate change has on the environment. I kept referring to other pressing and more tangible global issues. I was blind toto how the pressing environmental concerns of today —energy, water, pollution, sustainability — were actually in harmony with actions needed to fight climate change. This can be clearly seen in United Nations’ new Sustainable Development Goals and their special focus on climate change.

Head in Sand

Royalty free from Getty Images. Although the embedded HTML code did not seem to work, hence the screen shot. (http://www.gettyimages.co.uk/search/2/image?family=creative&phrase=head%20in%20sand&license=rf&excludenudity=true)

Finally, I crawled into…

The “Maybe I’m being in denial” phase:

No one undeniable bit of evidence unequivocally proved to me that humans were responsible for climate change, which makes sense if you’re me Science works on multiple proofs. One single experiment or piece of evidence supports a theory, it doesn’t prove anything. Over time, as different researchers gather more evidence a theory becomes refined and a more acceptable explanation for natural phenomenon. But it also took time because I was never astonished by a piece of evidence or a big news story; when you are in denial, evidence is unlikely to change your mind. On the contrary, it might persuade you to cover your ears and pretend you’re not listening. Believe it or not, there exists a “Flat Earth Society,” and no I won’t link to it.

So what happened to me then? What was the revelation? How did I enter…

The “Tear down the conspiracy wall!’ phase:

I began to actively pursue knowledge on how to discuss climate change with conspiracy theorists (the ones who believe in conspiracies in principle and therefore more likely to be climate change denialists) that I realized my strong-held beliefs and stubbornness matched the same criteria as the people I was trying to convince. I was a denialist myself.

I created a list of every question and doubt I had about the physics, chemistry, biology, economics and politics of climate change, and I started reading. I took online courses. I listened to podcasts. Every myth in my head popped and floated away. I learned that cosmic rays cannot account for the current patterns of climate change; that low and middle-income countries and their fragile economies are actually more vulnerable to climate change than high-income countries and should care more about it; that climate change could be accelerating desert formation; finally that pushing for renewable energies and sustainable development is in harmony with combating climate change. It all made sense without the need for an Evil Monster Corporation hiding a big truth or pushing a secret agenda. I was conspiracy-free!

Bottom line

No human is free of bias. There could be certain social, political and even personal circumstances that would stiffen a thought or belief in one’s mind. It takes effort to try to identify our biases and rid ourselves of them, or at least be conscious of them. But it’s definitely worth it.

KasraKasra Hassani is a former researcher in microbiology and immunology who gave up the lab to study public health. He now likes to sit at the intersection of research and the society.

 

by Stephen Strauss

So there I am sitting in a meeting room in Ottawa supposedly providing insight on the future path of one of Canada’s federal research agencies and all I can think is: If There Is a Technological God All Canadians Would Worship It Would Be The Babel Fish.

The image were created by Rod Lord for the TV series see http://www.bbc.co.uk/cult/hitchhikers/gallery/guide/babel2.shtml

The image were created by Rod Lord for the TV series see http://www.bbc.co.uk/cult/hitchhikers/gallery/guide/babel2.shtml

Okay, yes, we should back up.

The group in that Ottawa meeting room was tasked with looking at a series of previously identified global trends and a series of previously identified technological changes which — theoretically — would help us figure out which trends and changes would most affect Canada in the next 20 or so years. And as a consequence of that analysis, decide which future arenas the agency should/must get involved with.

Now if you are unaware of it, the listing of trends and areas of technological change has become a sort of intellectual fast food. The top ten or twelve or even the 56 rankings of these rankings seem to appear everywhere and often.

Spend time reflecting on any “trend” and it seems supra-banal. Yes, it’s true: Many countries’ populations are growing older; Climate change may change everything a lot; If there are more people fresh water should get scarcer. Nothing, which rises to the level of “oh my god I never thought of that”.

At same time, anything-but-banal new technologies like the Internet of Everything — that is all kinds of tiny devices hooking you with the Internet — is coming and Big Data — think a tsunami of data and data analyzing approaches — is coming and the Kingdom of Smaller than Small (Nanotechnology) is coming too. Altogether it seems new devices/apps in the next little while will make the recent past look like some technological Stone Age inhabited by data starved, climate unimpressed, and Internet impoverished 20th century Nerd-erthals.

Our group at the meeting in Ottawa, however, had no consensus on whether Canada, this most middling of countries with its poor to not even middling history of translating discovery into product, could carve a special niche within those changes. What can we do that the Finns and the Americans and the Brazilians and the Chinese and everyone else can’t also do? And given our previous history of hardly ever turning discovery into usable products, how can Canadians do it quicker and better and much, much more profitably than them?

And as I sat there listening to everyone trying to make smart remarks about future world changes without being able to say how this country would change the world, I realized it was a very modern Canadian gathering. That is we were all were speaking in English, but lots of people in the room’s first language wasn’t English. Their native tongue was mostly French, but accents suggested other languages were represented as well.

And that’s when the Babel fish as Canada’s “Gift to the Future” thought hit me. If you are looking for a cross-country Canadian civic religion it is — PQ party loyalists aside — bilingualism, and maybe also multilingualism.

What all Canadians would buy into and be happy to see research weight thrown behind is some technological version of the Douglas Adams’s Hitchhikers’ Guide to the Galaxy effortlessly translating ear plugged in quasifish. For those who haven’t seen or read Hitchhiker’s the Babel fish is described there as “small, yellow and leech-like, and probably the oddest thing in the universe. It feeds on brainwave energy received not from its own carrier but from those around it. It absorbs all unconscious mental frequencies…the practical upshot of all this is that you stick a Babel fish in your ear you can instantly understand anything in any form of language.”

The Babel Fish was of course just an iconic idea in a funny book but the idea of an instantaneously translator seems branded into a lot of science fiction. There are Star Trek’s Universal Translator and the telepathic field of Dr. Who’s Tardis and others.

And I thought, as the group discussed less obvious Canadian contributions (like coming up with technologies to cut down on water use because we had so much water) wouldn’t all Canadians want to be multilingual without having to go through the pain of actually learning another language — particularly if there was some kind of device which instantly translated all written words into your language too?

And then on the way home I wondered how close we were to a Babel fish/Universal Translator. And after some research found some exciting news to share: Much, much closer than I would have ever thought. Indeed we seem to be on the brink of at least approaching Universal Translator-dom if not Babel-fish-dom.

Simple devices that simultaneously translate a few words from one language to another already exist. Some pleasure cruises offer these devices to help people (who are just going to spend a short time in some foreign place) ask, “Where is the bathroom?” and understand the answer.

But there is also huge, big stuff taking place all around us. Skype is developing technology that could one day allow people to talk in one language and be heard in another. According to the Smithsonian magazine, Google is hard at work at “Kissing Language Barriers Goodbye.” Facebook has acquired something call Jibbigo which does translations in 20 plus languages.

Goddamn it, I then thought. This is happening, and nobody is alerting/warning us Canadians that this potentially huge big change in our national lives might be coming. And then I got depressed. Because maybe it is already too late for us to make any real contribution to the technology of simultaneous translation because how are we as a middling country going to complete with huge technological transnationals called Google and Skype and Facebook?

And then I had an epiphany.

Maybe it is too late for Canada to develop these technologies, but we could/should absolutely be the testing ground in which they are tried out. To be a 21st century Canadian is to either speak the other language, or wish you did, or want to make sure your kids are bilingual, or some combination of all the above. We have become a country where bilingualism, indeed multilingualism, is a kind of civic religion. But it is a hard because, well, languages are subtle and different and too slippery to easily equate. So lots and lots of us Canadians would put up with the glitches and the technological discomfort and costs of a Babel fish if it meant we really could talk to one another — and still be effectively unilingual.

Do you hear me Google and Skype and Facebook?

Start testing your universal translating technologies on us Canadians because we really, really, really want them.

Do you hear me unnamed federal research agency?

Give money to anyone trying to conduct these tests or develop apps related to them, because we really, really, really, really want something like the Babel fish to be in our national lives.

And we want it: Now/Maintenant/Ahora/Nú/今.

stephen_straussCSWA President Stephen Strauss has written about science for more than 30 years for The Globe and Mail, CBC.ca, Nature Biotechnology, The Walrus and many other places. One of his fondest memories of his time at The Globe was how his fellow journalists took to calling him Dr. Debunko

 

 

by Meredith Hanel

Peach flower, fruit, seed and leaves as illustrated by Otto Wilhelm Thomé (1885). [Credit: Wikipedia]

Peach flower, fruit, seed and leaves as illustrated by Otto Wilhelm Thomé (1885). [Credit: Wikipedia]

Peaches and nectarines are the same fruit minus a small genetic variation that makes nectarines hairless. When I first learned this little trivia tidbit I wondered about the difference in flavour. I prefer nectarines to peaches, but wondered if the taste difference was all in my head. Well, it’s not.

The genetic variation affects flavour, aroma, size, shape and texture. While the rough location of the genetic change has been known for some time, the exact gene and the exact change in the DNA sequenceof “nectarineness” has been a mystery. In March, scientists from Italy finally identified a disruption in a “fuzz” gene that is absent in peaches.

Agriculturists in China gifted fruit lovers with the peach about 4000 to 5000 years ago. At least 2000 years ago, again in China, nectarines burst on to the scene. Charles Darwin pondered about how nectarines popped up on peach trees and vice versa and described the odd finding of one fruit that was half and half. Would we call that a “peacharine?”

Darwin, and others, deduced that the nectarine was a peach variety. In 1933, scientists determined a recessive gene variant was responsible for the inheritance pattern of the nectarine’s hairless (glabrous) skin. The glabrous trait was given the designation G, with big G for the normal fuzzy peach character and little g for the glabrous nectarine character. Each fruit has two copies of this gene. Each parent gives one to the offspring fruit, which can be either GG, Gg, or gg, and only the gg fruits are nectarines.

Nectarine development [Wikipedia]

Nectarine development [Wikipedia]

 

The chromosomal location of the G trait was already roughly landmarked but the Italian research team zoomed in on the spot, sort of like how you zoom in to street view with Google maps. Many DNA sequence differences exist between nectarines and peaches that are not located in genes but are useful as landmarks along the chromosomes. These are called genetic markers. To zoom in on the G trait, the researchers crossed peach and nectarine trees and followed the offspring through two generations. The offspring had a mixture of peach and nectarine markers along their chromosomes but certain genetic markers, the ones closest to the G trait location, always went along with the nectarineness. These genetic markers landmarked the region to search for genes with mutations that could explain a nectarine’s fuzz-less-ness.

Within the landmarked region, the researchers identified a disrupted gene. The peach to nectarine gene disruption is a genetic modification by the hand of Mother Nature, an insertion of a transposable element. This type of DNA element can move because it contains its own code for the production of an enzyme that can “cut”and “paste”the transposable element to other locations in the genome. Transposable elements can get pasted right in the middle of genes, disrupting the DNA sequence. They are a known cause of genetic variation in plants. If you like chardonnay wine, you can thank a transposable element for disrupting the cabernet grape genome long ago.

In nectarines the transposable element stuck itself right in the middle of a gene called PpeMYB25. Genes with similarities to PpeMYB25 in other plants are important for making plant hair, called trichome, which can occur on the stem, leaves, flowers and fruit of plants. The PpeMYB25 gene is the recipe for making a protein that is a transcription factor, a type of protein that controls when and how much other genes are turned on, so a mutation in this one gene could explain not just baldness in nectarines but other nectarine characteristics as well, depending on what these other genes are that it controls. In this report the researchers focused on the peach fuzz characteristic. When they looked at flower buds during the period when fuzz or trichome first develops, they found PpeMYB25 to be active in the peach but not the nectarine buds.

This is the first description of a specific genetic modification that can explain the difference between peaches and nectarines, something that has long been a mystery.

This research makes a strong case that nectarine lack of fuzz is due to the inability of nectarines to produce the PpeMYB25 protein. How lack of PpeMYB25might lead to the other nectarine characteristics — flavour, for instance — still needs to be worked out.

References:

Vendramin, E. et al. (2014) A Unique Mutation in a MYB Gene Cosegregates with the Nectarine Phenotype in Peach. PLOS ONE. 9: e90574 Get paper

Ien-Chi, W. et al. (1995) Comparing Fruit and Tree Characteristics of Two Peaches and Their Nectarine Mutants. J. Amer. Soc. Hort. Sci. 120(1):101-106.

Darwin, C. (1868) The Variation of Animals and Plants Under Domestication, Volume 1, pg 363.

Meredith HanelMeredith Hanel earned her PhD in medical genetics and spent many years in the lab doing research in molecular and developmental biology related to medicine. Meredith works in science outreach with Scientists in School. She enjoys writing about science and loves to find out the biology behind just about anything in nature.

 
Courtesy of Pacific Northwest National Laboratory

Courtesy of Pacific Northwest National Laboratory

Open up any science magazine today and you’re likely to find at least one story having to do with the microbiome – all the bacteria that live in you and on you – and its impact on health. Although this field of science is in its early stages, researchers are linking disruptions in the microbiome to many big health problems we deal with today, including obesity, type 1 and 2 diabetes, celiac disease, as well as allergies and some forms of cancer.

Every article has its own way of framing the microbiome, whether the writer does it consciously or not. Take this line from a 2012 article in the Economist, which implies that a body and its bacteria live in a sort of mutualistic symbiosis – the two species live in a mutually beneficial relationship, and may in fact need each other to exist: “In exchange for raw materials and shelter the microbes that live in and on people feed and protect their hosts, and are thus integral to that host’s well-being.”

Yet I Contain Multitudes, the title of an upcoming book on microbes and their influence on the lives of animals by science writer Ed Yong, suggests a certain separateness between microbe and man. Similarly, this text from a recent article by Bryn Nelson of Gizmodo connotes a microbiome that is segregated from us, despite its constant ability to change in response to us: “This microscopic jungle is constantly adapting in response to our diet, antibiotic use and other environmental influences.”

As microbiome-related treatments take shape in the years ahead, regulators are struggling with how to conceptualize the microbiome. Should the microbiome be defined as an entity completely separate from us or should it be thought of as an integral part of what it means to be human – like the brain? Or perhaps it would be more appropriate to define it somewhere in between, as an organ that is part of us but which we could theoretically live without – like the spleen.

It’s not just semantic whimsy: it has implications for how we will one day access microbiome-related treatments for our own health.

One example where this concept hangs in the balance is the issue of fecal microbiota transplantation, or FMT. This is a medical treatment in which a fecal sample from a healthy donor is administered to a sick patient in order to ‘re-colonize’ the digestive tract with a better-functioning population of bacteria. The evidence that it works for C. difficile infection is irrefutable; no other treatment comes close.

Regulation of FMT has been a tricky issue, and it’s far from being resolved. Doctors have been quietly carrying out the procedure for years, but as FMT gets more widespread and the scientific literature grows, the United States’ FDA and equivalent agencies all around the world need to figure out how to regulate it. In doing so, the agency is trying to decide on your relationship status with your stool.

Dr. Alexander Khoruts, a FMT researcher and a leading clinical expert on the procedure, disagrees with the FDA’s current decision to regulate FMT as a drug, which implies the view that the bacterial population is completely separate from the human body.

“I think it is an organ transplant. I’m willing to [call] it a tissue transplant,” says Khoruts. “The reason why the FDA did not accept the notion of a transplant is because it considers this material not human. And the law is written that human transplants are distinct from drugs.”

He argues, however, that the bacteria inside us, which happen to come out in fecal form, are part of what makes the human species. “[There is] good evidence that these microorganisms have co-evolved with their human hosts,” he explains. “It’s true they’re open to the environment and they are changing; however, it may be they’re still part of humans. There are no germ-free people running around. So to me, it is an organ transplant.”

The FMT discussion continues to broil, but other examples are going to emerge. What if we take a bacterial species found in most humans and give it to an obese patient who doesn’t have it? What if we make a ‘functional food’ by adding to cheese three species of bacteria that are commonly found in Western, but not Eastern, populations? Regulators of all kinds will have to consider our relationship with the microbiome in the coming years. We’d best lead the way by watching our language.

Kristina Campbell writes for Gut Microbiota for Health Experts Exchange and blogs about the science of gut bacteria as the “Intestinal Gardener“. She and her microbes go everywhere together.  

by Lillianne Cadieux-Shaw

Humans are fascinated by sharks. A whale shark at the Georgia Aquarium [http://upload.wikimedia.org/wikipedia/commons/a/a1/Male_whale_shark_at_Georgia_Aquarium.jpg]

Humans are fascinated by sharks. A whale shark at the Georgia Aquarium [http://upload.wikimedia.org/wikipedia/commons/a/a1/Male_whale_shark_at_Georgia_Aquarium.jpg]

A giant shadow slices through the water, effortlessly, gracefully almost, as if in a ballet’s glissade. An eye glints in the darkness. A row of carnassial teeth appears and grows wider, darker, deeper in the depths. Then, panicked bubbles, flailing limbs, a desperate attempt to swim up towards the surface. But it’s too late. The poor floral-swimsuited victim knows it. The ancient poikilothermic beast with jaws the size of a fridge knows it. Everyone at home, white-knuckle gripping their popcorn bowl as they watch, knows it. There will be blood. This is Shark Week after all.

In 1988, a bunch of Discovery Channel executives were out at a bar, shooting the breeze, talking about what kinds of shows would be fun to produce, and one guy said: “You know what would be awesome? A week where we just have shows about sharks.” And they laughed. Because how preposterous that would be! A week of sharks! But they did agree it was a good idea. It would definitely be fun. One of them scribbled it down on a napkin. They brought the idea into the studio with them and gave it fins, airing 10 episodes that July. The first show was called Caged in Fear, about the testing process for motorized shark cages. To their surprise, ratings that week doubled. Discovery had stumbled on to something big, a key formula of entertainment that Discovery Channel founder John Hendricks articulated best when he said, “If an animal can eat you, ratings go through the roof.”

It worked. By 2006, Shark Week had been immortalized as a pop-culture phenomenon when Tracy Morgan’s character in 30 Rock gave advice to a colleague to “Live every week like it’s Shark Week.” Now, it is in its 27th season, with around 30 million viewers. Colbert declared it the second holiest annual holiday after Christmas. It has its own drinking game (rules can be found here) and the amount of adorable Shark Week related baked goods on Pinterest is just absurd. There’s no doubt that through its rowdy approach to wildlife, many people got excited about marine biology.

There’s just one problem, though — Shark Week is a fraud. Though it has always been about entertainment, at least Discovery based it on real science. Now, it’s hard to separate the science from the science-fiction.

Humans and the fearsome Megalodon never shared the planet at the same time. [http://en.wikipedia.org/wiki/Shark#mediaviewer/File:Megalodon_scale.svg]

Humans and the fearsome Megalodon never shared the planet at the same time. [http://en.wikipedia.org/wiki/Shark#mediaviewer/File:Megalodon_scale.svg]

The highest rated program in Shark Week history, which aired last year, was called Megalodon: The Monster Shark That Lives. It depicted a terrifying prehistoric shark with teeth six inches long and jaws that could crush a Volkswagen. After the show aired, an online poll on Discovery’s website (which was quickly taken down) showed that three quarters of respondents believed the Megalodon was still alive, roaming the seas. This was presumably even after reading the brief disclaimer at the end of the program that vaguely said, “though certain events and characters in this film have been dramatized, sightings of [the giant creature] continue to this day. Megalodon was a real shark. Legends of giant sharks persist all over the world. There is still debate about what they might be.”

There will be blood. [USA Today]

There will be blood. [USA Today]

The Megalodon did exist, and was surely terrifying. But, according to all archaeological and biological sources, the Megalodon went extinct two million years ago. It also turned out that some of the ‘scientists’ on the show were paid actors, images were doctored, and real scientists were misled and their quotes distorted. Discovery received quite the backlash for their scientifically misleading program. So it was a bit of a surprise that one of the high-billed programs this year was….you guessed it! Megalodon: New Evidence, sequel to the Monster Shark That Lives.

When asked about the scandal, and why they continue to insist that an extinct mega-predator may still be skulking around, the general atmosphere has been one of wishy-washying executives waggling their fingers and saying things like, “Well, who really knows?” in mysterious voices.

And unfortunately, it isn’t just the Megalodon episodes which are misleading. Shark of Darkness: Wrath of Submarine had the same flawed conceit – that there’s a big shark out there for which there is no scientific evidence. Lair of the Mega Shark? Also about a big imaginary shark.

Perhaps these problems seem silly — Shark Week has never been anything but entertainment, right? Cooked up by Discovery execs at a bar as a silly idea to boost ratings, Shark Week has always been about getting more viewers. In fact, the descent into parody should have been obvious from the repeated showings of Sharknado, the B movie satire about literally a tornado of sharks, over the course of this year’s Shark Week (or equally apparent even by glancing at the titles of their actual shows – Sharkaggedon, Zombie Sharks, Lair of the Mega Shark). Shark Week is marketed like any other form of mass entertainment, with the average viewer in mind, who is tuning in for escapism, or even just mild diversion. So isn’t all this brouhaha really a criticism of the decline of public programming in general? Unfortunately, Discovery must be held to a different standard, one with if not real science, then at least transparent motives.

The first problem is with the channel’s consistent breach of basic journalistic ethics. Shark Week not only hired actors to play scientists, but also got real scientists to contribute and then took their words out of context, making it seem like they agreed with some preposterous claim. One scientist who appeared on Voodoo Shark last year was interviewed about bull sharks, and asked if there might be any in the area. He said it was entirely likely. According to the scientist in an interview with io9, the show spliced out the question and put his answer to a question about whether there was a local-legend ‘voodoo’ shark nearby. Kristine Stump, a research associate at Shedd Aquarium, felt equally misled when she saw how her team’s research was portrayed on Monster Hammerhead.

Another larger journalistic issue is with how their programming exploits shark-as-vicious-predator rather than shark-as-endangered-species. They focus on the thriller appeal of great whites and not enough attention on the five hundred other species of shark.  They spend inordinate amounts of time detailing sharks attacking humans, though you are more likely to die from digging holes on the beach than from a shark attack, statistically speaking. This emphasis on shark aggression rather than shark conservation is akin to finding one scientist who denies climate change and then claiming that the entire scientific community is divided on the subject. With a quarter of all shark species threatened by extinction, they have much more to fear from us than we ever will from them—portraying them as human-eating beasts only increases a disassociation from them as living creatures deserving of our respect and protection.

The second problem with Shark Week is about the standard of science they are expected to uphold. Misleading viewers into believing there is a shark splashing around that all scientists agree went extinct in the early Stone Age is not even bad science, which can be dismissed as an isolated breach of the scientific method, but is pseudoscience, the “promotion of teachings different from those that have scientific legitimacy,” according to the Stanford Encyclopedia of Philosophy. Discovery’s Shark Week used to be a reliable and trustworthy source for popular science. It is this prior reputation which makes their scientific breaches such a disappointment — they are relying on their once-upheld educational values to convince otherwise rational people of fiction. They are mongering fear, exploiting a human need for drama, and peddling penny dreadful pasquinades masquerading as real science, all under the guise of the popular science they used to do so well.

The democratization of science has made it easier for everyone to have access to scientific news, journals and educational material. But because there’s so much information out there, we must somehow separate the words from the static, which can seem like an overwhelming, Sisyphean task. That’s why Discovery was so exciting, because they dug for diamonds in field upon field of coal, and made it fun in the process. But when they start turning up coal, painting it in cheap, sparkly glitter and trying to pass it off as diamonds, that’s when people feel betrayed. The one promising streak out of this hullabaloo, is that it mattered. People wanted real science.
The comment section obloquies and adversative media afterclap, the scientists coming out warning other scientists to think twice before talking to Shark Week producers, the thousands of Facebook vows to never trust Discovery again, it all shows a deep caring, an upset that viewers were being treated as if they didn’t care about scientific and journalistic integrity. It is this optimistic reaction which should make Discovery rethink before airing a Megalodon: Revisited trilogy.

LilliannePhotoLillianne Cadieux-Shaw is a freelance journalist and writer  passionate about science journalism, wildlife, space exploration  and finding photos of pugs on the Internet. You can find her on  Twitter: @lilcadieuxshaw

 
Set your Twitter account name in your settings to use the TwitterBar Section.