Social psychologists Susan Fiske and Cydnee Dupree have identified a big problem scientists have when it comes to how the public perceives their profession. But I think it’s a problem that scientists can easily address with a bit of foresight.
Fiske and Dupree measured people’s impressions of different groups in society based on competence – our smarts, skills and competitiveness – and warmth, which can include our perceived sincerity and good-naturedness.
Take a look. Can you spot scientists here?
In this measure, scientists come across as highly competent, but not very warm. They have high status in society, people think, so they must be pretty good at what they do, but they’re just not sure about why it matters to them. In fact, this sort of cold-but-competent stereotype can provoke feelings of envy, according to Fiske and Dupree’s research. Indeed, I would hazard that this is probably one of the core emotional strains of anti-elitism and anti-intellectualism in public life.
Conversely, check out doctors, nurses, teachers and professors. They’re competent and warm, in the public’s estimation. That stereotype provokes feelings of admiration, which is probably a lot closer to where science communicators would want to see “scientists” on this chart, especially if they care about public trust in science. (Their full paper is here.)
So what’s the difference here? I think people intuitively understand why doctors and nurses do what they do. They save lives and doing that sort of work requires some pretty obvious compassion and empathy for people.
Teachers teach. They love their students. Professors…profess and they’re often motivated by passing on knowledge to the next generation, too. But what do scientists do, generally speaking? Well, science. And that doesn’t tell the public much.
In a lot of public communication, I see scientists lead with their science and their methods and their expertise, but they don’t mention why they do what they do. Helping audiences understand that “why” is so important. As NASA’s Gavin Schmidt has pointed out, failing to convey motivations to an audience means that people will fill in the blank with the nearest stereotype they have on hand.
So researchers and science communicators who want to come across as a little warmer, a little more admired and – yes, a little more credible and relatable with audiences – should consider the following:
Ten or 20 years into a scientific career – or heck, even at the end of a PhD program – scientists sometimes find it difficult to harken back to why they got started down their career path in the first place.
Maybe it was nature walks they took when they were kids. Maybe they had an amazing teacher. Maybe they were always blown away by big machinery. Or, like a lot of nerds, perhaps it was the Star Trek that did it. (::raises hand::)
In every scientist’s life, there is some point or series of points at which they decided to dedicate themselves to difficult, sometime tedious work, to discover new things. Finding that moment – even if it’s something you fell into – is so important for connecting with audiences.
Most scientists got into the field because they were intrinsically fascinated by something. They were the kids who collected bugs in jars, who took the toaster apart and surprised their parents with dangerous birthday wish lists. (“But it’s just a chemistry set, mom!”)
Scott Mandia, an award-winning science communicator, likes to use the example of researchers who get dropped off in frigid polar wastes to study the climate. Are they doing that for the grant money? Or because it makes them the queen or king of the cryosphere? No way. You have to be Pretty. Damn. Curious. to take a trip like that.
Why are you giving a talk, testifying before a committee or releasing a paper to the media? What’s your goal? For scientists, it’s usually to inform. They want people to be able to use research to guide everyday choices they make, whether it’s what kind of medicine they use or what kind of food they buy. And on the policy front, scientists want to make sure that political leaders can make decisions based on the best science. For scientists working on commercial products, it can also be to provide a service, grow a business or attract investors to a new innovation.
Scientists should always be up front with audiences about what they’re doing, why, and who is supporting their work. They also have to self-reflect: sometimes scientists engage in public communication as an extension of competition within the scientific community or because they really think they know better than everybody else. Those are unhealthy premises from which to take on science communication work and they can easily backfire, not just with the public, but with one’s colleagues’ too, especially when scientists leave their disciplinary lane.
All of these points have to be genuine and come across as genuine. Audiences can sense something fake or manipulative a mile away. So, for instance, scientists shouldn’t name-check the local sports team when giving a public talk unless they really love that team. And if a scientist wants to invoke something general like being a parent, they should back that up with specific examples that are about them and their kid. It’s those detailed, specific stories that stick with audiences.
Helping audiences fill in these blanks can be a huge favor. If a scientist can get it down to a minute, that might be the “elevator pitch” for their work. These stories are also good openers for talks and testimony. And they’re great fodder for personal websites and sprucing up dusty faculty pages.
They can also be worth sharing on their own merits, as the growing Story Collider series has proven. And importantly, when scientists think deeply about why they really do what they do, it can also reveal stories about their own work that help them conceptualize new avenues for research. That’s right: all this fuzzy stuff about storytelling, narrative and audience can help researchers do better science, too!
A video version of this post is available here:
(3/14 update: I edited the links above to point to a Phys.org story about the research and added another link to the paper itself.)Continue reading
Adam Siegel, a security analyst, management consultant, and blogger, made an interesting point on Twitter in response to a Washington Post headline about a sea-level rise study.
— A Siegel (@A_Siegel) February 22, 2016
Indeed, people say all manner of things that may or may not be based on data and expertise. The verb “say” is ubiquitous in news writing and is perhaps uniquely democratic: it can describe statements from the proverbial “man on the street” just as easily as a speech from the president of the United States.
So do scientists and other analysts deserve different verbs when we’re talking about their research? I think there’s a case to be made that verbs like “conclude” or”find” are more precise for describing how scientists conduct analysis and…well, find things out and draw conclusions.
Naturally, if scientists are expressing an opinion – about a peer’s work or a matter of public dispute, for example – they might be “saying” something just like anybody else.
There’s a another way to think about this, too. Data and evidence can also do things like “show” and “demonstrate.” Writers might also attribute analytic findings to a paper, a research project or a scientific enterprise rather than to researchers themselves. Of course, that sometimes removes the human element from the equation, something scientists are used to doing in passive-voice academic writing.
These might seem like minor points at first blush, but I think Siegel is onto something. Certainly, in my own writing, I tend to use those more precise verbs, but then again, I’m used to scientists editing my prose!
It also reminds me of some other discussions I’ve had about people “believing” in climate change or evolution, a word that can easily conflate matters of fact with matters of opinion and faith. [Edit: Siegel just pointed me to one of his own posts on this very point.]
I do wonder, in the aggregate, if these small language choices are more powerful than we think. For now, I’ll have to file this under “topics I wish I could test in social science.”
On Twitter, Beth R. also suggests the verbs: measure, examine, analyze, observe, find, determine, conclude.Continue reading
Scientific American recapped a recent conversation at a major scientific meeting about whether or not there is really a “war on science.”
As a communications practitioner who has helped many scientists deal with political attacks on their work, I don’t think there is a war on science itself. Rather, I think we’re dealing with politicians and advocates who see science as one of many fronts in a series of ongoing battles. Or, to drop the war metaphor for a second, science is simply caught up in a bunch of different democratic disputes.
Science historian Mark Largent told meeting attendees that scientists shouldn’t approach this topic from a position of defensiveness, even when they are being attacked. Instead, he argued, researchers should recognize that science remains an incredibly powerful institution and that with great power, comes great responsibility.
— becca harrison (@_beccaharrison) February 13, 2016
It’s a provocative thought and my first instinct was to reject it. I know a lot of scientists who have felt absolutely helpless when politicians, advocates and straight-up trolls have attacked their research and their integrity. But as I thought about Largent’s point a bit more, I asked myself why people attack science in the first place. It’s usually because everybody in a political debate wants to be seen as having science on their side.
So even when individual researchers feel unfairly targeted by politicians or advocates, it’s helpful to understand that that is happening precisely because outsiders rightfully see the scientific community as a powerful arbiter of credibility in many democratic debates.
In response to a series of Tweets I posted about this topic, sociology professor Aaron McCright pointed me to a helpful distinction he and his colleagues have made between various fields of science that have been caught up in public controversies.
“Production science,” they say, creates new things. That includes new medicines, new ways to harness and produce energy, new chemical compounds and new crops and food products. “Impact science,” meanwhile, focuses on what’s happening to our health, our air, our water, our atmosphere, and the world’s built and natural environments.
These distinctions help explain how someone like Rep. Lamar Smith (R-Texas), the chairman of the House science committee, can be bullish on NASA’s production science, such as looking for life on Europa or the James Webb Telescope, but eager to cut NASA’s impact science, namely the agency’s Earth monitoring and climate research.
Conversely, anti-vaccine advocates think the pharmaceutical industry – a behemoth in production science – is poisoning children. So when they saw a piece of impact science by Andrew Wakefield, long-since retracted, which linked early-childhood vaccines to autism, they ran with it. And when public health impact scientists tell them they’re wrong – that vaccines save lives – they try to box those public health researchers in with the pharmaceutical industry, too.
Similar dynamics also play out in the scorched-earth world of disputes about agriculture and biotech. It strikes me that many people involved in that debate brand themselves as pro-science and are often tacitly arguing about whether or not agricultural production science or environmental and public health impact science should determine public policy outcomes on everything from crop subsidies to GMO labeling.
There are plenty of other useful frameworks for examining why people accept or reject certain types of science. Cultural cognition research offers another lens, based on political ideology, for understanding challenges to everything from nuclear power to gun violence research. Conspiratorial thinking also helps explain the persistence of misinformation on many of these topics, which can stem from a simple distrust of elites, including scientists. And political science offers many lessons in how corporate spending on front groups, scientific consultants, and political speech can warp democratic debates, as well as the practice of science itself.
None of these frameworks offers a holistic explanation for why people reject science on a given topic, but they are all useful.
If we want to resolve any of these “wars” or “battles” or “disputes,” it’s important to understand why people outside the scientific community accept or reject specific scientific findings. We need to be able to accurately and effectively communicate with audiences about these topics on their terms rather than hoping, or insisting, that people outside the scientific community will learn to love and appreciate science for the same reasons we do.Continue reading
The Northeast Conference on Science and Skepticism has dropped Richard Dawkins as its keynote speaker this year. That’s a good thing.
Dawkins is a gifted writer and speaker and many of us enjoy a greater understanding of biology and evolution thanks to his work. He’s also written beautifully about the positive influence science and reason can have in our lives, arguing passionately that our scientific understanding of light, for instance, does not “unweave the rainbow,” but only makes it more gorgeous to behold.
That’s why it’s so disappointing to see him engage in such negative, boorish behaviors online, from criticizing Rebecca Watson for speaking out about rampant sexism among skeptics, to attacking “clock kid,” to his latest – sharing a terribly stupid, juvenile video titled, I kid you not, “Feminists love Islamists.”
(Caution from personal experience: If you find it and watch it, the tune will be stuck in your head while you’re making eggs in the morning…in a bad way! And you’ll be flummoxed as you try to figure out why a respected scientist would share such a thing online with his 1.3 million followers.)
Naturally, this will become part of the broader conversation about self-censorship, and free speech. Dawkins would certainly like to frame it that way and has been sharing messages like this on Twitter:
@RichardDawkins Stand strong, Prof. They’re good at deliberately misinterpreting arguments, claiming hate speech and demanding apologies
— Lyle Yiannopoulos (@Lyde15) January 28, 2016
But I don’t really buy it. There’s something more basic and important happening here. Dawkins often acts like a jerk on social media. And Twitter is such an open, free-wheeling platform that he winds up alienating audiences for whom he never intended his original messages. As a result, Dawkins online posts feed into bad stereotypes about aloof, arrogant scientists and, at least in my opinion, do more to set him back as a public intellectual.
Additionally, no government is stepping in to censor Dawkins or his words. No one is shutting down his Twitter account. It’s a conference saying, “You know what? This guy’s not for us or our audience.” I’m sure Dawkins has plenty of other conferences at which he can speak and I’m sure there are plenty of speakers who are a better fit for NECSS and its members.
My completely unsolicited advice for Dawkins is that he should ask himself what his goals really are with Twitter and other social media. If it’s to piss people off and distract from science communication and humanism, he’s doing a bang up job. He might even have some lessons to learn from Alec Baldwin and other celebrities who have given the world a little too much access to their internal brain musings.
He might also follow some tips from other successful scientists on Twitter and try being a little more positive, inspirational and constructive, while saving the vicious takedowns and hyper-critical thought for blog posts, books and the lecture circuit, where those messages can be better understood in context by their intended audiences.
NECSS’s statement is below. And as a long time fan of the Skeptics Guide to the Universe podcast, I have to say that Steve Novella and crew have always approached their work with heart. They have empathy for people who disagree with them as well as people who are duped by charlatans and misinformers. That’s all too rare these days, but it’s critically important for good science communication. I appreciate their stand and the open way they’re communicating with members.
A Statement Concerning Richard Dawkins
The Northeast Conference on Science & Skepticism has withdrawn its invitation to Richard Dawkins to participate at NECSS 2016. We have taken this action in response to Dr. Dawkins’ approving re-tweet of a highly offensive video.
We believe strongly in freedom of speech and freedom to express unpopular, and even offensive, views. However, unnecessarily divisive, counterproductive, and even hateful speech runs contrary to our mission and the environment we wish to foster at NECSS. The sentiments expressed in the video do not represent the values of NECSS or its sponsoring organizations.
We will issue a full refund to any NECSS attendee who wishes to cancel their registration due to this announcement.
The NECSS Team
Feb. 2 update: I just read Steve Novella’s 1/30 explanation for dropping Dawkins. I agree with all of it, including his rebuttal to claims that the decision is somehow anti-free-speech given Dawkins’s already expansive platform. He also addresses what he considers a few valid criticisms of how he and his colleagues handled things. That’s classy and it’s in keeping with good skeptical thinking. The whole post is well worth a read.Continue reading
Nuclear power was a science issue until is wasn’t. In the 1950s, policymakers agreed that nuclear energy could harness the destructive power the American military and scientific establishment unleashed in Hiroshima and Nagasaki to produce electricity instead. We would tame the atom and use it for peace. Ford even toyed around with the Nucleon, a nuclear-powered concept car.
But nuclear power policy shifted in the 1960s and 70s. Citizens wanted more of a say in how nuclear power plants were sited and operated. The environmental and peace movements questioned the utility of nuclear technology itself, especially as the Soviet Union and United States adopted positions of mutually assured destruction. Eventually, Congress sundered the duties of the Atomic Energy Commission – which promoted and regulated nuclear power and made nuclear weapons, too – and created the Nuclear Regulatory Commission. After the Three Mile Island accident, U.S. nuclear power plant construction ground to a halt.
Two political scientists argue in a seminal work on policy change in the United States that this change-over is one of many examples of the punctuated equilibrium model of evolution playing out in U.S. politics, rather than in the Galapagos Islands. As more people found they had a stake in the nuclear power debate, the more diverse and chaotic the debate became, until a new order was established. Along with that shift, the influence of science reporting on the topic diminished as political debates intensified and as more people focused on the running and regulation of nuclear power rather than the promise of nuclear technology.
That history has long informed my skepticism when I talk to advocates and scientists who are enthusiastic about nuclear power. I sympathize with them, though: I grew up in a nuke plant town; the Oyster Creek Nuclear Generating Station was part of the landscape along Highway 9. We paid little mind to the sporadic evacuation drills at school in the event of a nuclear accident. And while a few parents pulled their children out of our high school on September 11th because they had heard false reports that nuclear power plants would be targeted next, I remember shrugging it off with a few of my buddies – why would anyone attack South Jersey when they were that close to Philly and New York?
I have classmates who’ve worked as security guards at the plant – obviously security stepped up after the attacks. Years after graduating high school, when reviewing an NRC safety incidence report from the plant, I recognized the name of another classmate’s father. In nuke plants towns, it’s our neighbors who are in charge of keeping the plants safe.
My home town seal even features an atom:
So my problems with nuclear power plants that exist today are narrow: let’s make sure the safety engineers and inspectors at the plants can do their jobs, by all means. And let’s figure out the waste problem, at some point, please.
But as far as climate change goes, there is something kind of nice about nuclear power plants that are operating today: they’re producing low-carbon electricity and they’re already paid for. But when we talk about the future of nuclear power, it’s not science or even safety that dominates the debate – it’s economics.
To no one’s surprise, it’s remains stupendously expensive to create a facility that uses controlled fission reactions to boil water and make electricity. And, in an essay for Gullies.org, I argue that that’s where opeds in favor of nuclear power should focus. It’s too easy to assume that green skepticism about nukes is what’s holding the technology back – that’s the sort of simplistic “but you’re a hypocrite” rhetoric that plagues our politics.
When it comes to curing energy production of its carbon blues, we need cogent arguments on economics and, indeed, economies of scale. Such arguments are becoming easier to make for wind, solar and renewables. And in a world with many energy options, the renewable success story makes the argument for nuclear power worse.
You can read the essay here. Enjoy!Continue reading
Sarah Myhre and Tessa Hill, two scientists who study the ocean and climate, published an interesting and, at times, challenging conversation on Medium yesterday that touched on an overlooked aspect of professional development in science communication.
As Hill puts it:
What is missing in many of these discussions and documents is how engaging in science communication will impact the scientists themselves. How will scientists walk the line between relaying scientific information and expressing personal views? How will researchers weigh the impact on their career — both positive & negative — that arise from speaking publicly about their work? How can universities and research institutes provide support to scientists who chose to spend time engaging and communicating?
Myhre agrees, noting:
We have almost no conversation within our community about how science communication and media exposure might impact individual scientists. I think this is where much of the moral quandaries exist.
They go on to discuss the hard work scientists have to do examining their own values and, indeed, their ultimate goals, when it comes to communicating to the public, policymakers and media. They also critically examine the practical trade offs scientists have to make when they prioritize communications work.
These are questions every scientist who does research of public import has had to grapple with, but it’s clear that a new generation of scientists is making a significant argument that Myhre articulates succinctly and powerfully:
Our institutions are responsible for evolving along with us.
Absolutely. The communications landscape has radically shifted since I earned my degree in the field. It will continue to do so under our feet and fingertips. Scientific societies, universities and training programs have to embrace constantly shifting communications best practices and effectively convey them to scientists.
But let’s not let these concerns hold us back, Myhre and Hill argue. All these changes mean we also have room to experiment, to figure out new things and to do so knowing that science has so much tell us about our world and about ourselves. Myhre and Hill conclude with a hopeful message suitable for framing and desktop backgrounds:
Be brave: there has never been a more important time to be a well-spoken member of the scientific community.
In fact, I found their message so inspiring, I went ahead and made a desktop background out of it. You can download it by clicking on the image below.
(It’s 1600 x 1200 and the base layer image is from NASA — naturally! — and was taken by Apollo astronaut William Anders.)
You can follow Myhre and Hill on Twitter. Their conversation is well worth a full read; it also includes a discussion of routine sexism in media coverage focused on female scientists that will ring true for many readers, too. Myhre also has another excellent Medium piece in which she guides readers through her process for carefully developing main messages around her research.Continue reading
Tickets are available here and, naturally, you can buy tickets in person at DC9, too.
If you’re wondering what to expect, check out Story Collider’s podcast. Or, as they inspiringly put it:
At the Story Collider, we believe that everyone has a story about science—a story about how science made a difference, affected them, or changed them on a personal and emotional level. We find those stories and share them in live shows and on our podcast. Sometimes, it’s even funny.
What you’re getting into: 1200 words, a 4 to 6 minute read.
Paul Thacker argues in the New York Times that scientists should cough up their emails when politicians, advocacy groups and investigators request them. It’s an interesting thought experiment, but Thacker’s op-ed downplays the value of preventing scientific harassment and fails to make the case that disclosure is actually suffering as scientists defend themselves from various attacks on their work and reputations.
When research is paid for by the public, the public has a right to demand transparency and to have access to documents related to the research.
That’s true, but figuring out what constitutes “documents related to the research” gets to the heart of political and legal disputes on this topic. Unfortunately, Thacker’s piece doesn’t delve into these distinctions, despite a growing body of legal rulings on this topic.
For instance, it’s hard to think of any reason taxpayer-funded data and research shouldn’t be public, except for narrow cases like protecting patient privacy or national security. There’s also broad agreement among scientists, advocates and journalists, that correspondence with a funder about the scope and nature of a project should be subject to disclosure.
But a public university scientist’s correspondence with a colleague in which they criticize a peer’s ideas or rate the quality of a grad student’s work, for instance, should not be disclosed, scientists and academic groups have argued. Making such correspondence public, they say, harms researchers’ ability to freely bat around ideas, thus infringing on their ability to do their jobs and their right to free inquiry.
For these reasons, among others, the Virginia Supreme Court blocked a fossil fuel funded non-profit from accessing years of scientific correspondence among climate researchers. The Court said that these exemptions would prevent “harm to university-wide research efforts…and impairment of free thought and expression.”
Exemptions to disclosure laws vary greatly by state, of course – Texas specifically exempts scientific data related to oil exploration – so what gets fairly exempted in one state might not in another, or at the federal level. But it’s clear that courts recognize that there’s significant public interest in preventing harassment and protecting academic freedom at public universities.
Thacker writes that:
the harassment argument should not be used as an excuse to bar access to scientific research that the public is paying for and has a legitimate interest in seeing.
I can’t think of any scientific or academic society or group that has attempted to bar access to taxpayer-funded “scientific research.” We need to be clear here: these arguments are very rarely about access to things like scientific data. They are usually about things like funding and email correspondence.
Of course, we should sympathize with watchdog groups and journalists who already have far too tough a time getting public agencies to comply with FOIA requests. The Society of Professional Journalists, for instance, has a guide to helping reporters rebuff the many silly excuses they get, including from universities, that don’t want to disclose information which should obviously be public. But those problems with FOIA compliance are far broader than the narrow circumstances under which scientific societies and academic groups have asked for exemptions.
Further, it’s not clear that these narrow exemptions are causing the problems Thacker worries about. For instance, Thacker links to a recent freedom of information request that was rejected by a university on harassment grounds. Fair enough, but the rejection is from a British university, where freedom of information (and libel) laws are quite a bit different than they are in the United States. Further, the example involves a dispute among researchers for access to a data set, not an attempt by a politician, watchdog group or media outlet to get access to scientists’ inboxes.
Thacker also cites many examples of disclosure requests revealing corporate interference in science. Again, fair enough, but he doesn’t make the case that the narrow academic freedom exemptions scientists have asked for would have prevented any of those investigations from succeeding. Maybe they could, but there’s at least one high-profile example of an academic who tried to hide suspect financial ties by appealing to such exemptions and lost.
Thacker says that scientists contradict themselves when they embrace transparency on one front, but not another. For instance, scientists have objected to Rep. Lamar Smith (R-Texas) demanding correspondence from NOAA scientists who authored a study that torpedoed a climate contrarian talking point. Thacker’s strongest argument is to point to another notable set of Congressional and FOIA-based inquiries targeting NOAA:
About 10 years ago, the agency released emails showing that officials in the administration of George W. Bush squashed a NOAA statement and that Bush political appointees were selecting which NOAA scientists could speak to the media based on their willingness to deny connections between climate change and hurricane activity.
Is this really a contradiction, though? Those investigations targeted political appointees in the administration who were silencing scientists. Disclosure of that political interference was clearly in the public interest. Rep. Smith’s investigation, by contrast, is much more muddled, especially since it started with questioning the validity of scientific research itself. In each case, scientists have supported efforts that prevent political interference in the scientific process.
These issues are complex and it makes sense that watchdogs like Thacker want to draw a hard line on disclosure laws. In fact, they absolutely should. We benefit when transparency advocates push for more sunlight. But trying to paint scientists as hypocritical on these issues does little to advance transparency. For his part, Thacker concludes his piece with this admonition:
Scientists who profess agreement with transparency only when it is on their terms are really not for transparency at all. The public should be alarmed.
Scientists would argue that the public should be alarmed when politicians and advocates attempt to stymie scientific research they don’t like. The argument scientists and scientific societies have made, repeatedly, is that there is a public interest in disclosure and a public interest in protecting scientists from political interference and harassment. Thacker only acknowledges the former point, arguing that harassment is the price worth paying for fuller transparency.
Transparency advocates could do more to recognize that scientists are right to stand up against political interference in their work. The Climate Science Legal Defense Fund detailed some of the costs of dealing with harassing requests in response to Thacker’s oped, too.
At the same time, scientists can do more to be transparent, too. As the public demands greater transparency from legacy institutions – including government agencies and universities – scientists are in a position to push their institutions toward proactive disclosure, including data, methods, funding sources and funding agreements.
Regardless, these disputes over harassment, funding and email disclosure won’t stop any time soon. The best way for scientists – and the public – to enjoy the benefits of transparency and freedom from political interference is to embrace proactive disclosure. If everyone in science was more transparent, the outliers would stick out like sore thumbs, and scientists would be smart to get out ahead of public demands for more transparency.
(I wrote about these issues last year when I was working at – and blogging for – the Union of Concerned Scientists.)Continue reading
I finally caught up on my reading over the holidays and was pleased to examine a rich presentation of views on science communication from the University of Michigan.
The report is based on a conference the university held called”Academic Engagement in Public and Political Discourse,” which featured many of the leading lights in science communication, such as Dietram Scheufele, who smartly acknowledged that most debates involving science aren’t about facts, but about the “messy space” where science and values intersect.
Similarly, former Rep. Brian Baird (Wash.-D) challenged the participants to consider what flipping the conference’s title might mean and why the idea of “public and political engagement in academic discourse” tends to give us pause. It’s a thought worth contemplating: technology and democracy are making all institutions, including universities, more open to public participation – and public scrutiny. Academics are increasingly embracing that openness, along with greater transparency about their own values.
Andrew Hoffman, who directs the university’s Erb Institute, organized the conference, which also included keynotes from NOAA administrator Jane Lubchenco and the always-positive, always-inspiring glaciologist Richard Alley. Hoffman’s book on climate communication, which I’ve reviewed previously, is an excellent resource for scientists, students and citizens on how people think about climate science and climate policy.Continue reading
John Abraham has a nice writeup in the Guardian about the American Geophysical Union’s science communication work. The organization’s Sharing Science initiative, in particular, is a growing hub for Earth scientists who are looking to convey their work with everyone from kindergartners to cabinet members.
I’ve worked with AGU staff for several years on member workshops and I was particularly struck this year by how ready scientists were to think through tough communications problems.
Like a lot of people who have run workshops with scientists, I’ve often found that I need to lead off by explaining why science communication is a good thing, why it doesn’t have to involve dumbing down your message and why it’s not up to somebody else (the media, the education system) to do it for you. More than once, I’ve had scientists ask me very critical questions about the very premise of even doing science communication in the place. Not that I minded — critical thinking and openness is one of the things I love about the scientific community.
But over the past year…I just haven’t had to do that. Scientists increasingly see and feel the need for better, stronger, faster, cooler science communication. And I think it’s easier than ever – thanks to the Internet – to see what happens when ignorance wins out over reason and conspiracy theories, misinformation and just plain goofiness on science-related topics proliferate.
Other societies are doing great work, too, of course, but I suspect AGU has been out front on a lot of communications work, in part, because Earth scientists are used to dealing with public controversies on two big hot-button topics: evolution and climate change. Importantly, it’s a society that’s open to lessons from other fields, too, including epidemiologists, tobacco researchers and historians of science.
There’s a lot to learn when PhDs take on science communication. For scientists, societies are often the very first place they turn to for help. AGU is right on to create a “positive feedback” effect of their own when it comes to fostering accurate, effective science communicationContinue reading