Journalistic Practice and Coverage of the Behavioral and Social Sciences

Sharon Dunwoody. Handbook on Communicating and Disseminating Behavioral Science. Editor: Melissa K Welch-Ross & Lauren G Fasig. Sage Publications, 2007.

Early in 2006, a team of psychologists at the University of Amsterdam published a research article describing a novel link between information processing and decision making. Dubbed the “deliberation-without-attention” effect, the studies suggested that what appear to be quick, intuitive judgments about complex problems may actually result in better decisions than those made in the heat of intense information gathering and processing (Dijksterhuis, Bos, Nordgren, & Baaren, 2006).

Media organizations around the world turned the research into news via a timeless, global process in which journalists encounter information, recognize something worthy in it, and respond by interviewing and writing. What is remarkable about the episode, then, is not the process itself but the fact that it was applied to a piece of peer-reviewed social and behavioral science research. While journalists have always depended heavily on journals for science news, they have historically been indifferent to social and behavioral science research, making stories such as this one all too rare in the mass media.

In this chapter, I will reflect on why this particular piece of research defied the odds and became news, as well as on a set of journalistic practices that govern the selection and structure of science news generally and social and behavioral science news in particular. In the latter half of this chapter, I will dwell on one particular judgment bias that I think has a significant impact on media coverage of social and behavioral science scholarship: a tendency for journalists to fail to see social and behavioral science phenomena as amenable to systematic exploration, a failure that leads them to construct narratives that ignore issues of evidence and context.

I will employ the long, rather awkward term social and behavioral sciences in this chapter as it captures not only the behavioral science focus of this text but also a literature that looks more generally at media coverage of the social sciences.

The “Deliberation-Without-Attention” Effect as News

Journalists do not monitor social and behavioral science journals on a regular basis. But science journalists do attend closely to a small number of major science and medical journals, among them Science magazine. It was in the February 17, 2006, issue of Science that the decision-making article appeared.

Science attends only sparingly to the social and behavioral sciences, so most journalists who received early access to articles in the February 17 issue were not looking specifically for a good social and behavioral science piece. (As is the case with other prestigious journals, Science makes an issue’s articles available to reporters several days in advance to give them time to interview sources for stories. Along with this access comes an embargo rule, requiring media organizations to hold their stories until the day of journal publication.) But the “deliberation-without-attention” story would have caught the attention of journalists for a number of reasons:

  • Although “just” a social and behavioral science story, it was one legitimized by publication in one of the nation’s top peer-reviewed journals. The rare nature of these articles in Science strengthens the signal that this work must be important.
  • The journal made the piece even more visible to reporters both by briefly summarizing its findings in its “This Week in Science” section in the front of the journal and by generating a short news article itself, written by one of the magazine’s news reporters (Miller, 2006).
  • The topic of decision making and judgment had become culturally salient with the 2005 publication of Malcolm Gladwell’s Blink, a book about the psychology of speedy decisions (Gladwell, 2005). Although “trade books” are invisible components of the environment of many Americans, the Gladwell book produced some buzz among journalists and would have been known among individuals of higher socioeconomic status, who can afford to build for themselves a rich information environment.
  • The scientists’ results violated our commonsense expectations, and such violations function as compelling news pegs for journalists. We typically regard decisions buttressed by effortful information processing to be better than intuitive judgments, but this work found the latter to trump the former. In a series of studies summarized in the article, participants received information in service to both simple and complex decisions. Then the investigators varied the length of time between pondering and making a decision by introducing the laboratory equivalent of “sleeping on it.” Some participants were asked to make decisions immediately after ingesting information (“conscious thinkers”) while others were distracted for a period of time before making a decision (“unconscious thinkers”). The conscious thinkers made better decisions when the choices were simple, but—and here’s the counterintuitive part—the unconscious thinkers were the better decision makers in complex situations.

The fact that the psychologists were at the University of Amsterdam presented interviewing hurdles that may have deterred some journalists. But the research was picked up by a number of prestigious media outlets, including The New York Times, Discover magazine, and The Washington Post. The resulting widespread publicity thus stems from a confluence of sophisticated signaling by the scientific culture in one of its top journals and topic characteristics that convinced journalists that the report would be both interesting and easily accessible to general readers. The popularized accounts not only communicated with lay audiences but probably put this study on the map among social and behavioral scientists as well, a point I will return to at the end of this chapter.

So how does a piece of social and behavioral science research find its way into the mass media? More specifically, what prompts a journalist to act, and what influences decisions regarding sources and story structure?

A relatively robust literature in the United States and elsewhere has explored dimensions of journalistic behavior (see, e.g., Berkowitz, 1997; Clayman & Heritage, 2002; Cohen & Young, 1981; Fishman, 1980; Gans, 1980; Tuchman, 1978; Tunstall, 1971), with much of modern scholarship in the United States now being captured by newer niche journals such as Journalism Studies and Journalism. The literature focusing specifically on media coverage of the social and behavioral sciences is much more modest (see, e.g., Goldstein, 1986; Haslam & Bryman, 1994; Rubinstein & Brown, 1985; Stocking & Dunwoody, 1982; Weiss & Singer, 1988). While referring the reader to these more expansive reflections on what journalists do and why they do it, I will opt here for a focus on a subset of news judgment factors that I think are particularly relevant to coverage of the social and behavioral sciences. The article about the deliberation-without-attention effects offers a good illustration of some of those factors, and I’ll point out those parallels along the way. At the end of the chapter, I will offer some suggestions for enhancing the volume and quality of behavioral science stories in the media through productive interactions with journalists.

The Reactive Nature of Story Selection

Journalists feel strongly that journalism should avoid creating news and, instead, should work to reflect real-world occurrences. That “mirror” norm encourages the journalism business to follow on the heels of a process rather than try to foreshadow its twists and turns.

This reactive posture becomes apparent when researchers track back through the unfolding of major news stories. For example, after the explosion of the space shuttle Challenger on January 28, 1986, many journalists and scholars rummaged through years of space shuttle stories to look for accounts that may have presaged the catastrophe. Although a few reporters had indeed explored the relatively high risks of shuttle flight, most reportage about NASA and the program remained determinedly optimistic and apparently oblivious to the risks. Journalist Gregg Easterbrook, then a national correspondent for the Atlantic Monthly, reflected in a panel discussion soon after the accident that “the press never covered the fundamental flaw of the shuttle program…. In the days when it was possible to make a change, there wasn’t enough coverage at all of the fact that the shuttle was flawed as a concept from the beginning. It was too great a risk” (Media Coverage of the Shuttle Disaster, 1986, p. 3).

Once the explosion had occurred, coverage of the aftermath, including enterprise reporting that uncovered the likely cause of the explosion (O rings made brittle by cold weather) and laid bare a tragic trail of human decisions, won numerous awards for journalists, among them the 1987 Pulitzer Prize for National Reporting by New York Times reporters. But that exquisite work could only come after the explosion, not prior to it.

This reactive posture makes journalism susceptible to influence by sources. Indeed, one could argue that reactivity enables skilled sources to construct the environment within which a journalist must then operate. And, although journalists express an acute understanding of and distaste for this possibility, they also acknowledge its inevitability. When press officials at the University of Utah announced a press conference in Salt Lake City in March 1989, even journalists skeptical of cold fusion hastened there. Regardless of its potential validity, the fusion claim was going to be big news, and journalists understood that covering the announcement was part of their responsibilities.

Source influence can be much more subtle, of course, such that it goes unnoticed by reporters. For example, numerous studies have found that, while journalists are literally inundated by story ideas and reject most of the suggestions that come their way, they still behave as if those ideas constitute the universe of story possibilities. Put another way, while journalists may use proportionately little of the information provided to them by sources, their selection of information will still be bounded by and will be proportionately like the available pool. For example, in one study of the information coming across the desks of science and health writers, Dunwoody and Kalter (1991) found that the science content categories reflected in the materials retained by the journalists strongly reflected the proportions of that content in the original materials, results consonant with a number of other studies (see, e.g., Dunwoody & Shields, 1986; Gold & Simmons, 1965; Whitney & Becker, 1982). Power to construct the universe of story possibilities is great power, indeed.

Sources also play a role in determining what about a story possibility is salient. Fishman (1980) was among the first to suggest that journalists readily adopt the frames of reference of their sources, using source reactions as guides to specific story features. For example, Fishman found that journalists covering local governmental meetings tended to privilege the items that captured the attention of the officials involved and ignored those parts of the agenda that were given short shrift by these individuals.

The assimilation of source interpretations is greatly enhanced when the status of sources is high, a characteristic of many science and medical sources. Reporters will accept and employ the frames of reference of these high-status sources even when doing so will delay their work. For example, science journalists readily accept scientists’ arguments that new research should be subjected to peer review before being publicly disseminated, even when it means that the journalists must delay writing about a research project for months to years.

It is this acceptance of scientists’ word for quality that makes signaling via journal publication so powerful a predictor of story choice. Thus, when Science magazine highlighted the publication of the “deliberation-without-attention” studies in its pages, journalists responded with enthusiasm, not skepticism. Through such mechanisms, the scientific culture has a pronounced impact on media representations of science. Although many scientists profess feelings of powerlessness in the grip of a runaway journalist, the more likely victim, frankly, is the journalist. There are always exceptions, of course, but the version of reality promulgated by the media is more closely aligned with scientists’ reality than with the interpretations of others (Dunwoody, 1999).

How do social and behavioral scientists fare in the story selection and structuring process relative to other scientists? Admittedly, many journalists place social and behavioral scientists lower on the status rungs than they do other types of scientists. This is changing, I think, as the quality of good social and behavioral science scholarship becomes more obvious to the culture at large. But it was not that long ago, in a study of science journalists’ news choices, that I found these reporters to routinely disparage social and behavioral science work. While the journalists I studied readily wrote stories about social and behavioral science topics, acknowledging the relevance of the topics to their audiences, they openly worried about the science’s quality. One journalist exiting a press conference featuring new research into the link between television content and aggressive behavior in children indicated that, although he would write the story for his newspaper, he considered the research “garbage science” (Dunwoody, 1986b, p. 72). He was offering a critique not of the specific research project being discussed in the press conference but of the “soft sciences” generally. He was not alone in his feelings at that time.

But even in today’s world, where the social and behavioral science disciplines are accorded more respect than in the past, journalists cannot cover what they don’t see. And they do not monitor social and behavioral science research journals. Thus, to become a part of the universe of story possibilities, journals and scientists focused on social and behavioral research must actively disseminate information, a process that I will return to at the end of the chapter.

What to Do about Truth Claims …

A major part of the normative toolkit for a working journalist is directed at managing one significant problem: Journalists cannot determine what is true.

In the course of gathering information for stories, journalists are confronted with countless truth claims, many of them contradictory. One scientist declares that his data suggest the presence of a genetic tendency toward violent behavior while others pooh-pooh that possibility. Respected scholars populate both sides of the debate over whether the death penalty deters would-be killers. Policy specialists duel over the effects of charter schools on learning among disadvantaged children.

Despite the fact that journalism’s societal job is to provide the public with the information needed to make informed decisions about these and other issues, journalists do not declare a truth denouement, for a number of reasons. For one thing, the typical journalist has neither the training nor the time to sift through truth claims to arrive at a judgment. Many scientists use this problem as a basis for calling on journalists to be formally trained in the science they cover. But such a claim is unrealistic, as journalists cover many sciences, not just physics or just social and behavioral sciences.

A second problem with the journalist who performs triage on truth claims is that her or his readers/viewers are not prepared to accept her or his assertion that one person’s truth claim may be more valid than that of another person. The official public role of a journalist in the United States is that of neutral transmitter, and any reporter who steps outside that role as a news reporter is open to charges of bias.

Illustrative of this is the beating New York Times reporter Gina Kolata has taken over the years for, among other things, her attempts to privilege the best available evidence in her stories (Marshall, 1998). For example, in the mid-1990s, Kolata reported that “the most definitive study yet of the health effects of silicone breast implants has failed to find any association between the implants and connective tissue diseases” (Kolata, 1995, p. A18), a finding that flew in the face of much anecdotal evidence at the time (for a useful summary of this controversy, see Vanderford & Smith, 1996). Kolata argued that the new data constituted the best available evidence to date in the search for a breast implant/disease link. In an article in The Nation, however, journalist Mark Dowie praised media coverage of the controversy that attempted to offer a balanced view and interpreted this and other Kolata stories as evidence that the journalist (and her newspaper) too readily sided with corporate views (Dowie, 1998). Experts empanelled by the Institute of Medicine some years later to look at all the breast implant/health evidence validated Kolata’s weight-of-evidence judgment: Although implant ruptures and subsequent infections present a “primary safety issue,” noted the panel, it could find no evidentiary basis for the claim that implants cause disease (Bondurant, Ernster, & Herdman, 1999, p. 5).

The bottom line, then, is that journalists do not possess the time, skills, or cultural permission to sift through contested claims in order to declare one “more true” than the rest. What, then, is a reporter to do? Over the decades, journalism has evolved two surrogates for validity claims, both of them equipped with advantages and debits for the journalist and his or her sources. One is objectivity; the other, a strategy called balance, calls on the reporter to offer variance in points of view. I’ll look briefly at each of these.

Objectivity

If you cannot know which message is true, argues the objectivity norm, then at least accurately transmit the messages that you gather. An objective account, then, is one that faithfully captures the factual claims and opinions of a source, regardless of the validity of those claims and opinions.

On the positive side, objectivity privileges accuracy and focuses both journalist and source on that characteristic of messages. Accuracy matters very much to scientists (see, e.g., Carsten & Illman, 2002; Matas, el-Guebaly, Peterkin, Green, & Harper, 1985; Singer, 1990), and objectivity makes it a priority for journalists as well.

The disadvantages of objectivity are substantial, however. Most primally, objectivity sanctions a journalist’s decision to ignore validity altogether. A story thus can communicate something that is just plain wrong and still meet the criteria for objectivity. The norm nurtures conundrums such as that highlighted in the 2005 film Good Night and Good Luck, the story of famous broadcaster Edward R. Murrow’s decision to abandon objectivity and unmask Sen. Joe McCarthy’s Communist-baiting behaviors in Congress. McCarthy’s attempts to charge a variety of Americans with treason and “communist subversion” in the 1950s received copious press coverage despite journalists’ strong reservations about the validity of the claims. For most journalists at the time, though, objectivity required that they transmit without judgment.

Balance

If you cannot know which message is true, argues the balance norm, then at least provide as many points of view as possible in your story. That means the truth will be in there somewhere.

The balance norm insists that reporters remain keenly aware that there can be different versions of reality, a good thing if media accounts are to have a chance of representing the complexity of belief systems. But balance in a world where objectivity prevents the journalist from making validity judgments is fraught with peril. When a reporter does seek a variety of viewpoints, the easiest way to play a neutral transmitter role is to give each of those viewpoints equal space.

The result is the “she said/he said” story, with sources often contradicting one another and leaving the reader/viewer with the feeling that “nobody knows what’s true!” Journalists often make the situation worse by attempting to display the variance in expert beliefs not by sampling across the spectrum of beliefs but by capturing beliefs at the extreme tails.

The objectivity and balance norms are so strong that journalists will often obey them despite their own best judgments. An iconic study illustrating this point was conducted some years ago by communication researcher James Dearing (1995), who explored media coverage of scientific outliers such as biologist Peter Duesberg, who contends that the HIV virus is not the cause of AIDS, and cold fusion “discoverers” B. Stanley Pons and Martin Fleischmann. Consistent with the expectations of objectivity and balance norms, Dearing found newspaper coverage of these issues to be more supportive than critical of outlier claims. But tellingly, when questioned about their own beliefs about the likely validity of the claims, the journalists indicated that the outliers were probably wrong! This meant, noted Dearing, “that most of these journalists did not, at the time they were writing their articles … consider these mavericks to be credible” (p. 355). But the pressure to withhold judgment about what is true overrode those opinions.

Objectivity and balance are coming under increasing criticism within journalism (see, e.g., Cunningham, 2003; Mooney, 2004). But the norms are exceptionally robust and will not decay easily. In recent years, both journalists and scholars of journalism have called on reporters to adopt more of a “weight-of-evidence” approach to coverage of contested claims (see, e.g., Dunwoody, 2005; Mooney & Nisbet, 2005). Such an approach would not require journalists to assess the validity of claims themselves but, rather, to share with their audiences what the bulk of experts believes to be true at that moment.

The closest that most journalists have been willing to come to this weight-of-evidence practice is to seek the counsel of relevant experts either to provide additional voices in a story or to serve as confidential advisers to the journalists themselves as they evaluate the newsworthiness of a scientific claim. Reporters recognize this norm as advice to avoid the “one-source story,” and the strategy is nicely captured in many of the stories written about the “deliberation-without-attention effect.” For example, reporter Gareth Cook of The Boston Globe included in his account reactions to the research from two other experts, a psychologist who reflected positively on the work and a business school professor who registered skepticism (Cook, 2006).

While still far afield from a validity judgment or even from a weight-of-evidence assertion, the practice of seeking feedback from multiple experts is an important journalistic tool. Some years ago, while I was observing science writers at a large meeting, I witnessed an effort not only to seek the advice of “unbiased others” but also to share that advice among journalists who work for competing organizations. At a press conference, a social scientist argued that his analyses of a number of large survey data sets had isolated a possible genetic precursor to violent behavior. Clearly uncomfortable with the finding, science journalists streaming out of the press conference got together in the hallway and decided to contact a variety of other social scientists by phone in an effort to vet both the PI and his findings. An hour or so later, the journalists regrouped to share what they had learned. They then made individual decisions about whether to write the story or not.

Are there things that social and behavioral scientists can do to enhance the likelihood of weight-of-evidence treatment of research in their own areas? To a limited extent, the answer is yes. I will return to this at the end of the chapter.

Journalists Often Frame Social and Behavioral Science as … Something Else

Back in the 1980s, in what remains an important piece of baseline science communication scholarship, researchers Carol Weiss and Eleanor Singer employed funds from the Russell Sage Foundation to study coverage of the social and behavioral sciences in major U.S. media. After carefully defining what they meant by social and behavioral science—sociology, political science, economics, psychology, anthropology, criminology, demography, epidemiology, and behavioral medicine—the scholars looked for stories published in major newspapers, news magazines, and network news over the course of 5 months in 1982. They also interviewed journalists who wrote some of the stories and social scientists who were quoted in them.

One of their many findings that stood out for me and will drive this section of the chapter is the following: Journalists rarely define a social and behavioral science story as being about “science,” and that failure, in turn, means that they do not apply the evidentiary standards of science to the topic. This became apparent to Weiss and Singer (1988) early in their study:

In the course of pretesting our interviews, we talked to a number of reporters who had just written a story which we classified as social science, and we asked them if this was the first time that they had written stories about social science. Uniformly they were taken aback; some seemed to think that we were talking gibberish. In their minds the current story was not about social science at all. They were writing about crime or business or politics or education. That they were reporting the results of research on the topic or citing the remarks of a social scientist was of little consequence. It was the topic of the story that provided the frame of reference for their work. (pp. 55-56)

Put another way, journalists write about social and behavioral science topics all the time but rarely identify the issue or event in their sights as “scientific” and amenable to systematic exploration. The absence of such an interpretive frame has large consequences for the nature of the story, as the frame will determine the types of sources sought and the nature of the questions asked.

Let’s say a journalist pitches a story to her editor about two-career couples whose work requires them to live apart. For most reporters, the topic sounds like a great lifestyle story about an increasingly familiar situation of general interest. If our journalist frames the story idea that way, the frame will lead her to seek out a few such couples through word of mouth, interview them, and write a story grounded in vivid, anecdotal materials derived from the experiences of those families.

But what if she were to define the story, initially, as one that has been the focus of research? That frame would lead her to the peer-reviewed literature to see what social scientists now “know” about the experiences of these couples. She would likely contact a couple scientists for interviews, and there is even a chance that she would then locate couples whose situations illustrate some of the main patterns in the data with which she has become familiar. What a difference a frame makes!

Journalists’ blindness to the scientific, evidentiary dimensions of social and behavioral science topics has caught the attention of an increasing number of journalists and scholars. Veteran journalist and analyst Michael Massing bemoaned the problem in an article that took issue with the press’ coverage of popular culture. Major news outlets such as The New York Times seem suffused with stories about the new, the dramatic, off-the-wall aspects of our culture, complained Massing, but “in the process, the Times has neglected a critical aspect of pop culture—its effects on society” (Massing, 2006, p. 28). Survey after survey indicates that Americans indeed worry about the effects of a large range of cultural changes—entertainment, clothing choices, new electronic gadgets—on children and society at large. Notes Massing, “The journalistic questions such fare provokes seem endless, and they extend far beyond the usual ones about sex and violence into the realms of sociology, politics, and religion” (p. 30).

Similarly, mass communication researcher Michael Slater, in an analysis of media coverage of alcohol-related risks, found that journalists rarely dipped below the surface of a topic to explore the causal waters underneath. Among the health catastrophes for which alcohol serves as a major precursor, he found that only motor vehicle accidents were covered in ways that gave readers some sense of alcohol’s role; coverage of other alcohol-related risks, such as injuries stemming from violent crime, were handled as specific, vivid episodes that seemed to occur in a causal vacuum (Slater, Long, & Ford, 2006).

Some years ago, a California newspaper reporter realized that media coverage of violence was dominated by discrete, episodic stories that informed readers about the tragic events themselves but that failed to place the violent acts in any type of interpretive framework. Journalist Jane Stevens came to a conclusion similar to that of Weiss and Singer years earlier: A major reason for this lack of context was definitional. “I discovered that the answer lies in the way journalists report crime,” noted Stevens. “We do so only from a law enforcement and criminal justice standpoint” (Stevens, 1998, p. 38). In other words, Stevens felt that the initial interpretive framework adopted by journalists covering violent crime rendered the reporters unable to “see” the epidemiology of violence and thus made them oblivious to social and behavioral science research that could help explain the occurrence of such dramatic and occasionally horrific acts.

In response to that problem, Stevens joined forces with the Berkeley Media Studies Group and the communication researchers at the University of Missouri-Columbia to examine newspaper patterns of crime reporting and then, through a variety of strategies, attempted to change reporting of violence in ways that would introduce evidentiary issues, context that could help readers develop an understanding of why such acts occur. Stevens wrote a book, Reporting on Violence, to provide journalists with data and ways to gather such information as the economic and psychological consequences of different types of violence, methods being developed to prevent violence, and information about how communities can implement prevention approaches (Stevens, 1997). The collaboration also designed and implemented onsite workshops at five metropolitan newspapers to introduce reporters to ways of thinking beyond the episodic, violent event. The trainers hoped to lead journalists to more evidentiary thinking about the causes and byproducts of violent acts, which in turn could lead to stories that reflected more thematically on the roles of violence in society (Dorfman, Thorson, & Stevens, 2001; Thorson, Dorfman, & Stevens, 2003).

Did these efforts to transform the interpretive framework for violent press coverage from an episodic crime focus to a more thematic public health focus bear fruit? It is hard to tell. The project directors summarized journalists’ reception to their workshop ideas as one of “cautious interest” (Dorfman et al., 2001, p. 417) and acknowledged that both normative and technical obstacles loomed for reporters and editors considering a more thematic, evidentiary approach to crime news. In her revised second edition of Reporting on Violence (available at http://www.bmsg.org), Stevens reflects on a small number of journalists who embraced the project material and went on to write a steady stream of contextual stories for their newspapers.

The Legitimizing Effect of Media Visibility

Before embarking on a final section reflecting on strategies that social and behavioral scientists can employ to enhance their media visibility and improve coverage of their research, it is important to ask the following question: Why would a scientist want to go out of her or his way to interact with journalists? Historically, scientists have worried that public visibility would create more problems than perks. And for the social and behavioral scientist of 25 years ago, that worry would have been justified; scientists were often punished by their peers for “popularization” activities (Dunwoody, 1986a). But today, scientists of all kinds are becoming increasingly sophisticated popularizers, for a number of reasons:

  • Quid pro quo. For many social and behavioral scientists, federal support of their research confers a responsibility to share results with various publics.
  • Education. Increasing numbers of scientists are embracing the need to participate in informal science education in an effort to increase the general level of science literacy in the United States. The National Science Foundation’s renewed emphasis on the “broader impact” component of its grant proposals is illustrative.
  • Dollars. There is plenty of anecdotal evidence on behalf of a relationship between level of public visibility and level of research resources. Systematic evidence is less obvious, but—like all humans—scientists find the vivid anecdotes compelling.
  • Legitimization. Perhaps most important, social and behavioral scientists are beginning to understand that public visibility for their work confers a kind of cultural legitimacy that can be valuable.

As a society’s primary signaling devices, the mass media make some things loom as more important than other things. Put another way, stories affect public judgments of salience even in the absence of a deeper understanding of issues. Today, for example, I encountered a major story in the “Science Times” section of The New York Times about recent research on the neurological costs of teenage drinking (Butler, 2006). Binge drinking is a salient issue for any university professor, but this piece stopped me in my tracks with its reflection on research detailing the extraordinary physiological cost of heavy drinking on nerve cells. Since reading it, I have been pondering how best to share the information with undergraduate students that I will encounter in my fall classes. Such is the power of media signaling.

Even more remarkable, though, is the media’s ability to serve as a legitimizer within science. Scientists, it turns out, react to media accounts just like the rest of us. A piece of research covered by the media seems more worthy to other scientists, even those within the discipline.

A couple of studies bear this out. Both find an effect of media visibility on the number of citations to a study in the peer-reviewed literature. Phillips, Kanter, Bednarczyk, and Tastad (1991) compared citations to two groups of studies published by the New England Journal of Medicine (NEJM); one group had been covered by The New York Times, and the other group—matched on all other variables—had not. In the first year after publication, the typical research study covered by the newspaper received 73% more citations than did the NEJM articles not so covered.

More recently, Kiernan (2003) looked for the same pattern in a larger group of journals and media outlets and found that peer-reviewed articles that garnered media visibility earned 22% more citations in the literature than did articles that remained culturally invisible. In addition, Kiernan also found that the legitimizing effect of media visibility was not limited to stories in the New York Times but was influenced by coverage in a wide range of media outlets.

Strategies for More Effective Media Interactions

The mass media environment is as complex as any social structure, so social and behavioral scientists pondering a foray there need to think carefully about what they want to accomplish and how best to do so. Below, I will return to three journalistic “realities” emphasized in this chapter and will offer a few tips for coping with them.

Improving the Social and Behavioral Science “Signal” in the Mass Media

The reactive nature of journalistic work and the strong likelihood that journalists do not go seeking information about social and behavioral science means that much good research will remain invisible unless scientists and their organizations take a more active role in dissemination. Two important components of this process are (1) putting information where good journalists will see it and (2) maintaining credibility in the process. I bring the second point into the discussion because, while journalists acknowledge the usefulness of sources coming to them rather than vice versa, they also react negatively to marketing efforts perceived to be “over the top.” One senior science writer who got increasing numbers of calls from scientists seeking coverage of their research confided that her first question to each caller had become, “When is your grant up for renewal?” Here are some tips for improving the signal sent to journalists:

  • Make publication in a high-quality, peer-reviewed journal the catalyst for the story you have to tell. For knowledgeable journalists, this publication point remains the default signal that good science is ready to be transformed into a story. Disseminating at an earlier stage is a credibility risk. This is not to say that other signals about one’s research are illegitimate. Scholars present at conferences, and attending journalists may follow up. Reports from credible bodies (The National Academies comes to mind) are viewed as highly credible by reporters. But for skilled journalists, publication in a peer-reviewed setting is still considered the gold standard. Remember, journalists cannot judge the validity of what you have to say. In that situation, savvy journalists rely on the same standard as do scientists: peer review.
  • Aim for publication in top journals outside your field. To land in Science, Nature, or the Journal of the American Medical Association will send a much greater signal to journalists than will publication in even the most prestigious journal in most of our disciplines.
  • Call a journalist to tip her or him off to your good work only if you have established a productive relationship with that individual. Remember, cold calls from a potential source are a bit unnerving for a reporter. He or she will immediately begin to wonder about your motive, and that does not help your cause. On the other hand, if you have worked with a reporter in the past and are selective about when and why you employ this strategy, the payoff can be substantial. A scientist on my own campus whose work was scheduled to be the cover article in an issue of Nature called a reporter at a major newspaper—someone he had met in a previous venue—to let her know. The resulting story unleashed a virtual avalanche of attention to his work.
  • Enlist the help of your university/company’s public information officers (PIOs) to prepare and disseminate a plain-English version of your research. Most of us work in organizations equipped with skilled writers who can cater to journalists’ needs much better than can we scientists. You may need to signal to these individuals that you have something useful to disseminate, so don’t wait for them to contact you. But once they are energized, their skills will be invaluable. Fellow chapter author Earle Holland, himself a distinguished public information officer, focuses on working with university PIOs.
  • Encourage your disciplinary society to become more active as an information disseminator. Remember that most of our social and behavioral science journals are virtually invisible to journalists. We can mediate that problem to some extent by developing communication strategies at an aggregate (in this case, organizational) level. Many societies now issue press releases when significant and interesting research is at the cusp of publication in their journals.
  • Encourage both employers and disciplinary organizations to consider placing their press releases on such omnibus Internet sites as EurekAlert (http://www.eurekalert.org), an online global news service operated by the American Association for the Advancement of Science that has become a major foraging venue for journalists, who receive privileged access to the new material there.

Enhancing the Likelihood That a Journalist Will Apply a Weight-Of-Evidence Approach to Your Research

Although the strategies below will not interest the scientist who believes she or he has uncovered “truth” in her or his work, I continue to argue on behalf of encouraging reporters to think more contextually, more thematically about our research and about the uncertainties so central to exploring complex processes. Here are some ways to accomplish that in an interview:

  • Regardless of whether or not the reporter asks (and she or he probably will not), share with the individual your assessment of the extent to which others in your field agree or disagree with your findings. Make it clear that opinions vary and that such variance is a natural outgrowth of scientific investigation.
  • Provide the reporter with names of other credible scientists who agree with you, as well as with names of those who disagree with you. Urge the reporter to contact some of these individuals. (By the way, this tactic will do wonders for your credibility, as it emphasizes your interest in truth over advocacy of your point of view.)
  • Identify outliers and recommend that the journalist avoid them. The reporter will appreciate your efforts at full disclosure and might even take your advice.
  • Offer to serve as a sounding board for future questions and issues. It is important to communicate your interest in helping the reporter grapple with the larger validity issues of your field. You may be more credible in this regard if you indicate a preference for acting as a confidential source of information.

Helping Journalists to Define Behavioral and Social Science Topics as Ones Amenable to Systematic Exploration

This is perhaps the most intractable problem of all, as an initial judgment of what a story is about will drive whether a journalist contacts you in the first place! In addition, Stevens and colleagues’ attempts to shift the interpretive frameworks of journalists who cover violence demonstrate how difficult it is to modify existing mental maps.

  • If you are contacted in the course of a developing story, speak from an evidentiary base and make that base obvious to the reporter. At the least, this will alert the journalist to the possibility of other interpretive frameworks for her or his story.

This will be only intermittently successful, of course. I recall an instance some years ago when an editor at The New York Times asked me to write a column for the Sunday newspaper reflecting on the impacts of science television programming on viewers. I naively assumed that he was open to an evidentiary piece and generated a column reflecting on how little we communications scholars knew about those message effects. At that time, for example, the prevailing operational definition of “high impact” for a NOVA program was a large audience. No one had a clue about whether anyone learned from the programs or about the possible effects of such learning on opinions or behaviors.

The editor was flummoxed by the column. He had anticipated that I would call people in the entertainment industry, the documentary production world, and perhaps even a few superstars such as the celebrated British broadcaster Sir David Attenborough to gather their reflections on the power of their programming. “So,” he said slowly and carefully, “we don’t really know the effects of these programs on audiences?” “Correct!” I responded, happy to have communicated my main message so clearly. In response, he killed the piece.

  • Put energy into helping future journalists and media audiences build early interpretive frames for social and behavioral issues as topics amenable to evidence. Many of us spend a significant amount of time in the classroom, and that is an ideal location for this construction effort. For example, if your university has a journalism or communication department, consider building a few bridges between your discipline and budding reporters. As a result of being invited to speak to faculty and students in a variety of science units, for example, I have become a part of a number of fruitful collaborations that provide communication training for both my journalism students and for students in an array of science disciplines, including the social and behavioral sciences.
  • Infiltrate “public” venues such as the annual meeting of the American Association for the Advancement of Science (AAAS). The AAAS meeting attracts thousands of scientists and hundreds of journalists. Staging a symposium on a timely public issue and mustering speakers who can provide a significant evidentiary base will go far. Again, our more narrow disciplinary meetings will be virtually invisible to the individuals we are trying to reach.

Do these strategies actually work? The answer is probably yes, although we have little empirical proof for most of them. Although the quantity and quality of research dedicated to exploring science communication questions has proliferated in recent years (see the overview by Weigold, Treise, and Rausch), the community of researchers remains too small to establish a thorough and empirically well-grounded picture. Still, the available scholarship on message attributes and effects is well worth scientists’ time and attention. Like many sources, they waste too much time at present rediscovering patterns and reinventing strategies. Readers will find some of this information in this very volume by going to the Stocking and Sparks discussion of communicating uncertainty, reflections on the Internet as a new information channel by Martland and Rothbaum, and the concluding chapter by Welch-Ross and Fasig that explores research on audiences for behavioral science information.

Although I have nothing better than anecdotal data to bring to bear, I have a strong sense that media organizations are increasingly willing to see social and behavioral research as high-quality, legitimate news pegs. Society is increasingly willing to acknowledge that most—if not all—important problems that we face are overwhelmingly behavioral or social in nature. As a result, journalists are increasingly looking to social and behavioral scientists for answers. The more actively and skillfully scientists share their good work with journalists, the more quickly the legitimization of behavioral and social science scholarship will gather a head of steam.