The following is adapted from a series of lectures delivered in Lithuania in May 2017 at the invitation of the U.S. embassy in Vilnius. The lectures were delivered seven times at various institutions and universities, slightly modified each time to fit the audience. The version below is based on the original text with minor adjustments. It was initially published by the Foreign Policy Research Institute; read the original here.
The Fake News Crisis
It seems we are in the midst of a crisis of information—or, rather, a crisis of misinformation. The epidemic has been dubbed “fake news,” a term initially coined by the news media to describe stories on the internet posted by websites of questionable integrity. The term has since been turned back on the media, in America at least, by the Conservative political establishment— including by current President Donald Trump who levies accusations of the “fake news mainstream media” against the likes of CNN, the Washington Post, the New York Times, and others. We now have a constant back and forth in the public sphere: the media accuses online outlets of being fake, and politicians, in turn, accuse the media of being fake.
To the media, the fake news moniker is a convenient straw man for an industry grappling with its own future. From January 2001 to September 2016, the newspaper publishing industry in the U.S. lost over half of its employment, from 412,000 to 174,000. In contrast, employment in the internet publishing and web search portals industry increased from 67,000 jobs in January 2007 to 206,000 jobs in September 2016. How do you argue for your existence when technology may be making you obsolete? You assert that your competition is “fake”—lacking credibility, untrustworthy, inferior—and that you are “real:” authentic, venerable, defending institutional norms and values. Indeed, Mark Thompson, President & CEO of The New York Times Company at the Detroit Economic Club, made this argument in December 2016. The topic of his address was fake news. His solution? To subscribe to the Times, Washington Post, Wall Street Journal, and others. “If you as a citizen are worried about fake news, put your money where your mouth is and pay for the real thing,” he concluded.
The American media’s preoccupation with fake news is, in part, a manifestation of an anxiety held by traditional news publishers as they grapple with the influence of Facebook, Google, Twitter, and other online media outlets. There has always been fake news, long before modern newspapers came into being, and there will continue to be fake news long after newspapers are gone. Subscribing to the New York Times will not, in itself, resolve the problem. In 1807, U.S. President Thomas Jefferson complained about the veracity of newspapers in a letter to John Norvell, claiming that “Nothing can now be believed which is seen in a newspaper” and that “the man who never looks into a newspaper is better informed than he who reads them.” It bears reminding that in Jefferson’s day, American newspapers had no pretense of being objective: on the contrary, they were mouthpieces for political interests, and, in fact, partisan newspapers helped Jefferson win the presidency over John Adams in 1800—often using tactics Jefferson later admonished. In the 1830s, there was a famous “moon hoax” by the New York Sun newspaper, in which the paper claimed to have evidence of life on the moon. That was, perhaps, the epitome of fake news. It helped sell newspapers, though. And those familiar with the Spanish-American War will know that U.S. newspapers of that era were filled with falsified accounts of Spanish cruelty in Cuba, not to mention misleading reports about the explosion of the USS Maine. Their reports helped to propel the U.S. into war with Spain. Even before newspapers, in societies without widespread literacy, misinformation, false information, and “fake news” was circulated via rumors, oral storytelling, songs, letters, pamphlets, imagery, and reports. In America, fake news is not solely confined to the printed word; for a generation, talk radio and television have spewed conspiracy theories and information of questionable integrity, so much so that it helped to spawn a satirical genre of “fake news” in the form of The Onion, The Daily Show, and The Colbert Report. False information, disseminated to achieve a political, ideological, financial, or comedic effect, is not new and will continue regardless of the medium.
Our generation’s crisis around fake news is really a technology story—similar to the ways that the printing press, penny press, radio, and television altered the information landscapes before us. Our technology is the internet, and it is the pervasiveness, virality, and speed of the World Wide Web that has engendered our urgency around today’s fake news. The web and social media are powerful tools that can spread fake news around the world in an instant. They can also spread the facts that refute it. Like any tool, the challenge is to ensure that we educate the people who use the thing, even as we improve the thing itself. In 2016, the Stanford History Education Group released an 18-month study of more than 7,000 American students that found the ability of students to reason about information on the internet was “bleak.” Students could not differentiate a story by a journalist from a story by an advertiser, and, on the whole, students did not take the time to investigate whether websites or social media accounts had biases or hidden agendas. The question, it seems, is not what to do about fake news, but rather what to do about citizens who lack the tools or skills to recognize it—or worse, prefer it because it aligns with, or reinforces, their viewpoints. To quote philosopher Michael Lynch, the internet is “both the world’s best fact-checker and the world’s best bias confirmer—often at the same time.”
Fake History: The Lesser-Known Crisis
The consequences of fake news are very real. Inaccurate or purposefully misleading information in the news media has the power to influence our politics, our governments, and the policies we make for the future. Over the long term, however, it is information about the past that has the power to shape our identities, our nations, our institutions, and our opinions of others. Any zealousness to combat “fake news” must be matched with an equal—if not greater—zealousness to fight “fake history.” “Fake history” is a long-term phenomenon that has been emerging before our eyes. And unlike the “real news” that counteracts “fake news” every day in the public sphere, the “real history” that counteracts “fake history” has gradually disappeared from public view.
To foreground this discussion, I would like to recall an op-ed by New York Times columnist Nicholas Kristof in February 2014. Titled, “Professors, We Need You!” Kristof lamented that academics were largely invisible in the public sphere. Their invisibility was not, according to Kristof, a problem for academics. It was a problem for America. America—the policymaking establishment and the nation at large—needed the minds of academics to enact sensible policy and help to solve pressing social problems, Kristof wrote. By sequestering their wisdom within academic journals and burying their ideas beneath inaccessible prose, scholars did a disservice to themselves, to policymakers, and to the American people. Society could only be enriched through the contributions of the country’s sharpest minds resolving the nation’s most pressing issues.
Kristof’s column publicized a sentiment that had been growing for the better part of a generation: namely that academia had grown so specialized and jargonized that it ceased to have real bearing on the world. Academic history was no exception. Over the course of a generation, historians had veered toward hyper-specialization, researching topics of limited (or no) public appeal; and consumed with peer review and tenure evaluation at the expense of the public interest. Not coincidentally, historical scholarship all but disappeared from the public sphere. Historians published prolifically, but fewer and fewer non-experts were reading. The average sale of a scholarly monograph is now estimated to be a few hundred copies—at best. A 2014 report found that the average academic journal article was read in its entirety by about 10 people. Up to 1.5 million peer-reviewed articles are published annually—82 percent of those in the humanities are not cited once. No one refers to 32 percent of the peer-reviewed articles in the social sciences. Brilliant as academic research may be, its visibility and influence on the world at-large is, sadly, often minimal.
One outcome of this retreat has been a society-wide dearth of “historical literacy.” I avoid saying “loss” of historical literacy because I am not convinced we in America ever much had it, nor am I prepared to say that today’s levels of historical literacy in America are below previous generations. But much like print journalism is in a state of anxiety regarding the standards it is fighting to uphold, so too is history in a similar state. Americans simply do not know—or do not place much effort into recognizing—“good” history, from “bad,” soundly researched scholarship from popularized myth. As a result, what is meant by the word “History,” what the word signifies, what it encompasses, what gets presented as history, and, most critically, how it gets communicated are all shifting quickly beneath our feet. What history has come to mean since it coalesced into a profession in the second half of the 19th century—a discipline concerned with the careful use of evidence to make interpretive arguments about the past—is evolving right before our eyes.
To understand this necessitates that we make the distinction between the past and history. The past comprises the infinite number of events that have occurred right before this very moment. As you read this sentence, you are in the present. Now? The reading of that previous sentence is in the past. The past—what human beings have done up to this very moment—remains a topic of infinite interest to human beings. The trouble is that the past and history, though not synonymous, are used interchangeably in today’s America. History deals with the interpretation of things that have happened in the past. More precisely, history deals with the interpretation of things that have actually happened. Historians call these things facts. Historians substantiate facts with evidence: documents or otherwise that prove with some certainty that events actually occurred. History is concerned with evidence: presenting it and explaining it. And therein lies the rub. For not everyone will look at facts and evidence and come up with the same explanation about how and why something happened as it did. Future historians may find copies of my writings and use them as evidence to suggest that indeed, during my lifetime, I was a writer who wrote about history. But those future historians may disagree about the reason that I wrote. Some may hypothesize that it was my career at the Library of Congress that motivated me. Others may say that, in fact, it was my family’s experience during Holocaust. Each of these arguments would be based on evidence, and offer an interpretation of the fact of my writing within the larger context of my life and the world around me. Thus, as John Arnold writes, “History is above all else an argument. Arguments are important; they create the possibility of changing things.” Since history deals with interpretations of past events grounded in evidence—and since new events are always occurring and new evidence is always being found—history is an ongoing, ever-changing argument that seeks to make some sense of the infinity of human actions that have occurred prior to this moment and have had a determinative effect on where we are today. Though both are valuable and interesting, history is much more than simply a recitation of what has happened in the past.
I would suggest that many Americans cannot differentiate between the past and history. This is evident on television, where in the U.S. we have a History channel whose programming now includes “Pawn Stars,” “Swamp People,” “Counting Cars,” “Ax Men,” and “The Marijuana Revolution.” A significant portion of the History Channel’s programming is, in fact, stories of people who live in the present and there is very little interpretation based on evidence. On the web, there are social media accounts such as @HistoryInPics that boast more than 4 million followers. @HistoryInPics posts photographs taken in the past without interpretation, with captions not always based on evidence and largely on snark, and in some instances images that are actually fictitious, and thus not history because they are not interpretations of events that have actually happened. Our web browsers have a history tab, a tab that is simply a list of websites we have visited in the past (this may resemble a chronicle, defined by Webster’s Dictionary as a description of events in the order that they happened, but in fact, mostly resembles what we call a list). Our Gmail accounts have an “archive” function—but an archive functions according to deeply reasoned practices of archival integrity and is arranged and described according to strict archival standards. The U.S. National Archives disposes of more than 90 percent of the material it accessions, whereas Gmail, contradictorily, asserts that users never have to delete anything. The “archive” function of Gmail is actually the opposite of archiving in many ways. The “archive” function would more aptly be named “move”—i.e. you move your email from one virtual holding place to another.
There are reasons why web browsers use the term “history,” why Gmail uses the term “archive,” and why @HistoryinPics did not choose the handle @ThePastinPics. There is a reason why famous American writers David McCullogh (a journalist by training), Michael Beschloss (who has a business degree from Harvard), and Doris Kearns Goodwin (a former U.S. government official) refer to themselves as historians and not, simply, writers. It is because the terminology of history carries connotations of authority, credibility, and reverence. Yet, ironically, much of what involves the past in mainstream American culture would not be considered historical—at least not as the history profession defines it. Not coincidently, little of what involves the past in mainstream culture involves historians. Even when a historian’s work finds the public eye, the historian is usually not the person celebrated for it. It is usually a journalist or a playwright who steers the conversation—someone who has a public platform, access to an audience, and communicative credibility with that audience. Media personalities such as Glenn Beck and Bill O’Reilly each have best-selling history books that have far outsold those written by academic historians. Fortune 500 companies use old photographs—some in the public domain, some not—for the purposes of driving web traffic, increasing clicks, and maximizing advertising dollars. Governments and elected officials frame their policies as being “historic” as means of pre-certifying their lasting significance. History is seemingly everywhere yet nowhere. This is the “history” of today in America. Consequently, it should be no surprise that many Americans lack the ability to distinguish good history from bad.
History—the analytical interpretation of the past based on critical assessments of evidence—is being transformed by our communications revolution. In a world where we receive the majority of our information from visual media such as the web, television, and mobile phones—and we lend the creators and users of these platforms the authority to dictate what we should or should not pay attention to—what passes for history is too often bits of information about the past circulated on the web, stripped of context, devoid of analysis, and intended to advance a political, ideological, financial, or personal agenda. Much like this environment has had a transformative effect on what gets called journalism and news, so too has it had a transformative effect on what gets to be called history. It creates the conditions for “fake history” to thrive. If historians are to reclaim the definition of history and instill historical literacy into our populations, historians will have to wade bravely and confidently into this complex communications environment. Many are already doing so, attempting to reclaim real history from fake history in the same way that journalists are attempting to reclaim real news from fake news.
To combat this will not be easy, and will require a multi-pronged strategy. Many of the strategies being employed to identify and flag fake news stories can also be effective in identifying fake history. Nearly all of these strategies are being directed by technologists; companies such as Google and Facebook, or vigilantes such as the Baltic elves, identify disinformation online and refute it—a reactive approach. But historians and scholars have a role to play, too: a proactive approach. We must saturate television, the web, and social media with real history and honest scholarship. It is not enough for historians to write a 400-page monograph and hope that elected officials or citizens will read it. Historians must adapt to the seismic shifts in communication and the new ways that people communicate via TV, the web, and mobile devices. Historians must win back market share in the marketplace of ideas and the marketplace of people’s attention—and to do so will require changing how and where we communicate our scholarship, as well as replicating our scholarship across multiple media and freeing it from behind paywalls. In an era of constant demands on people’s time, minds, and eyeballs, unfortunately the rules of the market apply to history. And academic historians have a very small percentage of the information market share in the U.S. Wikipedia, Google, Glenn Beck, Michael Beschloss, and @HistoryinPics have more—and taken together they have sizably more. To earn back market share, historians must communicate content in a manner appropriate to the medium and that appeals to audiences beyond academia. More significantly, an entire generation of digital natives needs to be shown what comprises good history. They must be shown that good, honest scholarship that critically interprets past events is worth their time and attention—and that engaging with this material will make them smarter, more knowledgeable, and better able to shape the world for good. Such is the daunting task historians are now facing.
How history gets communicated will have a determinative effect on the profession’s future. As much as we historians value monographs and journal articles, we must also embrace web videos, listicles, crowd-sourced knowledge, GIFs, memes, podcasts, blogs, social media, imagery, emojis, and more—and incentivize and reward historians for using these media. For this reason, I and others have proposed that History needs History Communicators, and have created the field of History Communication. Just as science has Science Communicators, History Communicators, like Science Communicators, are historians who step beyond the walls of universities and institutions and participate in public debates; engage in conversation with policymakers and the public; communicate history in a populist tone that has mass appeal across print, video, and audio; and advocate for policy decisions informed by historical research. History Communicators teach historical literacy: the critical analysis of sources, argument and counter-argument, how to evaluate evidence and how to prove a hypothesis. Most importantly, History Communicators stand up for history against simplification, disinformation, or propaganda, and explain basic historical concepts that those of us in the profession take for granted. History Communicators are historians and skilled communicators. They are engaging. They are dynamic. They tell stories, wield metaphors and analogies effectively, are succinct, and are able to distill complex ideas into accessible language. They are skilled rhetoricians, scintillating orators, and are able to connect emotionally as well as intellectually with audiences. They are skilled at modes of persuasion, not solely appealing to logic and reasoning but also to the character, values, and emotions of audiences. They are able to communicate across multiple media and have a strong presence on social media. They are diverse, both in age, ethnicity, and appearance. They are a different type of historian than the profession has ever produced.
A Call to Action
We have seen what occurs when there is a widespread lack of historical literacy in a population. It is not solely that government officials and political candidates speak carelessly or dangerously about the past, or that state actors use falsehoods to deliberately sow instability into a region. It is that citizens lack the ability, motivation, and intellectual self-confidence to disentangle myth from fact, ideology from honest inquiry. That is what must compel us to act. Much like fake news will never go away, fake history will not disappear either. Strongmen and demagogues will continue to use the past as a method to incite and divide society. We need real history that confronts it, communicated widely, popularly, and effectively—and the wisdom to know the difference between the two.
 See, e.g., Krisna Ruette-Orihuela & Cristina Soriano, “Remembering the Slave Rebellion of Coro: Historical Memory and Politics in Venezuela,” Ethnohistory 63:2 (April 2016) doi 10.1215/00141801-3455331.
 Harnum stated that sales per monograph were 300-400 copies in 2007; estimates are it is even lower today.
 Statistics taken from “Prof, no one is reading you,” The Strait Times, April 11, 2015. Accessed June 2, 2015. http://www.straitstimes.com/news/opinion/more-opinion-stories/story/prof-no-one-reading-you-20150411
 John H. Arnold. History A Very Short Introduction. Oxford University Press (Oxford: 2000), p.13.
 Follower count as of June 19, 2017. Source: http://www.Twitter.com/HistoryInPics
 One example: @HistoryinPics posted a photograph of John Lennon playing guitar with Che Guevara. The two are never known to have met, and the photograph was confirmed to be doctored.
 This observation was noted at the 2015 Annual Historical Association meeting in New York and repeated at the 2016 AHA meeting. For sources, see Bookscan’s list of top history bestsellers for 2014. The list also includes Howard Schultz, CEO of Starbucks.