The Future is Watching
We’re tweeting and posting and statusing ourselves into the history books
In an episode of the science fiction cartoon “Futurama!” the main character, Fry, and his 30th century friends visit the Moon. The delivery boy from the 20th century finds that the moon landing sites he cherished as a kid have now been “Fungineered” into cheesy Disney-esque rides that tell fractured fairy tale versions of the sites’ historical significance.
On the ride, narrators tell the gang that “no one really knows when, where or how man landed on the moon” (“I do!” yells Fry), but “fungineers” imagine it involved whalers with harpoons.
Though not a serious prediction of the future, the scene touches on an important issue: Will future historians see our times through a fractured lens based on lost or erroneous knowledge? Or are they more likely to find themselves confused and adrift in the endless “cloud” of digital information stored on the Internet?
In modern times, scholars write their historical tomes based on notes taken “back in the stacks” of libraries, archives and museums. They use photographs, books, letters, papers, court documents, deeds, diaries, autobiographical sketches, genealogical documents and old family Bibles, war records and old newspaper clippings. They sit for hours spinning through reels of microfilm or scanning microfiche plates. They methodically jot down each source for footnotes and bibliography. They compare and contrast accounts and search for corroboration. Some literally go the extra mile by following in the footsteps of their subjects, to experience for themselves what it must have been like and to see how time has changed the landscape. Others conduct lengthy interviews with primary sources who reach back in time with fading, imperfect memories.
But what kind of history will future researchers pen about our time? How will they get their information? Will today’s culture of social networking and “over-sharing” help or hinder future historians? How will Facebook postings, blogs, YouTube videos and MySpace pages be interpreted in the future? Will future writers even have the critical thinking skills necessary for debunking hoaxes and discerning what’s been plagiarized?
A Vast Digital Scope
The Internet is a modern marvel, and even though is still a fledgling, it’s growing up fast. Nearly 20 years after it first burst on the popular scene in all its dial-up glory, it has linked us all together into a planet-wide, Borg-like collective of thought. You can “look stuff up” online, and thanks to social networking you can chat with friends and family all over the world. You can upload this and download that, and speak your mind in 140 characters or less. You can post embarrassing photos of your friends from last night’s kegger, watch dozens of YouTube videos of cute dancing cats, write endless cryptic “status updates” on Facebook and text your friends in the fashionable, semi-illiterate style of the day.
Also, the egalitarian nature of the Web means everybody with a keyboard is a writer. Before social networking was in vogue, there was blogging, which covers a wide spectrum of human experience — idle teenage musings, political commentaries, postings from storm-ravaged cities and reports from revolutionaries struggling against oppressive dictators. People talked, gossiped and spread rumors before the Internet, but today there’s a much wider audience listening — including future historians.
But in an age where mountains of information get uploaded every day, hoaxes and plagiarized stories fill the Web with false, misleading information and conspiracy theories galore.
“We still have people who think the landing on the moon was staged,” observed Tallahassee author, historian and educator Joe Knetsch.
A skeptical Knetsch said future historians might not get the most accurate information if they base all their research on what can be found on the Web in general — and social networking in particular.
“I think it’ll hinder historians. There’s no editing for quality or validity,” he said. “How are you going to establish (something) was true or not? Or is it just somebody’s opinion about it?”
Claude Kenneson, president of the Tallahassee Historical Society and a longtime volunteer at the State Library and Archives of Florida, said he doesn’t trust the technology’s potential lack of permanence.
“If we relied on just the Internet there’d be problems,” Kenneson said. “Sometimes it says go to a link and it’s no longer there. That might be a problem unless they find a way where it can be preserved some kind of way, (because) things can disappear … at least now.”
Too Much of a Good Thing
Younger historians are, naturally, excited about the changing technology, and say it’s generally a good thing that so much more information is available online than ever before.
“This is an exciting time for historians,” said Laura Justice, a Mission San Luis living history interpreter in her early 30s with a background in archaeology and historic preservation. “For so long, researchers have pined over a dearth of information on a given topic. You worked with what you had and, quite often, had many gaping holes left in your answer,” she said. “Now, we have more information than we know what to do with.” Jake Harper, a National Park Service ranger and interpreter at Castillo de San Marcos National Monument in St. Augustine who’s in his 20s, admitted too much information, could be “a bit daunting.”
“Imagine the president 30 years from now who grew up making blogs, viral videos, Facebook posts, keeping a diary, writing letters and essays for school, etc., and we say ‘get me everything that person ever wrote.’ We would get more information out of one person than we ever potentially got from any one 200 years ago,” he said. “So it will offer interesting insight to peoples’ lives that we’ve never had, but will also be more information than we have ever had about individuals before.”
Making Sense of it All
Contemporary historians live in a transitional period, according to Justice, and new ways of thinking may be needed to make the best use of the new media.
“We historians are in the liminal place between ‘hardcopy information’ and ‘cloud storage.’ We are discovering and utilizing the new while still reliant on the old methods of communication and sharing,” she said.
For a long-time scholar like Knetsch, though, it’s not so much a matter of the information available; rather, it’s a matter of how future historians will process it and sort out fact from fiction.
Historical subjects expressed their opinions in letters and diaries, he said, “But at the same time, you know the source, because you’ve done the research to find out who that person is, what is their slant — you know that. With the new technology, and the over-sharing, we don’t have (that oversight).”
In other words, if there aren’t any footnotes, or a bibliography, or some other critical piece of source documentation, the information could have come from anywhere.
“When I read another historian’s work, the first thing I look at is the bibliography and footnotes,” he said. “I want to see what research (has been) done. Are there corroborating sources, or is he sticking to one source? As a historian, you can’t accept things uncritically. You have to be critical of your sources, you have to do as much as you can to back up each statement you make or at least evaluate.”
Justice said the validity of sources and facts has always been an issue when it comes to gathering and writing history.
“To look at an historical account and then compare it to an archaeological site can sometimes tell two entirely different stories,” she said. “What makes things different with the Internet is that it is no longer the writing of ‘dead white guys’ who were most often quite wealthy … . Today, more ethnicities, races and socio-economic classes are recognized in a primary source sort of way.”
“Every historical era is flawed and debatable — every one of them,” said Ross Lamoreaux, a historical interpreter at the Tampa Bay History Center. “Historians, like normal citizens, all have opinions. ‘Historical truth’ is often just what the winner writes, which makes a good historian one that wades through info, like a noodler looking for catfish. You gotta get dirty if you want results.”
But the art of critical thinking, a skill necessary for testing validity (and detecting baloney), is something that’s not being taught in schools these days, according to Knetsch. Teaching philosophies that pin academic success on rote memorization and measurable abilities, as well as “teaching to the test,” ignore analytical thinking skills, he said.
Knetsch said students today must learn how to ask the right questions and not take everything at face value. Disputes over history will be worse in the future, he surmised, because of this lack of critical thinking. That, and they’ll miss the similarities between past and current events. For example, “What is similar between today’s recession and the one of 1837? Land speculation.
Easing of credit,” he said.
Justice, on the other hand, said that in time, thinking skills may change to adapt to the new technology.
“I’m not sure if there is going to be a ‘lack’ of critical thinking skills, per se, but a change in consciousness and the way in which we look at things critically. There are several experts that discuss how people’s thought processes are changing because of the Internet,” she said. “Suddenly, we are inundated with massive amounts of data. We can approach it mathematically and look for patterns in, for example, what kinds of viral videos were popular and what the general responses are to those videos. With tweets, we can observe trending topics and evaluate the use of hashtags.” Lloyd Wheeler, a living history storyteller who “channels” Benjamin Franklin and other historical characters, said YouTube videos could indeed become a resource. “Future historians looking at a YouTube video will have the ability, if it’s not staged, to see what’s happening in the background, what’s going on around the people being filmed … but how they interpret that is up to them,” he said.
Ultimately, the sword cuts both ways, according to Lamoreaux. He himself shares knowledge and research with a legion of living history friends through the Internet.
“That’s great when it is the right info, but also leads to quicker ‘myths’ and untruths,” he said. However, he admits his research is made easier because much of the information he’d have to search for in books or archives is now available online.
Lamoreaux added that the Internet may be a godsend, but that doesn’t excuse the historian from doing his or her footwork.
“I firmly believe that computer-based research is acceptable, as long as it is notated in the same manner as any book, article or monograph,” he said. “I also firmly believe that all information shared as research on the forums should be footnoted or notated with source(s) to be believable. I’ve often found excellent info I wouldn’t have found anywhere but the Internet, but I always verify or double check from other sources.”
Hard Copies to Hard Drives
Will traditional hard-copy media still be around generations from now? That’s anyone’s guess.
“Technology keeps changing,” Knetsch said. “(Things) are designed not to last too long, but you have to maintain some kind of continuum to keep the information going.”
“The shelf life of microfilm is 50 years and then it has to be renewed or redone,” he continued. “(The) newest newspapers are so acidic they last about a decade. We do not know how long a (compact disc) lasts.”
Even if information is preserved, will the technology exist to play it back? “The Smithsonian has a whole section now (for obsolete technology). They started recording sound on wires before plastic, and there are only three operating machines in the entire United States that play that stuff. The Smithsonian has one or two of them,” said Knetsch. “Is that stuff going to be lost to us?”
Lamoreaux said he doesn’t have a crystal ball, but believes historians 150 years from now will have a harder time “knowing” our world of today because so much of what we do is done instantly on cell phones, tablets and computers.
“Unless hard copies are made, they won’t have the same ability like us researching 150 years ago,” he said. “We still have books, images, newspapers, etc. to use. But what are they going to have? Who really knows?”