Go to COMM 421 online assignments.

Go to COMM 621 online assignments.

Return to Class Resources.

COMM 421/621, History of Journalism

Instructor: Ross F. Collins, Professor of Communication, North Dakota State University, Fargo.

Instructor's Readings

Introduction to mass media history

Interest in history in America seems to offer a peculiar dichotomy. On the one hand, we almost all remember the famous quote by Henry Ford, "history is bunk." He reflects the opinion of many businesspeople, apparently, that history is not very relevant to daily life.

Students in history classes, too, are often there because they have to be, feel bored and stifled by the material. "It's just not relevant," they say. "What does this old stuff have to do with me?"

Why does history get a bad rap, on the one hand? It seems to me that to begin with, we have a society that worships youth and beautiful images--on television, in the movies, youth is usually portrayed as something better than age, something to keep for as long as possible. Young better than old! That worship of a youthful culture goes back to the beginning of the twentieth century, when advertisers began to promote things like fashion for young people, the fun of the young who drink Coke, and so on.

But our discontent with history goes back further than that, I think. Back even to the founding of the country. As a democracy, the idea that the people could govern themselves represented a big break from the usual monarchy, a king governing the people. In America, democracy was a huge experiment--a break from the past. A change. A break from history. So the country turned away, purposely and even violently, from the long history of monarchies in general base their legitimacy in history. Kings are able to rule because they look back to a long line of offspring from the same family, authority based on genealogy--and studying genealogy is history.

So now we're looking back rather ironically to a history of why Americans don't like history! But wait--if we said Americans don't seem to appreciate history, we're missing the other side of this strange contrast. Because while people say they hate history, they still seem to love history.

To find this out, we need go no further than recent moves and television programs. The PBS series on the Civil War was a huge success, widely viewed, and the historian who presented the series, Shelby Foote, because a household name--a celebrity historian! Other series, and biographies of historical figures such as Benjamin Franklin, are as widely watched.

Lest you think history is popular only when dramatized for TV, you can find a wide popularity too by just looking at the local bookstores' history sections, bursting with new books on historical topics. Sales seem strong. And it's not only the contemporary period, either, that is, our recent centuries. Take a look at the campus popularity of the Society of Creative Anachronism, students who dress in medieval costumes and reenact feats and fights of the medieval and Renaissance periods. Similarly, the Renaissance festivals, held around the country, form a popular way to live a little history of Henry VIII's time, and if you're interested in living American frontier history, you can join a popular group, the black power club, people who dress up like frontierspeople.

On an individual level, the interest of people in writing their own personal history, too, is booming. People are raiding the archives for genealogy material, trying to trace a family tree, and genealogy is often the basis for people's curiosity in more general history.

So it's a confusing thing--on the one hand, people seem to disdain formal study of history as irrelevant, but on the other hand, they really are interested in what people thought and did in the past. The thing that people don't realize, perhaps, is that what they hate about history, and what they love, is really the same thing.

I think history teachers like me have to take a share of the blame for people's lack of enthusiasm in formal history. In addition to perhaps a youth-oriented culture not built around historical traditions, historians have tended to portray past times as a dry litany of peculiar names and difficult dates. Too often, in written and spoken history, the presenter relies on a colorless listing of fact after fact, date after dates, sort of like a long wall chart but with no connection to the real humanity behind the dates. This is similar to some English teachers, I think,, who drain the fascination out of literature study by forcing it into a rule-oriented analysis of grammar and style.

But history is not just names and dates. It's about people who did things. It's literally a story , the basis of the word. It's a story about people who did interesting and important things--what they did, how they did it, why the did it,   how it affected other people.

In this sense, history isn't very much different from a story you might tell a friend, such as, "Boy, you'll never believe what happened in speech class last week. The professor was talking about communication and American family values, when one of the4 students got up and accused him of blasphemy and punched him in the nose! Somebody called the cops, and they arrested the student. Now there's a meeting being called, I guess, on classroom security."

Now this is a rather extraordinary story you might tell lots of people, if you could. Yet it has lots of elements of history in it--it talks about people who did something, that is, one event, and consequences of that event. You've told someone a little history.

Of course, studying history isn't as simple as that. The story above is good and bad history. It's good in that it's told by someone who actually saw it. That is, it's what historians call a "primary source." "Primary" means that the person who told the story either was right there, or was very close to the event in time and place. For instance, if you as a historians were researching history of the university during your grandfather's time, say the 1940s, primary would be, first, to talk to your grandfather. Also primary would be minutes from meetings during that time, memos, newsletters, personal letters, diaries, and so on. Newspapers, too, are considered primary if published at the time of the event.

On the other hand, if you told of this story of the professors' bad luck based on an article you read in a book by someone else who told you that story because someone else told him, then the source is no longer primary. It's called secondary in history lingo. An encyclopedia, for instance, would be a secondary source, as are most books and articles.

Now your little mini-history, while being primary in nature, does lack a couple other things we consider important to historical story-telling. One is that you didn't provide us with any interpretation. Interpretation is necessary for good history. That is, you must say what happened, but you also must try to explain why, and its significance. It's not enough to just throw out facts. For instance, to include interpretation, you would say, "I think he punched the professor because he is a very conservative religious sort of person, and he's known to be hot-tempered. Not only that, the professor had given him an F in another class. This means that from now on, professors will likely either be more careful about saying controversial things in class, or wear body armor."

Okay, now you've made some assumptions based on the incident, talking about motives, speculating on significance of the incident. But you're still missing something--evidence. In history, wherever you speculate, whenever you interpret, you need to back up your claim with evidence.

So your letter story would perhaps add, "I know he's religiously very conservative because he's the editor of a conservative religious newspaper and several brochures on he subject of families. He's also been violent in the past, and was arrested three times for threats and assault, according to police reports."

Now you've got interpretation backed up by evidence. Last you need to know that history values the presentation of this material. It's not just facts and interpretation thrown out haphazardly--you have to be a good storyteller. You have to make it interesting, with well-chosen words, yet be careful that all you say is the truth, and can be backed up with evidence.

So that's history in a nutshell:

Journalists, which some of us will became, are said to write the "rough draft of history," because they write a kind of "history" every day, as accounts of events, if their evidence and interpretations are different. In any case, the rest of us also value history as part of our lives too, in one way or another.

Why is history important to our lives? At its most personal, we look to history to help us understand why we are who we are as a society, why we do the things we do. If you are confused or anxious about your way of dealing with your life, you may wish to see a therapist or psychologist. Chances are that person will begin by asking you questions about your background: what was your family like? How did your parents treat you? What was elementary school lie for you? From this the therapist can construct a kind of personal history of you, a window to the past to help understand your present behavior. For instance, if you seem always to choose a boyfriend or girlfriend who abuses you, it would help to understand that your mother was abused by your father. If, on the more positive side, you choose as a career the law because it seems to come naturally to you, it might help you to see how your mother tended to approach decisions like a judge weighing testimony.

In fact, you and your therapist are looking for keys to your present behavior in the past. This is one major foundation for historical study too. Historians could be called the therapists of society, in the sense that to help explain why we do what we do today, why society is as it is, we look back to the past. Often this search through the past for keys to the present is surprising and revealing.

For instance, we know today that our American society is a violent one. Every day, it seems, politicians, parents or pundits decry our crime and search for solutions.

A historian might search for those solutions by asking the question, why is the crime rate as it is today? Unfortunately, unlike the therapist, we can't just sit down with the patient and solicit information. It's a lot harder in a society of 300 million! So instead, he has to take off his therapist's hat, and put on the hat of an archeologist. (That is, if people wore hats nowadays!)

I'm speaking metaphorically here, of course. A historian seldom is a true archeologist. But like an archeologist sifts through sands and digs in deep for information about a past civilization, a historian sifts through evidence from society for clues about the violence of the present.

In this case, I might sift back to find that, hm, in 1980, when Reagan became president, one of his major premises was to fight crime, then a crisis of America. I would not that during this time, the crime rate in America was statistically not much higher than twenty years later.

Digging back further, I would discover that in 1972 Richard Nixon promised to fight "the rising tide of crime." Different terminology, but the issue seemed to have been the same. So we learn, first, that there's nothing new in America's belief that crime is at a high, and perhaps the belief that it's worse than ever is not so true. In fact, we would now widen our search, reading back to find out more about crime and violence and the media, and America's love affair with guns, and we would discover that "crime" was considered an issue generations ago; in fact, America has a violent tradition dating to the founding of the Republic. In fact, city streets were much more dangerous in a New York of, say, 1850 than they are today.

How are we to interpret this? I'll leave that to you, because it's a topic you might wish to pursue for a historical research paper, say, the history of crime reporting in your region. The point is that, clearly, in searching back we can put today's topics into perspective, and understand that perhaps they are not so new after all, and that perhaps others have tried solutions we could learn from.

This is not the only reason to study history, to help us understand the present, though it is one good reason. We also take a look at the past for its own sake, for the stories and dramas of another place, another time. We may find that human history has changed, but that human nature has not changed very much at all. Or sometimes, in contrast, we'll find that human nature has changed very much, that the assumptions which underpin our entire existence have changed. For instance, if you believe that ancient philosophers or religion have already given explanations for just about everything that happens in life, and that because of this there's no need for actual experiments and research, what would that mean to you on a daily basis living your own life? Such a believe was indeed the medieval person's view of reality.

History gives you an opportunity to compare your human nature, your assumptions, with those of others, and that too gives you a perspective to help you understand the present. One reason I love studying history is that it's the closest thing I have t a time machine. No one can actually go back in time. We know that because the Cambridge physicist Stephen Hawking has told us: if time travel were possible, we'd be plagued by annoying tourists from the future. But I think that in historical study I can get as close as possible to living in other times, living other lives, thinking other ideas. It's kind of a way to break free from the trap of the now--we're all inevitably the products of a single society, a single point in time, and think the way we're molded to think. But history is a chance to step out of that, what historians call "present-mindedness," into another time far away and different form our own.

And, truly, in the end, studying history does ask us to examine the world carefully, to think rigorously and thoroughly, to make interpretations and conclusions based on evidence--that is, to think critically, the real goal of a university education. I might teach you how to use computer software or recognize the five families of type, but that's not really why you're here. Historical study comes closer to the mark.

In this class, then, we'll approach history from a little different perspective. I assume most of you have had a history class somewhere during the last 16 and some years, but the approach of this class might be new to you. If you remember the metaphor of the archeologist, we're going to take the same tack. The archeologist starts with the surface, and works down to answer questions of how each older layer affected the newer one. I hope to do the same thing. I'm going to start with very recent historical events, and ask questions about why they unfolded as they did, questions which can only be answered by working back to an earlier historical period. We'll sift layer after layer until by the end of the term, I hope, we'll reach near the bottom, the oldest layer. In doing that, maybe we'll be able, finally, to offer answers to some of the questions asked at the top layer.

This approach to historical study may be a little confusing, at least at first, because we're used to chronological approaches--starting at the beginning, and working to the end. Most biographers, for instance, begin something like, "She was born in 1821." I think, however, that starting at the front and working back will help keep us on our toes, intellectually, and more clearly show the relevance of what happened then to what's happening now. A good mental challenge!

To begin, though, we need to understand that in this class we are studying a particular kind of history: mass media history. What is that? Well, at its most simple, it's a study of a struggle--humanity's long struggle to communicate with others. Communicate, that means to dig out and interpret news, to offer opinion, to promote views. And the story of efforts to erect barriers to that flow of communication, either natural barriers of inadequate technology, or artificial barriers of censorship and control. Most of what we'll talk about is published, or broadcast, "mass media," as opposed to, say communication through private letters.

The press and war

I think it's interesting to begin a history of the mass media in a democracy like the United States by talking about a free press and communication when it's most awkward--that is, press coverage during wartime. Because for democracies war is not a normal state, and because pole are getting killed or the entire country may be threatened, the idea of freely telling everyone what's going on among the fighting men can get pretty controversial.

We'll take a recent historical example of the press during a war, that is, the media and the military during the 1991 Persian Gulf War.

Probably some of you remember the gulf war since it happened fairly recently. But what exactly do you know about that war? Do you know what really happened? In fact, you probably know everything you know about the gulf war from two sources: the media and the U.S. Government. Let's see how that information was gathered.

Most Americans learned in August 1990 that Saddam Hussein of Iraq had invaded the small but rich neighboring country of Kuwait. In the quick process of taking over the country, he killed a few hundred Kuwaiti soldiers, and damaged some property--expected in wartime. It became apparent after the Iraqi invasion that two things were true:

The United States and United Nations overwhelmingly approved a demand that Iraq leave Kuwait, but as we know, Hussein refused to back down. So the United States instituted something dubbed "Desert Shield," and begin shipping troops and supplies across the border to Saudi Arabia.

Now when a country looks like it's getting ready for a war, the media like to go along. Wars not only are perhaps the most significant, really critical events for any nation, they are also a great source of heroism, pain, life and death, fear and triumph--all those human emotions which make such good copy for reporters. Reporters have been tagging along with troops since the beginnings of the mass media some 150 years ago.

Of course, readers back home have the right to these stories. In a democracy, we have the right to know what the government is doing in our name. After all, if we live under a monarchy, and have no say over what the government does, we also have no blame for the outcome. But when we have a vote, we also have somewhat of a collective responsibility to know what our representatives are doing.

But this puts the military into a constant quandary, always has. Because on the one hand, you're fighting for freedom in America, but on the other hand, freedom given to reporters can be really annoying to the troops. The thing is, one you need to keep secrets from the enemy--the less an enemy knows about your movements, the better. On the other hand, if things go badly in battle, you also want to keep that information from the civilians at home, and perhaps even from the other troops, because if they don't keep the morale up, you wont' be able to continue fighting. One writer in a nutshell said the general's goal is to befuddle the enemy and bamboozle the public. It's easiest to fight a war if nobody really knows what you're doing.

Yet people demand to know, especially in a democracy. So what does the military do? Well, in the case of the gulf war, they made a pan. The plan was based on plans made in two other U.S. wars fought in the decade before, both short ones, the Granada war of 1983, and the Panama Canal war of 1989. In 1983 the Reagan Administration was anxious to make sure all the news coming from the battlefield was nice, and not offensive to sensitive people in living rooms at home. So reporters were not allowed to cover the wars. They were not part of the invasion, and those reporters who tried to sneak in using chartered boats were turned back and sequestered.

In fact, no one knew what was happening in Granada until it was a done deal. That is, all we knew at home was what government and military press aides told America. These are intrinsically biased sources, however.

Well, after that short war ended. The press screamed so loudly about censorship that the military promised to re-think. Along with major media organizations, it set up reporters in little knots to gather information. That is, it resurrected an old idea about war coverage, called the pool.

The pool means this: a group of journalists whose credentials have been approved by the military hang out at an area hotel and wait while a few of its numbers are picked up for a tour of the fighting area. These few are then escorted back to share what they find out, and a single report is produced for the folks back home.

Disadvantages of this news gathering method should be pretty obvious. First, you have no diversity of content and outlook by independent reporters who tour the battle lines on their own. Second, reporters only see what they are shown--which may be totally unrepresentative of what's really going on. When reporters can't see and hear and interview for themselves, their reports usually are heavily based on what the public relations people tell them. That's exactly what the military would want. Third, journalists are under close supervision of military public relations people. They are unable to wander off and talk to people on their own, get uncontrolled opinions from soldiers and officers.

The pool system was supposed to be used only for a very short time, the first few hours of battle, until the rest of the reporters could get back to the site, after which the pool was to dissolve. But in the 1989 Panama invasion, the secretary of defense, Dick Cheney, decided not to send the assembled pool over at all until very late, and reporters who tried to cover the war on their own were blocked and led back. In fact, we still really don't know very much about that invasion, including extent of civilian losses and just what happened during battles. No reporter was there to tell us.

As the gulf war loomed, as it became clear by fall 1990 that Hussein was not going to leave Kuwait, the first George Bush administration, leading a coalition of countries, decided to turn Desert Shield into Desert Storm. That fall it was already becoming clear to the American media that the pool system might operate again, and that war coverage might be strictly curtailed. In fact, a second problem was evolving--Saudi Arabia, a closed, conservative Muslim monarchy, was refusing to allow more journalists into the country. The Washington Post and other major news media asked the secretary of defense's press liaison to try to ease the situation, and establish clear ground rules for coverage in the Persian Gulf. The representative, Pete Williams, negotiated between the press and the general command, headed by Colin Powell, Norman Schwarzkopf, and Cheney. Finally by November Williams hammered out a three-phase plan for the press. The third phase was that the pool system would be eliminated.

However, in January 1991, the plan was implemented to the Central Command--Powell, Schwarzkopf and Cheney--without the third phase. Time was running out. In January, bombing of Iraq began. Williams promised that in the event of a land war, press access to battlefields would improve. But this did not occur. In fact, all the bad things about pool reporting came true. With the land invasion in February, only 160 reporters were allowed along the entire front line of 500,000 troops, traveling in teams of a half dozen, all under close control of military press officers. Reporters were not even allowed to talk to soldiers and, as one reporter said, he had more guns pointed at him from the friendly side than from the Iraqis.

So reporters tried to sneak out on their own, but they usually were arrested and dragged back, stripped of press credentials, and not allowed out again. One CBS camera crew got lost in the desert and was captured by Iraqis, not to be set free until after the war.

What's more, the press dispatches were to be submitted to a "security review," that is, censorship by military public relations folk. In one case, a photographer was sent off for taking a picture of a dead Iraqi soldier. Words were changed, or eliminated by military press offices in the field.

It was clear that journalists were being allowed to see and hear exactly what the military wanted them to, and that they were unable to get information from any other authoritative source. Reporters who were desperate for information tended to cover each other, telling about things like how the heat affects them, gas masks, reporters under SCUD missile attacks--but most publicized was the coverage of Peter Arnett, then of CNN, in Baghdad.

Because Arnett was not with the American army, but in the enemy capital, his reports were not part of the pool, and could not be controlled by U.S. authorities. They were, however, controlled by Iraqi censors. Arnett was criticized by some for offering Iraqi "propaganda," especially a case of his report of the allies supposedly bombing a plant producing infant formula. But viewers were fascinated by his so-called 'real time reports," just as they were happening, though satellite link. This offered instantaneous, sometimes unedited, direct and gritty looks at the war in Baghdad--instant coverage seldom seen before this time. Some journalism critics said this was a "new journalism" beginning in the gulf war, although perhaps it really is just a faster and less polished version of older methods.

In the end, despite all the Star-Wars like pictures and special reports, did the control exercised by authorities and journalists lead to a false perception of the war? Some critics, many of them journalists themselves, say we did get it wrong, and the truth didn't come out until later.

Legal experts have concluded that military restrictions during the Gulf War were clearly unconstitutional. Moreover, a number of big stories did not come out until after the war ended, and some people still don't know the truth. For instance, most of the bombs dropped during the war, 70 percent, actually missed their targets, despite the military publicity footage showing accurate bombings. And the much ballyhooed "smart bombs" totaled only one percent of the ordnance used. Secondly, "friendly fire" caused the great majority of the more than 260 American war deaths. That is, somebody screwed up. As well, American bulldozers buried as many as 100,000 Iraqi troops, many still alive. And the Iraqi army, billed by the U.S. military as one of the great military machines of all time, was only a fraction of the size we it was purported to be.

Why were commanders like Schwartzkopf and Powell so ferociously opposed to media coverage of this war, and why did civilian leaders agree? Now we can put our archeologists hat on again, and note a single fact: all had been active in the Vietnam War a quarter century before, and blamed the media for America's failures there.

Writing a Historical Biography or Autobiography

Most of us are well aware of the approach to history that we call historical biography, though maybe we never considered it much. But it’s become a really popular way to tell history—that is, to tell it through the lives of famous people who have lived it. I may be a sort of case study as a way to understand a society and a time. A lot of the famous people whose biography interests people today, of course, are celebrities. We have a whole television channel devoted to that. Sometimes it’s called lifestyles, not biographies, and we actually have A and E’s Lifestyle channel called FYI.

But it’s not only celebrities of today whose lives interest people. We also seem to be interested in biographies of people in the past. We can learn about historical figures as diverse as Shakespeare and Lincoln through biographies in both books and in movies—biopics, as they are sometimes called. We like to learn about other people’s lives.

So if pretty much everybody knows that a “biography” is a story about a person’s life, and of course, an autobiography is about one’s own life, well, how do we go about writing one as a historical work? That has been quite a topic of debate for centuries, millennia, actually. Plutarch’s Lives was written some 2000 years ago, but the “lives” are really biographies. Biographies, or “lives,” as they used to be called, also were popular in the Middle Ages as a way to write about Christian saints. We call those “hagiographies.” (pronounced both hag and haj). These were designed to give inspiration and instruction on how to live a good life based on the probable good example of saints. Today hagiographies of genuine Christian saints are not so common. The word has evolved to suggest an adulatory, uncritical account that makes the subject, as it were, into a saint. Some people will occasionally write such biographies of celebrities, but such hagiographies are not usually treated with much respect.

But throughout the history of biographical writing reaching back to ancient times, we still can identify a few general approaches that people tend to agree upon, and are still usually part of our modern biographical writing. More recently we have expanded those ancient guidelines. Here are five principles from past and present that are general to good biographical writing.

1. A biography has to be based on truth—it’s not a novel. Readers presume a biography is about a real person who did real things. So if we want to write a biography about, say, Frodo Baggins in Lord of the Rings, well, it’s not biography. He didn’t really exist. It’s fantasy. We expect biographers to work hard to find the truth. Even Plutarch tried hard to find truth in his ancient biographies.

Yet we will admit that some biographies stray from the straight and narrow of fact-based history. For example, is it all right to make up dialogue based on conversations that probably took place but were not recorded? A lot of biographers have done this, though it’s hard to say if such a thing is “true.” What about making up whole meetings from scratch, because they ought to have existed, or even other people and places the subject never knew? Most of us would draw the line at this—it’s no longer biography, but fiction, what perhaps the so-called reality TV shows call as “dramatization” or “docudrama.” Elements might be true, but others are made up to enhance the drama of a story line. To be a biography the writer must try to stick to the truth. That so many biographers play loose with that rule, really, is one reason some historians dislike biographies.

To find the truth, that is, what actually happened, may also be difficult. If we are writing a biography about, say Socrates, well, he lived more than 2000 years ago. There’s not much primary evidence available, and what we do find might be incomplete and biased. Even for more recent subjects, where the primary evidence exists, we realize it might be misleading in a variety of ways. Primary evidence must be considered carefully, one of the challenges of biography as it is of writing any sort of history.

2. A biography should be objective. We all know this really isn’t entirely possible—our own biases, sometimes bias we don’t even recognize in ourselves, color our choice of material and approach. We select from the sea of events in a person’s life those which we judge will best illustrate the person’s character and significance. We have to do this to create a narrative at all, that is, to tell a story, but in doing so we inevitably arbitrarily choose some aspects while not choosing others. We may easily choose evidence based on what we want the person to be, especially if we are writing about someone the world loves or hates, such as Mother Teresa or Adolph Hitler.

And sometimes when we don’t have all that much primary evidence to go on, objectivity becomes all the more difficult. Sometimes people or their heirs just don’t want others to write biographies of them, and so set about to destroy the evidence that might show a biographer what the subject was really like. Other times, if a family does not agree to what we call an “authorized” biography, evidence might be impossible to access. I like to quote a remark from one of my fellow historians, James Startt, who said, historians should begin with a “belief that a reasonable degree of objectivity is achievable, that a properly designed and controlled narrative element belongs in history and can communicate truth as far as it is known.” So should biographers.

Some biographers will argue that to write a proper biography you need to personally know the subject. It certainly could be an advantage to actually be a friend or family member, or at least to have actually interviewed the subject. Of course, it’s not always possible. Plutarch wrote long after his subjects were dead. And we can easily see the possibility of bias creeping in based on whether we like or dislike a subject whom we actually know.

3. A biography should be frank and revealing. The idea that a biography can’t really be good unless it is completely honest, the so-called “warts and all” approach, seems to be common today in our Western society—although it certainly was not so common in the past. Writers of past centuries may have used biography to illustrate a life well lived, as a way to give the reader lessons on how he or she may also live a virtuous life. If that were the point, talking about the subject’s tendency to get drunk and gamble the rent away would be irrelevant. But today many people would think such virtue-based biographies are more like obituaries. The common convention of obits as a kind of biography is to say nothing ill about the dead, but instead to praise the person when we can, and stay silent about the rest.

Today we are much more likely to find out about people’s private lives than used to be the case. And the standard of privacy seems to be changing, fairly dramatically, I think, as cell phones and the internet have made privacy more difficult to guard. And people today seem less likely to even care about guarding it, for ourselves, or for the famous and successful who, it seems many presume, need to be knocked off their pedestals.

So readers today are going to expect a revealing biography. Biographers should omit nothing—which, I suppose, some subjects realize, and so are apt to destroy their papers as to eliminate compromising evidence which a biographer can otherwise not fail to make use of.

4. A biography ought to be about a whole life. That would include childhood and adolescence, adulthood and senescence. Actually, a lot of historical biographers used to disagree with that, believing that what someone did as a child was pretty inconsequential to the life they led as an adult. What mattered to the traditional historical biography was public achievement. Today, though, we tend more and more to realize that the story of someone’s childhood can reveal a lot about that person as an adult. Private life affects public life, we believe. And so today we consider it important to add all kinds of details regarding a subject’s parents and siblings, her quirks or sexual habits, her spouse and children, her hopes and fears, her dreams and fancies, her habits with money and her battles with illness.

We’re supposed to march chronologically and relentlessly from the proverbial cradle to grave. The thing is, a lot of biographies actually don’t cover a whole life like this. One can decide that the best way to write about a person is to begin elsewhere than with the words “X was born on….” For example:

• We can begin with a key aspect of the person’s life, such as a revealing character trait, a significant professional accomplishment, a key decision or a great downfall. In this case a specific slice of life can help illuminate the person’s general character and significance.

• We can begin with the end, and work back. The way how a subject who is no longer living died is sometimes representative of the person’s life. A famous biopic, though it certainly is what we might call today a “docudrama” is the movie “Citizen Kane,” which begins with the subject’s death.

• We can set up a life by themes instead of chronology. For example, a biography of the famous painter Picasso might be based on artistic themes as they evolved throughout the artist’s life, and not necessarily on a chronological parade of events.

5. A biography must include all sources for the information, and must be based on reliable sources above all.

This suggests footnotes, and it is true that traditional historians who are sticklers for accuracy will include them. Some trade publishers nowadays are moving away from footnotes, because they know the public really doesn’t like them. Others will simply include a source list at the back, or sometimes include the notes as part of a website to keep the clutter away from the text. But in historical biography the credibility is only as good as the sources, and readers who care about accuracy—or perhaps those who are suspicious of the writer’s integrity—would like to see the sources of the information in the biography.

Sometimes in the case of biographies of contemporary people, the sources don’t want to be named—as those whom journalists’ us as confidential sources. But we know the temptation to misuse confidential sources can lead to dangers of exaggeration or, it has to be said, just making things up. So historical biographers don’t use confidential sources unless really necessary.

Writing historical biography.

Some historians do not like biographies, as noted. It is not only because they feel the biographer can too easily be biased, or can too easily move to make things up based on scant or non-existent evidence. It is because a biography is necessarily a slice of society, one person’s life. Perhaps it can’t do a very good job, then, of explaining the larger issues of historical importance, the significance of events and trends the historian hopes to find. But other historians think biography well done can actually help explain the larger issues by offering something like a case study. And historical biographies usually are written of people known to be significant to shaping history. While perhaps some historians disagree with the so-called “great man approach” to history, that history is made by a few important people, still, no one will deny that a few important people can trim the course of history. Who would disagree that Sam Adams, Joseph Pulitzer or Walter Lippmann served to shape American journalism history?

The historian writing biography, then, tries to give context by interpreting the age in which the person lived. None of us live in a vacuum—we are shaped by our culture and by our times. Our understanding of the world is based on where we came from, and a biographer needs to understand and reflect that in the writing. For example, you can’t write a biography of the revolutionary journalist Samuel Adams without understanding the history of colonial America at that period, and the spirit of the age in which he lived. Think about your own age. What presumptions to you bring to your own life, and how might they differ from the accepted wisdom of another age? How do you look at things differently from even your parents’ generation, to say nothing of generations centuries past? This is one of the challenging things about writing biography, or history of any kind, what we call “present-mindedness”: our tendency to judge people and events of the past based on our own tastes and attitudes of the present. We take our worldview for granted, often unexamined—until someone from another time or another place forces us to re-examine our presumptions. Making an honest effort to do that is hard, and sometimes painful. No one likes to discover that what they believed to be true may not be, or may not have been for others. But such self-examination is part of historical writing, and writing of historical biography.

Below are specific tips for writing biography from Robert Davies, retired history professor from Minnesota State University Moorhead, who recently published a biography of a New York Times military journalist named Hanson Baldwin. Note Davies emphasizes the idea of looking at the whole life, and not just choosing famous aspects; considering challenges or cultural privilege as part of success; acknowledging mistakes and weaknesses; considering the time in which the subject lived; and possible source material.

Begin with the early years of the subject before they became famous, note his/her family social and professional background, the mentors or those who influenced him/her the most, the mental approaches to work and how he/she saw or was influenced by the world around him/her. Was the subject helped by inherited wealth or social position that would give a boost up in his/her career? Or was life a hard scrabble for many years? Early mistakes in judgment made and what did the subject learn from those missteps?

In other words, a life-and-times approach. Did the subject write an autobiography or did another author write a biography about the subject? The student should select a person first to be the subject and then proceed from there.

(References: Hermoine Lee, Biography. A Very Short Introduction; Robert B. Davies, email comments; Start and Sloan, Historical Methods; Robert B. Davies, Baldwin of the Times).

The Historical Research Paper

In writing mass media history (and all history), a writer needs to consider three aspects:

Without these three, there's no real history being written. For instance, without facts you might write a novel. Fiction is interesting and readable, but it's not history. That's sometimes confusing to people who really believe the "docu-drama" format in movies, assuming they can make up their mind about something that happened in the past after seeing a movie, say, "Dances With Wolves," "Amadeus," or "The Da Vinci Code." You may get a flavor of the times, but beyond that, truth and fiction mingle.

As well, without interpretation you have, at the least, a timetable of events. At most you have what much genealogy is today: a description of who did what and when, but no explanation of how it relates to a larger world. Interpretation answers the "so what" question of history: so your grandfather worked as a typesetter in the 1920s. What does it mean to the development of American newspapers in society?

The narrative is the story you tell. It ought to be well-written and compelling, and not clumsy and dusty. In history, perhaps more than in any other discipline except English, how you present your material is as important as what you present. All the more so as we study mass media history here. Communication study emphasizes quality in written and spoken communication.

The evidence
In weaving together your story (that's what hiSTORY is), you need to consider the quality of evidence behind your facts and interpretation. Evidence comes in two varieties:

Primary evidence is best. It consists of material produced during the time you're studying, and by the organization or person you are studying. Included are periodicals from the period (for instance, a 1925 newspaper on a 1925 subject), brochures, bulletins, newsletters, organizational minutes, directories, diaries, letters, notes, tape recordings, video recordings, drawings, photographs, and oral history. Normally these are the kind of things you find in an archive, such as NDSU's Institute for Regional Studies, housed in an old K-Mart building on 19th avenue north, just north of the FargoDome. (Director: John Bye).

Secondary sources are books and articles published later on, about this time period. Normally the authors consulted primary and secondary sources to write their article or book, so your referring to them means you are getting the information second (or third or fourth) hand.

Secondary sources are not as good for your own research, but they are essential for background on your topic. For instance, if you are doing research on how women were portrayed in regional newspapers between 1920 and 1950, you would begin by seeing what other authors have written on women as portrayed in the media.

Your secondary research helps you refine your own topic and research question, and forms the introduction to your own work. For instance, before going into your own research in the example above, you spend a little time "setting the scene": talking about the topic in general, why you became interested in it, and what others have written about it. That research also helps you think of places for primary sources, how accessible they are, and whether you can reasonably acquire them in the time you have. If there's a problem, the answer usually is to refine your topic based on time you have and available primary sources.

Collecting a bibliography
As you do this background reading, pillage the footnotes. This means look at the footnotes, endnotes and bibliography for more sources specifically pertaining to your topic. It's not particularly useful except at the very beginning to make your list of sources based on what's available in our library. Likely there are a lot more sources, and better ones, available elsewhere, either Tri-College or through interlibrary loan. It only takes a few days to get what might be a lot more interesting source. Note: you don't have to consult every source you list! Narrow it down to just those that appear most pertinent.

After you make your personal bibliographical list, you're probably ready to really refine your topic into something you can write about. Let's call this your research question. It should be something you can answer by using primary sources. Check out the link below for ideas.

For example, your original topic idea might be the media and space exploration. You begin by looking in a few encyclopedias and general media textbooks. From there, you could find some helpful bibliographical entries. Add to them from the library's on-line catalog, but don't spend piles of time in front of the computer. You're better off looking at footnotes; why do your own work when you can rely on someone else's?

Note that encyclopedias can give you helpful background, but they are not considered credible sources for historical research. That is, they do not become part of your bibliography. This is particularly true for collaborative sources on line, most famously Wikipedia. These sources are not fact checked, and my be written by people with clear biases, as noted in a report by NPR.

How to read secondary sources
Ration your time: your topic for a term paper ought to be narrow enough that you don't have to wade through hundreds of sources, and read whole books. Skim; read chapters that most directly pertain to your topic. Take notes. Don't photocopy whole articles and microfiches--you won't have time to go back through it all. Some researchers think notecards filed under topic work best, and nowadays if you have a notebook computer all kinds of research organization programs can help make it easier for you.

After looking at about 10 of these secondary sources, you're ready to rely on primary sources. Your topic choice will dictate what they are, but you may have to actually do some foraging in an archives to see what's there. Don't hesitate to ask for help&emdash;staff love to show how much they know about their archives. If your topic requires oral history, that is, interviews, you must come to the person prepared with a list of questions directly pertaining to your topic. This helps direct an otherwise sometimes aimless interview, and helps jog memories. You may choose to tape record interviews, and that's not a bad thing for accuracy, but also take notes. If you've ever tried going through an hour and a half of tape looking for that one important statement you can't quite remember, you realize the importance of notes as well as tapes.

As for broadcast primary sources, they are tough to find locally, unless you have a special relationship with a station (and even then--broadcast people usually don't keep stuff for researchers). The state archives in Bismarck has some tapes--for help, inquire at NDSU's Institute for Regional Studies. Nationally, the Vanderbilt University archives have lots of news programs on tape, but I'm not sure how accessible they are here. Check their home page on the Web.

Research question options
When relying on primary sources for media history in our class, normally your question will involve one of these options:

A comparison of two or more media outlets concerning a certain topic, and a certain time period. You need to decide how you're going to compare the topics, why you're going to choose certain media. For instance, if you're going to compare the portrayal of women in local media in 1950 and again in 1970 and again today, what will be your criteria? Stereotypical views in print or photos? Will you look at advertising too? Will placement be a factor? Will you check word choices for stereotypical views? Will you count word or picture choices of a certain type in publications from different time periods? Your background reading will help you to answer these specific questions to guide your research.

An analysis of the media as it was during a certain time. For instance, local radio in the 1950s, or how minorities were treated as professional reporters in 1960s newspapers. Oral history is particularly useful here.

An analysis of a certain event as portrayed in several media outlets, such as the impact of Edward R. Murrow's attacks on Joseph McCarthy in regional media, or the end of World War I as reflected in the local dailies. It's important to choose a broad enough topic here to answer the "so what" question, but narrow enough to be doable in a reasonable amount of time.

An analysis of publicity or published material during a certain time, or how it evolved, whether it be on-campus material, or other material. Collections of brochures and pamphlets can be found in the archives, though they may not be so complete. Private companies also sometimes keep collections.

Biography of a well-known local media person. Oral history, of course, and perhaps that person's papers are available, either from the family or in archives. It's sad that so much of this material is thrown away.

The gist of this means that you can't simply do a generic project for this class, and call it good history. Generic means "Watergate and the American Press," or "Newspaper Coverage of the Vietnam War," or "Yellow Journalism in America." This is the kind of term paper you'd write in high school. For us it's too general, and offers little opportunity to use primary sources. Those topics might form your background reading, but the topic you finally choose could be something like, "How local print media covered Vietnam, 1961-75."

What is plagiarism?

What is wrong with journalists writing history?

A Short History of Photography

I. Modern photojournalism: 1920-2000.

The beginning of modern photojournalism took place in 1925, in Germany. The event was the invention of the first 35 mm camera, the Leica. It was designed as a way to use surplus movie film, then shot in the 35 mm format. Before this, a photo of professional quality required bulky equipment; after this photographers could go just about anywhere and take photos unobtrusively, without bulky lights or tripods. The difference was dramatic, for primarily posed photos, with people award of the photographer's presence, to new, natural photos of people as they really lived.

Added to this was another invention originally from Germany, the photojournalism magazine. From the mid-1920s, Germany, at first, experimented with the combination of two old ideas. Old was the direct publication of photos; that was available after about 1890, and by the early 20 th century, some publications, newspaper-style and magazine, were devoted primarily to illustrations. But the difference of photo magazines beginning in the 1920s was the collaboration--instead of isolated photos, laid out like in your photo album, editors and photographers begin to work together to produce an actual story told by pictures and words, or cutlines. In this concept, photographers would shoot many more photos than they needed, and transfer them to editors. Editors would examine contact sheets, that is, sheets with all the photos on them in miniature form (now done using Photoshop software), and choose those he or she best believed told the story. As important in the new photojournalism style was the layout and writing. Cutlines, or captions, helped tell the story along with the photos, guiding the reader through the illustrations., and photos were no longer published like a family album, or individually, just to illustrate a story. The written story was kept to a minimum, and the one, dominant, theme-setting photo would be published larger, while others would help reinforce this theme.

The combination of photography and journalism, or photojournalism--a term coined by Frank Luther Mott, historian and dean of the University of Missouri School of Journalism--really became familiar after World War II (1939-1945). Germany's photo magazines established the concept, but Hitler's rise to power in 1933 led to suppression and persecution of most of the editors, who generally fled the country. Many came to the United States.

The time was ripe, of course, for the establishment of a similar style of photo reporting in the U.S. Henry Luce, already successful with Time and Fortune magazines, conceived of a new general-interest magazine relying on modern photojournalism. It was called Life , launched Nov. 23, 1936.

The first photojournalism cover story in I was kind of unlikely, an article about the building of the Fort Peck Dam in Montana. Margaret Bourke-White photographed this, and in particular chronicled the life of the workers in little shanty towns spring up around the building site. The Life editor in charge of photography, John Shaw Billings, saw the potential of these photos, showing a kind of frontier life of the American West that many Americans thought had long vanished. Life, published weekly, immediately became popular, and was emulated by look-alikes such as Look , See , Photo , Picture , Click , and so on. As we know, only Look and Life lasted. Look went out of business in 1972; Life suspended publication the same year, returned in 1978 as a monthly, and finally folded as a serial in 2001.

But in the World War II era, Life was probably the most influential photojournalism magazine in the world. During that war, the most dramatic pictures of the conflict came not so often from the newspapers as from the weekly photojournalism magazines, photos that still are famous today. The drama of war and violence could be captured on those small, fast 35 mm cameras like no other, although it had to be said that through the 1950s and even 1960s, not all photojournalists used 35s. Many used large hand-held cameras made by the Graphic Camera Company, and two have become legendary: the Speed Graphic, and later, Crown Graphic. These are the cameras you think of when   you see old movies of photographers crowding around some celebrity, usually showing the photographer smoking a cigar and wearing a "Press" card in the hatband of his fedora. These cameras used sheet film, which meant you had to slide a holder in the back of the camera after every exposure. They also had cumbersome bellows-style focusing, and a pretty crude rangefinder. Their advantage, however, was their suburb quality negative, which meant a photographer could be pretty sloppy about exposure and development and still dredge up a reasonable print. (Automatic-exposure and focus cameras did not become common until the 1980s.)

Successor to the Graphic by the 1950s was the 120-format camera, usually a Rolleiflex, which provided greater mobility at the expense of smaller negative size. You looked down into the ground-glass viewfinder. But in newspapers, by the Vietnam War era, the camera of choice was the 35--film got better, making the camera easier to use, and the ability to use telephone, wide-angle, and later, zoom lenses made the 35 indispensable, as it still is to most photojournalists today.

Some of the great photojournalists of the early picture story era included "Weegee" (Arthur Fellig), a cigar-chomping cameraman before World War II who chronicled the New York crime and society's underside.

During World War II W. Eugene Smith and Robert Capa became well known for their gripping war pictures. Both were to be gravely affected by their profession. In fact, Capa was killed on assignment in Indochina, and Smith was severely injured on assignment in Japan.

Shortly before the war, with the world realizing the power of the camera to tell a story when used in unposed, candid situations, the federal government's Farm Security Administration hired a group of photographers. In fact, the FSC was set up in 1935 by Franklin D. Roosevelt to help resettle farmers who were destitute due to the Depression and massive drought in the Midwest. Because these resettlements might be a controversial task, the director, Roy Stryker, hired a number of photographers to record the plight of the farmers in the Midwest.

The photographers later, many of them, became famous--the collected 150,000 photos now housed in the Library of Congress. The power of these often stark, even ugly images showed America the incredible imbalance of its society, between urban prosperity and rural poverty, and helped convince people of the importance of Roosevelt's sometimes controversial social welfare programs. You can still buy copies of all these photos from the Library of Congress, including the most famous, such as Arthur Rothstein's dust bowl photo, or Dorthea Lange's "Migrant Mother."

I think that the golden age of photojournalism, with its prominent photo-story pages, ranged from about 1935 to 1975. Television clearly had a huge impact--to be able to see things live was even more powerful than a photo on paper. Even so, manyh of the photos we remember so well, the ones that symbolized a time and a place in our world, often were moments captured by still photography. When I began my career in journalism as a photojournalist, black and white was still the standard, and newspapers and many magazines were still publishing many photo-pages with minimal copy, stories told through photographs. Beginning about in the mid-1980s, however,   photojournalism changed its approach. Photographs standing alone, with bare cutlines, carrying the story themselves have been dropped generally in favor of more artistic solutions to story-telling: using photography as part of an overall design, along with drawings, headlines, graphics, other tools. It seems photography has fallen often into the realm of just another design tool.

Photography is driven by technology, always has been. Because, more than any other visual art, photography is built around machines and, at least until recently, chemistry. By the 1990s photojournalists were already shooting mostly color, and seldom making actual prints, but use computer technology to scan film directly into the design. And by the beginning of the new millennium, photojournalists were no longer using film: digital photography had become universal, both faster and cheaper in an industry preoccupied with both speed and profit. Color became the standard for "legacy media," newspapers and magazines, as well as for web news sites. Because color printing technology requires a higher quality image, photojournalists have had to adapt their methods to accept fewer available light images. Too, most publications are looking for eye-grabbing color, not necessary in black and white, and color demands correction to avoid greenish or orangeish casts from artificial light. All of this has meant photojournalists, even with the most sophisticated new cameras, are sometimes returning to the methods of their ancestors, carefully setting up lights, posing their subjects. You will often find, if you compare published photography today to that to 25 years ago, many fewer candid photos, less spontaneity, fewer feature photos of people grabbed at work or doing something outside. In fact, more and more, the subject is award of the camera, just as they were before the 1960s, the beginning of the age of the quest for naturalism in photojournalism.

You'll also find that the quality of the image has gone up, better lighting, sharper focus, and lush color, especially primary colors. Is photojournalism better today than it was in the black and white days? I think not, but it depends on what you like. Perhaps still photojournalism is not as important to society today, does not have the general impact of television, and its sometimes gritty "you are there" images bounced off satellite. Still, even with all our space-age technology, if I ask you to remember an image that for you defined a certain event, chances are you'd remember a still photograph. For instance, think Tiananmen Square in China, and you'd possibly recall the man facing down tanks. Think Gulf War, and you may recall the wounded soldier crying over a comrade. Think Vietnam War, and the execution of a Vietcong, or girl napalm victim. Think Protest Era, and the woman grieving over students shot at Kent State University. The single image still holds some defining power in our society.

Review a slide show of great photos.

II. The beginnings of photography, 1839-1900.

The invention of photography was received in Europe by a frenzy of enthusiasm, even a surprising amount. Why? Perhaps because it was an idea that people were primed and ready for. We have in photography a combination of science and art to produce a perfect, as they thought then, a perfect rendition of a scene or person.

We can understand why people of the age were so taken with the with this idea when we reflect that in the 1840s the machine age was already in full swing. Science was leading to new and better inventions, and the machine was thought to be the great answer to all the world's problems. Western people worshipped science, and photography was a product of scientific experiment or, if you will, chemical and optical experiment.

In the world of art, at this time too, the great goal of most artists was realism. That is, artists were trying their best to paint pictures as close in detail to reality as they could.

Photography offered a solution based in science.

The mechanism of the camera for photography, however, was actually very old. A device called a camera obscura widely employed by artists and amateur drawers alike. In fact, such a devices are still used today. They rely on a lens or, in the case of a large box, a pinhole, to transmit a view of the scene in front of it. This view is reflected off a mirror onto a white surface or ground glass. Artists may place a piece of tracing paper on the surface, and rough out the drawing in two-dimensional format. By this method they only have to spend a little time in the field or with a live subject to get the general proportions. Then they can return to the studio to finish. For early nineteenth century travelers, who wanted to draw things they saw, as was the fashion, a camera obscure could be particularly useful, for those who could not draw very well from nature. The machine was able to get the three-dimensional perspective right, because it reduced reality to contours that could be traced. If you've tried to draw from nature, you know how hard it really is to reduce a three-dimensional shape to a two-dimensional line.

The first people who contemplated possibilities of photography, then, were artists. Or in the case of some, not very good artists, such as Nicéphore Niépce. The idea was, why not try to find some way to save or "fix" that image on   a piece of paper. Then it could be returned to the study and consulted for copying. The key was, how to make the image stay? Since the 1700s chemists were aware of various substances which turned black or dark when light hit them. Curious, but no one thought it was worth much. Of course, the darkness would fade or be gone with the shaking of the solution.

The first person to successfully make a darkened chemical image permanent was Niépce, the not-so-great French artist. Actually, Niépce was more interested in engravings or etchings than in photography for art purposes. His idea was to record an image on a metal plate, and then etch it for printing. I 1826-27, he took a camera obscura, pointed it at a courtyard, and managed to make a permanent exposure of it. It took eight hours. He called it a heliograph, the first recorded picture using light-sensitive materials.

Unfortunately, Niépce was a man in his 60s, poor, and in ill health. He heard of experiments that another Frenchman was doing in photography, Louis Jacques-Mandé Daguerre. He wrote a cautious letter to Daguerre, wanting to know about the process, and finally, they decided to form a partnership in 1829. Daguerre's process differed from Niépce's. He used vapor of mercury and salt. In 1833 Niépce died, and his son carried on the partnership, although Daguerre mostly was the active participant. After eleven more years of experimenting, Daguerre perfected his process: a sheet of copper was coated with a thin layer of silver. The silver was made sensitive to light with iodine vapor. It was exposed in a camera, then vapor of mercury was used to bring out an image. Finally that image was fixed with a salt solution, common table salt.

The process was dramatically different from the chemically-based photo process used until digital techniques began in the late 1990s, its chemicals highly toxic and dangerous. But it worked, and worked very well, offering exquisite detail matching the best of what we can produce even a century and a half later. In early 1838 Daguerre tried to attract investors to his process, but could find few. However, he did attract the attention of a famous French scientist of the time, François Arago, who persuaded the French government to give a pension to Daguerre and the younger Niépce to work on the process. Daguerre, however, had to promise not to patent the process in France, and he eventually did.

In 1839 Arago and Daguerre announced the process to the world. Arago's public relations efforts and Daguerre's energetic promotion helped the daguerreotype, as it was called, to take the world by storm. Everyone was talking about it within days. Exposures, at first nearly 20 minutes, were in 1840 reduced to 30 seconds with the use of bromide, and faster lenses, able to gather more light. Those first 20 minute exposures were so long that subjects might get sunburned--direct sunlight only was bright enough to expose the plates. And sitting perfectly still that long was a terrible ordeal, sometimes requiring head braces. But it was okay to blink--exposure was so slow that it didn't register. And people didn't mind sitting through it--after all, a photograph was like a kind of immortality! And, for the first time, people could really record how they looked at a certain age, giving society a new appreciation for the unsettling differences between our visage at 20 and 60.

Daguerreotypes immediately became the rage in Paris. Everyone wanted their photo taken. But some people wore worried, too--artists. At first, when photography was announced, artists were somewhat optimistic. Finally they had a way to fix an image of the camera obscure to bring it back to the study for painting. Daguerre himself had been an artist, and most of the original inventers of photography had intended it as an artists' tool--not as an artistic medium in its own right. However, as photography caught on, artists began to realize that it was going to prove to be a real menace to their livelihood as portrait painters. Particularly painters of miniatures, a business that dropped to zero almost overnight as daguerreotypists were able to hand-color their photographs.

More unsettling, artists had lost the centuries-old battle for more and more detail, more and more realism. And lost it to a machine that could produce detail far beyond any artist. Artists realized that photography was not going to stay in the role that they had hoped, merely a copying aid. Everyone who was anyone wanted his portrait on a daguerreotype, and the little plate was much cheaper than a painting. Artists, nevertheless, used photographs as aids to their own painting, often photographing a scene or a face to save time, and returned to the studio to paint it. No one would call photography an "art," however. Many artists declared that the upstart was vulgar and mechanical, and some would not admit to using it at all. Photographers, on the other hand, more and more argued that photography was an art. That debate raged well into the twentieth century and indeed still sometimes greets photographers today. More than once, when I was more actively entering photographs in juried art shows, the rules would state "no photography."

Nevertheless, in the next 30 years, painters either consciously or unconsciously were strongly influenced in their use of lighting, in composition, in depiction of movement, by photography. Photography brought the philosophy of art to crisis, which ended with artists turning away from the centuries-old quest for realism--which photographers had won--toward a new goal, to paint feelings, interpretations, abstractions, and not necessarily what was there. Photography motivated the beginnings of the twentieth century's non-representational and abstract art.

After Daguerre and Arago announced the new process, a man in England became worried. His name, William Henry Fox Talbot, a wealthy gent with much time for experimenting and, like Daguerre, an accomplished artist. Talbot too was looking for a w ay to make permanent his images in the camera obscura. He was aware that artists before, in the 1820s, ha managed to make permanent an image, not on metal, but on paper. The problem, though, was that the process was not very workable, and anyway, the image produced was a negative. What use was that?

Talbot experimented with the same paper process, trying to find a better way to make the image permanent. His too was a negative image, but he had an idea no one had thought of before, apparently. By putting the negative image against a second sensitized sheet, and shining light through it, he could produce a positive image. Talbot, therefore, invented the first negative/positive photo process, unlike the daguerreotype, in which every image was on metal, and unique.

When Daguerre announced his process, Talbot was concerned that it was the same process as his. So he quickly published an account of his own method. In succeeding months of 1839 it became obvious that Talbot's process was totally different from Daguerre's. Talbot dipped paper in salt, and when dry, in silver nitrate, forming a light-sensitive chemical, silver chloride. He pointed the sensitized paper in a camera obscura at an object, waited until the image turned dark enough to be seen with the naked eye, about 30 minutes, then fixed the image with a strong salt solution, or potassium iodide. In 1840 Sir John Herschel suggested that hyposulfite of soda woud more effectively fix the image, and remove the unused silver particles, so that they wouldn't turn dark over time. Herschel is credited with inventing the fixing method we basically still use today in our "wet" darkrooms, called "hypo" for short.

Talbot soon realized that he really wouldn't have to wait until the image was actually visible, such a long exposure. With a shorter exposure, a hidden, or latent image would be formed, which could then be brought out by developing in gallic acid. So now we have a negative, development, fix, a process basically unchanged until the invention of digital imaging. Talbot also waxed the paper, making it more transparent, and called his process the "calotype," Greek for "beautiful picture."

Unlike Daguerre, however, Talbot patented his process. He gave licenses to few. For a dozen years the process hardly grew at all under the stranglehold of the patents. It was not, however, patented in Scotland, allowing pioneer photographers David Octavius Hill and Robert Adamson in the 1840s to produced an important collection of calotypes. W.F. Langenheim in the United States received a license, the only calotype producer in the country.

Meanwhile in France, witnessing the early 1839 announcement was an American, Samuel F.B. Morse. Morse himself had been dabbling in photography, and when he heard Daguerre's announcement he wrote about it right away for his audiences in America. More returned to New York City and taught the new process to several students, including Mathew Brady. In 1840 the world's first portrait studio was opened in New York City. We can credit Morse for bringing photography to America--along with his famous invention, the telegraph.

Daguerreotypes stayed popular in America and Europe for about a decade. Everyone had to have one. All the famous people had to have their faced daguerreotypes. The calotype was not nearly as popular, partly because of Talbot's controlling patents. But Talbot did contribute to the history of photography the first photo-illustrated book, his Pencil of Nature. In it he described his process, and illustrated it with actual photos attached to the book, charming domestic scenes and descriptions. But his calotypes were thought of as inferior to daguerreotypes, because they lacked the fine details of the metal-plate-based process. True enough; a daguerreotype is exquisitely detailed, even by modern standards. The calotype involved printing through a paper negative and, inevitably, the grain of paper fibers also were transferred to the image. This produced soft, almost luminous images. Today we think they truly are beautiful, but given the 1840s emphasis on detail and realism, they were fuzzy. So calotypes never caught on like daguerreotypes, which were produced by the millions. In fact, today collections can obtain daguerreotypes for a fairly reasonable price. Thousands still exist, in their small leather cases and behind class, to protect the fragile surface.

However, just about the time that Talbot finally decided to cede his patents to his calotype method, technology moved to replace both it and the daguerreotype. The big problem with the calotype was its loss of detail through the appear; if only an emulsion could be spread on glass, this problem and the fragile calotype negative could be eliminated. Many experimenters tried sticky things like raspberry jam or honey to keep the silver nitrate suspended on a glass plate. Nothing worked. Then in 1848, Niépce de St. victor, a cousin of Nicéphore, tried albumen, or egg white. It worked all right on glass plates, but soon it was left for another method which proved more sensitive to light. In 1851 Scott Archer, British, combined guncotton, ether and alcohol into a solution called collodion. The collodion was flowed onto a glass plate, dipped in silver nitrate, and exposed in the camera. The beauty of this method was that it only required a two to three second exposure, much faster than previous methods. The drawback was that the wet plate process demanded that photographers make exposures before the plate dried and lost its sensitivity to light, about one minute. Photographers, therefore, had to carry portable darkrooms everywhere they wanted to take a picture.

Nevertheless the wet plate process rapidly became the new standard, totally eliminating the daguerreotype by 1858. An era in photography--that of the unique, one of a kind photograph--had ended. Glass negatives could produce as many prints as needed. The albumen method invented by the Niépce cousin, however, was used extensively for some 30 years for the paper on which the prints were made. In fact, millions of egg whites were separated, their yokes sold to bakeries or hog farms.

Wet plates made possible extensive photography outside the studio, because of their superior sensitivity, and despite their darkroom drawback. This is not to say that no photography was done outside a studio before 1851. In 1842, Carl Stelzner made a daguerreotype photo of the Hamburg fire--first spot news photo. But the wet-plate process was far superior for outdoor photography, and after 1851 we find the first extensive use of photography to chronicle events and scenery. In 1855 Roger Fenton brought his camera to the Crimean War, the first war photographer. A Chicago photographer named Alexander Hesler is especially important to people around here. In the 1850s he photographed Minnesota, including views of St. Anthony Falls, Fort Snelling, and Minnehaha Falls. He was considered one of the great Midwestern American photographers of the period.

Photographers brought wet-plate darkrooms on their backs or pulled by mules to remote places around the world, from the arctic to the hot dusty sands of Egypt. Considering the fragile technology in those difficult conditions and climate extremes, it is astounding what photographs they did get. And they were very good. Probably the most famous of these early on-location photographers is Mathew Brady. Brady was trained by Morse in 1840, and soon opened his own studio in new York. Although ironically and tragically troubled by weak eyesight--blind in his later years--Brady built with partner Alexander Gardner an extremely successful portrait studio in New York, and later in Washington D.C. Most of the famous statesmen for 30 years were photographed by Brady or his staff, including every president from John Quincy Adams to William McKinley.

Most important, however, were the many portraits Brady made of Abraham Lincoln, beginning before Abe became president. Brady became acquaintances with Lincoln, and when the Civil War began (1861-65), he conceived of a new idea: to photograph the war as a complete chronicle from beginning to end. Brady secured permission from Lincoln in one letter reading "Pass Brady," but no money. At that point he needed none. He had acquired $100,000 from his portrait business, a fortune at the time. But by the end of the war Brady had spent it all, and owed more. He financed 20 teams of photographers to cover al the major battle sites. The technology of the time was not fast enough to photograph actual battles, but his haunting photos of battle aftermath perhaps forever changed the picture of war for ordinary civilians.

After the war Brady tried to sell some of his war photos, but they didn't sell well. Most people wanted to forget the war. He gave much of his collection to the U.S. War Department which, in turn, paid some of his bills. Unfortunately the department did not take careful care of the fragile collection, and much was lost. You can still acquire Brady photos through the Library of Congress web site.

Other well-known and important pioneer photographers include the Paris photographer Nadar, and the British portraitist Julia Margaret Cameron. Nadar, whose real name was Gaspard Felix Tounachon, set up shop in the mid-1850s and photographed the Paris greats and scene until about 1880. He was well known for his sensitive portraits. He also took the first aerial photos, from a balloon. Indeed, he actually had his portable darkroom in the balloon's gondola, and developed as the balloon swayed back and forth. Can you imagine!

Cameron is also known for her portraits, especially those of famous people. She was an extremely pushy lady--an early papparazzi?--who would usually not take no for an answer, but her portraits show an unusual sensitivity to the character of the person taken. Also significant at this time was the development of the so-called carte-de-visite, around 1854. These were small photos of about three and one fourth by two and one eighth inches which were collected and traded somewhat like sports cards are today. It was the rage to have your family and a variety of famous people in your carte-de-visite album.

In 1859 the stereoscope was invented to view photographs. The idea was a bit like what we might call the Viewmaster toy today--two photographs taken at slightly different angles were mounted on a card. The card was placed in the viewer, and like binoculars the two images would blend together to make what appeared to be a three-dimensional image.

Stereo cards and viewers waxed and waned in popularity throughout the Victorian age, and into the twentieth century. Millions were made, some funny, some risqué, and in the latter part of the century almost every home had its stereo viewer and cards--almost like the slide programs of today. You can still find the cards and viewers at a flea market for pretty cheap.

The 1870s marked the big years in the United States for landscape photography in the west. Falls, geysers, canyons, buffalo, Indians, all came under the eye of the western photographers. Many of the best know had been part of Mathew Brady's team, just as many of the cowboys had been in service for the Rebel cause. Tim O'Sullivan is one of the best remembered of these photographers. But perhaps the photographer who is most significant for changing the way people viewed the world was Eadward Muybridge.

Muybridge was British by nationality, but spent many years in the U.S. In 1872 he tried to finally, once and for all, settle the famous old debate among artists and horse riders: Is there a time when all four legs of a galloping horse are off the ground? No one really knew, because no one could see that fast. Muybridge tried to take action photos of horses, but the technology was not advanced enough to stop the action. In 1878 he tried again. He painted or covered everything, including the track, so it would be white, reflecting as much light as possible, on a sunny day. He rigged up twelve cameras, each to trip its shutter by a black thread broken by the horse.

The series was successful. And they showed that, yes, a horse does have its legs off the ground. They also showed that the way artists had drawn horses running, with legs outstretched, hobby-horse style, was inaccurate. Horses didn't run that way. In doing these "locomotion studies" of animals and people, Muybridge changed the way artists viewed motion. It was fond that the camera could see things that people could not, and it changed the way people viewed reality.

The late 1870s was seeing a third revolution in photography technology. From the daguerreotype and calotype, to the wet plate, now chemists experimented with ways to avoid the cumbersome web procedures, by finding a way to make dry plates. In 1871 gelatin was substituted for collodion, and the first dry glass plate was made. It was slower than a wet plate, however. But by 1880, dry plates became as fast as wet plates, and the cumbersome wet plate died out.

As wet plate technology was being superceded by dry plates, in popular taste other portrait styles gained. Among people who had limited funds, a photograph printed on emulsion placed on metal sold extensively--called a tintype. Tintypes were extremely cheap, almost like a photo machine of today, and were made from the 1870s all the way into the 1930s. Also popular were cabinet cards, photos of a size of about four inches by six inches. These are the photos we all probably have in our shoeboxes, inherited from our grandparents--and probably a few tintypes as well, taken by itinerant photographers.

Among the manufacturers of dry plates was one by the name of George Eastman. Eastman did a fair business selling them, but for him it wasn't enough. If only one could make the dry emulsion on a flexible back using gelatin. Eastman finally patented his solution, and introduced, in 1888, the first roll film. He marketed the film in his own camera, called a Kodak, with the slogan, "You push the button and we do the rest!" The camera came with 100 exposures, and after you'd shoot them, you returned the entire thing to Eastman for processing. The pictures that came back were circular. And the technology was so good that, for the first time, you actually could make a decent photo without a tripod.

Of course, this meant that no anyone who could push a button and wind a crank could be a photographer. It revolutionized the industry. For the first time any ole amateur could take a photo of any old thing, and cheaply too. The democratization of the image was complete, and what happened to Eastman's company everybody knows.

Mass media history: conclusions.

I believe that whenever we start something, we ought not to leave things up in the air, but to come to some sort of end--and that includes classes. We need closure. That's why I want to say just a little in general about this class. As you know, I set up this class in reverse chronology, starting at the present and working backwards. I used the archaeology metaphor to help us understand the idea of going backwards, which is hard otherwise for most of us to grasp. You've likely never had a course taught that way, or a book designed that way.

Throughout the course I tried to continually bring us back to the present, finding ties and links between the past and what we believe and do today. As we moved back further into history, it became more difficult to find the links. But the links were there. It's just that they have changed so many times since that they were hard to recognize. For instance, if I take a T-bone steak and grill it, the link to the cow is fairly obvious. You see bones, fat, and grain. But if I grind the steak into hamburger, slice it into thin patties, fry it, put it on a bun with pickles and sauces and cheese and stuff, the link to the cow has gone several more stages away, and it's hard to make the connection. I've known "vegetarians" who won't eat steak--but will eat a McDonalds hamburger.

Still, we realize the link is there. Is it possible to find a link, or theme, that helps to explain all American media history, from colonial days to the first Gulf War? This is controversial among historians--some say there is, some say there is not, and we discussed this a little when we talked of "historical schools." I would suggest, however, that if there is a common thread in the general history of the country, then there is a common thread to the history of the country's media, because media history is closely connected to political and cultural history.

Many historians would observe that the theme behind the founding of this country is individual rights and freedoms--the right to vote, the right to criticize, the right to own property. As for the press, we noted that the right to freedom was not at all common during the colonial years of this country, but by the founding of the United States, it was a well-accepted part of our constitution. It did undergo an early attack, in the Alien and Sedition Acts, but generally was a right safeguarded and expanded throughout the nineteenth century.

The 20th century, ironically perhaps, has seen many attempts to limit the right to free expression and criticism. The worst came during World War I, but it reappeared during recent wars such as that of the Persian Gulf. Military and civilian authorities decided to carefully control media coverage--and the majority of Americans supported that control.

We've seen other recent attacks on the right to free expression--a proposed anti-flag-burning amendment, a proposal to censor the Internet for indecent materials. George W. Bush, as the front-running Republican presidential candidate, declared that a satirical web site lampooning his candidacy ought to be banned. Polls have shown that very many Americans would enthusiastically support such legal limits to individual freedom of expression--because they are so offended by indecency, rude attacks, sensationalism and anti-patriotic behavior. Students in this class showed this in their almost unanimous agreement in favor of heavy censorship of Civil War dispatches. In fact, a good share of Americans would like to see a lot more limit to freedom of expression generally in America, including more control over movies and television. It's not at all certain that a First Amendment, if proposed today, would win majority support in this country. It's easy to believe in a "free press" principle--until someone publishes something that offends us. Even women's rights lawyers have battled to control the depiction of women as sex objects in advertising, and lots of us--I hope most of us--are offended by those neo-Nazi broadcasts on the Public Access Cable channel. Few of us don't think that at least some expression, the most obnoxious, should be controlled. After the 9/11 terrorist attacks, the federal government has moved to block access to all sorts of information citizens have the right to see under the Freedom of Information Act. I'd guess the majority of Americans would support this increased secrecy, although it contradicts the foundation of freedom the country was built on, it seems.

We generally believe, it seems, that what shocks us today is different, and certainly more significant, than what shocked colonial Americans in 1750. The topics are different, it's true--but the outrage is the same. Who's to say our outrage over today's "indecency on TV" is more rightly-placed than a 1750 British colonist's shock at seeing the king called an incompetent buffoon? In fact, it might be argued that outrage over that kind of publicity is more legitimate, because it may destabilize society and lead to war. In fact it did--certainly the press contributed to fervor for that war, for the War of 1821, the U.S. Civil War, the Spanish American War, and World War I. It's less certain that it contributed to World War II and more recent wars, but the point is, criticism of political matters seems much more dangerous than, for instance, TV shows or Internet pages depicting naked people.

So if we could find a thread running through American media history, perhaps it's our continued re-interpretation as a society of the phrase "Congress shall make no law...abridging the freedom of the press." For many of us--perhaps the majority--just can't believe that the politicians of that era really meant that. And from the time of the old political press, through the penny press era, yellow journalism, jazz journalism, early radio and later television, to today's Internet, someone, somewhere, is always testing he old concept so central to the American experiment with democracy.

And democracy is an experiment, one in which the media plays such a central role. No one, throughout all of recorded history, whether it be the ancient Sumerians, whose empire lasted a thousand years, to the Romans, whose empire lasted another thousand, to the Venicians, whose city-state lasted another thousand, to the Americans, whose republic has lasted a mere 225 so far--no one has allowed the kind of free expression that America does. Who knows if this will last a thousand years. The way some groups attack the ideas of free expression and equality that this country was built on, who knows, sometimes, if it will last even through our lifetimes. Democracy is fragile. It's never persisted very long without toppling into empire, dictatorship, kingdom, or some other kind of authoritarian system. And the end of democracy sometimes begins--as it did in Venice--with fearful citizens giving more power to the government as protection against a threat. Terrorism is today's threat--and even today some Americans seem happy to give away more our democratic rights to government officials in hopes they will protect us from risk. I think terrorism can never destroy a democracy directly, but fear of terrorism could bring its destruction from within. And at the center of democratic rights threatened in America is freedom of speech, freedom of the press. Perhaps in studying the history of American media you are studying the real firing line of debate on what Americans think this country should be, and what it might become. That's a bit daunting, but certainly something we all need to participate in--if we plan to work and live here.