Convincing Students… of What? Teaching STS in a Changing World

Keynote address delivered by Pamela Mack, STS, Clemson University, at the Southern History of Science and Technology Conference at Auburn University, 15-16 April 2016.

[The following text has been edited for the Newsletter.]

When I started teaching in the early 1980s, what I most hoped to convince students of was to question the authority of science. Today, in lower-level courses, I worry that doing so may encourage their rejection of science, and I don’t want to give ammunition to my undergraduates who don’t believe in global warming or evolution. I don’t think I have become more conservative as I get older, but I do believe that the views of the average undergraduate have changed, as has the culture
in which they live. I want to reflect on how my teaching has changed over time to give you some ways of thinking about how we respond to such change.

I want to say first that my focus here will not be on what undergraduates remember. Professors do keep adjusting to what cultural symbols freshmen no longer remember, such as 9/11, and I particularly struggle to explain to students the way we saw the world during the Cold War. But my concern is deeper than that; it is about what assumptions undergraduates start with and how we get them to think about and then question those assumptions. Doing so isn’t easy; in fact, it feels to me like students are increasingly unwilling to question what they consider to be common sense. That may not be true; it may be only an illusion because what seems like common sense to them has become increasingly at odds with the world of my youth. In any case, the assumptions that students come into the classroom with are different from what they were 30 years ago, and what I want to talk about is how that has changed my teaching.

I’m talking here about content and goals more than about methods and technology. I have been, and
continue to be, an early adopter of new classroom technology, but I don’t see that as keeping up with
the students—I see it as something I do to keep from getting bored. I’m not convinced that new technology is crucial to reaching today’s students because I believe it is very important for students to be exposed to different teaching methods. To those of you who will be starting out on a teaching career, find what works for you to keep from getting bored.

The central question that I want to emphasize today operates on a deeper level: what larger good do we believe we are doing in the world by our teaching? I came of age in the 1970s, so I mostly use political terms to describe this greater good, though I have been known to describe it as doing God’s work. You may have different ideas and use different words, but I hope that we all have similar thoughts about how our work helps make the world a better place, even if those thoughts change over time. For me, at least, a big part of that betterment is convincing my students to think more carefully about science and the world around them.

Let me tell you a story of how I understood that goal when I started teaching. When I was an undergraduate in the mid-1970s, I was particularly impressed by a local controversy about building a laboratory to do recombinant DNA research in Cambridge, Massachusetts. In July 1976, the summer before my senior year, the Cambridge City Council issued a moratorium on some kinds of recombinant DNA research. This was in the early days of learning to modify the genetic code. Researchers were just beginning to figure out how to insert genes into bacteria, and some of the research had been done using common bacteria and genes that might cause tumors. Some people feared that scientists might create a new disease. My boyfriend was studying astronomy, and some of the astronomers said “we have to go to distant mountaintops to do our research, why can’t the geneticists at least go to a lab in a more rural area, not in the middle of the city?” The City of Cambridge decided to resolve the question by appointing a committee comprised of residents who would educate themselves on the science and make a decision. All these years later I can still remember hearing two member of the committee speak about the experience: a nun and a Tufts professor of urban studies. The committee also included a physician, an engineer, a nurse, and two former city councilors. The Cambridge Experimentation Review board adopted a jury model and held 75 hours of hearings, listening to testimony from scientists and concerned citizens before coming to a decision to allow recombinant DNA research within the city limits of Cambridge but require some extra safety
precautions (Their final report is available at: http://emerald.tufts.edu/~skrimsky/PDF/CERB%20Report%201977. One member, Sheldon Krimsky, wrote a book as a result of the experience: Genetic Alchemy: The Social History of the Recombinant DNA Controversy (MIT Press 1982).

Those local events made a huge impression on me and became a model for my belief in the importance of citizen participation in decision-making for science and technology. I had grown up in a Massachusetts town that at the time still made decisions by direct town meeting, and observing these town meetings as a high school student gave me some awareness of the games played with public opinion but also made democracy very real to me. The anti-nuclear power movement was another model for citizen participation, showing in a particularly clear way that policy decisions about science and technology were too important to be left to the experts, because the experts’ careers depended on believing that nuclear power was safe. I went off to graduate school at the University of Pennsylvania with a strong belief that research in the history of technology would help advance the cause of citizen participation in science and technology policy. Graduate school didn’t take away my political understanding and its larger purpose in my work, though I spoke of it more indirectly back then. I think graduate students today face a different set of patterns, but I hope that all students think about some such deeper motivation.

Since I left graduate school, much of my teaching has involved courses that students take to meet a general education requirement. My goal clearly hasn’t been to prepare students to be historians of science and technology. I don’t remember my understanding of general education when I started, but I would say now that the purpose of general education, at least in the courses I teach, is to prepare students to become good citizens.

What do students need to become citizens with useful opinions about policy questions relating to science and technology? They need some scientific literacy, some knowledge to allow them to feel they have a rough grasp of the technical issues. But more important, they need to understand that we have choices about what kind of world we want to live in, that the direction of technological change is not inevitable. Citizenship in a democracy is about understanding that we collectively are the ones who get to make the choices. History provides rich examples of the impact of technology on society and of how social choices, just as much as technical choices, shape science and technology. There are great stories to tell of technologies that became dominant or failed for reasons that blatantly had more to do with the preferences of society or the obsessions of business leaders than technological superiority or even good business strategy.

Let me give an example. I particularly enjoy teaching about Henry Ford and the assembly line because it is such a rich example of different factors at play. In a freshman-level course I talk about some very basic principles of how capitalism works—how much reward goes to the people who put in the capital for a project, how much reward goes to the people who put in the labor, and how much goes to consumers in the form of lower prices. The assembly line is in part an example of an increase in efficiency so great that the owners and investors could get rich, the workers could get paid twice the going rate, and the price to consumers could go dramatically down, all at the same time. But the story of the Model T is also a story of Ford’s vision of a car for everyman (if not every woman) and of his old resentment of the Dodge brothers that led him to minimize the dividends he paid. I want undergraduates to see that this is a human story and that technology can be shaped by different goals and also to understand basic economics, which can help them think more clearly about what is fair.

Some of the lessons that I use to convince undergraduates of what it takes to be a good citizen have gotten easier in the last 30 years. Many students have seen examples of family members who became educated patients or caregivers and knew more about some aspects of a disease than the doctors did. It seems more possible now to make the point that you don’t have to be an expert to have a useful opinion. I get somewhat fewer students who feel strongly that all technological progress is good, and I have better examples now to challenge such views. Some will reject the idea that we should use genetic engineering to pick the traits of our children or use robots to care for old people. But I don’t simply feel that my task has gotten easier; our common culture and the views of undergraduates have changed in more challenging ways as well.

I have been blessed with the opportunity to come back in a more public way to what originally motivated me to go into history of science and technology. Since 2004 Clemson has had a requirement that all students take a course dealing with Science and Technology in Society, and I have coordinated an STS program that offers some of the courses and headed the committee that approves any course that meets the requirement. As I think about citizenship, science, and technology today, the two changes that I notice most are that science has lost much of its cultural authority and that undergraduates no longer see technology as something large and distant and out of control.

What has changed my teaching most is how much of this cultural authority science has lost. Students repeat the idea that evolution is only a theory, but, more than that, they see examples of social construction of science as additional evidence for rejecting whatever science they don’t like. Even though most of those who reject evolution accept the results of modern medicine, they still feel they can pick and choose what they want from science, to the frustration of many scientists. When I started out as a scholar using a social construction approach, I did want to reduce the cultural authority of science. But not this way! I didn’t imagine a challenge to the authority of science based on rejection of critical thinking and critical evaluation of information. I believed that the authority of science needed to make more room for cultural relativism, for the understanding that things look different when we see them from different perspectives. Instead, at least in the South, the authority of science has been challenged by those who believe that science threatens the certainties they see as essential. I would argue that the world is less fixed than what science tells us; the critique of science that has affected many of my undergraduates argues that the world is more fixed than science tells us.

The loss of authority of science has been good for my teaching because it has put me in a position of having to argue both sides much more than I used to. Those of you who are graduate students probably started out in a more balanced place than I did and know this already. This semester I am teaching a junior level history of science survey, but instead of trying to survey the whole history of science in one semester, I teach it as a case study course with four books.

We read Principe’s The Scientific Revolution: A Very Short Introduction and Larson’s Evolution, both of which I highly recommend. And then we read Paul Farber’s Mixing Races: From Scientific
Racism to Modern Evolutionary Ideas. I chose the latter book because I wanted to contribute to
the conversation about race at Clemson, where our iconic building is named after Pitchfork Ben Tillman, a 19th-century governor of South Carolina who embraced a virulent form of racism. It worked beautifully. The students aren’t very comfortable with studying racism, but the book has interesting stories of college life and pertains so centrally to questions of the impact of society on science and the impact of science on society that they can’t complain. However, I took a very different approach to the book than I would have 30 years ago. I certainly use the book as a set of examples about how bias creeps into science and how science is used to reinforce prejudice. But I also use it to point out that science is self-correcting. I hadn’t planned to argue that what makes science work is falsifiability, but when a student who had studied some philosophy of science brought up the idea, I ran with it.

When I studied philosophy of science as a graduate student, I was taught that the idea that science is progressive because scientific ideas are falsifiable had been disproved. And yet I wanted to tell my students that science is self-correcting… eventually. I didn’t want them to conclude that science should be rejected because it is full of bias; I wanted to give them a way of understanding how science works so that they would be able to argue with their friends. So I fell back on a strategy I used to use when I occasionally taught American Women’s History. I came to understand there that my task was not to turn the students into feminists, because trying to do that would have turned them off, but to move them a little ways in the direction of more egalitarian ideas about men and women than wherever they started. So I’m satisfied with teaching my undergraduates—mostly nonhistory major—that scientists are human but the social organization of science leads it towards self-correction.

The other change I want to talk about is more positive, but feels equally strange. I notice that undergraduates no longer feel that technology is out of control the way that we did in the 1970s. I half-seriously attribute this to the spread of the television remote control in the 1980s, when people started having frequent experience with technology as something you could control with your every whim. Perhaps, instead, it is because they are digital natives who have naturalized the technology with which they live. The downside is that this reduction in anxiety takes some of the energy out of the argument for citizen participation in decision-making for science and technology. Students don’t imagine technological progress as something whose direction they should think about because technology doesn’t feel like a hostile force.

I have several times co-taught (with a robot engineer) a course on robots and society, and the students we get don’t have many fears of robots. The problem is that they have certain beliefs about how humans will always be superior to robots, such as the belief that robots can only do what they are programmed to do. The robot engineer and I talk about how robots can be programmed to learn and then can base their behavior on what they have learned, but the students still tell us robots can’t learn. Denial seems to be the danger today.

I am increasingly aware of how students don’t think the world they grew up in is going to change, despite all evidence to the contrary. For example, in the fall of 2015 the majority of my freshman students told me that they believed we would never have driverless cars. Students are perhaps particularly resistant to thinking about change at present because they feel threatened by the cultural changes going on today. Most of my students are middle class and white, and come from a culture that perceives that its advantages are threatened. If you have had an unfair advantage and it is taken away, that tends to feel like discrimination. Therefore, the easiest way to resist making our society more fair is to say that change is impossible. It is at least unimaginable for many students.

So it is a simple but powerful thing when we try to convince our students—from our perspective as historians—that the world changes. I asked my students in my environmental history course last week to talk to their grandparents about growing up in the South without air conditioning. I was struck by how many of the students said they set the thermostat for the same temperature year round; a very different experience of life than living in the South without air-conditioning and perhaps without central heating. One of the things I enjoy about getting older is having personal stories to tell about how the world has changed, though I don’t think my students believe me when I talk about programming computers with punch cards in college. Or maybe they do—being older than the internet must seem unimaginably old.

I have argued that a reluctance to face change underlies both the declining authority of science and the reduced fear of technology that is out of control. Since history is the study of change over time, we are in a strong position to give our students opportunities to expand their thinking about what is possible. It seems like a simple thing, but I do believe that we historians contribute to making the world a better place.