Sunday, January 27, 2013

Does History Matter?

As an exercise for a class, I wrote this "speech" on whether history matters as if I were delivering it to a chapter meeting of the Association for Computing Machinery. Slightly odd premise for a blog post? Yes, indeed. But it was fun to write and I hope it will be fun for some of you to read.

To be clear, I never gave such a speech, and I'm not a member of the ACM. This was simply the set-up for a writing assignment and to demonstrate that I knew some history associated with U.S. higher education.

--+--+--

Good evening fellow ACM members, representatives of the University of Pennsylvania department of computer science, and guests. I’m honored to be here tonight with all of you on this occasion to recognize the 65th anniversary of the Association for Computing Machinery.
Tonight I’d like to consider the question “Does History Matter?” and given the nature of our organization, I’ll approach that question in terms of the 20th century history of science and engineering and the importance of government and academic institutions whose history is intertwined with the ACM.
For many people, history brings to mind lists of key dates, events, and the people whose names are associated with them, but my appreciation for history comes from the opportunity to think about the context for those events, and what motivated the actions of the people involved. It is with that perspective that I reflect on the birth of the ACM 65 years ago, and the times in which the founders lived.
The ACM is known as an educational and scientific society dedicated to the computing profession, but it is also a society rich with history. The ACM has an active and prominent History Committee that organizes workshops on the history of computing and the ACM, and that supports fellowships for the study of our history. The ACM also names its awards after some of the great contributors to computing through history, such as Grace Hopper, Gordon Bell and Alan Turing.
I’d like to take you back to a time just before the birth of the ACM. The defining events of the early 1940s were surely the sprawling world war involving Europe, Southeast Asia, and the United States and affecting people everywhere.  Some of the earliest computers were being put to work to support warfare, helping to calculate ballistic trajectories and to break the encoded messages of enemy forces. Among these was ENIAC, the computer “born” here in Philadelphia at the University of Pennsylvania. With so much at stake, and with the interest and support of the U.S. government, early computer development was advancing at a rapid pace. A British mathematician named Alan Turing, well known to this society and one of the founders of modern computing concepts, helped lead the way. Turing and his team at British military codebreaking center Bletchley Park are credited with cracking German cyphers. Their success in the use of early computer technology was key to the eventual defeat of German forces.
By 1945, the face of the war was changing. Hitler had been defeated, but Japanese Emperor Hirohito was firm in his commitment to war with the U.S. Here again, the science and engineering culture of academia partnered with government would play an enormous role in international events. Vannevar Bush, an engineer and academic from MIT was serving as science advisor to President Franklin Roosevelt during and immediately after World War II, and was also the administrator of the Manhattan Project, leading a team in the successful development of the atomic bomb.
In order to try to bring about the end of World War II, the United States elected to drop atomic bombs on two Japanese cities, Hiroshima and Nagasaki, killing 200,000 people and forcing the surrender of Japan. It was the first and only use of atomic bombs as weapons of war, and I think we’d all agree that their use was clearly a profound moment in history. Whatever your opinion on war and the use of massive force, man has since that day lived with the understanding that we possess the knowledge and the technology to bring about destruction on an incredible scale – perhaps so large a scale that mankind’s very survival is threatened.
Since the end of World War II, nuclear weapon capability has spread to several other countries and delivery systems have evolved from bombs to be dropped from military planes over a city to warheads on intercontinental ballistic missiles, able to be launched from the comfort of home countries. In the decades since, the world has become a much more dangerous place, where a few powerful men can trigger massive destruction and loss of life.
Scientists and engineers, most of whom previously had no involvement with large-scale warfare other than as individual soldiers, were forced to confront the fact that the knowledge and the technology they developed could be turned into weapons of mass destruction. For perhaps the first time, theoretical physicists saw how practical and deadly the application of their work could be, and mathematician’s proofs were now painting bullseyes on military targets. Nevertheless, it was impossible to ignore the value of science and engineering to U.S. military success.
In the years immediately after World War II, Vannevar Bush was still science advisor to the president and still a prominent academic with ties to universities around the United States. There was great interest on the part of the U.S. government in technology, and Bush was able to work with the federal government to establish large-scale funding for academic science and engineering. Ultimately this led to the creation of the National Science Foundation, which today is an enormous source of funding to U.S. universities involved in science and engineering research.
By the late 1940s, with the return of war veterans to the U.S. and a G.I. Bill that helped send large numbers of them to college, the U.S. government took stock of higher education in the country.  In 1947, the President’s Commission on Higher Education produced a report that considered the promise of an educated citizenry and the role of the United States government in making education accessible. The commission spoke strongly in favor of improved access to education for U.S. citizens, regardless of prevailing impediments such as race or ability to pay, and drew a direct correlation among democracy, equal freedom, and education. The report also recognized that in the aftermath of Hiroshima and Nagasaki we stood at a crossroads, at the beginning of an atomic age with potential “as great for human betterment as for human annihilation,” and that “the future of our civilization depends on the direction education takes.” These are quotes from the commission’s report, as is this: “Education is the foundation of democratic liberties. Without an educated citizenry alert to preserve and extend freedom, it would not long endure.” (President’s Commission on Higher Education, 1947).
This was the historical context into which the ACM was born, in 1947, the same year in which the President’s Commission on Higher Education produced their report. With computers serving prominent roles for the military, and increasing research at American universities, computing as a profession was receiving a great deal of attention. It was natural that the foremost researchers and practitioners would want a professional society to support each other and their growing profession.
In the years since, we have seen the development of the minicomputer in the 1960s, and with it a terrific set of partnerships that included government funded research labs at universities, and new players to a computer industry populated with recent graduates from our universities. The PDP and later VAX minicomputers from Digital Equipment Corporation, for example, benefitted greatly by the operating systems developed at UC Berkeley’s federally funded computer science laboratories. As the 1960s ended, a similar set of government, industry, and higher education partnerships led to the development of the Arpanet, which in turn, led directly to the Internet we all use today.

As the 20th century ended and the 21st century began, world-class engineering continued to move from academic research labs directly into American technology companies, resulting in exponentially faster, smaller and more efficient computing. Even desktop microcomputers, once the epitome of personal-sized computing, were giving way to laptop computers and smartphones.

Federal funding of science and engineering research labs in our universities, and the partnerships forged among government, corporations, and universities, have been crucial to the explosive growth in our industry. This growth, in turn, keeps our profession and our professional societies vibrant.

Let me return to the question posed at the start of my remarks. Attention to history allows us to see the real stories behind the evolution of knowledge and technology. The first computers that inspired the founders of the ACM were room-sized, ploddingly slow and power hungry. The computer in my phone, which is a descendent of those first computers, is by comparison lightning-quick, runs for hours on slim batteries, and slips easily into my pocket. Knowing something about the events of the last 65 years allows us to understand the many things that helped to bring that evolution about.

My closing thoughts are these: our history matters a great deal. History puts the facts and the events we think we know into a broader context and helps us to understand not only the “who” and “what” but also the “why” and “how” of important events. The ACM has shown in numerous ways that it cherishes the history of computing. I’m proud to share my appreciation for computing history and the ACM with all of you tonight.

Thank you very much.
References
President’s Commission on Higher Education. (1947) Higher Education for American Democracy. New York, NY: Harper & Brothers Publishers.

3 comments:

  1. "I will not subscribe to the depressing conclusion of Voltaire and Gibbon that history is "the record of the crimes and follies of mankind." Of course it is partly that, and contains a hundred million tragedies. But it is also the saving sanity of the average family, the labor and love of men and women bearing the stream of life over a thousand obstacles; it is the wisdom and courage of statesmen like Winston Churchill and Franklin Roosevelt, dying exhausted but fulfilled; it is the undiscourageable effort of scientists and philosophers to understand the universe that enveloped them; it is the patience and skill of artists and poets giving lasting form to transient beauty, or an illuminating clarity to subtle significance; it is the vision of prophets and saints challenging us to nobility.
    "On the turbulent and sullied river, hidden amid absurdity and suffering, there is a veritable City of God, in which the creative spirits of the past, by the miracles of memory and tradition, still live and work, carve and build and sing. Plato is there, playing philosophy with Socrates; Shakespeare is there, bringing new treasures every day; Keats is still listening to his nightingale, and Shelley is borne on the west wind; Nietzsche is there, raving and revealing; Christ is there, calling to us to come and share his bread. These and a thousand more, and the gifts they gave, are the Incredible Legacy of the race, the golden strain in the web of history. We shall not close our eyes to the evils that challenge us; we shall work undiscourageably to lessen them; but we shall take strength from the achievements of the past, the splendor of our inheritance.
    "Let us, varying Shakespeare's unhappy king, sit down and tell brave stories of noble women and great men."
    WILL DURANT

    ReplyDelete
  2. Great writing, and great and exciting retelling of history. I never heard of the ACM, though much of what you wrote is familiar to me.

    My wife has the pleasure to work with you, and I believe you'd be a great guy, especially based on reading this, and one who would be fascinating to sit back and talk with over coffee.

    ReplyDelete