In this brief essay we will recount some of the history of computing and computing machines in the Department of Mathematics and, to some extent, at WSU in general. By today's standards the capabilities of these early machines seem quaint, but in their time they were wonderful beyond compare. Modern readers who have ever used a pencil and a slide rule to do a numerical integration or to solve polynomial or differential equations will understand such difficulties.
Electronic stored-program computers were invented in the 1940s, but it was not until the early 1950s that their presence began to be felt outside the world of pure research labs. In 1956 Otis Rechard arrived at what was then Washington State College with the expectation that he would begin the development of electronic digital computing for the whole WSC campus. In 1957 WSC acquired its first computer, an IBM 650, which was housed in a room on the southwest corner of Todd Hall. It had 2000 words of main memory, as well as a large magnetic drum for other storage, and it was about the size of several large refrigerators. As just one example of the difficulties that early programmers faced, consider the following contortion. When one executed a 650 program, the computer would read in both the current instruction and the address of the next instruction to be executed. The idea was that the programmer would arrange the instructions of the revolving drum in such a way that, just as one instruction was finishing, the next instruction would appear under the read head. To optimize results, programmers needed to know about how much time each instruction would take and then set the next address accordingly.
However, these new machines did not sit well with some of the more traditional members of the Mathematics Department who thought that computers were simply a distraction. So it was not long before a Program in Computer Sciences was formed, followed shortly by a fully independent Department of Computer Sciences with Otis Rechard as chair.
In 1961 the IBM 650 was replaced with an IBM 709. Only 10 of those machines were ever built, as they were soon replaced with the IBM 7090. The two machines were the same except that the 709 used vacuum tubes and the 7090 used solid state transistors. The University Computing Center, a separate entity from the Department of Computer Sciences, was also formed in this period. Johnson Hall was built about 1961, and all of the computing facilities were moved there. In 1962 the Math Department produced its second Ph.D. student, Glenn Ingram, with James Jordan as his major professor. His Ph.D. thesis involved extensive numerical computing with complex twin prime numbers. Glenn went on to a long and successful career in mathematical computing.
By 1966 Washington State College had become Washington State University and the IBM 709 was replaced with an IBM 360. This machine was, perhaps, the first one that could be called a "mainframe" computer and it came to be used by large numbers of faculty from all over campus. Up until the time of the 360, computers were classified as either scientific or business machines. The difference was, mostly, that a scientific machine had a floating-point processor implemented in hardware whereas a business machine made do with only integer arithmetic in hardware. Floating-point computations were done in software. IBM chose the number 360 for the machine since it was designed to do both types of computing and thus encompassed all points of the compass.
As an example of the capabilities of this machine, it could solve a system of linear equations perhaps as large as 1000X1000. However, this accomplishment required several prerequisite conditions. First, it needed a very good program for swapping data in and out of memory at just the right time, as it did not have virtual memory. Second, it also required enough money to pay the very high user fees. Finally, one needed to wait overnight, at least, while one's program competed with others for use of the machine. This machine cost about $2,000,000, one half of which came from an NSF grant and the other half from the WSU budget. These are very large numbers if one considers the difference that inflation makes between then and now.
During this early period computers played essentially no role in the Mathematics Department's undergradute curriculum. However, in the summer of 1966 Donald Bushaw became Chair and new ideas began to take hold. A committee was appointed to review course offerings, and for the first time the Department felt some pressure to introduce computing to undergraduate students.
In the Beginning, 1966 - 1980
In today's world the term "computing" means digital computing. However, in those early days a different kind of computing was common, called "analog computing." Tyre Newton, of the mathematics faculty (now Emeritus), suffered a fateful encounter with this branch of the subject in 1967. An analog computer uses voltages of continuously varying magnitudes to represent different sizes of numbers. Its main output device is either an oscilloscope or an analog plotter where one can see, for example, a phase plane plot of a differential equation -- and its main input device is a set of dials on the console to set parameters such as initial conditions for a differential equation. It was programmed using a set of wires that plug into a bank of holes to control the flow of information. A photograph of such a machine is given on the right. A photo of Tyre, below left, and his account follows.
In 1967 I spent a summer in Pasadena, California, as a consultant for the Naval Undersea Warfare Center. There, I was introduced to a large analog computer, the EAI 7800, which was interfaced with the Univac 1108 digital computer. One application of this computing power was to simulate the guidance system of a torpedo.
I learned how to make a flow diagram for a given differential system and "wire the board" to simulate it on the analog. As a result of my efforts, I came away captivated by the potential of the analog computer to illustrate ordinary differential equations.
Upon my return to WSU, I convinced our Chair, Don Bushaw, to purchase an EAI TR-20 system consisting of a desktop analog computer, an analog plotter, and an oscilloscope. The front panel of the computer consisted of a patch panel on which the differential system was "wired." Potentiometers, set by hand, controlled the parameters of the system such as coefficients and initial conditions.
I found the basic analog computer to be a series of "black boxes" wired together in order to electronically simulate a system of ordinary differential equations. For this simulation, time is the independent variable while potential differences or voltages represent the dependent variables. These can be read out as traces on an oscilloscope, traces on a plotter, or numbers on a voltage meter. I was then faced with the task of justifying this purchase through its use as a teaching aid. To accomodate this, I mounted the system on an equipment cart so I could roll it into the classroom. From 1967 through the 1980s I continued to use the analog computer as an illustrative device, primarily in calculus and differential equations. This machine is still in working order and is on display in Room 327 of Neill Hall, where the Mathematics Department is situated today.
In addition to Tyre's discovery of the analog computer in the summer of 1967, in September 1966 nine new assistant professor had been hired in the WSU Mathematics Department and began teaching and researching. There are now three of them left in Pullman: David Barnes, Roy Johnson, and Charles Millham -- but Barnes and Johnson retired in the summer of 1998. Many of these new people were interested in the fledgling phenomenon of automatic computing.
However, as much as these factors changed the Department, there were other forces afoot that would, potentially, have an even larger effect on both the Department and the whole University. In 1966 the University acquired the IBM 360 computer mentioned above and made it available to all researchers. For its time it was a great machine but for doing numerical calculations, it was only as powerful as a second or third generation Intel 8080 or 8086 chip that was in the first personal computer. But it could do automatic arithmetic and thus, with a proper program, could solve differential equations, invert matrices (if they were not very big), and do other mathematical tasks that were far beyond anything previously possible at WSU.
However, to compare 8080 chips and IBM 360 machines one should realize that the structure of a mainframe or even a mini-computer is much more complex than that of a simple single-user PC since they are required to handle very different tasks. The mainframe hardware contains, amoung other structures, something called "channels." These are structures designed to allow communication with many different users at approximately the same time, so comparisons with chips in PC machines may be very unfair to the larger computers. Of course, the operating system for a mainframe is vastly more complex than that of a PC running DOS or even MS-Windows. Mainframes evolved over the years and in 1982 the largest computer on campus was an Amdah 470/V8 with 12 megs of main memory, 16 I/O channels, 14.8 gigs of disk storage, and a 64K cache. One of the standard measures of computing speed is the mip, which stands for "million instructions per second." The Amdahl machine ran at 7.4 mips whereas current PC's will run hundreds of times faster and (according to More's law) that rate is doubling about twice every 18 months.
However, there were all kinds of other difficulties in this era that just do not exist in a modern environment. For example, there was no Internet and in 1966 ordinary modems for telephone communications were not at all common and those that existed ran at 300 or so baud. David Barnes, below right, of the mathematics faculty (now Emeritus) recounts the following example of the joys of early-day computing:
My first experience with "automatic computing" was in 1971 using the IBM 360 and a program that I wrote using the new and wonderful language FORTRAN IV. My program employed Isaac Newton's method to compute the square root of 7. Including some comments, it consisted of perhaps 50 lines of code, and these were punched onto IBM cards, one line to a card. To make the cards we used a keypunch machine in the basement of Sloan Hall and, due to demand, one had to reserve time on it in advance. After writing out my program in great detail on special form paper suited to the FORTRAN card deck format, I went to the machine and carefully punched out my deck of cards that held the FORTRAN commands. It is not possible to erase a typo in such a process, and I often had to use 4 or 5 cards per try to get one good one. Just doing this part of the operation required two or three sessions spread over several days' time.
Finally came the big day. In great fear I slowly walked across campus to Johnson Hall where The Computer was then located and gave my deck of cards to an attendant behind a glass screen which had a small hole cut in it through which the cards could pass (when one communicates with the gods, one must be careful and humble. I wondered if I should bow and walk backward out of the presence of the clerk behind the dividing line between god and man, but I did not). My deck of cards was dumped into a large bin along with many other decks and I was told to come back tomorrow to see the results of feeding my precious program into the bowels of the computing machine. The following day was a Saturday, so I had to wait until Monday. On Monday morning I showed up at the glass window and got my result. In addition to my punched cards the output was about 20 pages of printout paper consisting almost entirely of incomprehensible error messages of one kind or another, and I knew that I must begin my first round of debugging computer code. Somehow I also knew that this was only the first time and that computer code would (as it were) bug me, in one form or another, for a lifetime. I returned to the keypunch and tried again, and again, and again. I walked my deck of cards back and forth like that for two or three weeks and then one day the sun came out, the angels smiled, everything worked, and I found that the square root of 7 is approximately 2.6457. Initially undaunted, I began to write programs to solve differential equations and compute eigenvalues of Sturm-Liouville systems. However, I was soon daunted, as I would never be able to use this setup to get these programs working in the entire time allotted in my life.
But soon (about 1972) things began to improve. The Department got its first modem and with it the ability to simply call the central computer on the telephone to transmit and receive programs over wires instead of walking them over to Johnson Hall. Modern modems can operate at speeds up to 56.6K baud. This early-day modem operated at only 400 baud with the ability to slow down to 150 baud (or even 50 baud) if telephone line noise began to interfere with the transmission.
Moreover, with the advances of the 70s, it was no longer necessary to punch programs onto cards, as there was a new and wonderful system called WYLBUR that allowed us to compose programs electronically on the machine and to submit them directly, using the modem. WYLBUR contained a primitive line editor that worked on a teletype machine, and results were printed onto a single long roll of paper. The days of a full screen editor using a CRT tube were still a few years away. About that time a man named Bill Joy, a graduate student at the University of California (Berkeley), was writing a different kind of editor for a CRT named "VI" that ran on the then new and novel operating system called UNIX. Our current department computers still use a descendent of Berkeley Unix and Joy's VI editor. R.M. Stallman's EMACS editor came still later.
The new modem was an acoustic device; digital signals were represented as audible sound waves of different frequencies. The telephone receiver needed to be inserted into a soundproofing foam rubber cup to make the connection between the central computer and the teletype device. We could remove the receiver and whistle into the cup, and by whistling exactly the right tune we could get ouptput on the teletype machine. But most of us never learned how to whistle well enough to produce anything except random letters.
From the late 1960s until the early 1980s most WSU computing innovations consisted of upgrading the central computer from the IBM 360 to various mixes of Amdahl and/or IBM mainframes. A similar process is still going on today. To support this expansion, a large share of the Department's (and other departments') operating budgets went to support these central computing facilities. There were difficulties with undergraduate computing. The operating systems of those days could handle only a small number of user accounts, so time was allocated in one lump to an entire course rather than to individual students. The result was that time given to a course would usually disappear instantly, leaving many students with no time at all; some students even had to use personal money to do assignments. We know that there were real cases dealing with computer fraud (involving lawyers and police), but the facts were quickly hushed up and are now impossible to resurrect.
In those times computing was not cheap and it soon got to the point that the Department could no longer afford to do all of the computing that it needed to do. As an example of those costs, in 1976 one of our faculty members, Richard Hanson, was allocated $750 for his Math 545 class in numerical analysis. The students used $1,974.11 before being cut off and the record does not show where the rest of the money came from. During that same semester he also used $1,401.83 for a research project on the numerical solution of integral equations. This research money came from his ONR grant, but not everyone had such grants. Hanson was, of course, only one faculty member whereas others, too, incurred such costs. These figures represent 1976 dollars, so they should be multiplied by a factor of 5 or 10 to get equivalent figures for today.
During this period the department was subjected to a sequence of different computing systems. About the time we were comfortable with one system, it would be discarded in favor of a new one so that many progrms required frequent rewriting. As an example, after a great deal of work, Newton was finally able to use the teletype in Sloan Hall to send a system of differential equations to Johnson Hall to get Cal-Comp plotter output. In 1976 he went away for a year on a faculty exchange. Upon his return, he accepted a Master's Degree candidate whose research needed this program. The system had been changed in Newton's absence and the computer interfacing no longer worked. This is just one example that could be repeated many times over by any of the folks who did computing in that era. The advent of mini-computers and especially the PC opened the door for departmental computing. The goal was to escape domination by the University mainframe. We attained this freedom in two ways: 1) establishing our own departmental research computing lab, and 2) establishing a micro-computer lab for undergraduates.
Becoming an Independent Computing Center, 1980 - 1990
The Research Computing Center
In 1980 it was clear that things had to change, and the instruments of that change were just then emerging in the form of the personal computer and the mini-computer. In 1981-1982 the Department had an amazing piece of good luck. As it happened, a "soliton" of money came together from several different sources at the same time to accidentally produce almost $80,000 for computing. One of the major sources was a grant that Edward Pate had to cover the cost of computing for his biological modeling projects. Rather than spend it on the University mainframe, we used his seed money, together with money from the Dean's Office, the standard Department budget, and some other sources to purchase and install a minimal version of a VAX/11-750 mini-computer. It was originally located in room 306 of Sloan Hall, and it ran VAX-VMS version 3.2, VAX FORTRAN, TeX, LaTeX, and some other VMS software. David Barnes was the first manager of the system, serving until 1985.
To use the facility, one had to walk down the hall and sit in room 306. It had 2 megs of main memory, 120 megs of disk space, an LA120 dot matrix printer, and 8 CRT terminals all located in the same small room (a part of which is pictured in Figure 2). It required a three-phase 240-volt power supply. For backup it used de-mountable disks (called RL02s) having a capacity of 10 megabytes each. They were used the same way that floppy disks are used now, but they were about 18 inches in diameter and weighed about 5 pounds each. Eventually a large amount of other software was added, such as Plot79, Pascal, IMSL, and many others. As a topper there was a 9600-baud modem and a primitive email program called "BITNET". It was connected to the University mainframe and from there to the rest of the world. The Internet was still more than ten years in the future.
These parameters seem antiquated by today's standard, but at the time we were mightily impressed with this wonderful machine. The only need that most of us had for the university computer was a connection to the outside world. At that point (about 1982) we had achieved our independence. As the late Thomas Lutz remarked, we began to "live life in the fast lane." System manager Barnes named the installation TBLCCIW, an acronym for "The Best Little Computing Center In Washington."
The Best Little Computing Center in Washington Circa 1983
In the picture below, the near cabinet housed disk drives, the next one housed more disk drives including the RL02s, the next one over was a TU80 tape drive, and the last one was the VAX 750 CPU unit. The larger box mounted on the far wall was the air-conditioning system that was required to cool off this equipment.
In 1982 John Cannon got a grant from the U.S. Air Force Office of Scientific Research for $125,200 to expand the machine. We used the money to add, among other things, more disk storage, a tape drive, a total of 8 megs of memory, a laser printer, a 600-lpm line printer, and other items. This expansion finally made a complete computing center that was the envy of departments across campus, and indeed across the country. In 1983 the Mathematics Department came of age as a state-of-the-art departmental computing center and has maintained its excellence to this day.
Unfortunately, Digital Equipment Corporation and the VAX-VMS systems simply did not keep pace with the rapid advancement of computing in the late 1980s and the company was taken over by Compaq in 1997. Our Vax-750 lasted until December of 1997, when the main power supply board failed and we decided that euthanasia was the best course of action. Today the Department uses UNIX running on a variety of hardware platforms. Many of us also use Microsoft-Windows and Apple MacIntosh machines.
The Undergraduate Computing Center (see photo, below right)
At the same time that the VAX was installed and began to draw faculty and graduate students, a concern was developing to provide computer opportunities to undergraduate students. In the fall of 1980, while on sabbatical leave at the University of California (Santa Barbara), Tyre Newton was introduced to the Apple II+ micro-computer labs in the Mathematics Department there. He spent considerable time observing how those labs were devised for student use in addition to learning how to program the AppleII+. His first personal programming project was the digital simulation of the Newton-Leipnik strange attractor. Newton and Roy Leipnik of the UCSB faculty first discovered this structure on the analog. It was then further studied using this simulation on the AppleII+. Newton returned to WSU thoroughly convinced that the micro-computer had a place in the mathematics curriculum. Partially on the basis of this enthusiasm, Michael Kallaher spent a sabbatical at UCSB during the fall of 1980, observing these same labs in addition to conducting his own personal research. He returned to the campus in January of 1981 with a similar enthusiasm for the potential of the micro-computer for students, but money was tight and the best he could get was a single Apple II+ housed in his office. Although a majority of its use was by Thomas Lutz and Tyre Newton, its potential for undergraduate instruction was increasingly clear.
The years 1982-1983 marked the Mathematics Department's first use of micro-computers in a laboratory setting. This was in the SLIC Apple II laboratory when it was in the Owen Science and Engineering Library. Newton assigned a series of experiments for his Math 315 students. In writing the software for his lab manuals, Newton got considerable help from Leonard Hensheid of the Chemistry Department, and they later jointly published the paper, "An elementary differential equation microcomputer laboratory," Collegiate Microcomputer III, 342-348 Nov. 1984.
Most of the following information (we have only slightly edited it) comes from Michael E. Moody, a former Mathematics Department faculty member who became Chair of the Mathematics Department at Harvey Mudd College in California.
Formalized computing in the mathematics undergraduate curriculum began in about 1986, when Michael Kallaher, then Chair, volunteered me to help set up and manage our first departmental micro-computing lab. We set up this lab in a small seminar room in Sloan Hall using 10 AppleII+ computers. At that time, Maple, Mathematica, Derive and other such programs did not exist, so we used a variety of programs, specific for numerical calculus computation and graphing. Several ink-jet printers were shared between computers, with a switch box. We were bold and had every section of calculus do a certain number of assignments, and so the lab was very busy, hot, and stuffy. I also began to write lab exercieses, which led to the publication of two lab books.
Within two years, we clearly were straining the capacity of the room, so we commandeered an adjacent grad student office, knocked a doorway in the wall, and doubled the size of the lab. We now had 15 Apple II computers, and serveral IBM XT computers but still no computer algebra systems. We continued using that lab until the Department's 1990 move to Neill Hall. It was at that time that we had an opportunity to design a special physical space for the lab, the existing Newton Computer Lab, and also an opportunity to re-evaluate lab computing for the student users. We decided to switch from a PC-based lab to a server X-term lab; I then negotiated with DEC to get in on a 75% off deal for the equipment that established the first real undergraduate computer lab: with 30 X-terms, and five DEC-station servers. We used UNIX, which provided many management advantages such as one central location for managing student accounts, software licenses, printing paper and other consumables. The switch to UNIX was a major factor in the success of the lab. David Barnes ran that shop when I was away on sabbatical in 1990-1991, and he had the initial joy of trying to make it all work. Needless to say, it did work, and worked better and better as time went on.
Other faculty made pioneering efforts in undergraduate computing. For example, in 1988 or thereabout, computer algebra systems began to appear, and one of the first attempts at including such facilities in the undergraduate curriculum was by James Jordan. He used the then new program DERIVE in conjunction with his Math 202 class. However, students quickly stole all of the DERIVE lab manuals, and we realized that we had to have much more security in the lab to control abuses.
The lab had resulted from a variety of such experiments. Even earlier, Barnes had conducted another curious experiment in this era of computing, using programmable pocket calculators. At the beginning of the 1980s, personal computers were still quite expensive and not common. However, cheap programmable calculators began to appear for the first time and were very well designed for use by people who had never done mainframe computing before. In 1980 the Department started a course that concentrated on using these little machines to do some of the computing tasks that are included in the Math 448 course, such as solving equations f(x)=0, numerical integration, and other standard calculus manipulations. However, the window of opportunity for calculators lasted only a few years and PCs soon dominated this whole area when they became increasingly less expensive and when good numerical software, such as MatLab and Mathematica, began to appear.
Because the undergraduate center evolved ultimately out of Tyre Newton's early efforts to introduce computing into the classroom, the Newton Lab was named in his honor and is now located in Room 101 of Neill Hall. Both the Undergraduate Laboratory and the Research Laboratory have continued to evolve, keeping pace with the rapid evolution of computer hardware and software. Some of the machines use the Microsoft NT operating system in addition to the UNIX systems, and all of them are networked together. This move to NT allows the use of common software packages such as spreadsheets, DOS-based programs, graphics packages, and others that may not run under UNIX.
Predicting the future is always a difficult and/or dangerous task especially in a field such as mathematical computing that is certain to change even more radically in the next 15 years than it has in the last 15 years. Still, a few trends seem to be clear. The Internet continues its exponential growth and will, for good or ill, have a monstrous effect on our methods of teaching at all levels. The speed/cost ratio of hardware will continue the current steep upward spiral well into this new millennium. This will propel the development of new and more useful mathematical software that will have an even more profound effect on the development of mathematics as we know it. It seems that these enticing challenges will occupy the resources of the Mathematics Department for some time to come.
Some older photos from the Mathematics Department