Computer Science As a Career Essay Text

Jonathan Friesen - Writing Coach

When i entered mit in september 1949, the von neumann concept of the stored program computer was only a few years old: it was the subject of a summer school program at the university of pennsylvania moore school of electrical engineering in 1946. It was around 1949 that realizations of the stored program concept went into operation at cambridge and manchester. At the princeton institute for advanced study, and the whirlwind computer at mit. My introduction to computers came through two undergraduate friends: bill eccles, a fellow ee, who had some experience with the ibm card programmed calculator in frank verzuh's shop, and ken ralston, who entertained us with stories of his experiences with the whirlwind computer, then under construction. I prepared programs written in assembly language on punched paper tape using frieden flexowriters , and stood aside watching the myriad lights blink and flash while operator mike solamita fed the tapes. When my program crashed, i could watch while the octal contents of memory were dumped onto a crt screen and snapped by a camera for my later perusal using a microfilm reader. Fifty years later much has changed: a room full of vacuum tubes has become a tiny chip with millions of transistors.

A phenomenon once limited to research laboratories has become an industry producing commodity products that anyone can own and use beneficially. A computer program is still a sequence of instructions obeyed by a processor as if their effects were accomplished one at a time. The only exception to this is the interruption of the program sequence to allow the processor to pay attention to another program, or to some input output device calling for service. Even our favorite programming languages deal only with the sequential aspect of computation, and provide no means for expressing actions that require use of program interruptions that is left to operating system calls, or, at best, library packages that encapsulate usage patterns of operating system facilities. This situation leaves programming languages unable to express large applications in a modular style such as that supported by the familiar procedure concept in sequential programming languages.

I have seen this history from inside a major research university, as a teacher of computer science, and as a researcher in what i like to call computer system architecture. In 1960 professor john mccarthy, now at stanford university and known for his contributions to artificial intelligence, led the long range computer study group lrcsg which proposed objectives for mit's future computer systems. I had the privilege of participating in the work of the lrcsg, which led to project mac and the multics computer and operating system, under the organizational leadership of prof.

Fano had a vision of the computer utility – the concept of the computer system as a repository for the knowledge of a community – data and procedures in a form that could be readily shared – a repository that could be built upon to create ever more powerful procedures, services, and active knowledge from those already in place. Corbat у 's goal was to provide the kind of central computer installation and operating system that could make this vision a reality. With funding from darpa, the defense advanced research projects agency, the result was multics.

I am proud of the role i played in the activities that led to the construction of multics. From the work of the lrcsg we envisioned that the hardware for multics would be a symmetric multiprocessor several processors having equal access to several banks of main memory. To support the sharing of processor and memory resources by independent programs running for many users at the same time, i advocated the combination of segmentation inspired by the burroughs b50 system and paging inspired by the manchester atlas computer. This would allow segments containing program modules and units of structured data to be shared by many users without the need for making copies. Ted glaser, robert graham, and me, spent much of the 1963 64 academic year visiting most of the major us computer manufacturers to see which of them might be able to meet the requirements we had formulated. Ibm put in a valiant effort toward convincing us that system 360 was the right choice, and we had to explain to their top management how the 360 architecture failed to address the problem of achieving the rapid reallocation of resources demanded by the time sharing environment.

Essay Made By Filipino Writers

Nevertheless, we found a willing collaborator in john couleur of the general electric company, and developed the multics hardware, the first computer system with hardware support for large numbers of paged segments of virtual memory, as the ge 645 system, a major variant of the ge 635 product. Rather than work on the software operating system for multics, i chose to do independent research following the intellectual ideas and issues that arose from my experience with the multics effort, and the time sharing system some students and i had built using a dec pdp 1 computer. During the 1960s project mac had very generous support from darpa, and the mit computer science faculty and graduate students could choose any topic to study, so long as it had some relation to computing. I formed the computation structures group and focused on architectural concepts that could narrow the acknowledged gap between programming concepts and the organization of computer hardware.

I found myself dismayed that people would consider themselves to be either hardware or software experts, but paid little heed to how joint advances in programming and architecture could lead to a synergistic outcome that might revolutionize computing practice. The agencies were willing to fund pretty wild ideas, and i was supported to do research on data flow architecture, first by nsf and later by the doe. This work inspired related projects at several companies and research institutions around the world, and earned me the echert mauchly award in 1984. Computer science departments had proliferated throughout the universities to meet the demand, primarily for programmers and software engineers, and the faculty assembled to teach the subjects was expected to do meaningful research.

To manage the burgeoning flood of conference papers, program committees adopted a new strategy for papers in computer architecture: no more wild ideas papers had to present quantitative results. The effect was to create a style of graduate research in computer architecture that remains the conventional wisdom of the community to the present day: make a small, innovative, change to a commercially accepted design and evaluate it using standard benchmark programs. This style has stifled the exploration and publication of interesting architectural ideas that require more than a modicum of change from current practice.

How to Write a Good Application Essay 5 Paragraphs

The practice of basing evaluations on standard benchmark codes neglects the potential benefits of architectural concepts that need a change in programming methodology to demonstrate their full benefit. Today, much of the excitement in computer science has shifted to various important application domains: medicine, speech processing, support for advanced human interfaces, communications, graphics, etc. And less funding is available for academic work in the core areas: programming languages, operating systems, and computer architecture. In fact, there are people who consider that these core areas have reached the limit of their potential for innovation.