|
Home | Switchboard | Unix Administration | Red Hat | TCP/IP Networks | Neoliberalism | Toxic Managers |
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix |
General Readings on Languages History | Early compilers source | |||||
Structured programming | Design patterns | Extreme Programming | CMM (Capability Maturity Model) | |||
Fortran | Algol & Simula | PL/1 | Basic | PL/360 | XPL | Algol-68 |
Pascal | Modula-2 | C | C++ | Smalltalk | Java | Icon |
Unix shells | C shell | Korn shell | bash | zsh | ||
Perl | TCL | Python | JavaScript | Ruby | Humor | Etc |
|
|
Ordinarily technology changes fast. But programming languages are different: programming languages are not just technology, but what programmers think in. They're half technology and half religion. And so the median language, meaning whatever language the median programmer uses, moves as slow as an iceberg. Paul Graham: Beating the Averages Libraries are more important that the language. Donald Knuth |
|
|
A fruitful way to think about language development is to consider it a to be special type of theory building. Peter Naur suggested that programming in general is theory building activity in his 1985 paper "Programming as Theory Building". But idea is especially applicable to compilers and interpreters. What Peter Naur failed to understand was that design of programming languages has religious overtones and sometimes represent an activity, which is pretty close to the process of creating a new, obscure cult ;-). Clueless academics publishing junk papers at obscure conferences are high priests of the church of programming languages. some like Niklaus Wirth and Edsger W. Dijkstra (temporary) reached the status close to those of (false) prophets :-).
On a deep conceptual level building of a new language is a human way of solving complex problems. That means that complier construction in probably the most underappreciated paradigm of programming of large systems much more so then greatly oversold object-oriented programming. OO benefits are greatly overstated. For users, programming languages distinctly have religious aspects, so decisions about what language to use are often far from being rational and are mainly cultural. Indoctrination at the university plays a very important role. Recently they were instrumental in making Java a new Cobol.
The second important observation about programming languages is that language per se is just a tiny part of what can be called language programming environment. the latter includes libraries, IDE, books, level of adoption at universities, popular, important applications written in the language, level of support and key players that support the language on major platforms such as Windows and Linux and other similar things. A mediocre language with good programming environment can give a run for the money to similar superior in design languages that are just naked. This is a story behind success of Java. Critical application is also very important and this is a story of success of PHP which is nothing but a bastardatized derivative of Perl (with all most interesting Perl features removed ;-) adapted to creation of dynamic web sites using so called LAMP stack.
Progress in programming languages has been very uneven and contain several setbacks. Currently this progress is mainly limited to development of so called scripting languages. Traditional high level languages field is stagnant for many decades.
At the same time there are some mysterious, unanswered question about factors that help the language to succeed or fail. Among them:
Those are difficult questions to answer without some way of classifying languages into different categories. Several such classifications exists. First of all like with natural languages, the number of people who speak a given language is a tremendous force that can overcome any real of perceived deficiencies of the language. In programming languages, like in natural languages nothing succeed like success.
History of programming languages raises interesting general questions about the limit of complexity of programming languages. There is strong historical evidence that a language with simpler core, or even simplistic core Basic, Pascal) have better chances to acquire high level of popularity. The underlying fact here probably is that most programmers are at best mediocre and such programmers tend on intuitive level to avoid more complex, more rich languages and prefer, say, Pascal to PL/1 and PHP to Perl. Or at least avoid it on a particular phase of language development (C++ is not simpler language then PL/1, but was widely adopted because of the progress of hardware, availability of compilers and not the least, because it was associated with OO exactly at the time OO became a mainstream fashion). Complex non-orthogonal languages can succeed only as a result of a long period of language development (which usually adds complexly -- just compare Fortran IV with Fortran 99; or PHP 3 with PHP 5 ) from a smaller core. The banner of some fashionable new trend extending existing popular language to this new "paradigm" is also a possibility (OO programming in case of C++, which is a superset of C).
Historically, few complex languages were successful (PL/1, Ada, Perl, C++), but even if they were successful, their success typically was temporary rather then permanent (PL/1, Ada, Perl). As Professor Wilkes noted (iee90):
Things move slowly in the computer language field but, over a sufficiently long period of time, it is possible to discern trends. In the 1970s, there was a vogue among system programmers for BCPL, a typeless language. This has now run its course, and system programmers appreciate some typing support. At the same time, they like a language with low level features that enable them to do things their way, rather than the compiler’s way, when they want to.
They continue, to have a strong preference for a lean language. At present they tend to favor C in its various versions. For applications in which flexibility is important, Lisp may be said to have gained strength as a popular programming language.
Further progress is necessary in the direction of achieving modularity. No language has so far emerged which exploits objects in a fully satisfactory manner, although C++ goes a long way. ADA was progressive in this respect, but unfortunately it is in the process of collapsing under its own great weight.
ADA is an example of what can happen when an official attempt is made to orchestrate technical advances. After the experience with PL/1 and ALGOL 68, it should have been clear that the future did not lie with massively large languages.
I would direct the reader’s attention to Modula-3, a modest attempt to build on the appeal and success of Pascal and Modula-2 [12].
Complexity of the compiler/interpreter also matter as it affects portability: this is one thing that probably doomed PL/1 (and later Ada), although those days a new language typically come with open source compiler (or in case of scripting languages, an interpreter) and this is less of a problem.
Here is an interesting take on language design from the preface to The D programming language book:
Programming language design seeks power in simplicity and, when successful, begets beauty.
Choosing the trade-offs among contradictory requirements is a difficult task that requires good taste from the language designer as much as mastery of theoretical principles and of practical implementation matters. Programming language design is software-engineering-complete.
D is a language that attempts to consistently do the right thing within the constraints it chose: system-level access to computing resources, high performance, and syntactic similarity with C-derived languages. In trying to do the right thing, D sometimes stays with tradition and does what other languages do, and other times it breaks tradition with a fresh, innovative solution. On occasion that meant revisiting the very constraints that D ostensibly embraced. For example, large program fragments or indeed entire programs can be written in a well-defined memory-safe subset of D, which entails giving away a small amount of system-level access for a large gain in program debuggability.
You may be interested in D if the following values are important to you:
- Performance. D is a systems programming language. It has a memory model that, although highly structured, is compatible with C’s and can call into and be called from C functions without any intervening translation.
- Expressiveness. D is not a small, minimalistic language, but it does have a high power-to-weight ratio. You can define eloquent, self-explanatory designs in D that model intricate realities accurately.
- “Torque.” Any backyard hot-rodder would tell you that power isn’t everything; its availability is. Some languages are most powerful for small programs, whereas other languages justify their syntactic overhead only past a certain size. D helps you get work done in short scripts and large programs alike, and it isn’t unusual for a large program to grow organically from a simple single-file script.
- Concurrency. D’s approach to concurrency is a definite departure from the languages it resembles, mirroring the departure of modern hardware designs from the architectures of yesteryear. D breaks away from the curse of implicit memory sharing (though it allows statically checked explicit sharing) and fosters mostly independent threads that communicate with one another via messages.
- Generic code. Generic code that manipulates other code has been pioneered by the powerful Lisp macros and continued by C++ templates, Java generics, and similar features in various other languages. D offers extremely powerful generic and generational mechanisms.
- Eclecticism. D recognizes that different programming paradigms are advantageous for different design challenges and fosters a highly integrated federation of styles instead of One True Approach.
- “These are my principles. If you don’t like them, I’ve got others.” D tries to observe solid principles of language design. At times, these run into considerations of implementation difficulty, usability difficulties, and above all human nature that doesn’t always find blind consistency sensible and intuitive. In such cases, all languages must make judgment calls that are ultimately subjective and are about balance, flexibility, and good taste more than anything else. In my opinion, at least, D compares very favorably with other languages that inevitably have had to make similar decisions.
At the initial, the most difficult stage of language development the language should solve an important problem that was inadequately solved by currently popular languages. But at the same time the language has few chances rto cesseed unless it perfectly fits into the current software fashion. This "fashion factor" is probably as important as several other factors combined with the exclution of "language sponsor" factor.
Like in woman dress fashion rules in language design. And with time this trend became more and more prononced. A new language should simultaneously represent the current fashionable trend. For example OO-programming was a visit card into the world of "big, successful languages" since probably early 90th (C++, Java, Python). Before that "structured programming" and "verification" (Pascal, Modula) played similar role.
PL/1, Java, C#, Ada are languages that had powerful sponsors. Pascal, Basic, Forth are examples of the languages that had no such sponsor during the initial period of development. C and C++ are somewhere in between.
But any language now need a "programming environment" which consists of a set of libraries, debugger and other tools (make tool, link, pretty-printer, etc). The set of standard" libraries and debugger are probably two most important elements. They cost lot of time (or money) to develop and here the role of powerful sponsor is difficult to underestimate.
While this is not a necessary condition for becoming popular, it really helps: other things equal the weight of the sponsor of the language does matter. For example Java, being a weak, inconsistent language (C-- with garbage collection and OO) was pushed through the throat on the strength of marketing and huge amount of money spend on creating Java programming environment. The same was partially true for C# and Python. That's why Python, despite its "non-Unix" origin is more viable scripting language now then, say, Perl (which is better integrated with Unix and has pretty innovative for scripting languages support of pointers and regular expressions), or Ruby (which has support of coroutines form day 1, not as "bolted on" feature like in Python). Like in political campaigns, negative advertizing also matter. For example Perl suffered greatly from blackmail comparing programs in it with "white noise". And then from withdrawal of O'Reilly from the role of sponsor of the language (although it continue to milk that Perl book publishing franchise ;-)
People proved to be pretty gullible and in this sense language marketing is not that different from woman clothing marketing :-)
One very important classification of programming languages is based on so called the level of the language. Essentially after there is at least one language that is successful on a given level, the success of other languages on the same level became more problematic. Higher chances for success are for languages that have even slightly higher, but still higher level then successful predecessors.
The level of the language informally can be described as the number of statements (or, more correctly, the number of lexical units (tokens)) needed to write a solution of a particular problem in one language versus another. This way we can distinguish several levels of programming languages:
Lowest levels. This level is occupied by assemblers and languages designed fro specific instruction sets like PL\360.
High level with automatic memory allocation for variables and garbage collection. Languages of this category (Java, C#) typically are compiled not to the native instruction set of the computer they need to run, but to some abstract instruction set called virtual machine.
Some people distinguish between "nanny languages" and "sharp razor" languages. The latter do not attempt to protect user from his errors while the former usually go too far... Right compromise is extremely difficult to find.
For example, I consider the explicit availability of pointers as an important feature of the language that greatly increases its expressive power and far outweighs risks of errors in hands of unskilled practitioners. In other words attempts to make the language "safer" often misfire.
Another useful typology is based in expressive style of the language:
Popularity of the programming languages is not strongly connected to their quality. Some languages that look like a collection of language designer blunders (PHP, Java ) became quite popular. Java became especially a new Cobol and PHP dominates dynamic Web sites construction. The dominant technology for such Web sites is often called LAMP, which means Linux - Apache -My SQL PHP. Being a highly simplified but badly constructed subset of Perl, kind of new Basic for dynamic Web sites construction PHP provides the most depressing experience. I was unpleasantly surprised when I had learnt the Wikipedia engine was rewritten in PHP from Perl some time ago, but this quite illustrates the trend.
So language design quality has little to do with the language success in the marketplace. Simpler languages have more wide appeal as success of PHP (which at the beginning was at the expense of Perl) suggests. In addition much depends whether the language has powerful sponsor like was the case with Java (Sun and IBM) as well as Python (Google).
Progress in programming languages has been very uneven and contain several setbacks like Java. Currently this progress is usually associated with scripting languages. History of programming languages raises interesting general questions about "laws" of programming language design. First let's reproduce several notable quotes:
Please note that one thing is to read language manual and appreciate how good the concepts are, and another to bet your project on a new, unproved language without good debuggers, manuals and, what is very important, libraries. Debugger is very important but standard libraries are crucial: they represent a factor that makes or breaks new languages.
In this sense languages are much like cars. For many people car is the thing that they use get to work and shopping mall and they are not very interesting is engine inline or V-type and the use of fuzzy logic in the transmission. What they care is safety, reliability, mileage, insurance and the size of trunk. In this sense "Worse is better" is very true. I already mentioned the importance of the debugger. The other important criteria is quality and availability of libraries. Actually libraries are what make 80% of the usability of the language, moreover in a sense libraries are more important than the language...
A popular belief that scripting is "unsafe" or "second rate" or "prototype" solution is completely wrong. If a project had died than it does not matter what was the implementation language, so for any successful project and tough schedules scripting language (especially in dual scripting language+C combination, for example TCL+C) is an optimal blend that for a large class of tasks. Such an approach helps to separate architectural decisions from implementation details much better that any OO model does.
Moreover even for tasks that handle a fair amount of computations and data (computationally intensive tasks) such languages as Python and Perl are often (but not always !) competitive with C++, C# and, especially, Java.The second important observation about programming languages is that language per se is just a tiny part of what can be called language programming environment. the latter includes libraries, IDE, books, level of adoption at universities, popular, important applications written in the language, level of support and key players that support the language on major platforms such as Windows and Linux and other similar things. A mediocre language with good programming environment can give a run for the money to similar superior in design languages that are just naked. This is a story behind success of Java. Critical application is also very important and this is a story of success of PHP which is nothing but a bastardatized derivative of Perl (with all most interesting Perl features removed ;-) adapted to creation of dynamic web sites using so called LAMP stack.
History of programming languages raises interesting general questions about the limit of complexity of programming languages. There is strong historical evidence that languages with simpler core, or even simplistic core has more chanced to acquire high level of popularity. The underlying fact here probably is that most programmers are at best mediocre and such programmer tend on intuitive level to avoid more complex, more rich languages like, say, PL/1 and Perl. Or at least avoid it on a particular phase of language development (C++ is not simpler language then PL/1, but was widely adopted because OO became a fashion). Complex non-orthogonal languages can succeed only as a result on long period of language development from a smaller core or with the banner of some fashionable new trend (OO programming in case of C++).
Here is modified from Byte the timeline of Programming Languages (for the original see BYTE.com September 1995 / 20th Anniversary /)
ca. 1946
- Konrad Zuse , a German engineer working alone while hiding out in the Bavarian Alps, develops Plankalkul. He applies the language to, among other things, chess.
1949
- Short Code , the first computer language actually used on an electronic computing device, appears. It is, however, a "hand-compiled" language.
Fifties
1951
- Grace Hopper , working for Remington Rand, begins design work on the first widely known compiler, named A-0. When the language is released by Rand in 1957, it is called MATH-MATIC.
1952
- Alick E. Glennie , in his spare time at the University of Manchester, devises a programming system called AUTOCODE, a rudimentary compiler.
1957
- FORTRAN --mathematical FORmula TRANslating system--appears. Heading the team is John Backus, who goes on to contribute to the development of ALGOL and the well-known syntax-specification system known as BNF.
1958
- FORTRAN II appears, able to handle subroutines and links to assembly language.
- LISP. John McCarthy at M.I.T. begins work on LISP--LISt Processing.
- Algol-58. The original specification for ALGOL appears. The specification does not describe how data will be input or output; that is left to the individual implementations.
1959
- LISP 1.5 appears.
- COBOL is created by the Conference on Data Systems and Languages (CODASYL).
Sixties
1960
- ALGOL 60 , the specification for Algol-60, the first block-structured language, appears. This is the root of the family tree that will ultimately produce the likes of Pascal. ALGOL goes on to become the most popular language in Europe in the mid- to late-1960s. Compilers for the language were quite difficult to write and that hampered it widespread use. FORTRAN managed to hold its own in the area of numeric computations and Cobol in data processing. Only PL/1 (which was released in 1964) managed to advance ideas of Algol 60 to reasonably wide audience.
- APL Sometime in the early 1960s , Kenneth Iverson begins work on the language that will become APL--A Programming Language. It uses a specialized character set that, for proper use, requires APL-compatible I/O devices.
- Discovery of context free languages formalism. The 1960's also saw the rise of automata theory and the theory of formal languages. Noam Chomsky introduced the notion of context free languages and later became well-known for his theory that language is "hard-wired" in human brains, and for his criticism of American foreign policy.
1962
- Snobol was designed in 1962 in Bell Labs by R. E. Griswold and I. Polonsky. Work begins on the sure-fire winner of the "clever acronym" award, SNOBOL--StriNg-Oriented symBOlic Language. It will spawn other clever acronyms: FASBOL, a SNOBOL compiler (in 1971), and SPITBOL--SPeedy ImplemenTation of snoBOL--also in 1971.
- APL is documented in Iverson's book, A Programming Language .
- FORTRAN IV appears.
1963
- ALGOL 60 is revised.
- PL/1. Work begins on PL/1.
1964
- System/360, announced in April of 1964,
- PL/1 is released with high quality compiler (F-compiler), which beats is quality of both compile-time and run-time diagnostics most of the compilers of the time. Later two brilliantly written and in some aspects unsurpassable compilers: debugging and optimizing PL/1 compilers were added. Both represented state of the art of compiler writing. Cornell University implemented subset of PL/1 for teaching called PL/C with the compiler that has probably the most advanced error detection and correction capabilities of batch compilers of all times. PL/1 was also adopted as system implementation language for Multics.
- APL\360 is implemented.
- BASIC. At Dartmouth University , professors John G. Kemeny and Thomas E. Kurtz invent BASIC. The first implementation was on a timesharing system. The first BASIC program runs at about 4:00 a.m. on May 1, 1964.
1965
- SNOBOL3 appears.
1966
- FORTRAN 66 appears.
- LISP 2 appears.
- Work begins on LOGO at Bolt, Beranek, & Newman. The team is headed by Wally Fuerzeig and includes Seymour Papert. LOGO is best known for its "turtle graphics."
1967
- SNOBOL4 , a much-enhanced SNOBOL, appears.
- The first volume of The Art of Computer Programming was published in 1968 and instantly became classic Donald Knuth (b. 1938) later published two additional volumes of his world famous three-volume treatise.
- Structured programming movement started. The start of the first religious cult in programming language design. It was created by Edgar Dijkstra who published his infamous "Go to statement considered harmful" (CACM 11(3), March 1968, pp 147-148). While misguided this cult somewhat contributed to the design of control structures in programming languages serving as a kind of stimulus for creation of more rich set of control structures in new programming languages (with PL/1 and its derivative -- C as probably the two popular programming languages which incorporated this new tendencies). Later it degenerated into completely fundamentalist and mostly counter-productive verification cult.
- ALGOL 68 , the successor of ALGOL 60, appears. Was the first extensible language that got some traction but generally was a flop. Some members of the specifications committee--including C.A.R. Hoare and Niklaus Wirth -- protested its approval on the basis of its overcomplexity. They proved to be partially write: ALGOL 68 compilers proves to be difficult to implement and tat doomed the language. Dissatisfied with the complexity of the Algol-68 Niklaus Wirth begins his work on a simple teaching language which later becomes Pascal.
- ALTRAN , a FORTRAN variant, appears.
- COBOL is officially defined by ANSI.
- Niklaus Wirth begins work on Pascal language design (in part as a reaction to overcomplexity of Algol 68). Like Basic before it, Pascal was specifically designed for teaching programming at universities and as such was specifically designed to allow one pass recursive decent compiler. But the language has multiple grave deficiencies. While a talented language designer Wirth went overboard in simplification of the language (for example in the initial version of the language loops were the allowed to have only increment one, arrays were only static, etc). It also was used to promote bizarre ideas of correctness proofs of the program inspired by verification movement with the high priest Edgar Dijkstra -- the first (or may be the second after structured programming) mass religious cult in programming languages history that destroyed careers of several talented computer scientists who joined it, such as David Gries). Some of blunders in Pascal design were later corrected in Modula and Modula 2.
1969
- 500 people attend an APL conference at IBM's headquarters in Armonk, New York. The demands for APL's distribution are so great that the event is later referred to as "The March on Armonk."
Seventies
1970
- Forth. Sometime in the early 1970s , Charles Moore writes the first significant programs in his new language, Forth.
Prolog. Work on Prolog begins about this time. For some time Prolog became fashionable due to Japan initiatives. Later it returned to relative obscurity, although did not completely disappeared from the language map.
- Also sometime in the early 1970s , work on Smalltalk begins at Xerox PARC, led by Alan Kay. Early versions will include Smalltalk-72, Smalltalk-74, and Smalltalk-76.
- An implementation of Pascal appears on a CDC 6000-series computer.
- Icon , a descendant of SNOBOL4, appears.
1972
- The manuscript for Konrad Zuse's Plankalkul (see 1946) is finally published.
- Dennis Ritchie produces C. The definitive reference manual for it will not appear until 1974.
- PL/M. In 1972 Gary Kildall implemented a subset of PL/1, called "PL/M" for microprocessors. PL/M was used to write the CP/M operating system - and much application software running on CP/M and MP/M. Digital Research also sold a PL/I compiler for the PC written in PL/M. PL/M was used to write much other software at Intel for the 8080, 8085, and Z-80 processors during the 1970s.
- The first implementation of Prolog -- by Alain Colmerauer and Phillip Roussel
1974
- Donald E. Knuth published his article that give a decisive blow to "structured programming fundamentalists" led by Edgar Dijkstra: Structured Programming with go to Statements. ACM Comput. Surv. 6(4): 261-301 (1974)
- Another ANSI specification for COBOL appears.
1975
- Paul Abrahams (Courant Intritute of Mathematical Sciences) destroyed credibility of "structured programming" cult in his article " 'Structure programming' considered harmful" (SYGPLAN Notices, 1975, April, p 13-24
- Tiny BASIC by Bob Albrecht and Dennis Allison (implementation by Dick Whipple and John Arnold) runs on a microcomputer in 2 KB of RAM. It is usable of a 4-KB machine, which left 2 KB available for the program.
- Microsoft was formed on April 4, 1975 to develop and sell BASIC interpreters for the Altair 8800. Bill Gates and Paul Allen write a version of BASIC that they sell to MITS (Micro Instrumentation and Telemetry Systems) on a per-copy royalty basis. MITS is producing the Altair, one of the earlier 8080-based microcomputers that came with a interpreter for a programming language.
- Scheme , a LISP dialect by G.L. Steele and G.J. Sussman, appears.
- Pascal User Manual and Report , by Jensen and Wirth, is published. Still considered by many to be the definitive reference on Pascal. This was kind of attempt to replicate the success of Basic relying of growing "structured programming" fundamentalism movement started by Edgar Dijkstra. Pascal acquired large following in universities as compiler was made freely available. It was adequate for teaching, has fast completer and was superior to Basic.
- B.W. Kerninghan describes RATFOR--RATional FORTRAN. It is a preprocessor that allows C-like control structures in FORTRAN. RATFOR is used in Kernighan and Plauger's "Software Tools," which appears in 1976.
1976
Backlash on Dijkstra correctness proofs pseudo-religious cult started:
- Andrew Tenenbaum (Vrije University, Amsterdam) published paper In Defense of Program Testing or Correctness Proofs Considered Harmful (SIGPLAN Notices, May 1976 pp 64-68). Made the crucial contribution to the "Structured programming without GOTO" programming debate, which was a decisive blow to the structured programming fundamentalists led by E. Dijkstra;
- Maurice Wilkes, famous computer scientists and the first president of British Computer Society (1957-1960) attacked "verification cult" in this article Software engineering and Structured programming published in IEEE transactions on Software engineering (SE-2, No.4, December 1976, pp 274-276. The paper was also presented as a Keynote address at the Second International Conference on Software engineering, San Francisco, CA, October 1976.
- Design System Language , considered to be a forerunner of PostScript, appears.
1977
- AWK was probably the second (after Snobol) string processing language that extensively use regular expressions. The first version was created in BellLabs by Alfred V. Aho, Peter J. Weinberger, and Brian W. Keringhan in 1977. This was also the first widely used language with built-in garbage collection.
- The ANSI standard for MUMPS -- Massachusetts General Hospital Utility Multi-Programming System -- appears. Used originally to handle medical records, MUMPS recognizes only a string data-type. Later renamed M.
- The design competition that will produce Ada begins. Honeywell Bull's team, led by Jean Ichbiah, will win the competition. Ada never live to promises and became an expensive flop.
- Kim Harris and others set up FIG, the FORTH interest group. They develop FIG-FORTH, which they sell for around $20.
UCSD Pascal. In the late 1970s , Kenneth Bowles produces UCSD Pascal, which makes Pascal available on PDP-11 and Z80-based computers.
- Niklaus Wirth begins work on Modula, forerunner of Modula-2 and successor to Pascal. It was the first widely used language that incorporate the concept of coroutines.
1978
- AWK -- a text-processing language named after the designers, Aho, Weinberger, and Kernighan -- appears.
- FORTRAN 77: The ANSI standard for FORTRAN 77 appears.
1979
- Bourne shell. The Bourne shell was included Unix Version 7. It was inferior to paralleled developed C-shell but gained tremendous popularity on the strength of AT&T ownership of Unix.
- C shell.The Second Berkeley Software Distribution (2BSD), was released in May 1979. It included updated versions of the 1BSD software as well as two new programs by Joy that persist on Unix systems to this day: the vi text editor (a visual version of ex) and the C shell.
- REXX was designed and first implemented between 1979 and mid-1982 by Mike Cowlishaw of IBM.
Eighties
1980
- Smalltalk-80 appears.
- Modula-2 appears.
- Franz LISP appears.
- Bjarne Stroustrup develops a set of languages -- collectively referred to as "C With Classes" -- that serve as the breeding ground for C++.
1981
- C-shell was extended into tcsh.
- Effort begins on a common dialect of LISP, referred to as Common LISP.
- Japan begins the Fifth Generation Computer System project. The primary language is Prolog.
1982
- ISO Pascal appears.
- In 1982 one of the first scripting languages REXX was released by IBM as a product. It was four years after AWK was released. Over the years IBM included REXX in almost all of its operating systems (VM/CMS, VM/GCS, MVS TSO/E, AS/400, VSE/ESA, AIX, CICS/ESA, PC DOS, and OS/2), and has made versions available for Novell NetWare, Windows, Java, and Linux.
- PostScript appears. It revolutionized printing on dot matrix and laser printers.
1983
- REXX was included in the third release of IBM's VM/CMS shipped in 1983; It was four years after AWK was released. Over the years IBM included REXX in almost all of its operating systems (VM/CMS, VM/GCS, MVS TSO/E, AS/400, VSE/ESA, AIX, CICS/ESA, PC DOS, and OS/2), and has made versions available for Novell NetWare, Windows, Java, and Linux.
- The Korn shell (ksh) was released in 1983.
- Smalltalk-80: The Language and Its Implementation by Goldberg et al is published. Influencial early book that promoted ideas of OO programming.
- Ada appears . Its name comes from Lady Augusta Ada Byron, Countess of Lovelace and daughter of the English poet Byron. She has been called the first computer programmer because of her work on Charles Babbage's analytical engine. In 1983, the Department of Defense directs that all new "mission-critical" applications be written in Ada.
- In late 1983 and early 1984, Microsoft and Digital Research both release the first C compilers for microcomputers.
- In July , the first implementation of C++ appears. The name was coined by Rick Mascitti.
- In November , Borland's Turbo Pascal hits the scene like a nuclear blast, thanks to an advertisement in BYTE magazine.
1984
- GCC development started. In 1984 Stallman started his work on an open source C compiler that became widely knows as gcc. The same year Steven Levy "Hackers" book is published with a chapter devoted to RMS that presented him in an extremely favorable light.
- Icon. R.E.Griswold designed Icon programming language Icon (see overview). Like Perl Icon is a high-level, programming language with a large repertoire of features for processing data structures and character strings. Icon is an imperative, procedural language with a syntax reminiscent of C and Pascal, but with semantics at a much higher level (see Griswold, Ralph E. and Madge T. Griswold. The Icon Programming Language, Second Edition, Prentice-Hall, Inc., Englewood Cliffs, New Jersey. 1990, ISBN 0-13-447889-4.).
- APL2. A reference manual for APL2 appears. APL2 is an extension of APL that permits nested arrays.
1985
- REXX. The first PC implementation of REXX was released.
- Forth controls the submersible sled that locates the wreck of the Titanic.
- Vanilla SNOBOL4 for microcomputers is released.
- Methods, a line-oriented Smalltalk for PCs, is introduced.
- The first version of GCC was able to compile itself appeared in late 1985. The same year GNU Manifesto published
1986
- Smalltalk/V appears--the first widely available version of Smalltalk for microcomputers.
- Apple releases Object Pascal for the Mac.
- Borland releases Turbo Prolog.
- Charles Duff releases Actor, an object-oriented language for developing Microsoft Windows applications.
- Eiffel , another object-oriented language, appears.
- C++ appears.
1987
- PERL. The first version of Perl, Perl 1.000 was released by Larry Wall in 1987. See an excellent PerlTimeline for more information.
- Turbo Pascal version 4.0 is released.
1988
- The specification for CLOS -- Common LISP Object System -- is published.
- Oberon. Niklaus Wirth finishes Oberon, his follow-up to Modula-2. The language was still-born but some of its ideas found its was to Python.
- PERL 2 was released.
- TCL was created. The Tcl scripting language grew out of work of John Ousterhout on creating the design tools for integrated circuits at the University of California at Berkeley in the early 1980's. In the fall of 1987, while on sabbatical at DEC's Western Research Laboratory, he decided to build an embeddable command language. He started work on Tcl in early 1988, and began using the first version of Tcl in a graphical text editor in the spring of 1988. The idea of TCL is different and to certain extent more interesting than idea of Perl -- TCL was designed as embeddable macro language for applications. In this sense TCL is closer to REXX (which was probably was one of the first language that was used both as a shell language and as a macrolanguage). Important products that use Tcl are TK toolkit and Expect.
1989
- The ANSI C specification is published.
- C++ 2.0 arrives in the form of a draft reference manual. The 2.0 version adds features such as multiple inheritance and pointers to members.
- Perl 3.0 was released in 1989 was distributed under GNU public license -- one of the first major open source project distributed under GNU license and probably the first outside FSF.
Nineties
1990
- zsh. Paul Falstad wrote zsh, a superset of the ksh88 which also had many csh features.
- C++ 2.1 , detailed in Annotated C++ Reference Manual by B. Stroustrup et al, is published. This adds templates and exception-handling features.
- FORTRAN 90 includes such new elements as case statements and derived types.
- Kenneth Iverson and Roger Hui present J at the APL90 conference.
1991
- Visual Basic wins BYTE's Best of Show award at Spring COMDEX.
PERL 4 released. In January 1991 the first edition of Programming Perl, a.k.a. The Pink Camel, by Larry Wall and Randal Schwartz is published by O'Reilly and Associates. It described a new, 4.0 version of Perl. Simultaneously Perl 4.0 was released (in March of the same year). Final version of Perl 4 was released in 1993. Larry Wall is awarded the Dr. Dobbs Journal Excellence in Programming Award. (March)
1992
- Dylan -- named for Dylan Thomas -- an object-oriented language resembling Scheme, is released by Apple.
1993
- ksh93 was released by David Korn. Was the last of line on AT&T developed shells.
- ANSI releases the X3J4.1 technical report -- the first-draft proposal for (gulp) object-oriented COBOL. The standard is expected to be finalized in 1997.
- PERL 4. Version 4 was the first widely used version of Perl. Timing was simply perfect: it was already widely available before WEB explosion in 1994.
1994
- PERL 5. Version 5 was released in the end of 1994:
- Microsoft incorporates Visual Basic for Applications into Excel.
1995
- In February , ISO accepts the 1995 revision of the Ada language. Called Ada 95, it includes OOP features and support for real-time systems.
- RUBY December: First release 0.95.
1996
- first ANSI C++ standard .
- Ruby 1.0 released. Did not gain much popularity until later.
1997
- Java. In 1997 Java was released. Sun launches a tremendous and widely successful complain to replace Cobol with Java as a standard language for writing commercial applications for the industry.
2006
2007
2011
- Dennis Ritchie, the creator of C, dies. He was only 70 at the time.
|
||||
Bulletin | Latest | Past week | Past month |
|
October 13, 2011 | NYTimes.com
Dennis M. Ritchie, who helped shape the modern digital era by creating software tools that power things as diverse as search engines like Google and smartphones, was found dead on Wednesday at his home in Berkeley Heights, N.J. He was 70.Mr. Ritchie, who lived alone, was in frail health in recent years after treatment for prostate cancer and heart disease, said his brother Bill.
In the late 1960s and early '70s, working at Bell Labs, Mr. Ritchie made a pair of lasting contributions to computer science. He was the principal designer of the C programming language and co-developer of the Unix operating system, working closely with Ken Thompson, his longtime Bell Labs collaborator.
The C programming language, a shorthand of words, numbers and punctuation, is still widely used today, and successors like C++ and Java build on the ideas, rules and grammar that Mr. Ritchie designed. The Unix operating system has similarly had a rich and enduring impact. Its free, open-source variant, Linux, powers many of the world's data centers, like those at Google and Amazon, and its technology serves as the foundation of operating systems, like Apple's iOS, in consumer computing devices.
"The tools that Dennis built - and their direct descendants - run pretty much everything today," said Brian Kernighan, a computer scientist at Princeton University who worked with Mr. Ritchie at Bell Labs.
Those tools were more than inventive bundles of computer code. The C language and Unix reflected a point of view, a different philosophy of computing than what had come before. In the late '60s and early '70s, minicomputers were moving into companies and universities - smaller and at a fraction of the price of hulking mainframes.
Minicomputers represented a step in the democratization of computing, and Unix and C were designed to open up computing to more people and collaborative working styles. Mr. Ritchie, Mr. Thompson and their Bell Labs colleagues were making not merely software but, as Mr. Ritchie once put it, "a system around which fellowship can form."
C was designed for systems programmers who wanted to get the fastest performance from operating systems, compilers and other programs. "C is not a big language - it's clean, simple, elegant," Mr. Kernighan said. "It lets you get close to the machine, without getting tied up in the machine."
Such higher-level languages had earlier been intended mainly to let people without a lot of programming skill write programs that could run on mainframes. Fortran was for scientists and engineers, while Cobol was for business managers.
C, like Unix, was designed mainly to let the growing ranks of professional programmers work more productively. And it steadily gained popularity. With Mr. Kernighan, Mr. Ritchie wrote a classic text, "The C Programming Language," also known as "K. & R." after the authors' initials, whose two editions, in 1978 and 1988, have sold millions of copies and been translated into 25 languages.
Dennis MacAlistair Ritchie was born on Sept. 9, 1941, in Bronxville, N.Y. His father, Alistair, was an engineer at Bell Labs, and his mother, Jean McGee Ritchie, was a homemaker. When he was a child, the family moved to Summit, N.J., where Mr. Ritchie grew up and attended high school. He then went to Harvard, where he majored in applied mathematics.
While a graduate student at Harvard, Mr. Ritchie worked at the computer center at the Massachusetts Institute of Technology, and became more interested in computing than math. He was recruited by the Sandia National Laboratories, which conducted weapons research and testing. "But it was nearly 1968," Mr. Ritchie recalled in an interview in 2001, "and somehow making A-bombs for the government didn't seem in tune with the times."
Mr. Ritchie joined Bell Labs in 1967, and soon began his fruitful collaboration with Mr. Thompson on both Unix and the C programming language. The pair represented the two different strands of the nascent discipline of computer science. Mr. Ritchie came to computing from math, while Mr. Thompson came from electrical engineering.
"We were very complementary," said Mr. Thompson, who is now an engineer at Google. "Sometimes personalities clash, and sometimes they meld. It was just good with Dennis."
Besides his brother Bill, of Alexandria, Va., Mr. Ritchie is survived by another brother, John, of Newton, Mass., and a sister, Lynn Ritchie of Hexham, England.
Mr. Ritchie traveled widely and read voraciously, but friends and family members say his main passion was his work. He remained at Bell Labs, working on various research projects, until he retired in 2007.
Colleagues who worked with Mr. Ritchie were struck by his code - meticulous, clean and concise. His writing, according to Mr. Kernighan, was similar. "There was a remarkable precision to his writing," Mr. Kernighan said, "no extra words, elegant and spare, much like his code."
March 20, 2007 | MSNBC.com
John Backus, whose development of the Fortran programming language in the 1950s changed how people interacted with computers and paved the way for modern software, has died. He was 82.
Backus died Saturday in Ashland, Ore., according to IBM Corp., where he spent his career.
Prior to Fortran, computers had to be meticulously "hand-coded" - programmed in the raw strings of digits that triggered actions inside the machine. Fortran was a "high-level" programming language because it abstracted that work - it let programmers enter commands in a more intuitive system, which the computer would translate into machine code on its own.
The breakthrough earned Backus the 1977 Turing Award from the Association for Computing Machinery, one of the industry's highest accolades. The citation praised Backus' "profound, influential, and lasting contributions."
Backus also won a National Medal of Science in 1975 and got the 1993 Charles Stark Draper Prize, the top honor from the National Academy of Engineering.
"Much of my work has come from being lazy," Backus told Think, the IBM employee magazine, in 1979. "I didn't like writing programs, and so, when I was working on the IBM 701 (an early computer), writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs."
John Warner Backus was born in Wilmington, Del., in 1924. His father was a chemist who became a stockbroker. Backus had what he would later describe as a "checkered educational career" in prep school and the University of Virginia, which he left after six months. After being drafted into the Army, Backus studied medicine but dropped it when he found radio engineering more compelling.
Backus finally found his calling in math, and he pursued a master's degree at Columbia University in New York. Shortly before graduating, Backus toured the IBM offices in midtown Manhattan and came across the company's Selective Sequence Electronic Calculator, an early computer stuffed with 13,000 vacuum tubes. Backus met one of the machine's inventors, Rex Seeber - who "gave me a little homemade test and hired me on the spot," Backus recalled in 1979.
Backus' early work at IBM included computing lunar positions on the balky, bulky computers that were state of the art in the 1950s. But he tired of hand-coding the hardware, and in 1954 he got his bosses to let him assemble a team that could design an easier system.
The result, Fortran, short for Formula Translation, reduced the number of programming statements necessary to operate a machine by a factor of 20.
It showed skeptics that machines could run just as efficiently without hand-coding. A wide range of programming languages and software approaches proliferated, although Fortran also evolved over the years and remains in use.
Backus remained with IBM until his retirement in 1991. Among his other important contributions was a method for describing the particular grammar of computer languages. The system is known as Backus-Naur Form.
© 2007 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Copyright 2007 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
A number of readers let us know of the passing of John W. Backus, who assembled a team to develop FORTRAN at IBM in the 1950s. It was the first widely used high-level language. Backus later worked on a "function-level" programming language, FP, which was described in his Turing Award lecture "Can Programming be Liberated from the von Neumann Style?" and is viewed as Backus's apology for creating FORTRAN. He received the 1977 ACM Turing Award "for profound, influential, and lasting contributions to the design of practical high-level programming systems, notably through his work on FORTRAN, and for seminal publication of formal procedures for the specification of programming languages."Be afraid, be very afraid(Score:5, Insightful)
by vivaoporto (1064484) on Tuesday March 20, @05:48AM (#18411881)
(http://wsu.edu/~brians/errors/errors.html)With both the lack of interest and the distortion of the original goal, Computer Science as we know may be dying with the elders. Computer Science originally had nothing to do with computers (as in personal computer) per se, but with the science of computation, optimal algorithms for pure math problems, etc. Actually, it was nothing but a branch of Math. The way computer science is being dealt with nowadays, with disdain, lack of interest and with people thinking about it as a tool to put another "screw tighter" professional in the market, soon we may run out of real breakthroughs like the ones those genius created to pave the yellow brick road we run over nowadays.
by dario_moreno (263767) on Tuesday March 20, @06:15AM (#18411991)
(http://www.mastermodelisation.com/ | Last Journal: Sunday April 03, @04:23PM)Maybe it's because I was breastfed with BASIC from a very young age, but when I was forced to learn FORTRAN to work on legacy code I discovered after some initial, computer science taught disgust, that it was really the best way to express myself in code, better than with anything else, and I owe my present university position to FORTRAN because it made me so productive. I guess it was because the language was conceived by engineer, scientists oriented types, and not by formal logic adepts or grammar nazis. I still teach FORTRAN to this day, using F90/F95 in all its power, and MATLAB-like exposed students tend to enjoy it because they can develop simple and efficient numerical codes much faster than with anything else; some of them found positions thanks to it. The trick is to use FORTRAN for what it's for (numerical arrays, heavy linear algebra, easily parallelizable scientific computing) and not strings or files manipulation, linked lists (LISP) , graphics or system : for that there is C(++), and tons of libraries. If the code grows larger than 10 000 lines, very strong discipline is necessary, and that's where true OO can be pertinent. In scientific code FORTRAN tends to be 20% faster than the best possible C++ implementation because the grammar is so simple that compilers tend to understand better the code and can vectorize or optimize it much farther than C ; and there is much less overhead than with C++ because the objects are simpler to manipulate. Major code used in the industry (Star-CD, Gaussian for instance) is still written in FORTRAN for those (and legacy) reasons.
Re:rest in peace(Score:4, Insightful)
by Wormholio (729552) on Tuesday March 20, @08:04AM (#18412433)I too still teach my students (in physics and astronomy) to use Fortran, for many of the reasons listed above. While it may also be useful for them to go on to learn other languages, their primary focus is on the physics problems they need to solve and the numerical algorithms needed to help them do that. Fortran makes it easy for them to get started and then focus on the calculations, not on grammar or philosophy.
Fortran has been criticized because you can write "spaghetti code" or other crap, while other languages supposedly protect you from the mistakes you can make in Fortran. But you can write crappy code in any language (including "spaghetti classes"). I teach my students to write with good style. They know their code has to be clearly understandable not just to the machine but also to someone else who is familiar with the goal of the code but not the details. Trying to enforce good style through grammar is misguided at best, just as it is in writing in general. Developing good style is a personal, ongoing process for writing anything, including good code.
BNF is an acronym for "Backus Naur Form". John Backus and Peter Naur introduced for the first time a formal notation to describe the syntax of a given language (This was for the description of the ALGOL 60 programming language, see [Naur 60]). To be precise, most of BNF was introduced by Backus in a report presented at an earlier UNESCO conference on ALGOL 58. Few read the report, but when Peter Naur read it he was surprised at some of the differences he found between his and Backus's interpretation of ALGOL 58. He decided that for the successor to ALGOL, all participants of the first design had come to recognize some weaknesses, should be given in a similar form so that all participants should be aware of what they were agreeing to. He made a few modificiations that are almost universally used and drew up on his own the BNF for ALGOL 60 at the meeting where it was designed. Depending on how you attribute presenting it to the world, it was either by Backus in 59 or Naur in 60. (For more details on this period of programming languages history, see the introduction to Backus's Turing award article in Communications of the ACM, Vol. 21, No. 8, august 1978. This note was suggested by William B. Clodius from Los Alamos Natl. Lab).
1. The Report on the language used a formal syntax specification, one of the first, if not the first, to do so. Semantics were specified with prose, however.
2. There was a distinction between the publication language and the implementation language (those probably aren't the right terms). Among other things, it got around differences such as whether to use decimal points or commas in numeric constants.
3. Designed by a committee, rather than a private company or government agency.
4. Archetype of the so-called "Algol-like languages," examples of which are (were?) Pascal, PL./I, Algol68, Ada, C, and Java. (The term Algol-like languages is hardly used any more, since we have few examples of contemporary non-Algol-like languages.)
However, as someone who actually programmed in it (on a Univac 1108 in 1972 or 1973), I can say that Algol60 was extremely difficult to use for anything real, since it lacked string processing, data structures, adequate control flow constructs, and separate compilation. (Or so I recall... it's been a while since I've read the Report.)
The following exchange comes from a transcript given at the 1978 conference which the book documents:
CHEATHAM: The next question is from Bernie Galler of the University of Michigan, and he asks: "BNF is sometimes pronounced Backus-Naur-Form and sometimes Backus-Normal- Form. What was the original intention?
NAUR: I don't know where BNF came from in the first place. I don't know -- surely BNF originally meant Backus Normal Form. I don't know who suggested it. Perhaps Ingerman. [This is denied by Peter Z. Ingerman.] I don't know.
CHEATHAM: It was a suggestion that Peter Ingerman proposed then?
NAUR: ... Then the suggestion to change that I think was made by Don Knuth in a letter to the Communications of the ACM, and the justification -- well, he has the justification there. I think I made reference to it, so there you'll find whatever justification was originally made. That's all I would like to say.
Ralph Griswold died two weeks ago. He created several programming languages, most notably Snobol (in the 60s) and Icon (in the 70s) - both outstandingly innovative, integral, and efficacious in their areas. Despite the abundance of scripting and other languages today, Snobol and Icon are still unsurpassed in many respects, both as elegance of design and as practicality.
See also Ralph Griswold 1934-2006 and Griswold Memorial Endowment
Ralph E. Griswold died in Tucson on October 4, 2006, of complications from pancreatic cancer. He was Regents Professor Emeritus in the Department of Computer Science at the University of Arizona.Griswold was born in Modesto, California, in 1934. He was an award winner in the 1952 Westinghouse National Science Talent Search and went on to attend Stanford University, culminating in a PhD in Electrical Engineering in 1962.
Griswold joined the staff of Bell Telephone Laboratories in Holmdel, New Jersey, and rose to become head of Programming Research and Development. In 1971, he came to the University of Arizona to found the Department of Computer Science, and he served as department head through 1981. His insistence on high standards brought the department recognition and respect. In recognition of his work the university granted him the breastle of Regents Professor in 1990.
While at Bell Labs, Griswold led the design and implementation of the groundbreaking SNOBOL4 programming language with its emphasis on string manipulation and high-level data structures. At Arizona, he developed the Icon programming language, a high-level language whose influence can be seen in Python and other recent languages.
Griswold authored numerous books and articles about computer science. After retiring in 1997, his interests turned to weaving. While researching mathematical aspects of weaving design he collected and digitized a large library of weaving documents and maintained a public website. He published technical monographs and weaving designs that inspired the work of others, and he remained active until his final week.
-----Gregg Townsend Staff Scientist The University of Arizona
Abstract
This is a personal tour of programming languages which I have encountered since I first learnt to program a computer in 1959.
Assembly codes
Assembly languages have changed over the years, starting very simple (binary or decimal instruction codes and operands), growing to accept mnemonic operation codes and operands in place of numeric ones, and adding macro and conditional assembly features. Nowadays programming in assembly language is rarer than it used to be, and rather simple mnemonic assemblers, sufficient to process the output of a compiler, are used.
The English Electric DEUCE was programmed in binary (one 32-bit word to each row of a punched card). Each instruction had to specify the location of the next instruction to be obeyed, and the way to get a fast program was to place instructions in the mercury delay lines such that there was no unnecessary waiting between instructions. An interesting technical challenge was to write a bootstrap program of twelve instructions on a single punched card. During my time as a pre-university student at the English Electric Company, a staff member was in the process of writing an assembler which would, inter alia, look after instruction placement.
The EDSAC 2 and Titan (ATLAS 2) computers at Cambridge University had decimal assemblers. Both operation codes and operands were specified by decimal numbers. Both computers had built-in firmware to perform common tasks, for example conversion between binary and decimal. The order code of the Titan was designed so that it would be relatively easy to remember the decimal operation codes (the regularities in the order code showed most clearly in decimal).
The most powerful assembler with which I have worked was Control Data's COMPASS macro assembler for the CDC 6000 and 7000 Series central (CP) and peripheral (PP) processors. Operations and register operands were specified by mnemonics. CP instructions were written in a particularly user-friendly form: "SA1 A0+B1" denoted "set address register A1 to the sum of address register A0 and index register B1" (this initiated a read from central memory address A1, as reading was the function of registers A1 to A5). COMPASS was a classical two-pass assembler with macro and conditional assembly features, and generated a full listing showing both the source assembly code and the generated machine code (in octal). CDC's operating systems were written almost entirely in COMPASS assembly language. So too, at CERN, was the software for the Remote Input/Output Stations (RIOS, running on Modular 1 computers) and for SUPERMUX (terminal concentrators running on HP 2100 computers); the technique was to define a set of COMPASS macros for the target computer's order code, and to post-process the output of COMPASS to shorten 60-bit words to the word length of the target.
in BCPL for the ModComp computers of the CERNET packet switching network, and for the I8080, M6800 and TMS9900 microprocessors;
in Pascal for the M6800/6801/6809 and M68000/68020/68030+68881 microprocessors.
This list is not exhaustive. There were certainly others, for example the cross assemblers for the ESOP and XOP bit slice processors.
Structured assembly languages
Structured assembly languages provide a mixture of access to machine specific features such as registers and addressing modes, and high level language constructs such as begin-end blocks, procedures, if and case statements, and for and while loop constructs. The first such language was Niklaus Wirth's PL/360. At CERN, Robert Russell created PL-11 (for PDP-11) and later PL-VAX.
I had brief contact with PL-VAX, and with the PL/M language for Intel microprocessors.
Autocodes
My first programming language was DEUCE Alphacode. The language provided a set of floating point variables (X1, X2, ..., X2200) and a smaller number of counting (integer) variables (N1, N2, ..., N63). One line of code could perform a single operation, for example "X1 = X2 + X3" or "X4 = ROOT X5". The statements of Alphacode were usually interpreted, not compiled. Writing and using an Alphacode program was an improvement on performing pre-specified calculations on an electro-mechanical calculating machine, my previous activity at English Electric.
EDSAC 2 and Titan Autocode at Cambridge University supported operations on a fixed set of integer variables (I, J, ...) and of floating point arrays (A[5], B[7], ...). Autocode was a compiled language. Since I was interested in algorithms for syntactic analysis of programming languages at that time, the lack of arrays of integers ruled out the use of Autocode during my doctoral studies.
Algol 60
I was introduced to Algol 60 while working as a summer student at Elliott Automation, using the Elliott 803 computer. I remember being amazed when shown a recursive algorithm for the travelling salesman problem, and discovering that Algol did indeed allow a function to call itself.
FORTRAN
I first encountered FORTRAN IV when I arrived at CERN. How could a computer programmer live to the age of 26 years before using FORTRAN? The answer lies in the traditions of Cambridge University's Mathematical Laboratory, now renamed the Computer Laboratory. Shortly after I left Cambridge, the University bought an IBM computer for its computing service, thereby introducing FORTRAN.
FORTRAN was the most important computer language at CERN, so much so that people used to say, only half jokingly, that whatever computer language physicists were using in five, ten or fifteen years time, it would be called FORTRAN. Only when CERN decided to move to C++ instead of making the transition from FORTRAN 77 to Fortran 90, did the use of FORTRAN decline.
Although Pascal and C were the most frequently used high level languages on microprocessors at CERN, FORTRAN was used too. An extreme example was Hans von der Schmitt's RTF/68K (Real Time Fortran 77 for 68K Processors), whose compiler was written in a compiler description language which was translated into FORTRAN by a compiler generator, also written in FORTRAN.
SYMPL
SYMPL was an Algol-like systems implementation language created by Control Data for its 6000 and 7000 Series computers. I used it to write the CP (central processor) part of the 6000 Series software for CERNET; the PP (peripheral processor) part was written in COMPASS assembly language. Dietrich Wiegandt, writing CERNET software for the IBM System 370, had to write it all in assembly language. I was the more fortunate person.
BCPL
Martin Richards' BCPL is known as an ancestor of the C programming language. BCPL was used at CERN to write CERNET node software for ModComp II computers, and CERNET host software for PDP-11 and Nord 10. The BCPL compiler was written in BCPL. It ran at CERN as a native compiler for CDC 6000, IBM 370, VAX, PDP-11, Nord 10, HP 2100; and as a cross compiler for ModComp II and TMS9900. Cross assemblers, a linker, a librarian and pre-loaders ("pushers") were written in BCPL for ModComp, I8080, M6800 and TMS9900.
Ada
Ada 83 was tried at CERN by an ad hoc evaluation group, but never went much further. It was however used to write the error message module of the MODEL data acquisition system. One Ada critic felt that the inter-task rendezvous mechanism was unsuitable for real-time data acquisition systems. My own experience was that type checking was so strict that even carefully constructed code could be rejected by the compiler for no understandable reason.
Pascal
The second generation of CERN cross software for microprocessors was written in Pascal. This included cross assemblers for M6800/6801/6809 and M68K (M68000/68020/68030+68881), and M68K cross compilers for Pascal, Modula-2, FORTRAN 77 and C (unfinished). The FORTRAN cross compiler was little used as there were well established UNIX-derived f77 and C cross compilers. MoniCa, a debugging monitor for M68K, was written in Pascal and assembly language. MoniCa was used in the ALEPH event builder, in VALET-Plus test systems, and elsewhere. The Pascal cross compiler, which originated in Siemens research laboratories and was extended at CERN to become a multi-language compiler, was sufficiently good to see off competition from some commercial Pascal compilers. A few years later, however, DD divisional management decided that the time had come to stop developing microprocessor cross software.
At the height of its use at CERN, Pascal was considered a safe language in which to write programs, whereas C and BCPL were considered dangerous.
Modula-2
Modula-2, Niklaus Wirth's successor to Pascal, was used on M68K hardware in the LEP control system.
At about the same time, Robert Cailliau in PS division was promoting the use of his enhanced Pascal, P+.
In both cases, necessary Pascal extensions included facilities for separate compilation and for bit operations.
Modula-2 was capable of running on a microprocessor without operating system, since its SYSTEM module provided the minimum necessary for accepting interrupts and switching context.
C
C came to CERN with UNIX (on PDP-11 and then on VAX), and with the OS-9/68K operating system (on M68K) which was used in LEP experiments' data acquisition systems. It is now the dominant language for control and data acquisition systems. Much code which used to be written in assembly language is today written in C.
The introduction into C of function prototypes, and the existence of instrumentation and debugging tools such as Purify and Insure++, has made programming in C less hazardous than it used to be.
The GNU C compiler has set a high standard, below which no commercial C compiler is viable.
C is the programming language with which I work most often today.
C++
C++ has taken over the role of FORTRAN as CERN's scientific data processing language. I have followed the development of C++ with interest, have helped to solve some problems with the use of C++, but have not become a practising C++ programmer. So take my views with a pinch of salt.
The definition of C++ and its run time library has taken a long time to stabilize. Early users will have discovered how much the language and library have changed over the years. Some of the newer features are still not correctly implemented by current compilers. C++ is so powerful that a programmer can write programs which his colleagues, and perhaps he himself, cannot understand. So select the C++ features which you use with care, look at some of the generated code (it may be more than you expect), and use a good debugging tool (such as Insure++).
Java
I admire Java and its class libraries, have been to a Java programming course and read the nutshell book, but do not actually write Java programs today. So my views on Java may be worth even less than my views on C++. Let's simply say that I like Java better than C++, and I think its use will spread well beyond its current application area of Web-based human interfaces.
Acknowledgments
In this article I have touched on the work of too many CERN colleagues to mention every one of them by name. Ian Willers was the driving force behind the use of BCPL at CERN. Horst von Eicken drove the second generation of microprocessor support software. Jean Montuelle, David Foster and Jonathan Caves made major contributions to the microprocessor cross software.
About the author(s): Julian Blake was the leader of the DD/SW/LM (languages and microprocessor support) section before it was dissolved.
John Reid, convenor of the ISO Fortran standards committee, has posted the following announcement to some Fortran-related forums: 'I am pleased to tell you that the draft Fortran 2000 standard is now out for comment. ... The J3 (USA Fortran committee) version, which is identical except for the title page and the headers and footers, is available in ps, pdf, text, or source (latex). This is a very significant milestone for Fortran 2000. It is a major extension of Fortran 95 that has required a significant amount of development work by the J3. ... The abstract of the revision, which lists the major enhancements is appended. I have written an informal description of the new features, which will be published in the next issue of Fortran Forum (about to appear).'"
Re: XPL Language
From comp.compilers
From: [email protected] (Steve Meyer) Newsgroups: comp.compilers Date: 27 Aug 2000 22:27:57 -0400 Organization: Compilers Central References: 00-06-118 00-07-016 00-07-075 00-08-018 00-08-028 00-08-055 00-08-083 Keywords: history I am not sure it makes sense to continue this historial discussion,
but I think there is a lot more to story. The roots of modern
computing lie in this story. For example, although both PL360 and XPL
were available why did Professor Knuth use assembler in his Art of
Programming books? Also, these original languages (and the Bell Labs
counter-parts) arose in Academic (School of Literate and Science)
computer science departments, but now computing is studied in EE
departments.
On 13 Aug 2000 19:10:55 -0400, Duane Sand <[email protected]> wrote:
>Steve Meyer wrote in message 00-08-055...
>>>>>: Peter Flass <[email protected]> wrote:
>>>>>: > XPL, developed in the 1970's was one of the earliest "compiler
>>>>>: > compilers", was widely ported, and was the basis for a number of
>other
>>>>>: > languages such as the PL/M family.
>>
>>I think PL/M and XPL came from different worlds that did not
>>communicate. I think people saw XPL as too high level. I think PL/M
>>came from other system level languages such as PL/360 (?). My
>>recollection may not be right.
>
>Niklaus Wirth developed PL360 as an alternative to writing IBM360
>assembly code directly. It was a quick one-person project. The
>parser used "operator precedence' techniques which predated practical
>LR methods. The tables could be worked out by hand in no time but the
>method couldn't handle BNFs of most languages. It was quite low
>level, mapping infix syntactic forms directly to single 360
Parsing may have gotten tenure for lots of professors but the most
advanced programming language areas such as HDLs (hardware
descriptions languages) now use the "predated" operator precedence
methods. Also Professor Wirth's languages have remained at the
for-front of academic programming languages.
>instructions without any optimizations. The PL360 paper inspired lots
>of people to develop their own small languages.
I think PL360 was very popular within IBM and among the back then
"modernist" movement away from assembly language. I know it was very
popular at SLAC.
>McKeeman etc developed XPL on 360 as a tidy subset of PL/I that could
>be implemented by a few people and be useful in coding biggish things,
>including the compiler and parser generator. The parser was initially
>based on their extensions to operator precedence, which relaxed BNF
>restrictions but required use of a parser generator tool and was still
>limited compared to LR. XPL was "high level" only in having built-in
>a varying-length string data type supported by a garbage collector.
>There were no struct types.
I think it was hard back then to differentiate XPL from Mckeeman's
advocacy of Burroughs B5500 style stack machine research program.
>Univ of Washington ported XPL onto SDS/Xerox systems that were like
>360 but with one instruction format.
>
>UW graduate Gary Kildall developed Intel's first programming tools for
>the 8008 and 8080, in trade for a very early portable computer: an
>8008 without keyboard or monitor, installed in a briefcase. Kildall
>used these (plus a floppy drive adapted by UW grad John Torode) to
>develop CP/M, the precursor to MS DOS. The Intel tools included an
>assembler and PL/M, both coded in Fortran. PL/M was inspired by the
>example of PL360 and the implementation methods of XPL. Kildall left
>before UW's XPL project but was likely very aware of it.
>
>PL/M's level was limited by the 8008's near inability to support proc
>calls. The first micro language to see significant use was Basic,
>implemented by assembler-coded interpreters. Implementing real
>applications in real compiled languages required later chips with
>nicer instruction sets, eg 8088 (gag) and M6800.
As the Z80 showed, 8088 was only one index register away from being
real computer.
Historical question I think is why there was so little communciation
between the current most popular BCPL, B, C, C++ research program and
the Stanford/Silicon Valley research program.
Just my two cents.
/Steve
--
Steve Meyer Phone: (415) 296-7017
Pragmatic C Software Corp. Fax: (415) 296-0946
220 Montgomery St., Suite 925 email: [email protected]
San Francisco, CA 94104
The FORTRANSIT story is covered in the Annals of Computing History [4, 5], but an additional and more informal slant doesn't hurt.
Google matched content |
Internal
External
Some views on programming, taken in a wide sense and regarded as a human activity, are presented. Accepting that programs will not only have to be designed and produced, but also modified so as to cater for changing demands, it is concluded that the proper, primary aim of programming is, not to produce programs, but to have the programmers build theories of the manner in which the problems at hand are solved by program execution. The implications of such a view of programming on matters such as program life and modification, system development methods, and the professional status of programmers, are discussed.
Society
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
Quotes
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Bulletin:
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
History:
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
Classic books:
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|
You can use PayPal to to buy a cup of coffee for authors of this site |
Disclaimer:
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.
Last modified: March 12, 2019