Perl has an interesting historical path: from a language for elite system administrators to mass Web development
language, and back to the tool for elite system administrators
Creator of Perl Larry Wall is a pretty interesting figure on the open
source/free software arena. Designing the language and writing first Perl interpreter was a really huge project. Unlike Linux Torvalds,
Larry Wall did not have POSIX standards before him to solve architectural problems although Unix shell languages and AWK provided some
guidelines. Perl interpreter was not his first freeware/open source effort -- before it he created a popular USENET newsreader rn
and several other programs like patch. But no matter what is your level of qualification and talent, the development of the language
of this level of complexity is really an exhausting full time job. And here lies the problem as you need to earn living for yourself
and your family, if you have one. Larry Wall married pretty early and had two children when he started Perl.
His personal site is www.wall.org. The little known fact is that he created all
his programs including Perl being almost completely blind on one eye. Recently he published a diary relating details of his corneatransplant surgery that improved his vision. Unfortunately his health problems were not limited to this mishap. In 2004 he undergone
a very serious operation that removed a tumor from his stomach. See Portraits of Open Source Pioneers
for additional information.
Language started with limited distribution of version 2 (the first version with regex engine; limited documentation) and archived
significant popularity with version 4 released in 1991 (the first Perl book was published for this version). Actually final version
of Perl4 (4.036 released on Feb 1993) was very stable and was used for many years after the development stopped as better AWK then AWK.
It has had amazing, simply amazing debugger. For example it was included in IBM Tivoli products till early 2000 (till the end
of life of classic TEC).
Both Python and Ruby also carry too heavy OO baggage and that's also diminishes their value as sysadmin tools -- only few tasks
in system administration area can benefit from OO approach.
Currently the mainstream version is version 5, which is the version usually called Perl. The standard de-facto is version 5.10.1.
There is also experimental version of Perl6. But it did not yet achieved significant popularity, or has an interpreter
included in any major OS by default.
With its questionable emphasis on OO it also displays the type of problems that were called by Fred Brooks
Second-system effect. I believe that OO paradigm outlived its
usefulness, is significantly overhyped, and the fact that Perl 5 is not OO language represent huge advantage, not a drawback.
but at the same time not following the dominant fashion has it obvious drawbacks too. So jury is still out about this design decision.
Unfortunately one can see that in Perl 6 Larry Wall bought Simula -67 based OO paradigm used in
Python and Ruby "hook, line and sinker", and that was a questionable decision, making Perl 6 "Jonny come lately"
in this category. There were several much simpler areas were Perl 5 could be more profitably be extended such as exceptions,
coroutines and, especially, introducing types of variables. He also did not realize that Javascript prototypes based OO model is a
much better implementation of OO then Simula-67 model. And that Perl 5 modules do 80% of what is useful in classes (namely provide a
separate namespace and the ability to share variables in this namespace between several subroutines). Even primitive constructor
often can be implemented as a BEGIN block.
Taking into account that Perl4 is still in use, we can talk about Perl family on languages: Perl 4, Perl 5 and Perl 6 ;-).
Perl also significantly influenced two other major scripting languages -- Python and Ruby. From 1991 to 200 Perl was the "king of
the hill" and was the most used scripting language in existence as it was heavily used for SGI programming. Later it was displaced by
simpler (and inferior) PHP, as most WEB designers needed something like Basic, not something like PL/1. And Perl returned to
its roots -- as the tool for elite system administrators, who can master the complexity of the language and benefit from its
formidable expressive power.
Perl has an interesting historical path: from a language for elite system administrators to mass Web development
language, and back to the tool for elite system administrators
Like for cats, being more then 25 years old language (as of 2015) is an achievement in itself ;-) Many languages dies earlier then
that. Being installed by default in all major flavors of Linux and Unix is the major, tremendous achievement. Significance of which
is impossible to overestimate . It is very difficult to achieve this status (Ruby still did not; Python achieved it only partially --
for linux distributions only). But as soon as this status achieved it is extremely difficult to change. That means that Perl is
here to stay. May be for centuries to come :-). And Perl has a significant advantage over Python because Larry Wall was a super
qualified Unix sysadmin, which really understood Linux philosophy when creating this language. While Guido van Rossum while a
talented language designer in its own right, never was specialist in Unix and that clearly affected the design of Python.
Perl is older than Python but not by much (Per development started in 1987 vs 1989 for Python). Python reached version 1.0 in
January 1994; Perl 5.000 was released on October 17, 1994; ) . Perl was developed to deal with a text processing (processing logs)
on the UNIX machine while Python was a successor of programming language ABC designed for teaching programming and due tot his got
some funding from DAPRA:
During Van Rossum's stay at CNRI, he launched the
Computer Programming for Everybody (CP4E) initiative, intending to make programming more accessible to more people, with a
basic "literacy" in programming languages, similar to the basic English literacy and mathematics skills required by most
employers. Python served a central role in this: because of its focus on clean syntax, it was already suitable, and CP4E's goals
bore similarities to its predecessor, ABC. The project was funded by
DARPA.[14]
As of 2007[update],
the CP4E project is inactive, and while Python attempts to be easily learnable and not too arcane in its syntax and semantics,
reaching out to non-programmers is not an active concern.[15]
As of 2019 Perl remains one of the major scripting languages and has probably the
second largest amount of production code running of any scripting language, although most of it was written a while ago. Outside
system administration, few large system development projects now use Perl ( bioperl.org
was probably the last large project of this type and it is gradually is replaced by
biopython). In the past several large Web sites such as Yahoo and Amazon used Perl as the
programming language.
Perl no longer is used much for Web development, but the level of suitability to sysadmin tasks was and remain unsurpassed. Because Python is used in Universities for
teaching programming it became more popular for sysadmin tasks as well, but Perl in this niche still is superior to any viable alternative
including Python. So Python ascendance was not only due to the quality of the language and its implementation, but also
due to so called "Peter Naur effect": Peter Naur (of Algol 60 report and BNF notation fame) in his
1975 lecture "Programming languages, natural languages, and mathematics" which later was reprinted in
hypothesized that since late 70th only those future languages what can be thought to beginners have changes to enter the
"main" programming languages space. All others are limited to niche applications. In this sense Perl is a clear violation of
Peter Naur hypothesis ;-).
Anther important factor in Perl success is that Perl is a very interesting language with highly unorthodox design, which despite
its warts produced a lot of innovative, even for this day concepts. As such it is attractive to elite programmers and system
administrators who can master the language complexity and benefit form its expressiveness. For example it is one of the few scripting languages which has
concept of pointers as a data type, much like C. Also it is unique in a sense that has explicit directives (package) for managing namespace. Not to say an excellent access to Unix
internals (Larry Wall was a "superstar" Unix system administrator and it shows)
After peak popularity was reached around year 2000 Perl slowly faded giving most of the space in applications area to Python and
PHP. For example none of major configuration management system in use is written in Perl. Looks like using Perl for large
system development requires to much discipline, and Python and Ruby are better fits for this environment (while in some important areas
being inferior to Perl).
But much of this drop of popularity is connected with the mechanism of measuring language popularity (number of published books per
year; whether that language is taught at universities or not; number of submission CPAN vs similar repositories for other languages,
etc). For older languages (and Perl is as old as bash) those are imperfect methods. There is a huge "silent majority" of Perl
users who do not buy many books, do not submit anything to SPAN and use Perl mostly in simple scripts. I believe that includes probably
90% of all Unix sysadmins and bioinformatics engineers. I doubt that this number diminished much as Perl is conceptually close
to shell and is a natural choice of system administrators (especially self-taught, as people who graduated from the University now tend
to flock to Python which was their first language in the university) which is still a mass profession.
Also Perl has fist class (really first class) debugger and a stable well tested interpreter. So it definitely can compete with
other language on the quality of implementation. While Python managed to capitalize on OO hype and the fact that as a simpler
language it is now used for teaching programming at universities, it has its own set of problems. First of all OO as paradigm
is badly suited for programming scripts for Unix/linux maintenance. It significantly increases the size of such scripts and as
such number of bugs in them. Also the questionable trend to view everything as an object distracts from working on the actual task and
most of the efforts are wasted on created some set "universal" set of classes, which may or may not be useful. The second is that regex
engine are not integrated as well into language as in Perl. And that last, but not least is its slower, which is clearly visible if
you process huge logs in memory (which was how I come to Perl). And only very recently Python got a decent debugger, which still is
inferior to Perl debugger. And language without decent debugger is a junk language in my book (look at PHP ;-). I think Knuth
once remarked that when the has chosen the language to work on a particular machine, he always tried to choose the language with the
best debugger.
For the next 30 years Perl will remain Swiss army knife of Unix/Linux sysadmins. Perl is still a great replacement for shell scripts,
sed and AWK. It's native regex support was ahead of it's time and still remains pretty advanced although gap with other
languages narrowed after 30 years of Perl existence. You can do very complex string manipulation and statistical gathering with
no effort at all. A bunch of web log parsers were written in Perl and it's easy to see why, because the language was designed for that
sort of purpose. In other worlds, Perl is to server administration and string parsing what PHP is to development of web sites (and please
do not forget that PHP is derivate of Perl ;-) .
Perl is not a static language. Like all programming languages it evolves with time. Some initial design decisions proved to be
deeply wrong were later partially corrected (use of strict in modern Perl is one example). For some, programmers learned to
compensate deficiencies of the language with tools (for example
prettyprinter can be used for very efficient finding of unclosed '{' in C-style languages). Runaway string
constants (unclosed ' or ") now are detected pretty well as a part of syntax highlighting in editors (and you should not use
the editor that does not have this feature ;-)
Similarly, reversing comparison with constant ( if ( 2==$stage) which looks strange, but the trick still makes
sense ;-)can help to eliminate some
errors connected with unfortunate used of = for assignment and == for comparison, a blunder inherited from
C, which is a source of many bugs. In addition to this blunder, Perl introduced another one borrowed from Bourne shell --
different symbols for comparison of strings and numbers ( if ( $line eq 'EOF' but if ( $found == -1 )
). Here only God can help you although if one part of the comparison is constant Perl interpreter now produces warnings.
Generally a typical programming language gets to a new version in 10 to 12 years. Perl reached mainstream status with Perl 4
released in 1991. Perl is now in version 5 that was initially released in late 1994 (actually way to quick from the point of view of
language evolution). But for the 20 years since 1994 Perl changed very little as resources for its development became minimal after
O'Reilly withdraw its support around 2002.
The last major changes were introduced in Perl 5.10 which has been released on 18th December 2007, on the 20th birthday of Perl .
This version is now standard de-factor for Perl 5 and used in major Unix and Linux distributions. The level of dominance is such
that scripts written for older version can be viewed as legacy scripts.
Unfortunately Perl does not have an influential corporate sponsor like Python have found in Google and Microsoft. It also suffered
from OO fanatics attempt to mold the language into their preferred taste and ill conceived and then botched attempt to develop Perl
6: after Larry Wall withdrawal from development their was no a person with comparable architectural vision and talent to take over.
So the best strategy in such circumstances is conservative strategy, but people with just two cents of talent often have huge amount
of energy and such people represent serious danger as for hijacking and the destroying open source project which lost its principle
developer. Still despite lack of funds the development progresses and language is evolving.
Despite slow page of changes Perl as we used to know in say 1996 is somewhat different from a typical versions of Perl deployed in
2012. Deployed version are usually in the range of 5.8 to 5.16. Most important innovations are already present in version 5.8
("our" keyword is probably the most important addition to the
language that simplifies use of strict).
Some features such as string constants written as identifies without single or double quotes (as in $dictionary{weather}
) became obsolete. Although the idea that weather is a contact and $weather is variable has its value, as $
can be considered as dereferencing mechanism. Perl is installed by default on all major versions of Unix. The most popular are
two versions: 5.8.8 and 5.10.1. With 5.8 still representing lion share of older Linux installations.
Among important versions still widely deployed on Unix (see
Perl - Wikipedia):
Perl 5.6Released on March 22, 2000. It introduced several important changes and refinements of the language.
Major changes included 64-bit support, Unicode string
representation, large file support (i.e. files over 2 GB) and the "our"
keyword. Now this version is by-and-large obsolete. Among notable new language features in this version we can mention the
following:
"our" variable declarations An "our" declaration introduces the ability to create an alias to the global variable (and
creates corresponding variable if it does not exist) within a package namespace. In example below our creates an alias to a global for namespace
Myspace variable Myspace::buffer
package Myspace;
our $bugger; # this is alias to the global variable Myspace::buffer for rest of lexical scope
$bar = 20;
package Other_namespace;
print "$bar\n"; # prints 20, as it refers to $Myspace::buffer, not to $::buffer
By default a package declaration changes which symbol table you are referring to when you use unqualified variables. But you
still can access variables declared with "our" attribute unqualified like in example above print "$bar\n" after
the package declaration (which effectively changed the default symbol table and without our attribute $bar should mean
$Bar::bar). See "our" in perlfunc.
the new $^V magic variable (which contains the Perl version as a string). Also, sprintf and printf
support the Perl-specific format flag %v to print ordinals of characters in arbitrary strings.
If open() is passed three arguments instead of two, the second argument is used as the mode and the third argument
is taken to be the file name.
In addition to BEGIN, INIT, END, DESTROY and AUTOLOAD, subroutines named
CHECK are now special. These are queued up during compilation and behave similar to END blocks, except they are called
at the end of compilation rather than at the end of execution. They cannot be called directly.
The qw// operator is now evaluated at compile time into a true list
Binary numbers are now supported as literals, in s? printf formats, and oct()
Perl now allows the arrow to be omitted in many constructs involving subroutine calls through references.
For example, $foo[10]->('foo') may now be written $foo[10]('foo'). This is rather similar to how the
arrow may be omitted from $foo[10]->{'foo'}. Note however, that the arrow is still required for foo(10)->('bar').
The exists() built-in now works on subroutine names. A subroutine is considered to exist if it has been
declared (even if implicitly). See
"exists" in perlfunc for examples.
The bit operators (& | ^ ~ << >>) now operate on the full native integer width (the exact size of which
is available in $Config{ivsize}).
@- and @+provide starting/ending offsets of regex matches. The new magic variables @- and @+ provide
the starting and ending offsets, respectively, of $&, $1, $2, etc. See
perlvar for details.
Perl 5.8.Released July 18, 2002. It did not anything substantial to the language but features introduced in 5.6
now are better debugged. This is the current "minimal" version of Perl to write scripts for as it is available on all flavors of
Unix. For example RHEL 5.8 still comes with Perl 5.8.8 installed and you needs manually upgrade to Active State Perl 5.10 for Linux
to use version 5.10 (both 32 bit and 64 bit binaries are free). Version 5.8.8 is also still standard on AIX 5.3 and HP-US 11.31.
For earlier versions of HP-UX like 10.xx version 5.8 is available as HP-provided binary package. The changes to the language in this
version were minimal:
5.8.3 A SCALAR method is now available for tied hashes.
5.8.4 Perl format were enhanced
5.8.8 chdir, chmod and chown can now work on filehandles as well as filenames, if the system supports
respectively fchdir, fchmod and fchown.
Perl 5.10.Released Dec 18, 2007. This was the major upgrade of version 5.8 that introduced one important language
feature: state variables (see Perl state variables for
details). Currently is installed by default on Suse 11, RHEL 6. Cygwin has Perl 5.26 by default, not 5.10. This is the version we
will cover as versions 5.12-5.14 are essentially better debugged variants of this version . There are very few Perl books that
cover this version. Among good, that I can recommend are
Programming Perl, forth edition (covers version 5.14).
While currently it does not run on all Unix flavors just a matter of time. Linux installations that do not have this version installed
by default can be upgrade using binary package from ActiveState.
This version improves Perl by introducing some important fixes to old problems and some interesting new features like:
state variables. This is an alternative to "my" variables and this type of variables keeps the value between invocations.
This is probably the most important addition as it rectify serious deficiency of previous version in subroutines local variables
implementation.
Several useful extensions to regex including
The ability to assign names to variables that contain matching elements of regular expressions (traditionally $1,
$2,...).
Find the last occurrence of the substring in regex without backtracking (?+, *+, ++);
For example
DB<1> $a='aabbaaaabbaa'; # will return the last aa not the middle aaaa as "greedy" operation would
DB<2> p $a=~/(a++)/;
aa
DB<3>
Can be used as "between" operation.
Metasymbol \D{N} with the ability to catch the last matched capture buffer with \g{-1}, previous
\g{-2} and so on.
New metasymbol \h -- horizontal White space, Similarly metasymbol \v for vertical white space. Also \H
and \V
Case statement (switch statement). Syntax is "given-when" It introduces four new keywords ("given, "when", "default"
and "continue").
Ability to specify minimal version of Perl for which the script is written (use Perl 5.10).
New keywordsay is identical to existing keyword print but provides newline at the end. You are saving
four symbols so how wise is this addition is difficult to say ;-). It definitely useful for debugging statements and allow to
distinguish them from "production" statements.
Lexical $_
Smart match operator ("~~"); there are two unrelated ideas mixed dea here.
If left side is a scalar and right side is an array it will perform search for a variable in the array -- a very
common operation.
The second and much more ambitious idea is to get rid of the necessity to use eq for string comparison and
== for numerical (which is one of the common sources or errors in Perl and can be viewed as a design flaw; the idea was
borrowed from Unix shell). If one of operators is a numeric constant, or clearly numeric (for example prefixed with unary +)
interpreter will chose numeric comparison, otherwise string comparison. In other words, smart match will check if the
two values are string-equal using "eq", unless it finds some other way to match them.... See
Smart Matching in Perl 5.10
so all the following will be true:
$a='Foo'
$b=42;
"$a ~~ "Foo"
$b ~~ 42
$b ~~ 42.0
$b ~~ "42.0"
and all these will be false
$a ~~ "Bar"
$b ~~ "42x"
When turning a string to a number Perl looks at the left side of the string and uses as many characters as it can understand
as being a number and warns if there are more - non-number - characters in the string. On the other hand ~~ fits the comparison
method to the values on the two sides. In a smart way.
This means that these are all true:
42 == 42
42 == 42.0
42 == "42.0"
42 == "42\n"
42 == "42x" # true albeit with a warning... if you used use warnings...
"Foo" == "Bar" # this a Perl gotcha because both part of this example are converted to zero.
But this is false:
42 eq "42.0" # left oprand is converted to string and the compared: length does not match to it is false.
42 eq "42\n" # same thing but programmers who use shell should beware of this behavious as it is different from bash
and other shells.
This behavior while consistent is a bit hard to understand.
Feature pragma
use feature 'say'
use feature 'switch'
use feature 'state'
use feature ':5.10'
Defined-or operator. Similar to (condition) ? if-true : if_false conditional expression. If variable is not
defined then it is assigned the default value provided in statement after "//". For example
$var = $val // $default_value;
You can also use "compact" version (C-style shortcut)
$var //= $default_value
Stacked file tests
Command line option-Eturns on extensions like new keywordsayin oneliners
Perl 5.10.1
The handling of complex expressions by the given/when
switch statement has been enhanced. There are two new cases where when now interprets its argument as a Boolean, instead of an
expression to be used in a smart match:
flip-flop operators
The .. and ... flip-flop operators are now evaluated in Boolean context, following their usual
semantics; see Range Operators in perlop.
Note that, as in perl 5.10.0, when (1..10) will
not work to test whether a given value is an integer between 1 and 10; you should use when ([1..10]) instead (note the array reference).
However, contrary to 5.10.0, evaluating the flip-flop operators in Boolean context ensures it can now be useful in a
when(), notably for implementing bistable conditions,
like in:
$scalar ~~ @array now always distributes the smart match across the elements of the array. It's true if one
element in @array verifies $scalar ~~ $element . This is a generalization of the old behaviour that tested whether
the array contained the scalar.
A bugfix related to the handling of the /m modifier and qr resulted in a change of behaviour between 5.8.x and 5.10.0:
# matches in 5.8.x, doesn't match in 5.10.0
$re = qr/^bar/; "foo\nbar" =~ /$re/m;
Perl 5.12.Released April 12, 2010. Looks like better debugged version 5.10. This version is superseded by version
5.14 release with interval of just one year...
You can now use the keys,
values,
each built-in functions on
arrays (previously you could only use them on hashes). See perlfunc for
details.
Perl 5.14.Released May 14, 2011. Looks like better debugged version 5.10. The forth edition of Programming
Perl (2012) uses Perl 5.14 as the base version.
Non-destructive substitution
The substitution (s///)
and transliteration (y///) operators now support an
/r option that copies the input variable, carries out the substitution on the copy, and returns the result. The original
remains unmodified.
Perl is an acronym for "Practical Extraction and Report Language." The first version of Perl was developed by
Larry Wall around 1987. Like in the case with many other important algorithmic
languages the development was partially financed by military.
Like the typical human, Perl was conceived in secret, and existed for roughly nine months before anyone in the world ever saw
it. Its womb was a secret project for the National Security Agency known as the "Blacker" project, which has long since closed
down. The goal of that sexy project was not to produce Perl. However, Perl may well have been the most useful thing to come
from Blacker. Sex can fool you that way.
"Blacker" project needed some C/AWK/SED superset and an attempt was made to add AWK and SED features to the C-shell framework.
The idea was to create a language more convenient and more suitable for processing logs and generating reports for large quantities
of data than combination of shell, AWK and C-shell. Design contained some elegant solutions and the language got some traction. Here
how Larry Wall explains his decision:
At this point, I'm talking about Perl, version 0. Only a few people in my office ever used it. In fact, the early history of
Perl recorded in O'Reilly's Camel Book (Programming Perl) was written by my officemate of the time, Daniel Faigin.
He, along with my brother in law, Mark Biggar, were most influential in the early design of Perl. They were also the only users
at the time. Mark talked me out of using bc as a backend expression processor, and into using normal, built in floating
point operations, since they were just being standardized by the IEEE (Institute of Electrical and Electronics Engineers). Relying
on that standard was one of the better decisions I ever made. Earlier scripting languages such as REXX didn't have that option,
and as a result they tend to run slower.
The very first version already contained a lot strong points and first of all the principle idea that the language should provide
for special constructs optimizing the usage of common, most frequent idioms (although, the example Larry Wall sites was a very costly
blunder by Dennis Richie; introduction of ++, -- and of += is much better case to support this argument):
I made one major, incompatible change to Perl just before it was born. From the start, one of my overriding design principles
was to "optimize for the common case." I didn't coin this phase, of course. I learned it from people like Dennis Ritchie,
who realized that computers tend to assign more values than they compare. This is why Dennis made = represent assignment
and == represent comparison in his C programming language.
I'd made many such tradeoffs in designing Perl, but I realized that I'd violated the principle in Perl's regular expression
syntax. It used grep's notion of backslashing ordinary characters to produce metacharacters, rather than egrep's
notion of backslashing metacharacters to produce ordinary characters.
It turns out that you use the metacharacters much more frequently than you do the literal characters, so it made sense to change
Perl so that /(.*)/ defined a substring that could be referenced later, while /\(.*\)/ matched a sequence inside literal parentheses.
Perl 2.0 was release in June 1988 (coincidentally simultaneously with SPARCstation 1 which along with SunOS became the machine
of choice for Perl enthusiasts, who can afford it).
Perl 3.0 was released in 1989 was distributed under GNU public license -- one of the first major open source project distributed
under GNU license and probably the first outside FSF. This is the version that first appeared on PC.
The decision to release it as free open sourced software was also a brilliant and in 1989. And not an easy one:
I knew that I didn't dare ask the company lawyers for permission, because they'd have thought about it for something like six
months, and then told me "no." This is despite the fact that they wouldn't be interested in peddling it themselves.
In the
old days, a lot of free software was released under the principle that it's much easier to ask forgiveness than to seek permission.
I'm glad things have changed -- at least to the extent that the counterculture is acknowledged these days, even if it's not quite
accepted. Yet.
In January 1991 the first edition of Programming Perl, a.k.a. The Pink Camel, by Larry Wall and Randal Schwartz is published by
O'Reilly and Associates. It described a new, 4.0 version of Perl. Simultaneously Perl 4.0 was released (in March of the same year).
. Larry Wall is awarded the Dr. Dobbs Journal Excellence in Programming Award. (March).
This was very well debugged version and was used as interpreter in many software system such as "classic" Tivoli.
Final version of Perl 4 was released in 1993, was very stable, and remained in widespread use for a decade of more. Please note that development lasted only for three years as Perl 5 was released in 1994.
Perl 5.000 was released on October 17, 1994. It was almost complete rewrite of the interpreter and introduction of substantial
changes to the language. Among new constructs added were references, namespaces, new type of local variables (my) variables, and
modules. New functions include: abs(), chr(), uc(), ucfirst(), lc(),
lcfirst(), chomp(), glob(). There is now an English module that provides human
readable translations for cryptic variable names. Modules can be loaded via the new keywords use. Pattern matches may
now be followed by additional modifiers including "multiline match" semantics. An s modifier makes . match
newline.
In Perl 4 shared one global name space, causing maintainability problems for large scripts.
To solve this problem Perl 5 provides two mechanisms for protecting code from having its variables stomped on by
other code: lexically scoped variables created with my or state and global variables,
which are exposed via the vars pragma, or the our keyword. Any global variable belong to
some namespace (the default namespace is called main and can be accessed via a "fully qualified form" like
$main::myvar.
Lexically scoped
variable is considered to be part of that lexical-scope, and does not have a "fully qualified form".
In Perl namespaces are called "packages" and the package declaration tells the compiler
with which namespace to prefix to our
variables and unqualified dynamic names. This both protects against accidental stomping and provides an interface for
deliberately clobbering global dynamic variables declared and used in other scopes or packages, when that is what you
want to do.
This release happened "just in time" for the World Wide Web revolution. WWW has dramatically raised the visibility of Perl 5 around
1996 -- to certain extent at the expense of TCL and other Unix-based scripting languages, although WEB also produced several Perl
competitors like JavaScript and PHP. Due to the Web Perl has become one of major programming languages for Internet and for
some period dominated Web scripting practically like VB dominates Windows scripting arena. Later it was displaced in this role by
PHP -- a Perl derivative that is vastly inferiors language in comparison to Perl, but that adhere more strictly to C syntax.
As it was well integrated with Apache and you can write PHP code directly in HTML files it became new "duct tape of the Web" instead
of Perl.
Initially version 5 coexisted with Perl 4. It became the primary version of Perl in early 1995 (just before Java). This was huge
effort of the part of Larry Wall. As he remarked later:
For Perl, the meltdown happened because I decided to follow the rule: "Plan to throw away your prototype, because you will
anyway." Perl 5 was nearly a total reorganization. I have in times past claimed that it was a total rewrite, but that's a bit
of a stretch, since I did, in fact, evolve Perl 4's runtime system into Perl 5's. (Though if you compared them, you'd see almost
nothing in common.) The compiler, though, was a total rewrite.
Although Perl implementation is an open source implementation, commercial companies actively participated in its development.
The major commercial beneficiary of Perl success was O'Reilly & Associates that published first books on the language (Programming
Perl and Learning Perl in 1993). At the peak of Perl popularity it sold millions of dollars worth of Perl books each year. They also
tried to distribute extended Perl distribution called Perl Resource Kit.
For some time they provided financial support for the Perl movement. From 1995 to 2002 they employ Larry Wall. O'Reilly also hosted
Perl web site and sponsored a Perl conference.
PHP -- an ugly and primitive language had chosen the right area of specialization -- web pages and soon (by and large due to its
primitivism in style of Basic) get substantial following as it was the easiest language for those who developed some pages in HTML
top start to add elements of programming to their Web site.
This was unfair development but life is often unfair. Also around 2000-2001 with the burst of dot-com bubble the computer
book publishing bubble was also burst and as a result O'Reilly lost large chunk of its revenue. Like Perl it never recovered.
Due to the fact that Perl books became much less profitable, around 2002 O'Reilly lost interest in Perl. The same year Larry Wall
left O'Reilly. Here is pretty a telling quote by Bill O'Reilly himself from
Is Perl Still Relevant
- O'Reilly Media (July 2005 )
In terms of the competitive landscape among programming languages, in addition to PHP, Python has long been gaining on Perl.
From about 1/6 the size of the Perl market when I first began tracking it, it's now about 2/3 the size of the Perl book market.
The other scripting language (in addition to Perl, Python, and PHP) that we're paying a lot more attention to these days is Ruby.
The Ruby On Rails framework is taking the world by storm, and has gone one up on PHP in terms of making database backed application
programming a piece of cake.
With O'Reilly divesting from Perl lost its only corporate sponsor. It never acquired another. Development needed to be moved to
by-and-large volunteer basis (although Active State did develop Microsoft Windows version of Perl as a commercial organization).
May be due to this after 2002 (when version 5.8 was released) Perl development dramatically slowed down and the next version (5.10)
was released only in 2007. After that Perl development reacquired some of its former dynamic and we saw versions 5.12, 5.14 and 5.16
released in three consecutive years.
In 2004 Larry Wall undergone the operation that removed a tumor from his stomach. See Portraits
of Open Source Pioneers for additional information. That was an end of Larry Wall role of the primary developer of the language
and serious difficulties with the path of the language followed.
That created a crisis in Perl development leadership, the crisis typical for a large open source project
who lost its original leader in much more dangerous for complex languages (and complex programming systems in general ) than for
simpler one. In such cases people ten to stick to what is available, as nobody has the statute to propose more radical changes
(like depreciating dangerous features like "forced conversions" based on the type of the operation not operator) as well
as introduction of typing of variables, which would allow notation if ($x==$y) for strings inread of error prone
borrowed from Unix shell solution if ($x eq $y), one of the most harmful Perl design blunders.
Perl is not the first and not the last with those problems. Eventually viable projects get back on
track albeit in more modest way. I hope than this is happening with Perl.
Perl remains included by default in all major Linux distributions. Languages popularity metrics like
TIOBE does not reflect usage Perl is system administration were it still is pretty
prominent due to it synergy with bash programming and Unix in general. The versions used are rather old (Perl 5.16.3 in RHEL 7). So
more recent version like version 5.32 will not see the mainstream deployment for a decade or so. And rightly so. As resources to
develop Perl 5 are very scares the quality of recent releases is insufficient for heavy production use and that's probably explains
such a delay.
Fiasco with Perl 6 which was launched despite the fact that there are no available resources for its development suggests that
here are serious problem in Perl development community. One such problem is the rise of Perl complexity junkies which advocate
making Perl fully object oriented language despite the fact that this approach inflicts heavy computational costs and is not optimal
for the major area of use of Perl -- sysadmin scripts. The key part of this party are people connected with book publishing.
In this sense, some Perl authors book represent vanguard of the this part of Perl complexity junkies.
Ambitious Perl 6 project was launched in 2000 with no resources and no real new leader of Larry Wall caliber. Perl 6 has harmed
Perl 5 acceptance, and distract many Perl programmers as it introduced element of FUD in the whole Perl ecosystem.
Also the party of Perl complexity junkies became more prominent in Perl development community than it should be. It
did not produce much of value, but it scared a lot of people who otherwise would use Perl.
The party of Perl complexity junkies became more prominent in Perl development community than it should be. It
did not produce much of value, but it scared a lot of people who otherwise would use Perl.
And it continued to the false impression
that Perl is excessively complex language, that better should be avoided. To me those people produce impression of people with a
deep inferiority complex, who understands the limitation of their abilities but try to hide them with kind of "language bravado". Randal
L. Schwartz is a typical example here. His books are very weak, especially Learning Perl.
Beginning Perlby Simon Cozens is a much better book. Still he enjoys the reputation of a guru.
The party of Perl complexity junkies developed in two main stages
Initially Perl complexity junkies were preoccupied with Perl idioms. They greatly contributed to the impression that Perl in
incompressible language, wild gibberish of obscure digrams with special meaning, which nobody understands.
Later generation of Perl complexity junkies probably can be called "OO challenged" individuals who want to "objectify" anything
in sight, just to make a scratch, so to speak. They do not understand that the fact that Perl was on sideline of OO-revolution is a
strength, not weakness. And there now multiple "junk" Perl books that advocate this approach.
I especially hate their misplaced OO fanaticism and its result in mangling of many Perl standard modules which now are somewhat
dangerous to use because of bugs introduced due to OO conversion (I do not consider developers who wave OO banner to be honest --
most of this type of people who I know personally are corrupt; that does not mean that all of then are dumb. Like corrupt bankers
some of the them are pretty bright and still do their evil things for the sake of personal advancement at the expense of society at
large ;-).
Stress on enhancing Perl OO capabilities prevented the introduction of much more needed and simpler to implement changes. If you look at the language changes from 2002 it is clear that the only important language feature introduced was state variables
in version 5.10. No attention was paid for rampant for all C-style languages problem with misuse of = instead of ==,
instruction of "soft semicolon" at the end of the line, named labels for { and { bracket with the possibility of PL/1 style
"multiple closure (numeric local label in Pascal style would be OK) and other features that diminish that chances of making errors
for mere mortals. Of course, high priests of the cult will deny that such problems exists ;-)
Only misguided dichotomy of == vs eq operators was (partially) addressed by ~~ operator
(which probably was a wrong fix and its implementation turned into fiasco in any case.)
I think that introduction of explicit typing would be
simpler and probably better approach (it is actually available in Perl 6/Raku ) -- variables with explicit type should not be able
automatically to be converted "down" (from string to numeric value, only up (from numeric value to string). Somebody need to have
courage to admit that arbitrary forced conversion mechanisms in Perl went a little bit too far and are a features and can creates
difficult to detect bugs. It is probably high time to do the necessary legwork.
Still, if you look at perldelta for releases starting from 5.10 you would be amazed at the amount of useful work done by
maintainers for free.
Around 2002 Perl became popular in a new field -- bioinformatics. That period lasted probably till 2015 when it was started
gradually replaced by Python and later R.
The importance of programming in biology stretches back to at
least late 90 with the genome decoding efforts. And it certainly has a significant future now that it is a recognized part
of research into many areas of medicine and basic biological research. This may not be news to biologists. But Perl programmers were
surprised that their favorite language has become one of the most - if not the most popular - of programming languages used
in bioinformatics. See RFC Bioinformatics Tutorial
and L. Stein. How Perl saved the Human Genome Project.
The Perl Journal.
As of 2017 Perl is losing its position in bioinformatics in favor of R and Python.
Perl remains not only the primary language for senior Unix sysadmin, but also one of the most influence scripting language in
existence. Being more then 30 years old (version 1.0 was release in December 18, 1987) it is also a historically important
scripting language. That does not mean that it is already dead like PL/1. I would not try to write a book if this was true. On the
contrary, Perl 5 archived status of a mature language that is included by default with all flavors of Unix.
But 30 years of development permit to view the historical importance of the language and its place among similar live and already
dead programming languages.
From the historical point of view Perl is a sibling of C-shell and AWK. The repertoire of built-in function reminds AWK. While
syntax resembles C with some important borrowing from shell (interpolated strings, casing of operand via operation (==
vs eq), sigils like $@%).
That means that those who know shell programming feel that they can adapt to Perl without major problems. And use it as just more
modern version of shell. That's why many UNIX sysadmin find Perl (deceptively) easy to learn.
In reality Perl is a complex language with a complex, sometime even convoluted semantic. The slogan of Perl...."There's
always more than one way to do it." is essentially the same idea that inspire the designers on PL/1 and it would definitely find
home in hearts of designers of MS Office ;-). Different Perl programmers may use different approaches even for simple problem.
In this sense Perl can be considered a very important development: anti-Unix (countercultural) development within the Unix culture
;-). And Larry Wall agree with this:
But Perl was actually much more countercultural than you might think. It was intended to subvert
the Unix philosophy. More specifically, it was intended to subvert that part of Unix philosophy that said that
every tool should do only one thing and do that one thing well.
The problem with that philosophy is that many of the tools available under Unix did not, in fact, do things very well.
They had arbitrary limits. They were slow. They were non-portable. They were difficult to integrate via the shell because they
had different ideas of data formats. They worked okay as long as you did what was expected, but if you wanted to do something
slightly different, you had to write your own tool from scratch.
So that's what I did. Perl is just another tool in the Unix toolbox. Perl does one thing, and it does it well: it gets out
of your face.
But it is very interesting to note that Perl has one very unlikely precursor (I do not know whether Larry Wall ever worked on
mainframes). When I first encountered Perl I was surprised how many of underling ideas of Perl are close to PL/1 -- the language
that served as one of the inspirations for the C and despite being a mainframe language historically related to the Unix culture
via its Multics roots.
PL/1 was very innovative language that was too far ahead of its time to survive. It was the first language that contained good
string handling, exception handling, rudimentary multitasking. It was and probably still is one of the most interesting algorithmic
languages in existence, although it's popularity (similar to popularity of many other interesting IBM products with VM/CMS and OS/2
and examples) that suffered blows from IBM itself and in 70th from religious fanatics in the days of structured programming and verification.
What is most interesting that despite its age PL/1 has probably the best optimizing and debugging compilers for any language of similar
complexity in existence. IBM optimizing and debugging compilers for PL/1 on system 360/370 remain an unsurpassed masterpiece of software
engineering. They will always be remembered along with FORTRAN H and PL/C compilers.
Probably the major killing factor was that compiler for PL/1 was too complex for many organizations to re-implement (it is probably
close to similar to C++ compilers in complexity). No free compiler existed although Cornell University managed to implemented PL/C
-- a pretty full teaching subset of PL/1 and successfully use it for a number of years. Even later simplified version called PL/M
was not able to withstand the competition with free C compilers. I wonder what would happen to PL/1 if IBM released the compiler
under some king of open source license. BTW currently the quality of Perl interpreter is much less that PL/1 debugging compiler.
Paradoxically, PL/1 compilers were used as open source free products in Eastern Europe and the USSR. And it is interesting to
note that it really dominated mainframe programming in the USSR, far outpacing Cobol and Fortran that still dominated the mainframe
arena at this time in the USA and other Western countries. So here analogy with Perl hold perfectly. Moreover PL/1 dominated
despite the fact the Soviet IBM 360/370 clones (called EC -- Russian abbreviation of "Uniform System of Computers") were less powerful
(and less reliable) that Western counterparts.
I would like to stress that PL/1 (as a system programming language for Multics) has large influence on C -- one of the most widely
used compiled programming languages and many of it's ideas directly or indirectly found its way into other programming languages
(I have no information about Larry Wall possible exposure to PL/1) can be found in Perl. IMHO understanding if not PL/1 programming,
but PL/1 philosophy -- or its close relative Perl philosophy can benefit programming community much more that playing with languages
based on some kind of religious doctrine like pure strongly type languages or OO languages ;-).
There were several versions of Perl but that most important are version 4 and version 5 (released in 1995). The latter is still
the current version of the language. Version 4 was widely available before WEB explosion in 1994.
Another thing that helped legitimize Perl was the addition of the Artistic License to stand beside the GPL. Perl 3 used only
the GPL, but I found that this didn't do quite what I wanted. I wanted Perl to be used, and the GPL was preventing people from
using Perl. Not that I dislike the GPL myself -- it provides a set of assurances that many hackers find comforting. But business
people needed a different set of assurances, and so I wrote the Artistic License to reassure them.
The really brilliant part was that I didn't require people to state which license they were distributing under, so nobody had
to publicly commit to one or the other. In sociological terms, nobody had to lose face, or cause anyone else to lose face. Most
everyone chose to read whichever license they preferred, and to ignore the other. That's how Perl used psychology to subvert the
license wars which, as you may or may not be aware, are still going on. Ho hum.
Yet another thing that helped legitimize Perl was that there was a long period of stability for Perl 4, patch level 36. The
primary cause of this was that I abandoned Perl 4 to work on Perl 5.
Another little know innovation of Perl is high quality debugger. This is a real masterpiece of programming. before I started using
Perl, I never saw anything even close to the convenience and power of Perl debugger. AWK did not have a debugger. Shell has
had some debuggers, but they did not ship with the language. Of course, now there several imitations (and Python debugger after
version 2.6 is OK, but only OK), But still Perl 5 debugger is the original masterpeace.
It is very sad that few Perl programmers understand this gem and even fewer use full capabilities. There is no good books on debugger
but basics can be learned in a couple of hours from Youtube video.
(nytimes.com) 39died on
Saturday at his home in St. Paul, Minn. He was 80. From a report: His wife, Linda, said
that he died after an episode of ventricular tachycardia, in which the heart beats faster than
normal. Mr. Silver had a heart transplant 27 years ago.
Since their introduction in 1980, Post-it Notes have become a ubiquitous office product,
first in the form of little canary-yellow pads -- billions of which are sold annually -- and
later also in different hues and sizes, some with much stickier adhesives. There are currently
more than 3,000 Post-it Brand products globally. Dr. Silver worked in 3M's central research
laboratory developing adhesives. In 1968, he was trying to create one that was so strong it
could be used in aircraft construction.
He failed in that goal. But during his experimentation, he invented something entirely
different: an adhesive that stuck to surfaces, but that could be easily peeled off and was
reusable. It was a solution to a problem that did not appear to exist, but Dr. Silver was
certain it was a breakthrough. "I felt my adhesive was so obviously unique that I began to give
seminars throughout 3M in the hope I would spark an idea among its product developers," he told
Financial Times in 2010. Dr. Silver promoted his adhesive for several years within 3M, a
company known for its innovative workplace, so assiduously that he became known as "Mr.
Persistent."
Randy Suess, having worked at IBM and Zenith, was bored. "I was looking for something more
to do -- and then this damn thing called a [personal] computer came along," he
said in the 2005 film BBS: The Documentary.
In 1975, he joined the Chicago Area Computer Hobbyists' Exchange, or CACHE, where he met
Ward Christensen. The two lived far enough apart that they had to invent ways
to collaborate remotely: Christensen wrote the MODEM.ASM program and XMODEM file transfer
protocol so they could send each other files.
When a blizzard struck Chicago in 1978, CACHE meetings were canceled, prompting Suess and
Christensen to develop an online alternative that all members could
participate in. With Suess's hardware and Christensen's software, they created the first
dial-up bulletin board, naming it the Computerized Bulletin Board System.
The impact of CBBS was felt far beyond Chicagoland: by 1994, there were 60,000 BBSes
across the United States. Their online communities, message boards, file libraries, and
multiplayer games were a precursor to everything from Reddit to YouTube and World of
Warcraft.
Suess died at 74. When CBBS went offline in the 1980s, it had received over a half-million
calls. A version of CBBS is still available via telnet.
Douglas Engelbart is credited with inventing the computer mouse, but it was Bill English's
design and implementation that made the mouse a reality.
After serving in the United States Navy, English joined the Stanford Research Institute
(SRI) in the early 1960s. There, he partnered with Engelbart on an experimental computer
dubbed the oNLine System (NLS).
As part of that collaboration, Engelbart shared his sketches for a mechanical device that
could move a digital pointer on a screen with English. English proceeded to develop a
prototype, housing it in a pinewood case. He was the author of the first article to refer to
a computer mouse, published in July 1965, three years before Engelbart demonstrated it in the
"Mother of All Demos," which English directed.
After leaving SRI in 1971, English joined Xerox PARC, where he adapted his NLS concepts
into the Alto computer, which inspired Steve Jobs and the Apple Macintosh.
Former high-school teacher Frances Allen was pursuing a master's degree in mathematics
when she had her first encounter with computers, learning how to program an IBM 650. Upon
earning her master's in 1957, she switched careers and joined IBM Research, where she stayed
for 45 years. Her educational background was
immediately put to good use, teaching IBM programmers how to use the two-month-old FORTRAN
language.
Her next project was with the IBM 7030, also known as Stretch, IBM's first transistorized
supercomputer and the world's fastest computer from 1961 to 1963. Allen's team developed a
compiler for Stretch that supported three different programming languages; it would be used
by the National Security Agency, where Allen spent a year overseeing the computer's
installation and testing.
A pioneer in compiler optimization, Allen published several landmark papers on program
optimization, control flow analysis, and parallelizing compilers that paved the way for
modern, efficient programming. She was the first woman to be named an IBM Fellow in 1989, and
the first woman to win the Turing Award in 2006.
Allen was also committed to helping future generations of female programmers, speaking at
conferences around the world and urging women to consider STEM careers. In 2000, IBM
established the Frances E. Allen Women in Technology Mentoring Award in her honor.
"She was a brilliant thinker and innovator and a kind-hearted individual who had an
unabashed commitment to others," said IBM in a tribute video.
For example Microsoft success was by the large part determined its alliance with IBM
in the creation of PC and then exploiting IBM ineptness to ride this via shred marketing
and alliances and "natural monopoly" tendencies in IT. MS DOS was a clone of CP/M that
was bought, extended and skillfully marketed. Zero innovation here.
Both Microsoft and Apple rely of research labs in other companies to produce
innovation which they then then produced and marketed. Even Steve Jobs smartphone was not
an innovation per se: it was just a slick form factor that was the most successful in the
market. All functionality existed in other products.
Facebook was prelude to, has given the world a glimpse into, the future.
From pure technical POV Facebook is mostly junk. It is a tremendous database of user
information which user supply themselves due to cultivated exhibitionism. Kind of private
intelligence company. The mere fact that software was written in PHP tells you something
about real Zuckerberg level.
Amazon created a usable interface for shopping via internet (creating comments
infrastructure and a usable user account database ) but this is not innovation in any
sense of the word. It prospered by stealing large part of Wall Mart logistic software
(and people) and using Wall Mart tricks with suppliers. So Bezos model was Wall Mart
clone on the Internet.
Unless something is done, Bezos will soon be the most powerful man in the world.
People like Bezos, Google founders, Zuckerberg to a certain extent are part of
intelligence agencies infrastructure. Remember Prism. So implicitly we can assume that
they all report to the head of CIA.
Artificial Intelligence, AI, is another consequence of this era of innovation that
demands our immediate attention.
There is very little intelligence in artificial intelligence :-). Intelligent behavior
of robots in mostly an illusion created by First Clark law:
The gun data computer was a series of artillery computers used by the U.S. Army for coastal
artillery, field artillery and anti-aircraft artillery applications. In antiaircraft
applications they were used in conjunction with a director.
Variations
M1: This was used by seacoast artillery for major-caliber seacoast guns. It computed
continuous firing data for a battery of two guns that were separated by not more than 1,000
feet (300 m). It utilised the same type of input data furnished by a range section with the
then-current (1940) types of position-finding and fire-control equipment
M18: FADAC (Field Artillery Digital Automatic Computer),[1][2] an all-transistorized
general-purpose digital computer[3] manufactured by Amelco (Teledyne Systems, Inc.,)[4] and
North American -- Autonetics.[5] FADAC was first fielded in 1960,[6][7] and was the first
semiconductor-based digital electronics field-artillery computer
The Intel 4004 is a 4-bit central processing unit (CPU) released by Intel Corporation in
1971. It was the first commercially produced microprocessor,[2] and the first in a long line of
Intel CPUs.
The chip design, implemented with the MOS silicon gate technology, started in April 1970,
and was created by Federico Faggin who led the project from beginning to completion in 1971.
Marcian Hoff formulated and led the architectural proposal in 1969, and Masatoshi Shima
contributed to the architecture and later to the logic design. The first delivery of a fully
operational 4004 occurred in March 1971 to Busicom Corp. of Japan for its 141-PF printing
calculator engineering prototype (now displayed in the Computer History Museum – Mountain
View, Ca) [1]. This calculator for which the 4004 was originally designed and built as a custom
chip [3] was first commercially available in July 1971.
*************************************************
[Note the distinction between military and commercial application.]
The 4004 was the first random logic circuit integrated in one chip using the MOS
(metal–oxide–semiconductor) silicon gate technology (SGT). It was the most advanced
integrated circuit (IC) design undertaken up until then
The above is a good capsule summary of the FCS of the major battleships of each nation and
I'm going to refer to it in this post. For purposes of simplicity, I am only going to discuss
the British, German, Japanese and US ships. Since you are only interested in optical FC, I'm
also not going to discuss radar FC except to state, once again, that radar FC beats optical in
almost any situation and so there's little point in comparing the two.
Now, three points to note:
The British battleships lacked RPC almost entirely until late in the war. The Japanese
never implemented it in any meaningful way. The Germans had it only for elevation (more on
that later) while the USN had it for both elevation and bearing. To me, RPC is one of the
single most important advancements in the development of accurate gunnery. No longer did
the director operator shout down a voice tube "target bearing 230 degrees at 20,500 yards"
to which another sailor dialed into the rangekeeper (analog computer) which was then
transmitted to a set of dials in the gun mounts which then the gun captain had his trainers
and elevators match by slewing/elevating the guns to match the dials, with at every point a
potential for operator errors. Instead, the director operator now controlled the laying of
the guns almost directly. Plus, in the US system, there was a feedback system where the
rangekeeper moved the director sights to where it thought the ship should be shooting. If
it wasn't correct, the director operator adjusted the sights back on target, thus setting
up a closed-loop system (for a living, I design process control computers that do this
digitally. So, please trust me on this, I'm amazed at what was achieved in closed-loop
systems using 1930s analog computers). The German system had RPC only for the elevation,
but this is not as bad as it may seem. Since, in the German system of "bracket salvos," the
first half salvo is really to determine bearing, they did not feel that the extra
complication was necessary. Judging from the results at River Plate and Denmark Strait,
it's hard to argue, but, my personal opinion is that it would have been worth the
investment. The Japanese, as in much of their naval technology, used the same methodology
as their British mentors and used a "follow the pointer" system for both elevation and
training.
The main director baselength on the Japanese, German and US ships was roughly about as
long as on the main turrets and was relatively large. By contrast, the British had a
relatively short baselength both as compared to the turret rangefinders and as compared to
those of other nation's ships. In fact, the main director on British ships was almost an
afterthought, intended to be used more for fire direction than for fire control. The real
FC rangefinders in the British ships were the ones on the main turrets. Again, I was
surprised when I discovered this a couple of years ago, as the British pioneered
centralized FC and the Nelson class with their high-mounted directors greatly influenced
subsequent designs. The British reliance on turret RF's is really a holdover from pre- and
early-dreadnought days, where each individual turret layed the guns for themselves.
This reliance on turret RF was shown flawed at the Denmark Strait: Since Adm. Holland
chose to push his ships directly towards the Germans and thus into the wind; sea spray
coated the optics on all four forward turrets and forced the British to use the less
accurate main directors. I think that the results speak for themselves: The British fired
long-spaced ladder salvos and didn't land a hit until after the POW turned broadside to the
Germans (i.e., the turrets no longer faced into the sea spray) at a relatively short range
of about 16,000 yards. As a result of this engagement, the British belatedly realized their
design/concept flaw and installed a larger (but still relatively short) baselength director
on the last three KGV ships. However, only the forward director was modified, the aft
director was unchanged.
The use of stable vertical elements in the US systems. I call your attention to the
paragraph at the bottom of the above "Baddest" link, describing the performance of the USS
North Carolina (the oldest of the new battleships) during a series of maneuvers where she
still maintained target lock. Since you own "British Battleships," I won't repeat the
problems and successes that the British had with their systems. I have only limited
information on what other nation's ships were capable of, but it appears that the German's
systems were at least equivalent to those of the British (I assume this from descriptions
in Whitley's and Campbell's books plus Baron Mullenheim-Rechburg's comments in
"Survivor").
Bottom line: The Japanese and the Germans had better optical RF's than any other nation. In
a fight where only optical systems are used, they had a clear advantage as shown at Denmark
Strait, the River Platte and First Savo (I assume that you've seen my previous posts regarding
the POW's radar and won't revisit the subject). However, the US had the best FC as a system
(FCS). What this meant is that, when 10cm fire-control radar became available, the US was able
to easily integrate it into their FCS, thus creating the best overall FCS as compared to the
FCS used by any other nation. My (strictly amateur) conclusion is that the US FCS with radar
was the most advanced of any nation in the 1942-1945 timeframe.
One other item, per your last post: I do not take the Bismarck's performance at her last
battle as being truly indicative of her performance. The crew was exhausted by their night-long
skirmishing with Capt. Vian's destroyers and the Bismarck's motion was subject to random
direction changes. By contrast, the British battleships had rested crews firing from stable
platforms. As always, I dislike to speculate upon "what-ifs," especially this one, as, in my
opinion, an undamaged Bismarck with Adm. Lutjens in command wouldn't have fought against the
KGV and Rodney, she would have beat feet in the opposite direction as fast as possible. So,
I'll leave any thought as to what the outcome of an engagement between healthy ships to
others.
"... Among other things, Mr. Engelbart, who died in 2013 at 88 , envisioned a mechanical device that could move a cursor across a screen and perform discrete tasks by selecting particular symbols or images. Mr. English made this a reality, building the first computer mouse and, through a series of tests, showing that it could navigate a screen faster than any other device developed at S.R.I. ..."
"... As Mr. Engelbart demonstrated the machine onstage at the Civic Auditorium, a live video appeared on the wall behind him showing the seamless interaction between his mouse and the computer screen. Mr. English directed this elaborate production from the back of the auditorium, relying on cameras and microphones both there and at the lab that housed the computer in Menlo Park, Calif., more than 30 miles away. ..."
"... After Mr. Engelbart had envisaged the computer mouse and drawn a rough sketch of it on a notepad, Mr. English built it in the mid-1960s. Housed inside a small pinewood case, the device consisted of two electrical mechanisms, called potentiometers, that tracked the movement of two small wheels as they moved across a desktop. They called it a mouse because of the way the computer's on-screen cursor, called a CAT, seemed to chase the device's path. ..."
"... As they were developing the system, both Mr. English and Mr. Engelbart were part of the government-funded L.S.D. tests conducted by a nearby lab called the International Foundation of Advanced Study. Both took the psychedelic as part of a sweeping effort to determine whether it could "open the mind" and foster creativity. ..."
"... Three years after the demonstration, Mr. English left S.R.I. and joined a new Xerox lab called the Palo Alto Research Center, or PARC . There he helped adapt many of the NLS ideas for a new machine called the Alto, which became a template for the Apple Macintosh, the first Microsoft Windows personal computers and other internet-connected devices. ..."
ALGOL 60 at 60: The greatest computer language you've never used and grandaddy of the programming family treeBack to
the time when tape was king By Richard Speed
15 May 2020 at 09:47 149 SHARE ▼
An Elliott 803 at Loughborough Grammar School in 1976 (pic: Loughborough Schools Foundation / Peter Onion)
2020 marks 60 years since ALGOL 60 laid the groundwork for a multitude of computer languages.
The Register spoke to The National Museum of Computing's Peter Onion
and Andrew Herbert to learn a bit more about the good old days of punch tapes.
ALGOL 60 was the successor to ALGOL 58, which debuted in 1958. ALGOL 58 had introduced the concept of code blocks (replete with
begin and end delimiting pairs), but ALGOL 60 took these starting points of structured programming and
ran with them, giving rise to familiar faces such as Pascal and C, as well as the likes of B and Simula.
"In the 1950s most code was originally written in machine code or assembly code," said Herbert, former director of Microsoft Research
in Cambridge, with every computer having its own particular twist on things. A first generation of languages, called "Autocode",
existed for coding problems like equations which could then be translated into machine code, but lacked the bells and whistles of
today. Worse, some had features that others lacked, making hopping between systems tricky.
"There was an Autocode for the [Elliott] 803," said Onion, "but it only supported expressions like A + B = C, so if you've got
a complex equation, you have to break it down into individual single binary operations. So there was still a lot of hard work to
be done by the programmer."
"Fortran," said Herbert, "emerged as the first real programming language for scientific and numeric work. That convinced people
that having higher-level languages (as they called them then – they were pretty primitive by modern standards) made programmers more
productive."
The overhead of compiling, and inefficiencies in the compilers themselves, meant that machine code remained king of the performance
hill, but for those doing science work, the ability to churn out some code to solve a problem and then simply move on to the next
was appealing.
"Fortran," Herbert continued, "was more like an autocode," before laughing, "It still is in some ways!
"And a bunch of people thought you could do better."
Enter the International Federation for Information Processing (IFIP), which Herbert recalled "had a whole bunch of committees
who looked at standards and problems in computing".
One group started on the design of what was then called an "Algorithmic Language": a language for writing algorithms. The output,
in 1958, described the language "ALGOL 58". However, as engineers began to create compilers for the new system, they found "all kinds
of things hadn't really been thought about or worked through properly," recalled Herbert.
And so there were revisions and changes. A periodical called "
The ALGOL Bulletin " detailed
the travails of those involved as the problems and the weaknesses in the language were dealt with (or at least attempted).
The process was not unlike an open-source mailing list today, but in paper form.
Eventually, Herbert told us, "they published the ALGOL 60 report, which is the baseline that everyone then worked to."
The committees were under pressure and also suffered a little from differing international approaches. The American side had a
lot of experience in Fortran and were seeking something that could quickly be made to work on their computers, while the Europeans
were a little more cerebral and had, Herbert laughed, "terrible notions like beauty and elegance in mind for the language".
"People were sorting out some of the things that we now take for granted like ideas in structured programming, data structures,
data types," he added.
Seeking solutions to the problem of portability of programmers between systems and code between hardware generations as well as
avoiding the pain of having to rewrite programs every time a new iteration of computer arrived, vendors embraced the language with
variants cropping up over many manufacturers.
ALGOL 60 on tape (pic: Peter Onion)
Alas, those seeking a handy-dandy "HELLO WORLD" example will be disappointed. The Achilles' heel of the language that would go
on to inspire so many others was that it lacked standard input/output capabilities.
"The defining committee couldn't agree on how to do input/output," said Herbert. "They decided that would be left to a library,
and that library would be user dependent."
"In this case," added Onion, "the user being the compiler writer."
Oh dear. The omission pretty much did for vendor independence as manufacturers naturally went their own way, leaving large chunks
of code incompatible between systems. There were also elements of ALGOL 60 that were open to interpretation, leaving it a little
compromised from the start.
While ALGOL ploughed its furrow, Fortran continued to be developed in parallel. "People in the Fortran world," explained Herbert,
"saw ideas in ALGOL they quite liked and brought them across." As the decades passed, Fortran remained the centre of gravity for
scientific computing while ALGOL became more of an academic language, used for teaching computer science ideas.
"It was quite heavily used in the scientific community," Herbert said. "Most mainframe manufacturers supported it."
Some of the team behind ALGOL 60 stayed with the project and went on to come up with ALGOL 68, which, as far as Herbert is concerned,
"nailed all the things that ALGOL 60 had left a bit vague".
Indeed, it was hard to avoid in the 1970s for those taking computer science courses. This hack has fond memories of the successor
language, while the grandfather of Reg sub-editor
Richard Currie had a hand in
the development of ALGOL 68-R
and RS.
"It had the world's most exotic input output system," Herbert laughed.
It was also, sadly for its enthusiasts, a bit of a dead end. Despite ALGOL 68-R becoming widely used in (particularly British)
military applications for a time, it would take until the 1970s for a full implementation of ALGOL 68 to become available.
The last edition of
The ALGOL Bulletin was published in 1988, with its editor noting: "ALGOL 68 as a language is very stable. It is used and loved
by those who understand its benefits, and ignored (or misquoted) by the rest."
The story of ALGOL 60 is not so much of the language's eventual fate, but also of those that it inspired. ALGOL W, based on a
proposal for ALGOL X, by Niklaus Wirth and QuickSort creator Tony Hoare would go on to inspire Wirth's Pascal and Modula-2. Pascal's
influence continues to be felt today.
ALGOL 60 also heavily influenced the Combined Programming Language (CPL), developed in the 1960s but not implemented until the
following decade. CPL in turn led to Basic CPL (BCPL), from which B descended. The B language was further developed to become C.
Tony Hoare was responsible for the implementation of ALGOL 60 on the
Elliott 803 computer , an example of which remains
operational at The National Museum of Computing, although compiling and running a program on that hardware is a little different
to the development environments to which coders are now accustomed.
First, the compiler must be loaded from paper tape. The ALGOL program itself is then fed into the tape reader and "it sort of
chunters away," remarked Onion, "for anything between 30 seconds to perhaps 15 or 20 minutes during the compilation."
https://www.youtube.com/embed/AIxZ1i8pvZI
Once compiled, a program would be free to use the space originally occupied by the compiler. Doing so would, however, not win
the programmer any popularity awards since the next user would have to reload the compiler again. Leaving it in memory meant that
multiple programs could be run.
"That made it very popular for teaching," said Herbert, "because you can have a line of students, each with their paper tape with
their programme in their hand and you basically march up to the machine, the machine's got the ALGOL system loaded, you run your
programme, it produces gibberish, you go away and think about it and the next student runs their programme."
With paper tape being king, Onion observed that the experience of programming taught a bit of focus: "When your edit, compile,
edit, compile cycle starts to get above about 10 minutes, you start to pay an awful lot of attention to your source code "
The National Museum of Computing has two Elliott machines in its
collection , a 1962 803B (which was donated after spending
15 years lurking in a barn following its decommissioning) and a 903. Both are fully operational and can be seen running once the
museum is able to open its doors once again.
The 803B, which is maintained by Onion, also features a Calcomp drum plotter as well as some additional input/output features.
The Lorenz attractor plotted by an ALGOL program (pic: Peter Onion)
As for taking the ALGOL 60 itself out for a spin today, there are a few options for those not fortunate enough to have an Elliott
803 or 903 to hand. MARST will translate ALGOL 60 to
C or one can get a feel for the whole 803 experience via a simulator
.
Although as ALGOL 60 turns 60, you could just fire up a modern programming language. Lurking within will likely be the ideas of
ALGOL's designers. ® John Thorn
When I was studying Electronic Engineering in the early 1980s, ALGOL was the first language we were formally taught - I remember
the ALGOL-68R language guide was a Ministry of Defence book.
Algol 60 on an ICL 1902 around 1980 here; How many freaking errors did you get because of missing semicolons?;
As for PL/1, IBM also had its own extended versions (confidential for some reason), used for internal mainframe code development,
called PL/AS and PL/DS.
"The more I ponder the principles of language design, and the techniques that put them into practice, the more is my amazement
at and admiration of ALGOL 60. Here is a language so far ahead of its time that it was not only an improvement on its predecessors
but also on nearly all its successors".
- C.A.R. Hoare, "Hints on Programming Language Design", 1973
"When your edit, compile, edit, compile cycle starts to get above about 10 minutes, you start to pay an awful lot of attention
to your source code "
10 minutes? Luxury. Punched card jobs run in batches. 2 hours turn-round, max 3 runs a day with the compiler losing track after
the first error and rejecting every subsequent line. Then you really paid attention to your source code.
The mainframe operators soon learned that when a System 4 (IBM 360) Assembler run produced thousands of errors after the first
few statements it just needed a statement adding and rerunning. IIRC something like a USE or BALR 3,0 establishing the addressing
base register.
The punched card data-prep gir women also became quite competent at spotting common mistakes. Hence one compiler
test source compiled cleanly - when it was supposed to test those error messages.
On an official training course to learn the System 4 Assembler there was a desk of us who had already had some hands-on practice.
The lecturer in Hut K was kept on his toes by our questions. When the set program task was returned from the computer run - he
gleefully gave us our failed run listings. We looked at them - then pointed out he had forgotten to include the macro expansion
pass. Oops! He then remembered that he always left it out on the first submission to save the expense on his machine time budget.
He didn't expect clean compilations from the students.
The "bad for science" machine was the IBM 360, where the floating point exponent represented 16**n rather than 2**n. As a result
the floating point mantissa sometimes lost three bits. The result was that the single precision floating point was good to only
about six decimal digits. Hence the proliferation of double precision floating point on IBM. It was not needed on ICL 190x nor
Elliott 803.
IBM were mainly interested in commercial arithmetic from COBOL compilers. This used binary coded decimal (BCD) arithmetic,
which could handle billions of dollars to the nearest cent. COBOL type computational defaulted to BCD, I believe. I was once trying
to explain floating point data to a database salesman. I finally got through to him with the phrase computational-type-3.
I do! Back in the early-1980s, working at the in-house consultancy arm of a multinational, I was on-site doing some tedious
task when I was joined by a colleague from our Dutch office. He had even less patience than me, so less than 30 minutes in, he
resorted to giving me a formal lecture on Backus-Naur notation for the rest of the morning (and returned to Rotterdam in the afternoon).
(When the main Board closed us down the following year, he returned to the University of Leiden. Thank you, Joss - I'll never
forget BNF.)
"When your edit, compile, edit, compile cycle starts to get above about 10 minutes, you start to pay an awful lot of attention
to your source code..."
I remember those days, except I was using punch cards instead of paper tape. Those long turn-arounds forced you to desk check
your code and spend time debugging it properly.
A tale of how a sysadmin went from hobbyist and Level 1 tech to the big time at Red Hat.
We've all got a story, right? I don't know if anyone would read mine, but, to the right audience, it
might sound familiar, or at least relatable. My life is sort of fractured between two things I'm
passionate about. One is the off-road industry, and the other is open source software. Some time ago,
I was involved in a YouTube "challenge" where a bunch of us off-road enthusiasts were asked to share
our story, and I told the tale of how I got involved in "Jeeping" and why we do what we do in that
space. Here, I am about to tell the other side of my story, where I'm a computer guy with a bunch of
nerd cred. So hang on while I tell you the story of a broke high-school kid who stumbled into a
career.
Career Advice
I was a kid in the '80s; before computers were everywhere you looked. My dad, though, he was his
generation's version of a geek -- a telephone guy, son of a handyman who worked his way through the
Depression, making whatever he could out of whatever he had. As a kid, my dad involved me in all sorts
of projects involving building things or wiring, and we even built an electrified wooden toy
helicopter together. I learned from him that you could build your own happiness. I always joke with
people that I learned my alphabet on a Texas Instruments computer connected to our living room TV. My
first game system was an Atari computer with a 5.25" floppy drive.
In the early '90s, my dad brought home a state-of-the-art 486 desktop computer. Our first modern
computer! He gave me his old Atari to put in my bedroom and tinker on, and that's what I did. The 486,
though, that thing had a modern operating system on it, and I was nothing short of enthralled. A
friend introduced me to some local dial-up bulletin board systems.
That, I'd have to say, is what started it all. I learned so much from this community of like-minded
folks; all dialed into this little BBS. Eventually, I became very curious about how the BBS itself
worked and started tinkering with BBS software. I found the space to be expensive, though; you needed
hardware I couldn't afford.
Then came the Internet. As I mentioned, my dad was a telephone guy. He was, at the time, an
engineer at a local telephone company. One of the partner companies under the same umbrella as his
telco was starting up an internet service provider! He was able to get in early, and I was one of the
first kids in my grade to get access to the Internet.
The Internet was a very different place at the time. You'd dial in, so it was much slower, and,
because of the speed and the technologies involved, it was very much a text world. It was certainly
not the entertainment source it is today, but to me, it was still just amazing!
I had a friend who was just as intrigued by technology as I was. He and I were both interested in
the world outside of the end-user experience of these neat little computer things. We read everything
we could get our hands-on. We read about websites and servers and how all of these things worked
together. We read about some of the darker sides of technology, the Hackers Handbook, and how phone
phreaking worked. I even learned a bit about how to pick locks! Lucky for me, and my parents, I've
always been guided by a pretty fierce sense of morality, or my life could have turned out much
differently.
Our reading and learning eventually led us to Linux. The word "free" associated with Linux caught
our attention. I didn't have a job, I was in high school, so free was good. Little did I know that
picking up that first Linux distro (Red Hat 5.0, I still have the CDs) would steer me into the career
I've worked at for my entire adult life. My friend, by the way, he runs the Engineering team at that
ISP I mentioned now. I guess our curiosity worked out pretty well for him too!
During the summer between my sophomore and junior years in High School, I picked up and started
tinkering with those Red Hat 5.0 install discs. I installed, and reinstalled, and reinstalled, and
reinstalled that OS on my little 486 until I finally got it right. I even got dual-booting working, so
I could keep my Windows environment and play with Linux. After I graduated, my parents bought me a new
PC to use for my school work in college, so I was able to turn my little 486 into a dedicated Linux
machine. By now, we'd moved from dial-up internet service to dedicated cable. 500mbps baby! I ran a
website off of my little 486. I lost track of the number of times I had to wipe and reinstall that
system because some malicious actor broke into my poor little machine and flattened it on me, but I
persisted, learning something else each time.
While I was in college, I worked in level 1 tech support for the ISP I mentioned above. I didn't
love it. I had no control over the services I was supporting, and let's face it, level 1 tech support
is a frustrating IT job. I spent five years doing that, trying and failing to get into the system
administration group at the ISP. Eventually, I moved up into a network support role, which was much
better than level 1, but not where I wanted to be. I was OK at networking, and I certainly could have
made a career out of it, but it wasn't what I wanted to do. I wanted to run servers. I wanted to run
Linux.
So, after seven years at the ISP, I left and started a job as a network administrator at a small
web host. We were a team of about ten people, though that varied during the time I was there. "Network
administrator" was a very loose title there. I was responsible for everything that had a CPU in it. I
even had to replace the filter in the building's AC unit. I was responsible for network gear, some WAN
links, Cisco routers, switches, and of course, Windows, Linux, and BSD servers. This was much more in
line with what I wanted to do. However, I didn't love how they were doing it, not just from a
technology aspect, but from a business perspective. They did some things that I thought were
questionable. Still, though, I was gaining experience, in so many ways. I implemented more and more
Linux systems to replace windows and BSD systems there, architected improvements, and generally did
the best job I knew how to do.
After about three and a half years there, I left that web host for what I thought would be my last
move. I started as a system administrator at a small liberal arts college near home. By this point,
I'm married, and my wife and I are planning on a family. Higher education has some benefits that many
people might not know about. It's a great atmosphere, and they put a lot of emphasis on bettering
yourself, not just putting in your hours. The only real downside is that the pay is lower than in the
private sector. Still, this was an increase over what I was making, and I didn't know it at the time,
but I was walking into a team that helped set me up for the future in ways I couldn't have imagined.
See, I believe that IT is made up of two types of people: folks who see IT as a lucrative career,
and folks who are passionate about IT and get paid to do it. This place was about 50% passionate
people. I had never worked so closely with people so excited to do what they do. I felt like I was at
home, I was learning new things every day, and talking with some of the most brilliant IT people I'd
ever met. What's more, they all wanted to share what they knew.
Well, over time, that slowly changed. Those brilliant people took other jobs, some changes in
management forced some others out, and eventually, I found that I was one of the few left who was
still passionate about what had been so important to the mission at the college. More cloud adoption
meant less need for a do-it-yourselfer like me. My "I'm going to retire here" plans started to
crumble. I eventually moved into a new role they created for high-performance computing, which had
promise. We started deploying the college's first HPC cluster. Then I got a message one Sunday
afternoon from a contact I'd made within Red Hat.
I'd met Marc (Unclemarc to those who know him) through the Red Hat Accelerators, a customer
advocacy group that Red Hat runs, and of which I'd become a member in late 2018. We hung out at Summit
in Boston in early 2019, and apparently, he liked what he saw in me. He let me know that the team he's
on would likely have an opening soon, and he thought I'd make a great addition. Now, for me, the
prospect of a job at Red Hat sounded almost too good to be true. I'd been a fan of Red Hat since
well, remember when I said I bought that first Linux distro install disc in 1997 or so? It was Red Hat
Linux. I'd based a career on a Linux distro I'd bought out of an interest in a better way of doing
something, when I was a kid in high school, looking for a cheaper alternative. Now here I am, a few
months into Technical Account Management at Red Hat. I guess you could say I'm pleased with where this
path has taken me.
Wondering where your sysadmin career could take you? Take a
skills assessment
to find the next step on the path for you.
Topics:
Sysadmin culture
Career
Nathan Lager
Nate is a Technical Account Manager with Red Hat and an experienced sysadmin with 20 years in
the industry. He first encountered Linux (Red Hat 5.0) as a teenager, after deciding that
software licensing was too expensive for a kid with no income, in the late 90's. Since then
he's run
More about me
The America of the moon-landing is not the America of today. Graduates of business schools
have taken over once great engineering companies. The business students are of a lower
intellect and baser motivation -- the worst possible combination.
The desire for science and engineering has weakened in America but greed for money and
wealth is greatly increased. The business types produce mostly intellectual garbage and
compensate for it with volume. No competent intellect can read it (or wants to do so) and so
it remains unchecked, inflates even more and clogs everything.
You can live for a long time on the great inheritance your fathers have bequeathed you but
you cannot live on it forever. Yet this is what we are trying to in more ways than one.
The shock in the US was that the Russians were not only competitive, but had embarrassed US
science and engineering by being first. In 1958, President Eisenhower signed into law the
National Defense Education Act, and this enabled talented students to flow into science and
engineering. The shock waves were felt throughout the entire educational system, from top to
bottom. Mathematics was more important than football.
The world is filled with conformism and groupthink. Most people do not wish to think for
themselves. Thinking for oneself is dangerous, requires effort and often leads to rejection by
the herd of one's peers.
The profession of arms, the intelligence business, the civil service bureaucracy, the
wondrous world of groups like the League of Women Voters, Rotary Club as well as the empire of
the thinktanks are all rotten with this sickness, an illness which leads inevitably to
stereotyped and unrealistic thinking, thinking that does not reflect reality.
The worst locus of this mentally crippling phenomenon is the world of the academics. I have
served on a number of boards that awarded Ph.D and post doctoral grants. I was on the Fulbright
Fellowship federal board. I was on the HF Guggenheim program and executive boards for a long
time. Those are two examples of my exposure to the individual and collective academic
minds.
As a class of people I find them unimpressive. The credentialing exercise in acquiring a
doctorate is basically a nepotistic process of sucking up to elders and a crutch for ego
support as well as an entrance ticket for various hierarchies, among them the world of the
academy. The process of degree acquisition itself requires sponsorship by esteemed academics
who recommend candidates who do not stray very far from the corpus of known work in whichever
narrow field is involved. The endorsements from RESPECTED academics are often decisive in the
award of grants.
This process is continued throughout a career in academic research. PEER REVIEW is the
sine qua non for acceptance of a "paper," invitation to career making conferences, or
to the Holy of Holies, TENURE.
This life experience forms and creates CONFORMISTS, people who instinctively boot-lick their
fellows in a search for the "Good Doggy" moments that make up their lives. These people are for
sale. Their price may not be money, but they are still for sale. They want to be accepted as
members of their group. Dissent leads to expulsion or effective rejection from the group.
This mentality renders doubtful any assertion that a large group of academics supports any
stated conclusion. As a species academics will say or do anything to be included in their
caste.
This makes them inherently dangerous. They will support any party or parties, of any
political inclination if that group has the money, and the potential or actual power to
maintain the academics as a tribe. pl
That is the nature of tribes and humans are very tribal. At least most of them.
Fortunately, there are outliers. I was recently reading "Political Tribes" which was written
by a couple who are both law professors that examines this.
Take global warming (aka the rebranded climate change). Good luck getting grants to do any
skeptical research. This highly complex subject which posits human impact is a perfect
example of tribal bias.
My success in the private sector comes from consistent questioning what I wanted to be
true to prevent suboptimal design decisions.
I also instinctively dislike groups that have some idealized view of "What is to be
done?"
As Groucho said: "I refuse to join any club that would have me as a member"
The 'isms' had it, be it Nazism, Fascism, Communism, Totalitarianism, Elitism all demand
conformity and adherence to group think. If one does not co-tow to whichever 'ism' is at
play, those outside their group think are persecuted, ostracized, jailed, and executed all
because they defy their conformity demands, and defy allegiance to them.
One world, one religion, one government, one Borg. all lead down the same road to --
Orwell's 1984.
David Halberstam: The Best and the Brightest. (Reminder how the heck we got into Vietnam,
when the best and the brightest were serving as presidential advisors.)
Also good Halberstam re-read: The Powers that Be - when the conservative media controlled
the levers of power; not the uber-liberal one we experience today.
The monumental impact of CThe season finale of Command Line Heroes offers a
lesson in how a small community of open source enthusiasts can change the world. 01 Oct
2019 Matthew Broberg (Red Hat)
Feed 29
up 3 comments x Subscribe now
Get the highlights in your inbox every week.
https://opensource.com/eloqua-embedded-email-capture-block.html?offer_id=70160000000QzXNAA0
Command Line Heroes
podcast explores C's origin story in a way that showcases the longevity and power of its
design. It's a perfect synthesis of all the languages discussed throughout the podcast's third
season and this series of articles .
original_c_programming_book.jpg
C is such a fundamental language that many of us forget how much it has changed. Technically
a "high-level language," in the sense that it requires a compiler to be runnable, it's as close
to assembly language as people like to get these days (outside of specialized, low-memory
environments). It's also considered to be the language that made nearly all languages that came
after it possible.
The path to C began with failure
While the myth persists that all great inventions come from highly competitive garage
dwellers, C's story is more fit for the Renaissance period.
In the 1960s, Bell Labs in suburban New Jersey was one of the most innovative places of its
time. Jon Gertner, author of The idea factory , describes the
culture of the time marked by optimism and the excitement to solve tough problems. Instead of
monetization pressures with tight timelines, Bell Labs offered seemingly endless funding for
wild ideas. It had a research and development ethos that aligns well with today's open
leadership principles . The results were significant and prove that brilliance can come
without the promise of VC funding or an IPO.
The challenge back then was terminal sharing: finding a way for lots of people to access
the (very limited number of) available computers. Before there was a scalable answer for that,
and long before we had a shell like Bash , there was the
Multics project. It was a hypothetical operating system where hundreds or even thousands of
developers could share time on the same system. This was a dream of John McCarty, creator of
Lisp and the term artificial intelligence (AI), as I recently explored .
Joy Lisi Ranken, author of A people's history of computing
in the United States , describes what happened next. There was a lot of public interest
in driving forward with Multics' vision of more universally available timesharing. Academics,
scientists, educators, and some in the broader public were looking forward to this
computer-powered future. Many advocated for computing as a public utility, akin to electricity,
and the push toward timesharing was a global movement.
Up to that point, high-end mainframes topped out at 40-50 terminals per system. The change
of scale was ambitious and eventually failed, as Warren Toomey writes in IEEE
Spectrum :
"Over five years, AT&T invested millions in the Multics project, purchasing a GE-645
mainframe computer and dedicating to the effort many of the top researchers at the company's
renowned Bell Telephone Laboratories -- including Thompson and Ritchie, Joseph F. Ossanna,
Stuart Feldman, M. Douglas McIlroy, and the late Robert Morris. But the new system was too
ambitious, and it fell troublingly behind schedule. In the end, AT&T's corporate leaders
decided to pull the plug."
Bell Labs pulled out of the Multics program in 1969. Multics wasn't going to
happen.
The fellowship of the C
Funding wrapped up, and the powerful GE645 mainframe was assigned to other tasks inside Bell
Labs. But that didn't discourage everyone.
Among the last holdouts from the Multics project were four men who felt passionately tied to
the project: Ken Thompson, Dennis Ritchie, Doug McIlroy, and J.F. Ossanna. These four diehards
continued to muse and scribble ideas on paper. Thompson and Ritchie developed a game called
Space Travel for the PDP-7 minicomputer. While they were working on that, Thompson started
implementing all those crazy hand-written ideas about filesystems they'd developed among the
wreckage of Multics.
pdp7-minicomputer-oslo-2005.jpeg A PDP-7
minicomputer was not top of line technology at the time, but the team implemented foundational
technologies that change the future of programming languages and operating systems alike.
That's worth emphasizing: Some of the original filesystem specifications were written by
hand and then programmed on what was effectively a toy compared to the systems they were using
to build Multics. Wikipedia's Ken Thompson page dives deeper
into what came next:
"While writing Multics, Thompson created the Bon programming language. He also created a
video game called Space Travel . Later, Bell Labs
withdrew from the MULTICS project. In order to go on playing the game, Thompson found an old
PDP-7 machine and rewrote
Space Travel on it. Eventually, the tools developed by Thompson became the Unix operating system : Working on a PDP-7, a
team of Bell Labs researchers led by Thompson and Ritchie, and including Rudd Canaday,
developed a hierarchical file
system , the concepts of computer processes and device files , a command-line
interpreter , pipes for easy inter-process
communication, and some small utility programs. In 1970, Brian Kernighan suggested the name
'Unix,' in a pun on the name 'Multics.' After initial work on Unix, Thompson decided that
Unix needed a system programming language and created B , a precursor to Ritchie's
C ."
As Walter Toomey documented in the IEEE Spectrum article mentioned above, Unix showed
promise in a way the Multics project never materialized. After winning over the team and doing
a lot more programming, the pathway to Unix was paved.
Getting from B to C in Unix
Thompson quickly created a Unix language he called B. B inherited much from its predecessor
BCPL, but it wasn't enough of a breakaway from older languages. B didn't know data types, for
starters. It's considered a typeless language, which meant its "Hello World" program looked
like this:
Even if you're not a programmer, it's clear that carving up strings four characters at a
time would be limiting. It's also worth noting that this text is considered the original "Hello
World" from Brian Kernighan's 1972 book, A tutorial introduction to the language
B (although that claim is not definitive).
Typelessness aside, B's assembly-language counterparts were still yielding programs faster
than was possible using the B compiler's threaded-code technique. So, from 1971 to 1973,
Ritchie modified B. He added a "character type" and built a new compiler so that it didn't have
to use threaded code anymore. After two years of work, B had become C.
The right
abstraction at the right time
C's use of types and ease of compiling down to efficient assembly code made it the perfect
language for the rise of minicomputers, which speak in bytecode. B was eventually overtaken by
C. Once C became the language of Unix, it became the de facto standard across the budding
computer industry. Unix was the sharing platform of the pre-internet days. The more
people wrote C, the better it got, and the more it was adopted. It eventually became an open
standard itself. According to the Brief history
of C programming language :
"For many years, the de facto standard for C was the version supplied with the Unix
operating system. In the summer of 1983 a committee was established to create an ANSI
(American National Standards Institute) standard that would define the C language. The
standardization process took six years (much longer than anyone reasonably expected)."
How influential is C today? A quick
review reveals:
Parts of all major operating systems are written in C, including macOS, Windows, Linux,
and Android.
The world's most prolific databases, including DB2, MySQL, MS SQL, and PostgreSQL, are
written in C.
Many programming-language specifics begun in C, including Python, Go, Perl's core
interpreter, and the R statistical language.
Decades after they started as scrappy outsiders, Thompson and Ritchie are praised as titans
of the programming world. They shared 1983's Turing Award, and in 1998, received the National Medal of Science for their
work on the C language and Unix.
But Doug McIlroy and J.F. Ossanna deserve their share of praise, too. All four of them are
true Command Line Heroes.
Wrapping up the season
Command Line
Heroes has completed an entire season of insights into the programming languages that
affect how we code today. It's been a joy to learn about these languages and share them with
you. I hope you've enjoyed it as well!
(cnet.com) 58BeauHD on Monday October 14,
2019 @10:10PM from the nostalgia-blast dept. The latest update from Internet Archive
brings thousands of MS-DOS games from the '90s like 3D Bomber, Zool and Alien Rampage. CNET
reports: On Sunday, Internet Archive released 2,500 MS-DOS games that
includes action, strategy and adventure titles. Some of the games are Vor Terra, Spooky Kooky
Monster Maker, Princess Maker 2 and I Have No Mouth And I Must Scream. "This will be our
biggest update yet, ranging from tiny recent independent productions to long-forgotten big-name
releases from decades ago," Internet Archive software curator Jason Scott wrote on
the site's blog .
One game that might trigger a few memories is the 1992 action-adventure horror game
Alone in the Dark ,
published by Infogrames. In the game, you can play private investigator Edward Carnby or family
member Emily Hartwood, who's investigating the suspicious death of Jeremy Hartwood in his
Louisiana mansion called Derceto, which is now supposedly haunted. Fighting against rats,
zombies and giant worms, you have to solve a number of puzzles to escape. Another retro game
included by Internet Archive is a 1994 title played on PCs and Amiga computers called
Mr. Blobby (a remake
of the SNES game Super Troll Islands). Players can choose from three different characters --
Mr. Blobby, Mrs. Blobby and Baby Blobby. The goal of the game is to color in the computer
screen by walking over it. Levels include climbing ladders, avoiding spikes and bouncing on
springs.
They write that its early popularity in the mid-1990s came in part because "Microsoft needed
software capable of showing video on their website, MSN.com, then the default homepage of every
Internet Explorer user." But Flash allowed anyone to become an animator. (One Disney
artist tells them that Flash could do in three days what would take a professional animator 7
months -- and cost $10,000.)
Their article opens in 2008, a golden age when Flash was installed on 98% of desktops --
then looks back on its impact: The online world Flash entered was largely static. Blinking
GIFs delivered the majority of online movement. Constructed in early HTML and CSS, websites
lifted clumsily from the metaphors of magazine design: boxy and grid-like, they sported borders
and sidebars and little clickable numbers to flick through their pages (the horror).
Some of these websites were, to put it succinctly, absolute trash. Flash was applied
enthusiastically and inappropriately. The gratuitous animation of restaurant websites was
particularly grievous -- kitsch abominations, these could feature thumping bass music and
teleporting ingredients . Ishkur's 'guide to electronic music' is a
notable example from the era you can still view -- a chaos of pop arty lines and bubbles and
audio samples, it looks like the mind map of a naughty child...
In contrast to the web's modern, business-like aesthetic, there is something bizarre,
almost sentimental, about billion-dollar multinationals producing websites in line with Flash's
worst excess: long loading times, gaudy cartoonish graphics, intrusive sound and
incomprehensible purpose... "Back in 2007, you could be making Flash games and actually be
making a living," remembers Newgrounds founder Tom Fulp, when asked about Flash's golden age.
"That was a really fun time, because that's kind of what everyone's dream is: to make the games
you want and be able to make a living off it."
Wired summarizes Steve Jobs' "brutally candid" diatribe against Flash in 2010. "Flash drained
batteries. It ran slow. It was a security nightmare. He asserted that an era had come to an
end... '[T]he mobile era is about low power devices, touch interfaces and open web standards --
all areas where Flash falls short.'" Wired also argues that "It was economically viable for him
to rubbish Flash -- he wanted to encourage people to create native games for iOS."
But they also write that today, "The post-Flash internet looks different. The software's
downfall precipitated the rise of a new aesthetic...one moulded by the specifications of the
smartphone and the growth of social media," favoring hits of information rather than striving
for more immersive, movie-emulating thrills.
And they add that though Newgrounds long-ago moved away from Flash, the site's founder is
now working on a Flash emulator to keep all that early classic content playable in a
browser.
Eugene Miya , A
friend/colleague. Sometimes driver. Other shared experiences.
Updated Mar 22 2017 · Author has 11.2k answers and 7.9m answer views
He mostly writes in C today.
I can assure you he at least knows about Python. Guido's office at Dropbox is 1 -- 2 blocks
by a backdoor gate from Don's house.
I would tend to doubt that he would use R (I've used S before as one of my stat packages).
Don would probably write something for himself.
Don is not big on functional languages, so I would doubt either Haskell (sorry Paul) or LISP
(but McCarthy lived just around the corner from Don; I used to drive him to meetings; actually,
I've driven all 3 of us to meetings, and he got his wife an electric version of my car based on
riding in my car (score one for friend's choices)). He does use emacs and he does write MLISP
macros, but he believes in being closer to the hardware which is why he sticks with MMIX (and
MIX) in his books.
Don't discount him learning the machine language of a given architecture.
I'm having dinner with Don and Jill and a dozen other mutual friends in 3 weeks or so (our
quarterly dinner). I can ask him then, if I remember (either a calendar entry or at job). I try
not to bother him with things like this. Don is well connected to the hacker community
Don's name was brought up at an undergrad architecture seminar today, but Don was not in the
audience (an amazing audience; I took a photo for the collection of architects and other
computer scientists in the audience (Hennessey and Patterson were talking)). I came close to
biking by his house on my way back home.
We do have a mutual friend (actually, I introduced Don to my biology friend at Don's
request) who arrives next week, and Don is my wine drinking proxy. So there is a chance I may
see him sooner.
Don Knuth would want to use something that’s low level, because details matter
. So no Haskell; LISP is borderline. Perhaps if the Lisp machine ever had become a thing.
He’d want something with well-defined and simple semantics, so definitely no R. Python
also contains quite a few strange ad hoc rules, especially in its OO and lambda features. Yes
Python is easy to learn and it looks pretty, but Don doesn’t care about superficialities
like that. He’d want a language whose version number is converging to a mathematical
constant, which is also not in favor of R or Python.
What remains is C. Out of the five languages listed, my guess is Don would pick that one.
But actually, his own old choice of Pascal suits him even better. I don’t think any
languages have been invented since T E X was written that score higher on the Knuthometer than Knuth’s own original pick.
And yes, I feel that this is actually a conclusion that bears some thinking about. 24.1k
views ·
Dan
Allen , I've been programming for 34 years now. Still not finished.
Answered Mar 9, 2017 · Author has 4.5k answers and 1.8m answer views
In The Art of Computer Programming I think he'd do exactly what he did. He'd invent his own
architecture and implement programs in an assembly language targeting that theoretical
machine.
He did that for a reason because he wanted to reveal the detail of algorithms at the lowest
level of detail which is machine level.
He didn't use any available languages at the time and I don't see why that would suit his purpose now. All the languages
above are too high-level for his purposes.
The chapter surveys some aspects of the Soviet computer software world and examines how
computers applied in several fields enjoy a high level of official support. The chapter
examines seven major areas of computer applications in the USSR. Various automated systems of
management and control (ASU) are discussed. The state of computing in Soviet research and
development organizations, which found themselves low on the priority list when it came to
allocating computer and communications technology until the mid 1980s is also described.
Computer networking is also developing very slowly in the USSR. The Ministry of
Telecommunications is hostile to data communications and places various impediments in the way
of organizations desiring to use the switched network for this purpose. The chapter reviews
Soviet educational computing. Computer courses with a curriculum stressing the development of
programming skills and "algorithmic thinking" were introduced into Soviet schools. Computer
Aided Design (CAD) is the latest applications area of highest priority. The chapter emphasizes
that without radical change, the Soviet software industry will be unable to satisfy domestic
demand for high-quality software. The consequence is that Western software will be in great and
growing demand, which raises a policy question for the United States and its software
industry.
80
kwertii
asks:
"Does anyone have any information
on computing in the former Soviet Union? A Google search turned up
this virtual museum
, which has some good historical background on the development of early Soviet computer
technology (a lot only in Russian, unfortunately) but not much on later systems. What sorts of architectures
did Soviet computers use? Were there any radically different computing concepts in use, like a standard 9-bit
byte or something? What kind of operating systems were common? How has the end of the Cold War and the large
scale introduction of Western computer technology affected the course of Russian computer development?"
The reality is that the KGB was stealing American computer designs from the beginning. As
Glastnost was coming into being, and the "west" was getting a look into how things worked
inside the Soviet system, they discovered that they were running clones of the IBM 360's.
I've seen an interview recently with an ex-KGB big-wig who said he realized how bankrupt the
Soviet system was as he learned how little they developed "in house" rather than copied from
the west. The Soviets were always one or two generations of technology behind simply because
they weren't inventing it.
An an slightly different node, I found this link a while ago that discusses, in great depth,
Sinclair Clones
[nvg.ntnu.no] from the late 1970's to the early 1990's.
Another thing I remember reading a long while ago was an article in "A+/Incider" magazine (and
Apple II magazine) where the cover story was the giant headline "Red Apples"; in it they talked
about a close of the Apple IIe that looked like a negative of the Apple IIe we know (black
case, white keys), but otherwise was more or less the same -- compatible logic, just made
somewhere else. I may even throw that coppy in my flatbed if there is enoguh interest.
If I had to guess, all but either very high-end or very early machine will be of the same
designs as western counterparts, probably for engineering reasons because an engineer doesn't
want to reinvent the wheel (or bitwise logic in this case) just to make machine to do word
processing.
Here's some info on the
Agat
[old-computers.com] - a clone of an Apple II.
If you want to buy an old Russian computer,
try here (has many pictures!)
[seller2001.euro.ru]. I don't know if this guy's stock is
representative of 1980's Russian computing, but it contains a lot (31) of
Sinclair clones
[seller2001.euro.ru], and information on
other computers, including IBM PC-compatibles
[seller2001.euro.ru]. If nothing, the
names listed should help searches.
Sinclair clones are VERY representative of personal computer market of that time. There
were literally dozens of variants, with various extensions and addons, custom operating
systems, modified OS, etc. They were self-made (I've had one of those, total cost: $20),
with mass-produced pc boards and cases, and even factory-made (even with OS translated to
Russian.
Most of them connected to a TV and used tape recorders for storage. Eventually, I had a
dot-matrix printer and could've gotten a 5" floppy drive if I really wanted. I've seen
mice, modems and light pens. I've seen cable and broadcast tv system's audio channel used
to broadcast binary data when station wasn't broadcasting regular programming (would that
be predecessor to cable modems?) We would record audio to tapes and then load them back
into computer.
There were clones of 286 PC's as well (Poisk), although that was just about when I moved
to this side of the ocean..
There were also completely original computers with BASIC or FORTRAN interpreter as
"operating system".
I remember trying to set up a system with a friend across town where the spectrums
were wired up to mangled phones and we'd send messages by saving a program across the
phone that the other end would load and then repeat... each message also included the
basic app required to send the next one - or something - I forget now
- Z80 machines running CP/M or custom operating systems like the DIOS
- Sinclair clones
When the opening to the west happened, there was a huge leap in technology because 286 and
386SX PCs were brought.
I was fortunate enough to have one, and it seemed to me, at that time, that they had
gigantic CPU power and a huge memory.
I was running benchmarks all the time to compare my 386sx with my Sinclair.
My 386sx was about 10-15 times faster, and had 15 times more memory!
How was that for a leap?
Now in Eastern Europe we have very good programmers. Why?
Because, when the outside world is not that interesting and funny, more and more people have
fun (I mean, programming is lots of fun) with their computers!
Thank you for your time reading this, and sorry for posting as AC. I don't have a
./ account and I find logging it each time in order to read
./ is
pretty hard.
>Thank you for your time reading this, and sorry for posting as AC. I don't have a
./ account and I find logging it each time in order to read
./
is pretty hard.
you know, you can cookie your logon and only have to actually log on once a year, when
your cookie expires.
They have (had?) one of the Russian Sinclair clones in a display case by the stairs at the
staff entrance of the National Science Museum in London when I was there (~5 years ago).
First time I had ever seen one. I've often thought how much fun it must have been trying to
deal with the strange 5 functions / key system that the Spectrum had PLUS having everything
in Cyrrilic(sp?)!
I have a good friend who is from Bulgaria, and there they mass-produced an apple IIe
knockoff called the Pravetz. they reverse engineered the apple and started making their own
version. He said that they ended up being more powerful than any of the apple II line.
People like the Dark Avenger (ever had a
real
computer virus? he probably wrote it)
grew up hacking these things. anyway, they are mentioned in a really good
wired article
[wired.com] about the Dark Avenger and the Soviet Bloc's more recent
computing history, and Woz even has a
picture of one
[woz.org] on his website.
Mod parent up -- it's one of the only two informative posts so far (and no, that guy ranting
about how you have to go to the library to do research is not insightful)
In the late 70s or early 80s ACM's "Computing Surveys" ran an article on Soviet computing.
Here's what I remember:
The Soviets said that military computers were generally original designs.
Most of the commercial computers were either IBM 360/370 models diverted through 3rd
countries (direct exports were prohibited) or the Soviet "Ryad" line. Ryads were 360/370
copies. Not having to worry about copyright andd patent issues, the East copied IBM
mainframes directly. IBM engineers recognized an I/O problem with one Soviet model, since
the IBM original had the same problem. Just as the 360 model development was split among
groups in Poughkeepsie and Endicott, different Soviet Bloc countries were assigned
development/manufacturiing responsibility for the copies.
Software was, of course, pirated OS/360. (Back in those days, software came with source.)
I found the acm.org's site search to be unusable on linux/mozilla, which is ironic --
however, a Google search on
"soviet site:acm.org"
[google.com] turned up some interesting papers available as pdf
(special tribute to Russian Dmitry Sklyarov ?):
Let's see, so far we've got one offtopic post, one bigoted and ignorant post from the Tom
Clancy crowd, and the usual noise. I don't think you'll get much help here. Sometimes all you
can find online is opinion and rumor.
Now, don't get me wrong. I love the Web in general and Slashdot in particular. Both are
invaluable resources for obscure little questions like the one you're asking. I know I used to
write technical documentation without having the net as a reference source -- but I'm damned if
I remember how.
Still, information you can get through this kind of informal research is limited in scope.
There's a lot of stuff online -- but a lot more that's not. A lot of texts exist only in
proprietary databases, not on the web. Not to mention the the much larger document base that
simply doesn't exist in eletronic form.
You need to find a good library, probably one at a university or in a major city. They all
have web sites (librarians love the web) and usually have their catalogs online. But searching
a library catalog is not as simple as typing a few content words into Google. You probably need
to interface with one of those old-fashioned access nodes that are only available onsite -- the
ones with comprehensive heuristic and associative search features. I refer, of course, to
reference librarians.
That's very politically correct of you. You show a tendency common to most PC types -- Don't
let the facts get in the way of feel-good politics.
The Soviet Union didn't do very much independent computer design after the early 1960's.
Various Soviet agencies and front organizations obtained IBM, Burroughs and Sperry-Univac
mainframes and setup factories to manufacture spares and even a few backward-engineered
copies.
The Soviet Union did not embrace information technology. It was a society that was
essentially living in the 1930's. Heavy industry was the priority of the USSR, not
semiconductors.
If you looked on the desks of Soviet desk jockeys in the late 80's, you'd find most offices
to be non-computerized (like many western offices). The ones with computers had green
screens, IBM or Apple clones. Engineers had Intergraph or Apolla stuff.
I love the term "Politically Correct". It allows you to dismiss any difference of opinion
as a kneejerk reaction. Which is itself, of course, a kneejerk reaction.
(I once heard
Night of the Living Dead
condemned as "Politically Correct"
because the main character was black. Too typical.)
Look, I never said the Soviets never ripped off American technology. The US leads the
planet in this area. People imitate us. Well, duh. Go to the Sony web site sometime and
read that company's history. Their early attempts to reverse-engineer and manufacture
magnetic recording devices are quite amusing.
I'm no expert on the history of Soviet technology. But I do know enough to know that
saying "They never did anything with computers except rip off American designs" is
simplistic and stupid.
In point of fact, Soviet engineers in all areas were not able to
imitate Western technology as much as they would have liked. There were many reasons for
this, some obvious, some not. If you're really interested in the subject, go do some
actual reading. In any case, spare us the Clancy cliches.
The term "Politically Correct" in this context means that you are more concerned with
your notion of "fairness" towards the former Soviet Union than the facts.
You have further reinforced my assessment of your original post with your reply. You
suggest that i visit the Sony web site to learn about their early reverse-engineering
efforts, then admit that you know virtually nothing about Soviet technology. You then
assert (while posting in "Ask Slashdot") that we would all be better served by reading
printed books (that Tom Clancy didn't write) on the subject rather than asking people
on the web.
Maybe you should have taken a second to read my post. In that post I stated clearly
that Soviets did have their own computer innovations until sometime in the 1960's. At
that point it was cheaper and easier for them to appropriate and/or copy Western
equipment. Technology as it applied to semiconductors just was not a priority.
Spare this forum your offtopic pseudo-intellectual rants and go away.
It's so paradoxical being PC. On the one hand, people assume you're so thoroughly
brainwashed that you can't think for yourself. On the other hand, they continue to
lecture you as if you were actually capable of rational thought!
Well, I can't sneer. Here I am arguing with a guy who enters the discussion with
the premise that nothing I say
can
make sense. Pretty futile, no?
But I love the way you put "fair" in quotes. In this context "fair" simply means
admitting that you don't know what you don't know. It means being skeptical about
your own prejudices and assumptions.
It might help if you separate out the issue of whether the Soviet system was
morally bankrupt and profoundly inefficient. Actually, that's not even an issue any
more -- almost everybody agrees that it was. But it doesn't follow from this fact
that Soviet technology consisted entirely of pathetic American rip offs. However
screwed up the state was, it had some brilliant citizens, and only a bigot would
dismiss their accomplishments out of hand.
You think I want the whole world to agree with me? What am I doing on
Slashdot then?
It's not the opinion that makes you a bigot. Bigotry can happen to
anybody, of any stripe. God knows I've caught myself in that mode often
enough.
The difference between disagreement and bigotry is the same as the
difference between having an honest difference of opinion and being
prejudiced. If you disagree with me because you find my arguments
uncompelling, then you're just someone with a different POV. That's fair
enough. It's even useful -- even if neither of us can admit he's wrong, at
least we can keep each other honest.
But if you start out assuming that whole groups of people are incapable of
saying or doing anything worth your notice, and sneer at anybody who suggests
otherwise, then you're a bigot.
The Soviet Union did not embrace information technology. It was a society that was
essentially living in the 1930's. Heavy industry was the priority of the USSR, not
semiconductors.
If you looked on the desks of Soviet desk jockeys in the late 80's, you'd find most
offices to be non-computerized (like many western offices). The ones with computers had
green screens, IBM or Apple clones. Engineers had Intergraph or Apolla stuff.
The USSR was indeed behind behind the west regarding advanced semiconductor technologi,
but your anectdotical evidence can be misleading, since the USSR soviet economy was
sharply devided into a civilian part (who got almost nothing) and a military who had
first priority.
So even though the standard USSR office was pen-and-paper, the military complex would
have access much more advanced technology.
IMHO, soviet military equipment since WWII to until the eighties, was often on par, if
not better, than US equipment (especially missilies, tanks, infantery weapons, airplanes,
though perhaps not avionics).
OTOH, civilian USSR equipment was always decades behind, what could be found in the west.
The truth isn't bigoted or ignorant.
I believe that a famous USSR newspaper was called "Pravda", meaning "The Truth"
;-).
In 1991, I actually have been using one BESM-6 computer, which was completely original (no
IBM copying at all). It was 48-bit machine. I was faster than IBM PS2 12Mhz...
I remember a book called
Writing Bug Free Code
(yes, you all scoff, but this is for
real) written by a Russian computer scientist.
The basic premise was that he was using punch cards, and the actual computer on which he was
compiling and testing his programs was in a relatively distant city.
He would punch up a set of cards and mail them to where the computer was, which would take a
week or two. When they got around to it, they would compile his program and print out a test
run using input he gave them. This would take another week. The week or two return trip made
the average round trip take a month.
Now if you had to wait one month to find out that you had missed a semicolon, wouldn't
you
be more careful?
Now if you had to wait one month to find out that you had missed a semicolon,
wouldn't you be more careful?
Actually, that POV is not restricted to the former Proletarian Dictatorship. Most of my
early programming was done by punching FORTRAN and PL/1 code onto punched cards. I used to
stay up all night so I could submit my jobs when the turnaround was down to 15 minutes.
I had a FORTRAN textbook that said this was Very Bad, and not just because of lost sleep.
It urged students to think through their code before trying it. Do hand simulation. Read it
through with a friend. Later on I read books by people who insisted all software should be
"provably correct."
Now I work with Delphi and Kylix, which thoroughly encourages the cut-and-try approach.
Oh well.
Including functional front panels, paper tape and thoughts like "Wow, that 1200bps
cassette tape is fast!"
Used to do punch cards in PL/1 at school at least until I discovered the lab with vt-100s
in it, and made friends with an operator who showed me how to make the machine punch the
cards based on the source file that I had entered at the terminal.
;-)
Hello David, are you still out there?
Yeah, IBM really resisted interactive computing for a long time. Actually a good
thing, since it helped give companies like DEC and DG their shot. One way to do
without keypunches in IBM shops was to write a card-reader emulator!
Are we in nostalgia mode? Elsewhere on
/., somebody is asking for
help porting his RPG code to Linux. I seem to recall that RPG was little more than a
software emulator for an IBM accounting machine, which used plugboard programming to
process data on punched cards. Perhaps I misremember. Silly to invent a language for
something like that!
This quote is from page 15 of the OpenVMS at 20 publication that Digital Published in 1997. The
PDF
[compaq.com] is available from Compaq.
During the cold war, VAX systems could not be sold behind the Iron Curtain. Recognizing
superior technology, technical people cloned VAX systems in Russia, Hungary, and China. After
learning that VAX systems were being cloned, DIGITAL had the following words etched on the CVAX
chip, "VAX...when you care enough to steal the very best."
Oh yeah, it's a great site... maybe I should have mentioned it
:)
I've been lucky to work at two places with good optical equipment
... mainly for PCB inspection/rework, so not quite the magnification at that site.
When I mysteriously blew up a FET in a hybrid package, I got to remove the top (a
welded-on metal top; none of the dice were potted inside) and see if it was over
voltage or over current that killed the part. At another facility, we an X-ray machine
and a scanning electron microscope, neither of which I got to use
:(
Glorious new Soviet People's Dual Potato 3000! With advanced UVR (Ultra Root Vegetable(tm))
technology and many obedient clock cycles working for common good. Running Mikkelzoft Window
KGB. Own the means of production and experience many kilohertz of glorious revolution in the
People's progress today, comrade!
Adski_
/
NB. Before you complain, I must point out that as a Linux user myself, I am of course a fervent
communist.
I believe that the coolest invention the Russians ever made (concerning computers) was the
ternary computer. More appropriately, the balanced ternary computer.
It was a bit like our binary computers, but it had real potential with the trigits having the
values of up, down and neutral. The computer was called SETUN, although it was experimental and
never truly realized since the 60's.
If anyone has a link concerning SETUN, I'd be interested, so far my only source has been the
meager note on 'An introdunction to cryptography', Mollin.
A
search on Google
[google.com] gives a number of interesting links, including:
photo
[icfcst.kiev.ua] at the European Museum on CS and Technology
article
[computer-museum.ru] (including bibliography) at the Virtual Computer Museum
discussion of ternary computing
[sigmaxi.org] at American Scientist
One of those indicated it was circa 1958.
There is an
article
[xbitlabs.com] on X-bit labs about Soviet supercomputers
Elbrus-1
,
Elbrus-2
and
Elbrus-3
, and their successor,
Elbrus-2000
:
The history of the world computer science is connected with the name of Elbrus. This
company was founded in Lebedev Institute of Precision Mechanics and Computing Equipment,
which team had been developing supercomputers for the Soviet Union's defense establishments
for over 40 years. E2K processor embodies the developing ideas of the Russian supercomputer
Elbrus-3 built in 1991. Today Elbrus-3 architecture is referred to EPIC (Explicitly Parallel
Instruction Computing).
According to Boris A. Babaian, chief architect of Elbrus supercomputers, superscalar
architecture was invented in Russia. To quote him as saying: "In 1978 we developed the
world's first superscalar computer, Elbrus-1. At present all Western superscalar processors
have just the same architecture. First Western superscalar processor appeared in 1992 while
ours - in 1978. Moreover, our variant of superscalar is analogous to Pentium Pro introduced
by Intel in 1995".
The historical priority of Elbrus is confirmed in the States as well. According to the
same article in Microprocessor Report by Keith Diefendorff, the developer of Motorola 88110
- one of the first western superscalar processors: "In 1978 almost 15 years ahead of Western
superscalar processors, Elbrus implemented a two-issue out-of-order processor with register
renaming and speculative execution".
I seem to remember that the only computer system ever built on trinary (base-3) logic was
produced in the Soviet Union. The name escapes me, but I think something like that is enought
to dispell the idea of them not doing any original research (good research, OTOH...).
I just noticed that kwertii lists 9-bit bytes as a "radically different concept", an example of
what Soviet computer architects might have considered. Worth mentioning that the 8-bit byte was
not always something you could take for granted. I can't think of any production machines, but
I seem to recall that Knuth's specification of his famous
MIX
[dannyreviews.com] machine (an imaginary computer he invented for teaching purposes)
doesn't require that bytes be implemented as 8-bit values. In fact, a programmer is not even
supposed to assume that a byte is a string of bits!
Before IBM introduced the byte concept back in the 60s, all computers used "word-level"
addressing. That meant that data path width and the addressable unit of data had to be the same
thing. Made it hard to write portable software. By divorcing the addressing scheme from the
data path width, IBM was able to design computers where differences in word size were a matter
of efficiency, not compatibility.
There was nothing to force manufacturers to use 8-bit bytes. (Unless, of course, they were
trying to rip off IBMs instruction set. A few did, but competing head-to-head with Big Blue
that way usually didn't work out.) On the one hand, the standard data terminal of the time used
a 7-bit character set. On the other hand, you can make a case for a
12-bit byte
[colorado.edu]. But IBM used an 8-bit byte, and in those days, what IBM did
tended to become a standard.
Bull-Honeywell's GCOS machines still use 9-bit bytes. C was designed to run on these
machines (Kernighan's
Programming in C
[lysator.liu.se] begins ``C is a computer language available on the
GCOS and UNIX operating systems...''). The size of various types is intentionally left
flexible to allow for these machines.
A 36-bit word on a machine with limited address space allows pointers to individual bits.
Those who do not know their own history are doomed to assume that it was lived only by
`backward' peoples?
I don't know a lot about these boxes, but information on the web seems to indicate that
"character oriented" means a very small word size. That would make sense -- the 1401 was
a popular business machine with only 16K of RAM. I presume it had a correspondingly small
word size -- like 8 bits?
But an 8-bit word and an 8-bit byte are not the same thing. With an 8-bit word you can
easily manipulate individual characters, but your ability to do numerical work is
extremely limited. If you need to do scientific computing, you have to go find a system
with a bigger word size -- and lose the ability to deal with character-size data easily.
Byte architecture eliminates this problem by divorcing data path width ("word size")
from addressible data unit("byte size").
I've heard we used to read the architecture of western silicon chips slice by slice.
Also there were many IBM and other boxes bought in. Many of which were copied since there
wasn't enough money to by them for all the needs.
s/\bby\b/buy/;
And of course I'm not saying we didn't do any original research. The engineers were really
good, probably because eduaction had really high standards. That's changed unfortunately, at
least here in Estonia with the adoption of international degrees.
Not sure if anyone can expand on this, but I thought that Bulgaria was the east-European
silicon valley? As mentioned already, the GDR also made some kit. I've read some material
describing Russians buying fairly advanced homegrown systems from Bulgaria; it's no secret that
they have a few virus authors there... so they certainly have some latent expertise. It's
long-suspected that Russian coding techniques were superior to those in the West, motivated by
the presence of less-powerful CPUs. Or was this a myth too?
A colleague of mine is of Slovak descent, and tells me one of the wildest dodges in the Bad Old
Days was CPUs with weird numbers of bits, like 28 bit words. It seems that it was illegal to
export 32 bit CPUs to the Eastern Bloc. But anything smaller was OK.
In
Wireless World
in the late 1980s there was a very good series of articles on
Eastern Bloc computing, including all the PDP-11 and S/360 clones that have been mentioned.
Sorry, I don't have the exact citation. Check your library.
Well, I don't know anything about the history of Russian/Soviet computing. However, I was over
there last summer, and found a computer store which had state-of-the-art peripherals for sale,
right alongside a bootleg copy of Windows 2000. In a bookstore, I found (and bought) a Russian
translation of Olaf Kirch's
Linux Network Administrator's Guide
(aka,
The NAG
[oreilly.com]). The text was Russian but the examples were all in the default language of
Linux, English.
The products in the computer store were selling for about the same as in America given the
exchange rate at the time (except for the Win2K which was ~USD13). When you consider that the
average Russian salary is USD2000-3000/yr, you aren't going to find many Russians online, at
least not at home. Businesses seem to be fairly up-to-date as far as technology goes, aside
from the mom-and-pop shops. Broadband internet access seems to be more myth than reality there.
Some of posts here said that they were a couple generations behind because they were just
copying American technology. Appears they're catching up.
Check out the
Robotron
[google.com] site, created in memory of the East German line of computers.
Pictures, manuals, and screenshots. (A PacMan clone!) Z80 clones, 8086 clones, CP/M clones,
etc.
Just after the Baltics broke away, I was visiting the University of
Latvia. I asked to see the computer facilities and was led to a room full of Norsk Data
text-based terminals with cyrillic keyboards. The displays were able to show both cyrillic and
roman characters. I do not, sadly, remember any specifics of the computer they were connected
to other than that it had a lot of wires hanging everywhere.
Norsk Data ("Norwegian Computers") designed fairly advanced 32-bit systems in the middle of
the 80's. I remember using them at my local university in Sweden. (Obviously the VAX 11/785
we had too was more exciting since it could run Hack under VMS Eunice).
Back then there was an export embargo on advanced computers to the Soviet union, which
basically meant that 32-bit computers couldn't be sold there. So they cut off 4 bits and
voila! had an exportable 28-bit computer (ND-505).
Maybe technically not a soviet machine, but still...
I seem to remember hearing something about the ICL almost managing to become the computer
supplier to the Soviet government, but this being blocked in the final stages by the British
government. I can't find anything to support this anywhere, however - does anyone out there
remember more of this than me?
"BESM"/"Elbrus" line -- originally developed.
"ES" Line -- clone of IBM 360 line
"Elektronika"/"SM" line -- clone of PDP-11 line, often with some creative changes
(high-density floppies, graphics controlers on a second PDP-11 CPU), then some VAXen
"DWK"/"UKNC" line -- same as "SM", but made as a desktop. "DWK" models 3 and 4 were
built as a single unit with terminal (keyboard was separate), "UKNC" was a very nice flat
box with builtin keyboard and extension connectors at the top, connected to a separate
monitor.
"BK-0010" -- can be described as a PDP-11 squeezed into Sinclair's case, everything was
in the keyboard, with TV output, tape recorder connector, and on some models a serial port.
"Elektronika-85" -- Dec Pro/350 clone. Was hated just as much as its prototype.
"ES-1840","Iskra-1030" lines -- IBM PC clones, usually with some changes. Appeared in
early 90's and soon were replaced by conventional PC clones.
"Radio-86RK","Specialist" -- hobbyist 8080-based boxes, never were mass-produced but
popular among various computer enthusiasts.
"Sinclair" clones
There were some others, however I have mentioned the most popular ones.
You can see that Larry Wall bought OO paradigm "hook, line and sinker" , and that was very bad, IMHO disastrous decision. There
were several areas were Perl 5 could be more profitably be extended such as exceptions, coroutines and, especially, introducing types
of variables. He also did not realize that Javascript prototypes based OO model has much better implementation of OO then Simula-67
model. And that Perl 5 modules do 80% of what is useful in classes (namely provide a separate namespace and the ability to share variables
in this namespace between several subroutines)
Notable quotes:
"... Perl 5 had this problem with "do" loops because they weren't real loops - they were a "do" block followed by a statement modifier, and people kept wanting to use loop control it them. Well, we can fix that. "loop" now is a real loop. And it allows a modifier on it but still behaves as a real loop. And so, do goes off to have other duties, and you can write a loop that tests at the end and it is a real loop. And this is just one of many many many things that confused new Perl 5 programmers. ..."
"... We have properties which you can put on variables and onto values. These are generalizations of things that were special code in Perl 5, but now we have general mechanisms to do the same things, they're actually done using a mix-in mechanism like Ruby. ..."
"... Smart match operators is, like Damian say, equal-tilda ("=~") on steroids. Instead of just allowing a regular expression on the right side it allows basically anything, and it figures out that this wants to do a numeric comparison, this wants to do a string comparison, this wants to compare two arrays, this wants to do a lookup in the hash; this wants to call the closure on the right passing in the left argument, and it will tell if you if $x can quack. Now that looks a little strange because you can just say "$x.can('quack')". Why would you do it this way? Well, you'll see. ..."
"If I wanted it fast, I'd write it in C" - That's almost a direct quote from the original awk page.
"I thought of a way to do it so it must be right" - That's obviously PHP. ( laughter and applause )
"You can build anything with NAND gates" - Any language designed by an electrical engineer. ( laughter )
"This is a very high level language, who cares about bits?" - The entire scope of fourth generation languages fell into this...
problem.
"Users care about elegance" - A lot of languages from Europe tend to fall into this. You know, Eiffel.
"The specification is good enough" - Ada.
"Abstraction equals usability" - Scheme. Things like that.
"The common kernel should be as small as possible" - Forth.
"Let's make this easy for the computer" - Lisp. ( laughter )
"Most programs are designed top-down" - Pascal. ( laughter )
"Everything is a vector" - APL.
"Everything is an object" - Smalltalk and its children. (whispered:) Ruby. ( laughter )
"Everything is a hypothesis" - Prolog. ( laughter )
"Everything is a function" - Haskell. ( laughter )
"Programmers should never have been given free will" - Obviously, Python. ( laughter )
So my psychological conjecture is that normal people, if they perceive that a computer language is forcing them to learn theory,
they won't like it. In other words, hide the fancy stuff. It can be there, just hide it. Fan Mail (14:42)
Q: "Dear Larry, I love Perl. It has saved my company, my crew, my sanity and my marriage. After Perl I can't imagine going
back to any other language. I dream in Perl, I tell everyone else about Perl. How can you improve on perfection? Signed, Happy
in Haifa."
A: "Dear Happy,
You need to recognize that Perl can be good in some dimensions and not so good in other dimensions. You also need to recognize
that there will be some pain in climbing over or tunneling through the barrier to the true minimum."
Now Perl 5 has a few false minima. Syntax, semantics, pragmatics, ( laughter ), discourse structure, implementation, documentation,
culture... Other than that Perl 5 is not too bad.
Q: "Dear Larry,
You have often talked about the waterbed theory of linguistic complexity, and beauty times brains equals a constant. Isn't
it true that improving Perl in some areas will automatically make it worse in other areas? Signed, Terrified in Tel-Aviv."
A: "Dear Terrified,
...
No." ( laughter )
You see, you can make some things so they aren't any worse. For instance, we changed all the sigils to be more consistent, and
they're just the same length, they're just different. And you can make some things much better. Instead of having to write all this
gobbledygook to dereference references in Perl 5 you can just do it straight left to right in Perl 6. Or there's even more shortcuts,
so multidimensional arrays and constant hash subscripts get their own notation, so it's even clearer, at least once you've learned
it. Again, we're optimizing for expressiveness, not necessarily learnability.
Q: "Dear Larry,
I've heard a disturbing rumor that Perl 6 is turning into Java, or Python, or (whispered:) Ruby, or something. What's the
point of using Perl if it's just another object-oriented language? Why are we changing the arrow operator to the dot operator?
Signed, Nervous in Netanya."
A: "Dear Nervous,
First of all, we can do object orientation better without making other things worse. As I said. Now, we're changing from
arrow to dot, because ... because ... Well, just 'cuz I said so!"
You know, actually, we do have some good reasons - it's shorter, it's the industry standard, I wanted the arrow for something
else, and I wanted the dot as a secondary sigil. Now we can have it for attributes that have accessors. I also wanted the unary dot
for topical type calls, with an assumed object on the left and finally, because I said so. Darn it.
... ... ...
No arbitrary limits round two : Perl started off with the idea that strings should grow infinitely, if you have memory.
Just let's get rid of those arbitrary limits that plagued Unix utilities in the early years. Perl 6 is taking this in a number of
different dimensions than just how long your strings are. No arbitrary limits - you ought to be able to program very abstractly,
you ought to be able to program very concretely - that's just one dimension.
... .. ...
Perl 5 is just all full of these strange gobbledygooky variables which we all know and love - and hate. So the error variables
are now unified into a single error variable. These variables have been deprecated forever, they're gone! These weird things that
just drive syntax highlighters nuts ( laughter ) now actually have more regular names. The star there, $*GID, that's what
we call a secondary sigil, what that just says is this is in the global namespace. So we know that that's a global variable for the
entire process. Similarly for uids.
... ... ...
Perl 5 had this problem with "do" loops because they weren't real loops - they were a "do" block followed by a statement modifier,
and people kept wanting to use loop control it them. Well, we can fix that. "loop" now is a real loop. And it allows a modifier on
it but still behaves as a real loop. And so, do goes off to have other duties, and you can write a loop that tests at the end and
it is a real loop. And this is just one of many many many things that confused new Perl 5 programmers.
... ... ...
Perl 5, another place where it was too orthogonal - we defined parameter passing to just come in as an array. You know arrays,
subroutines - they're just orthogonal. You just happen to have one called @_, which your parameters come in, and it was wonderfully
orthogonal, and people built all sorts of stuff on top of it, and it's another place where we are changing.
... .. ...
Likewise, if you turn them inside out - the french quotes - you can use the regular angle brackets, and yes, we did change here-docs
so it does not conflict, then that's the equivalent of "qw". This qw interpolates, with single-angles it does not interpolate - that
is the exact "qw".
We have properties which you can put on variables and onto values. These are generalizations of things that were special code
in Perl 5, but now we have general mechanisms to do the same things, they're actually done using a mix-in mechanism like Ruby.
Smart match operators is, like Damian say, equal-tilda ("=~") on steroids. Instead of just allowing a regular expression on the
right side it allows basically anything, and it figures out that this wants to do a numeric comparison, this wants to do a string
comparison, this wants to compare two arrays, this wants to do a lookup in the hash; this wants to call the closure on the right
passing in the left argument, and it will tell if you if $x can quack. Now that looks a little strange because you can just say "$x.can('quack')".
Why would you do it this way? Well, you'll see.
... ... ..
There's a lot of cruft that we inherited from the UNIX culture and we added more cruft, and we're cleaning it up. So in Perl 5
we made the mistake of interpreting regular expressions as strings, which means we had to do weird things like back-references are
\1 on the left, but they're $1 on the right, even though it means the same thing. In Perl 6, because it's just a language, (an embedded
language) $1 is the back-reference. It does not automatically interpolate this $1 from what it was before. You can also get it translated
to Euros I guess.
Perl is unique complex non-orthogonal language and due to this it has unique level of
expressiveness.
Also the complexity of Perl to a large extent reflect the complexity of Perl environment
(which is Unix environment at the beginning, but now also Windows environment with its
quirks)
Notable quotes:
"... On a syntactic level, in the particular case of Perl, I placed variable names in a separate namespace from reserved words. That's one of the reasons there are funny characters on the front of variable names -- dollar signs and so forth. That allowed me to add new reserved words without breaking old programs. ..."
"... A script is something that is easy to tweak, and a program is something that is locked in. There are all sorts of metaphorical tie-ins that tend to make programs static and scripts dynamic, but of course, it's a continuum. You can write Perl programs, and you can write C scripts. People do talk more about Perl programs than C scripts. Maybe that just means Perl is more versatile. ..."
"... A good language actually gives you a range, a wide dynamic range, of your level of discipline. We're starting to move in that direction with Perl. The initial Perl was lackadaisical about requiring things to be defined or declared or what have you. Perl 5 has some declarations that you can use if you want to increase your level of discipline. But it's optional. So you can say "use strict," or you can turn on warnings, or you can do various sorts of declarations. ..."
"... But Perl was an experiment in trying to come up with not a large language -- not as large as English -- but a medium-sized language, and to try to see if, by adding certain kinds of complexity from natural language, the expressiveness of the language grew faster than the pain of using it. And, by and large, I think that experiment has been successful. ..."
"... If you used the regular expression in a list context, it will pass back a list of the various subexpressions that it matched. A different computer language may add regular expressions, even have a module that's called Perl 5 regular expressions, but it won't be integrated into the language. You'll have to jump through an extra hoop, take that right angle turn, in order to say, "Okay, well here, now apply the regular expression, now let's pull the things out of the regular expression," rather than being able to use the thing in a particular context and have it do something meaningful. ..."
"... A language is not a set of syntax rules. It is not just a set of semantics. It's the entire culture surrounding the language itself. So part of the cultural context in which you analyze a language includes all the personalities and people involved -- how everybody sees the language, how they propagate the language to other people, how it gets taught, the attitudes of people who are helping each other learn the language -- all of this goes into the pot of context. ..."
"... In the beginning, I just tried to help everybody. Particularly being on USENET. You know, there are even some sneaky things in there -- like looking for people's Perl questions in many different newsgroups. For a long time, I resisted creating a newsgroup for Perl, specifically because I did not want it to be ghettoized. You know, if someone can say, "Oh, this is a discussion about Perl, take it over to the Perl newsgroup," then they shut off the discussion in the shell newsgroup. If there are only the shell newsgroups, and someone says, "Oh, by the way, in Perl, you can solve it like this," that's free advertising. So, it's fuzzy. We had proposed Perl as a newsgroup probably a year or two before we actually created it. It eventually came to the point where the time was right for it, and we did that. ..."
"... For most web applications, Perl is severely underutilized. Your typical CGI script says print, print, print, print, print, print, print. But in a sense, it's the dynamic range of Perl that allows for that. You don't have to say a whole lot to write a simple Perl script, whereas your minimal Java program is, you know, eight or ten lines long anyway. Many of the features that made it competitive in the UNIX space will make it competitive in other spaces. ..."
"... Over the years, much of the work of making Perl work for people has been in designing ways for people to come to Perl. I actually delayed the first version of Perl for a couple of months until I had a sed-to-Perl and an awk-to-Perl translator. One of the benefits of borrowing features from various other languages is that those subsets of Perl that use those features are familiar to people coming from that other culture. What would be best, in my book, is if someone had a way of saying, "Well, I've got this thing in Visual Basic. Now, can I just rewrite some of these things in Perl?" ..."
The creator of Perl talks about language design and Perl. By Eugene Eric
Kim
DDJ : Is Perl 5.005 what you envisioned Perl to be when you set out to do
it?
LW: That assumes that I'm smart enough to envision something as complicated as Perl.
I knew that Perl would be good at some things, and would be good at more things as time went
on. So, in a sense, I'm sort of blessed with natural stupidity -- as opposed to artificial
intelligence -- in the sense that I know what my intellectual limits are.
I'm not one of these people who can sit down and design an entire system from scratch and
figure out how everything relates to everything else, so I knew from the start that I had to
take the bear-of-very-little-brain approach, and design the thing to evolve. But that fit in
with my background in linguistics, because natural languages evolve over time.
You can apply biological metaphors to languages. They move into niches, and as new needs
arise, languages change over time. It's actually a practical way to design a computer language.
Not all computer programs can be designed that way, but I think more can be designed that way
than have been. A lot of the majestic failures that have occurred in computer science have been
because people thought they could design the whole thing in advance.
DDJ : How do you design a language to evolve?
LW: There are several aspects to that, depending on whether you are talking about
syntax or semantics. On a syntactic level, in the particular case of Perl, I placed
variable names in a separate namespace from reserved words. That's one of the reasons there are
funny characters on the front of variable names -- dollar signs and so forth. That allowed me
to add new reserved words without breaking old programs.
DDJ : What is a scripting language? Does Perl fall into the category of a
scripting language?
LW: Well, being a linguist, I tend to go back to the etymological meanings of
"script" and "program," though, of course, that's fallacious in terms of what they mean
nowadays. A script is what you hand to the actors, and a program is what you hand to the
audience. Now hopefully, the program is already locked in by the time you hand that out,
whereas the script is something you can tinker with. I think of phrases like "following the
script," or "breaking from the script." The notion that you can evolve your script ties into
the notion of rapid prototyping.
A script is something that is easy to tweak, and a program is something that is locked
in. There are all sorts of metaphorical tie-ins that tend to make programs static and scripts
dynamic, but of course, it's a continuum. You can write Perl programs, and you can write C
scripts. People do talk more about Perl programs than C scripts. Maybe that just means Perl is
more versatile.
... ... ...
DDJ : Would that be a better distinction than interpreted versus compiled --
run-time versus compile-time binding?
LW: It's a more useful distinction in many ways because, with late-binding languages
like Perl or Java, you cannot make up your mind about what the real meaning of it is until the
last moment. But there are different definitions of what the last moment is. Computer
scientists would say there are really different "latenesses" of binding.
A good language actually gives you a range, a wide dynamic range, of your level of
discipline. We're starting to move in that direction with Perl. The initial Perl was
lackadaisical about requiring things to be defined or declared or what have you. Perl 5 has
some declarations that you can use if you want to increase your level of discipline. But it's
optional. So you can say "use strict," or you can turn on warnings, or you can do various sorts
of declarations.
DDJ : Would it be accurate to say that Perl doesn't enforce good design?
LW: No, it does not. It tries to give you some tools to help if you want to do that, but I'm
a firm believer that a language -- whether it's a natural language or a computer language --
ought to be an amoral artistic medium.
You can write pretty poems or you can write ugly poems, but that doesn't say whether English
is pretty or ugly. So, while I kind of like to see beautiful computer programs, I don't think
the chief virtue of a language is beauty. That's like asking an artist whether they use
beautiful paints and a beautiful canvas and a beautiful palette. A language should be a medium
of expression, which does not restrict your feeling unless you ask it to.
DDJ : Where does the beauty of a program lie? In the underlying algorithms, in the
syntax of the description?
LW: Well, there are many different definitions of artistic beauty. It can be argued
that it's symmetry, which in a computer language might be considered orthogonality. It's also
been argued that broken symmetry is what is considered most beautiful and most artistic and
diverse. Symmetry breaking is the root of our whole universe according to physicists, so if God
is an artist, then maybe that's his definition of what beauty is.
This actually ties back in with the built-to-evolve concept on the semantic level. A lot of
computer languages were defined to be naturally orthogonal, or at least the computer scientists
who designed them were giving lip service to orthogonality. And that's all very well if you're
trying to define a position in a space. But that's not how people think. It's not how natural
languages work. Natural languages are not orthogonal, they're diagonal. They give you
hypotenuses.
Suppose you're flying from California to Quebec. You don't fly due east, and take a left
turn over Nashville, and then go due north. You fly straight, more or less, from here to there.
And it's a network. And it's actually sort of a fractal network, where your big link is
straight, and you have little "fractally" things at the end for your taxi and bicycle and
whatever the mode of transport you use. Languages work the same way. And they're designed to
get you most of the way here, and then have ways of refining the additional shades of
meaning.
When they first built the University of California at Irvine campus, they just put the
buildings in. They did not put any sidewalks, they just planted grass. The next year, they came
back and built the sidewalks where the trails were in the grass. Perl is that kind of a
language. It is not designed from first principles. Perl is those sidewalks in the grass. Those
trails that were there before were the previous computer languages that Perl has borrowed ideas
from. And Perl has unashamedly borrowed ideas from many, many different languages. Those paths
can go diagonally. We want shortcuts. Sometimes we want to be able to do the orthogonal thing,
so Perl generally allows the orthogonal approach also. But it also allows a certain number of
shortcuts, and being able to insert those shortcuts is part of that evolutionary thing.
I don't want to claim that this is the only way to design a computer language, or that
everyone is going to actually enjoy a computer language that is designed in this way.
Obviously, some people speak other languages. But Perl was an experiment in trying to come
up with not a large language -- not as large as English -- but a medium-sized language, and to
try to see if, by adding certain kinds of complexity from natural language, the expressiveness
of the language grew faster than the pain of using it. And, by and large, I think that
experiment has been successful.
DDJ : Give an example of one of the things you think is expressive about Perl that
you wouldn't find in other languages.
LW: The fact that regular-expression parsing and the use of regular expressions is
built right into the language. If you used the regular expression in a list context, it
will pass back a list of the various subexpressions that it matched. A different computer
language may add regular expressions, even have a module that's called Perl 5 regular
expressions, but it won't be integrated into the language. You'll have to jump through an extra
hoop, take that right angle turn, in order to say, "Okay, well here, now apply the regular
expression, now let's pull the things out of the regular expression," rather than being able to
use the thing in a particular context and have it do something meaningful.
The school of linguistics I happened to come up through is called tagmemics, and it makes a
big deal about context. In a real language -- this is a tagmemic idea -- you can distinguish
between what the conventional meaning of the "thing" is and how it's being used. You think of
"dog" primarily as a noun, but you can use it as a verb. That's the prototypical example, but
the "thing" applies at many different levels. You think of a sentence as a sentence.
Transformational grammar was built on the notion of analyzing a sentence. And they had all
their cute rules, and they eventually ended up throwing most of them back out again.
But in the tagmemic view, you can take a sentence as a unit and use it differently. You can
say a sentence like, "I don't like your I-can-use-anything-like-a-sentence attitude." There,
I've used the sentence as an adjective. The sentence isn't an adjective if you analyze it, any
way you want to analyze it. But this is the way people think. If there's a way to make sense of
something in a particular context, they'll do so. And Perl is just trying to make those things
make sense. There's the basic distinction in Perl between singular and plural context -- call
it list context and scalar context, if you will. But you can use a particular construct in a
singular context that has one meaning that sort of makes sense using the list context, and it
may have a different meaning that makes sense in the plural context.
That is where the expressiveness comes from. In English, you read essays by people who say,
"Well, how does this metaphor thing work?" Owen Barfield talks about this. You say one thing
and mean another. That's how metaphors arise. Or you take two things and jam them together. I
think it was Owen Barfield, or maybe it was C.S. Lewis, who talked about "a piercing
sweetness." And we know what "piercing" is, and we know what "sweetness" is, but you put those
two together, and you've created a new meaning. And that's how languages ought to work.
DDJ : Is a more expressive language more difficult to learn?
LW: Yes. It was a conscious tradeoff at the beginning of Perl that it would be more
difficult to master the whole language. However, taking another clue from a natural language,
we do not require 5-year olds to speak with the same diction as 50-year olds. It is okay for
you to use the subset of a language that you are comfortable with, and to learn as you go. This
is not true of so many computer-science languages. If you program C++ in a subset that
corresponds to C, you get laughed out of the office.
There's a whole subject that we haven't touched here. A language is not a set of syntax
rules. It is not just a set of semantics. It's the entire culture surrounding the language
itself. So part of the cultural context in which you analyze a language includes all the
personalities and people involved -- how everybody sees the language, how they propagate the
language to other people, how it gets taught, the attitudes of people who are helping each
other learn the language -- all of this goes into the pot of context.
Because I had already put out other freeware projects (rn and patch), I realized before I
ever wrote Perl that a great deal of the value of those things was from collaboration. Many of
the really good ideas in rn and Perl came from other people.
I think that Perl is in its adolescence right now. There are places where it is grown up,
and places where it's still throwing tantrums. I have a couple of teenagers, and the thing you
notice about teenagers is that they're always plus or minus ten years from their real age. So
if you've got a 15-year old, they're either acting 25 or they're acting 5. Sometimes
simultaneously! And Perl is a little that way, but that's okay.
DDJ : What part of Perl isn't quite grown up?
LW: Well, I think that the part of Perl, which has not been realistic up until now
has been on the order of how you enable people in certain business situations to actually use
it properly. There are a lot of people who cannot use freeware because it is, you know,
schlocky. Their bosses won't let them, their government won't let them, or they think their
government won't let them. There are a lot of people who, unknown to their bosses or their
government, are using Perl.
DDJ : So these aren't technical issues.
LW: I suppose it depends on how you define technology. Some of it is perceptions,
some of it is business models, and things like that. I'm trying to generate a new symbiosis
between the commercial and the freeware interests. I think there's an artificial dividing line
between those groups and that they could be more collaborative.
As a linguist, the generation of a linguistic culture is a technical issue. So, these
adjustments we might make in people's attitudes toward commercial operations or in how Perl is
being supported, distributed, advertised, and marketed -- not in terms of trying to make bucks,
but just how we propagate the culture -- these are technical ideas in the psychological and the
linguistic sense. They are, of course, not technical in the computer-science sense. But I think
that's where Perl has really excelled -- its growth has not been driven solely by technical
merits.
DDJ : What are the things that you do when you set out to create a culture around
the software that you write?
LW:In the beginning, I just tried to help everybody. Particularly being on
USENET. You know, there are even some sneaky things in there -- like looking for people's Perl
questions in many different newsgroups. For a long time, I resisted creating a newsgroup for
Perl, specifically because I did not want it to be ghettoized. You know, if someone can say,
"Oh, this is a discussion about Perl, take it over to the Perl newsgroup," then they shut off
the discussion in the shell newsgroup. If there are only the shell newsgroups, and someone
says, "Oh, by the way, in Perl, you can solve it like this," that's free advertising. So, it's
fuzzy. We had proposed Perl as a newsgroup probably a year or two before we actually created
it. It eventually came to the point where the time was right for it, and we did that.
DDJ : Perl has really been pigeonholed as a language of the Web. One result is
that people mistakenly try to compare Perl to Java. Why do you think people make the comparison
in the first place? Is there anything to compare?
LW: Well, people always compare everything.
DDJ : Do you agree that Perl has been pigeonholed?
LW: Yes, but I'm not sure that it bothers me. Before it was pigeonholed as a web
language, it was pigeonholed as a system-administration language, and I think that -- this goes
counter to what I was saying earlier about marketing Perl -- if the abilities are there to do a
particular job, there will be somebody there to apply it, generally speaking. So I'm not too
worried about Perl moving into new ecological niches, as long as it has the capability of
surviving in there.
Perl is actually a scrappy language for surviving in a particular ecological niche. (Can you
tell I like biological metaphors?) You've got to understand that it first went up against C and
against shell, both of which were much loved in the UNIX community, and it succeeded against
them. So that early competition actually makes it quite a fit competitor in many other realms,
too.
For most web applications, Perl is severely underutilized. Your typical CGI script says
print, print, print, print, print, print, print. But in a sense, it's the dynamic range of Perl
that allows for that. You don't have to say a whole lot to write a simple Perl script, whereas
your minimal Java program is, you know, eight or ten lines long anyway. Many of the features
that made it competitive in the UNIX space will make it competitive in other spaces.
Now, there are things that Perl can't do. One of the things that you can't do with Perl
right now is compile it down to Java bytecode. And if that, in the long run, becomes a large
ecological niche (and this is not yet a sure thing), then that is a capability I want to be
certain that Perl has.
DDJ : There's been a movement to merge the two development paths between the
ActiveWare Perl for Windows and the main distribution of Perl. You were talking about
ecological niches earlier, and how Perl started off as a text-processing language. The
scripting languages that are dominant on the Microsoft platforms -- like VB -- tend to be more
visual than textual. Given Perl's UNIX origins -- awk, sed, and C, for that matter -- do you
think that Perl, as it currently stands, has the tools to fit into a Windows niche?
LW: Yes and no. It depends on your problem domain and who's trying to solve the
problem. There are problems that only need a textual solution or don't need a visual solution.
Automation things of certain sorts don't need to interact with the desktop, so for those sorts
of things -- and for the programmers who aren't really all that interested in visual
programming -- it's already good for that. And people are already using it for that. Certainly,
there is a group of people who would be enabled to use Perl if it had more of a visual
interface, and one of the things we're talking about doing for the O'Reilly NT Perl Resource
Kit is some sort of a visual interface.
A lot of what Windows is designed to do is to get mere mortals from 0 to 60, and there are
some people who want to get from 60 to 100. We are not really interested in being in
Microsoft's crosshairs. We're not actually interested in competing head-to-head with Visual
Basic, and to the extent that we do compete with them, it's going to be kind of subtle. There
has to be some way to get people from the slow lane to the fast lane. It's one thing to give
them a way to get from 60 to 100, but if they have to spin out to get from the slow lane to the
fast lane, then that's not going to work either.
Over the years, much of the work of making Perl work for people has been in designing
ways for people to come to Perl. I actually delayed the first version of Perl for a couple of
months until I had a sed-to-Perl and an awk-to-Perl translator. One of the benefits of
borrowing features from various other languages is that those subsets of Perl that use those
features are familiar to people coming from that other culture. What would be best, in my book,
is if someone had a way of saying, "Well, I've got this thing in Visual Basic. Now, can I just
rewrite some of these things in Perl?"
We're already doing this with Java. On our UNIX Perl Resource Kit, I've got a hybrid
language called "jpl" -- that's partly a pun on my old alma mater, Jet Propulsion Laboratory,
and partly for Java, Perl...Lingo, there we go! That's good. "Java Perl Lingo." You've heard it
first here! jpl lets you take a Java program and magically turn one of the methods into a chunk
of Perl right there inline. It turns Perl code into a native method, and automates the linkage
so that when you pull in the Java code, it also pulls in the Perl code, and the interpreter,
and everything else. It's actually calling out from Java's Virtual Machine into Perl's virtual
machine. And we can call in the other direction, too. You can embed Java in Perl, except that
there's a bug in JDK having to do with threads that prevents us from doing any I/O. But that's
Java's problem.
It's a way of letting somebody evolve from a purely Java solution into, at least partly, a
Perl solution. It's important not only to make Perl evolve, but to make it so that people can
evolve their own programs. It's how I program, and I think a lot of people program that way.
Most of us are too stupid to know what we want at the beginning.
DDJ : Is there hope down the line to present Perl to a standardization
body?
LW: Well, I have said in jest that people will be free to standardize Perl when I'm
dead. There may come a time when that is the right thing to do, but it doesn't seem appropriate
yet.
DDJ : When would that time be?
LW: Oh, maybe when the federal government declares that we can't export Perl unless
it's standardized or something.
DDJ : Only when you're forced to, basically.
LW: Yeah. To me, once things get to a standards body, it's not very interesting
anymore. The most efficient form of government is a benevolent dictatorship. I remember walking
into some BOF that USENIX held six or seven years ago, and John Quarterman was running it, and
he saw me sneak in, sit in the back corner, and he said, "Oh, here comes Larry Wall! He's a
standards committee all of his own!"
A great deal of the success of Perl so far has been based on some of my own idiosyncrasies.
And I recognize that they are idiosyncrasies, and I try to let people argue me out of them
whenever appropriate. But there are still ways of looking at things that I seem to do
differently than anybody else. It may well be that perl5-porters will one day degenerate into a
standards committee. So far, I have not abused my authority to the point that people have
written me off, and so I am still allowed to exercise a certain amount of absolute power over
the Perl core.
I just think headless standards committees tend to reduce everything to mush. There is a
conservatism that committees have that individuals don't, and there are times when you want to
have that conservatism and times you don't. I try to exercise my authority where we don't want
that conservatism. And I try not to exercise it at other times.
DDJ : How did you get involved in computer science? You're a linguist by
background?
LW: Because I talk to computer scientists more than I talk to linguists, I wear the
linguistics mantle more than I wear the computer-science mantle, but they actually came along
in parallel, and I'm probably a 50/50 hybrid. You know, basically, I'm no good at either
linguistics or computer science.
DDJ : So you took computer-science courses in college?
LW: In college, yeah. In college, I had various majors, but what I eventually
graduated in -- I'm one of those people that packed four years into eight -- what I eventually
graduated in was a self-constructed major, and it was Natural and Artificial Languages, which
seems positively prescient considering where I ended up.
DDJ : When did you join O'Reilly as a salaried employee? And how did that come
about?
LW: A year-and-a-half ago. It was partly because my previous job was kind of winding
down.
DDJ : What was your previous job?
LW: I was working for Seagate Software. They were shutting down that branch of
operations there. So, I was just starting to look around a little bit, and Tim noticed me
looking around and said, "Well, you know, I've wanted to hire you for a long time," so we
talked. And Gina Blaber (O'Reilly's software director) and I met. So, they more or less offered
to pay me to mess around with Perl.
So it's sort of my dream job. I get to work from home, and if I feel like taking a nap in
the afternoon, I can take a nap in the afternoon and work all night.
DDJ : Do you have any final comments, or tips for aspiring programmers? Or
aspiring Perl programmers?
LW: Assume that your first idea is wrong, and try to think through the various
options. I think that the biggest mistake people make is latching onto the first idea that
comes to them and trying to do that. It really comes to a thing that my folks taught me about
money. Don't buy something unless you've wanted it three times. Similarly, don't throw in a
feature when you first think of it. Think if there's a way to generalize it, think if it should
be generalized. Sometimes you can generalize things too much. I think like the things in Scheme
were generalized too much. There is a level of abstraction beyond which people don't want to
go. Take a good look at what you want to do, and try to come up with the long-term lazy way,
not the short-term lazy way.
Knuth: Yeah. That's absolutely true. I've got to get another thought out of my mind though.
That is, early on in the TeX project I also had to do programming of a completely different
type. I told you last week that this was my first real exercise in structured programming,
which was one of Dijkstra's huge... That's one of the few breakthroughs in the history of
computer science, in a way. He was actually responsible for maybe two of the ten that I
know.
So I'm doing structured programming as I'm writing TeX. I'm trying to do it right, the way I
should've been writing programs in the 60s. Then I also got this typesetting machine, which
had, inside of it, a tiny 8080 chip or something. I'm not sure exactly. It was a Zilog, or some
very early Intel chip. Way before the 386s. A little computer with 8-bit registers and a small
number of things it could do. I had to write my own assembly language for this, because the
existing software for writing programs for this little micro thing were so bad. I had to write
actually thousands of lines of code for this, in order to control the typesetting. Inside the
machine I had to control a stepper motor, and I had to accelerate it.
Every so often I had to give another [command] saying, "Okay, now take a step," and then
continue downloading a font from the mainframe.
I had six levels of interrupts in this program. I remember talking to you at this time,
saying, "Ed, I'm programming in assembly language for an 8-bit computer," and you said "Yeah,
you've been doing the same thing and it's fun again."
You know, you'll remember. We'll undoubtedly talk more about that when I have my turn
interviewing you in a week or so. This is another aspect of programming: that you also feel
that you're in control and that there's not a black box separating you. It's not only the
power, but it's the knowledge of what's going on; that nobody's hiding something. It's also
this aspect of jumping levels of abstraction. In my opinion, the thing that computer scientists
are best at is seeing things at many levels of detail: high level, intermediate levels, and
lowest levels. I know if I'm adding 1 to a certain number, that this is getting me towards some
big goal at the top. People enjoy most the things that they're good at. Here's a case where if
you're working on a machine that has only this 8-bit capability, but in order to do this you
have to go through levels, of not only that machine, but also to the next level up of the
assembler, and then you have a simulator in which you can help debug your programs, and you
have higher level languages that go through, and then you have the typesetting at the top.
There are these six or seven levels all present at the same time. A computer scientist is in
heaven in a situation like this.
Feigenbaum: Don, to get back, I want to ask you about that as part of the next
question. You went back into programming in a really serious way. It took you, as I said
before, ten years, not one year, and you didn't quit. As soon as you mastered one part of it,
you went into Metafont, which is another big deal. To what extent were you doing that because
you needed to, what I might call expose yourself to, or upgrade your skills in, the art that
had emerged over the decade-and-a-half since you had done RUNCIBLE? And to what extent did you
do it just because you were driven to be a programmer? You loved programming.
Knuth: Yeah. I think your hypothesis is good. It didn't occur to me at the time that
I just had to program in order to be a happy man. Certainly I didn't find my other roles
distasteful, except for fundraising. I enjoyed every aspect of being a professor except dealing
with proposals, which I did my share of, but that was a necessary evil sort of in my own
thinking, I guess. But the fact that now I'm still compelled to I wake up in the morning with
an idea, and it makes my day to think of adding a couple of lines to my program. Gives me a
real high. It must be the way poets feel, or musicians and so on, and other people, painters,
whatever. Programming does that for me. It's certainly true. But the fact that I had to put so
much time in it was not totally that, I'm sure, because it became a responsibility. It wasn't
just for Phyllis and me, as it turned out. I started working on it at the AI lab, and people
were looking at the output coming out of the machine and they would say, "Hey, Don, how did you
do that?" Guy Steele was visiting from MIT that summer and he said, "Don, I want to port this
to take it to MIT." I didn't have two users.
First I had 10, and then I had 100, and then I had 1000. Every time it went to another order
of magnitude I had to change the system, because it would almost match their needs but then
they would have very good suggestions as to something it wasn't covering. Then when it went to
10,000 and when it went to 100,000, the last stage was 10 years later when I made it friendly
for the other alphabets of the world, where people have accented letters and Russian letters.
<p>I had started out with only 7-bit codes. I had so many international users by that
time, I saw that was a fundamental error. I started out with the idea that nobody would ever
want to use a keyboard that could generate more than about 90 characters. It was going to be
too complicated. But I was wrong. So it [TeX] was a burden as well, in the sense that I wanted
to do a responsible job.
I had actually consciously planned an end-game that would take me four years to finish,
and [then] not continue maintaining it and adding on, so that I could have something where I
could say, "And now it's done and it's never going to change." I believe this is one aspect of
software that, not for every system, but for TeX, it was vital that it became something that
wouldn't be a moving target after while.
Feigenbaum: The books on TeX were a period. That is, you put a period down and you
said, "This is it."
Programming skills are somewhat similar to the skills of people who play violin or piano. As
soon a you stop playing violin or piano still start to evaporate. First slowly, then quicker. In
two yours you probably will lose 80%.
Notable quotes:
"... I happened to look the other day. I wrote 35 programs in January, and 28 or 29 programs in February. These are small programs, but I have a compulsion. I love to write programs and put things into it. ..."
Dijkstra said he was proud to be a programmer. Unfortunately he changed his attitude
completely, and I think he wrote his last computer program in the 1980s. At this conference I
went to in 1967 about simulation language, Chris Strachey was going around asking everybody at
the conference what was the last computer program you wrote. This was 1967. Some of the people
said, "I've never written a computer program." Others would say, "Oh yeah, here's what I did
last week." I asked Edsger this question when I visited him in Texas in the 90s and he said,
"Don, I write programs now with pencil and paper, and I execute them in my head." He finds that
a good enough discipline.
I think he was mistaken on that. He taught me a lot of things, but I really think that if he
had continued... One of Dijkstra's greatest strengths was that he felt a strong sense of
aesthetics, and he didn't want to compromise his notions of beauty. They were so intense that
when he visited me in the 1960s, I had just come to Stanford. I remember the conversation we
had. It was in the first apartment, our little rented house, before we had electricity in the
house.
We were sitting there in the dark, and he was telling me how he had just learned about the
specifications of the IBM System/360, and it made him so ill that his heart was actually
starting to flutter.
He intensely disliked things that he didn't consider clean to work with. So I can see that
he would have distaste for the languages that he had to work with on real computers. My
reaction to that was to design my own language, and then make Pascal so that it would work well
for me in those days. But his response was to do everything only intellectually.
So, programming.
I happened to look the other day. I wrote 35 programs in January, and 28 or 29 programs
in February. These are small programs, but I have a compulsion. I love to write programs and
put things into it. I think of a question that I want to answer, or I have part of my book
where I want to present something. But I can't just present it by reading about it in a book.
As I code it, it all becomes clear in my head. It's just the discipline. The fact that I have
to translate my knowledge of this method into something that the machine is going to understand
just forces me to make that crystal-clear in my head. Then I can explain it to somebody else
infinitely better. The exposition is always better if I've implemented it, even though it's
going to take me more time.
So I had a programming hat when I was outside of Cal Tech, and at Cal Tech I am a
mathematician taking my grad studies. A startup company, called Green Tree Corporation because
green is the color of money, came to me and said, "Don, name your price. Write compilers for us
and we will take care of finding computers for you to debug them on, and assistance for you to
do your work. Name your price." I said, "Oh, okay. $100,000.", assuming that this was In that
era this was not quite at Bill Gate's level today, but it was sort of out there.
The guy didn't blink. He said, "Okay." I didn't really blink either. I said, "Well, I'm not
going to do it. I just thought this was an impossible number."
At that point I made the decision in my life that I wasn't going to optimize my income; I
was really going to do what I thought I could do for well, I don't know. If you ask me what
makes me most happy, number one would be somebody saying "I learned something from you". Number
two would be somebody saying "I used your software". But number infinity would be Well, no.
Number infinity minus one would be "I bought your book". It's not as good as "I read your
book", you know. Then there is "I bought your software"; that was not in my own personal value.
So that decision came up. I kept up with the literature about compilers. The Communications of
the ACM was where the action was. I also worked with people on trying to debug the ALGOL
language, which had problems with it. I published a few papers, like "The Remaining Trouble
Spots in ALGOL 60" was one of the papers that I worked on. I chaired a committee called
"Smallgol" which was to find a subset of ALGOL that would work on small computers. I was active
in programming languages.
Frana: You have made the comment several times that maybe 1 in 50 people have the
"computer scientist's mind." Knuth: Yes. Frana: I am wondering if a large number of those
people are trained professional librarians? [laughter] There is some strangeness there. But can
you pinpoint what it is about the mind of the computer scientist that is....
Knuth: That is different?
Frana: What are the characteristics?
Knuth: Two things: one is the ability to deal with non-uniform structure, where you
have case one, case two, case three, case four. Or that you have a model of something where the
first component is integer, the next component is a Boolean, and the next component is a real
number, or something like that, you know, non-uniform structure. To deal fluently with those
kinds of entities, which is not typical in other branches of mathematics, is critical. And the
other characteristic ability is to shift levels quickly, from looking at something in the large
to looking at something in the small, and many levels in between, jumping from one level of
abstraction to another. You know that, when you are adding one to some number, that you are
actually getting closer to some overarching goal. These skills, being able to deal with
nonuniform objects and to see through things from the top level to the bottom level, these are
very essential to computer programming, it seems to me. But maybe I am fooling myself because I
am too close to it.
Frana: It is the hardest thing to really understand that which you are existing
within.
Knuth: Well, certainly it seems the way things are going. You take any particular subject
that you are interested in and you try to see if somebody with an American high school
education has learned it, and you will be appalled. You know, Jesse Jackson thinks that
students know nothing about political science, and I am sure the chemists think that students
don't know chemistry, and so on. But somehow they get it when they have to later. But I would
say certainly the students now have been getting more of a superficial idea of mathematics than
they used to. We have to do remedial stuff at Stanford that we didn't have to do thirty years
ago.
Frana: Gio [Wiederhold] said much the same thing to me.
Knuth: The most scandalous thing was that Stanford's course in linear algebra could not get
to eigenvalues because the students didn't know about complex numbers. Now every course at
Stanford that takes linear algebra as a prerequisite does so because they want the students to
know about eigenvalues. But here at Stanford, with one of the highest admission standards of
any university, our students don't know complex numbers. So we have to teach them that when
they get to college. Yes, this is definitely a breakdown.
Frana: Was your mathematics training in high school particularly good, or was it that you
spent a lot of time actually doing problems?
Knuth: No, my mathematics training in high school was not good. My teachers could not answer
my questions and so I decided I'd go into physics. I mean, I had played with mathematics in
high school. I did a lot of work drawing graphs and plotting points and I used pi as the radix
of a number system, and explored what the world would be like if you wanted to do logarithms
and you had a number system based on pi. And I had played with stuff like that. But my teachers
couldn't answer questions that I had.
... ... ... Frana: Do you have an answer? Are American students different today? In one of
your interviews you discuss the problem of creativity versus gross absorption of knowledge.
Knuth: Well, that is part of it. Today we have mostly a sound byte culture, this lack of
attention span and trying to learn how to pass exams. Frana: Yes,
Knuth: I can be a writer, who tries to organize other people's ideas into some kind of a
more coherent structure so that it is easier to put things together. I can see that I could be
viewed as a scholar that does his best to check out sources of material, so that people get
credit where it is due. And to check facts over, not just to look at the abstract of something,
but to see what the methods were that did it and to fill in holes if necessary. I look at my
role as being able to understand the motivations and terminology of one group of specialists
and boil it down to a certain extent so that people in other parts of the field can use it. I
try to listen to the theoreticians and select what they have done that is important to the
programmer on the street; to remove technical jargon when possible.
But I have never been good at any kind of a role that would be making policy, or advising
people on strategies, or what to do. I have always been best at refining things that are there
and bringing order out of chaos. I sometimes raise new ideas that might stimulate people, but
not really in a way that would be in any way controlling the flow. The only time I have ever
advocated something strongly was with literate programming; but I do this always with the
caveat that it works for me, not knowing if it would work for anybody else.
When I work with a system that I have created myself, I can always change it if I don't like
it. But everybody who works with my system has to work with what I give them. So I am not able
to judge my own stuff impartially. So anyway, I have always felt bad about if anyone says,
'Don, please forecast the future,'...
"... When you're writing a document for a human being to understand, the human being will look at it and nod his head and say, "Yeah, this makes sense." But then there's all kinds of ambiguities and vagueness that you don't realize until you try to put it into a computer. Then all of a sudden, almost every five minutes as you're writing the code, a question comes up that wasn't addressed in the specification. "What if this combination occurs?" ..."
"... When you're faced with implementation, a person who has been delegated this job of working from a design would have to say, "Well hmm, I don't know what the designer meant by this." ..."
...I showed the second version of this design to two of my graduate students, and I said,
"Okay, implement this, please, this summer. That's your summer job." I thought I had specified
a language. I had to go away. I spent several weeks in China during the summer of 1977, and I
had various other obligations. I assumed that when I got back from my summer trips, I would be
able to play around with TeX and refine it a little bit. To my amazement, the students, who
were outstanding students, had not competed [it]. They had a system that was able to do about
three lines of TeX. I thought, "My goodness, what's going on? I thought these were good
students." Well afterwards I changed my attitude to saying, "Boy, they accomplished a
miracle."
Because going from my specification, which I thought was complete, they really had an
impossible task, and they had succeeded wonderfully with it. These students, by the way, [were]
Michael Plass, who has gone on to be the brains behind almost all of Xerox's Docutech software
and all kind of things that are inside of typesetting devices now, and Frank Liang, one of the
key people for Microsoft Word.
He did important mathematical things as well as his hyphenation methods which are quite used
in all languages now. These guys were actually doing great work, but I was amazed that they
couldn't do what I thought was just sort of a routine task. Then I became a programmer in
earnest, where I had to do it. The reason is when you're doing programming, you have to explain
something to a computer, which is dumb.
When you're writing a document for a human being to understand, the human being will
look at it and nod his head and say, "Yeah, this makes sense." But then there's all kinds of
ambiguities and vagueness that you don't realize until you try to put it into a computer. Then
all of a sudden, almost every five minutes as you're writing the code, a question comes up that
wasn't addressed in the specification. "What if this combination occurs?"
It just didn't occur to the person writing the design specification. When you're faced
with implementation, a person who has been delegated this job of working from a design would
have to say, "Well hmm, I don't know what the designer meant by this."
If I hadn't been in China they would've scheduled an appointment with me and stopped their
programming for a day. Then they would come in at the designated hour and we would talk. They
would take 15 minutes to present to me what the problem was, and then I would think about it
for a while, and then I'd say, "Oh yeah, do this. " Then they would go home and they would
write code for another five minutes and they'd have to schedule another appointment.
I'm probably exaggerating, but this is why I think Bob Floyd's Chiron compiler never got
going. Bob worked many years on a beautiful idea for a programming language, where he designed
a language called Chiron, but he never touched the programming himself. I think this was
actually the reason that he had trouble with that project, because it's so hard to do the
design unless you're faced with the low-level aspects of it, explaining it to a machine instead
of to another person.
Forsythe, I think it was, who said, "People have said traditionally that you don't
understand something until you've taught it in a class. The truth is you don't really
understand something until you've taught it to a computer, until you've been able to program
it." At this level, programming was absolutely important
Having just celebrated my 10000th birthday (in base three), I'm operating a little bit in
history mode. Every once in awhile, people have asked me to record some of my memories of past
events --- I guess because I've been fortunate enough to live at some pretty exciting times,
computersciencewise. These after-the-fact recollections aren't really as reliable as
contemporary records; but they do at least show what I think I remember. And the stories are
interesting, because they involve lots of other people.
So, before these instances of oral history themselves begin to fade from my memory, I've
decided to record some links to several that I still know about:
Interview by Philip L Frana at the Charles Babbage Institute, November 2001
Some extended interviews, not available online, have also been published in books, notably
in Chapters 7--17 of Companion to the Papers of Donald
Knuth (conversations with Dikran Karagueuzian in the summer of 1996), and in two
books by Edgar G. Daylight, The Essential Knuth (2013), Algorithmic Barriers
Falling (2014).
Knuth: No, I stopped going to conferences. It was too discouraging. Computer programming
keeps getting harder because more stuff is discovered. I can cope with learning about one new
technique per day, but I can't take ten in a day all at once. So conferences are depressing; it
means I have so much more work to do. If I hide myself from the truth I am much happier.
"... Also, Addison-Wesley was the people who were asking me to do this book; my favorite textbooks had been published by Addison Wesley. They had done the books that I loved the most as a student. For them to come to me and say, "Would you write a book for us?", and here I am just a secondyear gradate student -- this was a thrill. ..."
"... But in those days, The Art of Computer Programming was very important because I'm thinking of the aesthetical: the whole question of writing programs as something that has artistic aspects in all senses of the word. The one idea is "art" which means artificial, and the other "art" means fine art. All these are long stories, but I've got to cover it fairly quickly. ..."
Knuth: This is, of course, really the story of my life, because I hope to live long enough
to finish it. But I may not, because it's turned out to be such a huge project. I got married
in the summer of 1961, after my first year of graduate school. My wife finished college, and I
could use the money I had made -- the $5000 on the compiler -- to finance a trip to Europe for
our honeymoon.
We had four months of wedded bliss in Southern California, and then a man from
Addison-Wesley came to visit me and said "Don, we would like you to write a book about how to
write compilers."
The more I thought about it, I decided "Oh yes, I've got this book inside of me."
I sketched out that day -- I still have the sheet of tablet paper on which I wrote -- I
sketched out 12 chapters that I thought ought to be in such a book. I told Jill, my wife, "I
think I'm going to write a book."
As I say, we had four months of bliss, because the rest of our marriage has all been devoted
to this book. Well, we still have had happiness. But really, I wake up every morning and I
still haven't finished the book. So I try to -- I have to -- organize the rest of my life
around this, as one main unifying theme. The book was supposed to be about how to write a
compiler. They had heard about me from one of their editorial advisors, that I knew something
about how to do this. The idea appealed to me for two main reasons. One is that I did enjoy
writing. In high school I had been editor of the weekly paper. In college I was editor of the
science magazine, and I worked on the campus paper as copy editor. And, as I told you, I wrote
the manual for that compiler that we wrote. I enjoyed writing, number one.
Also, Addison-Wesley was the people who were asking me to do this book; my favorite
textbooks had been published by Addison Wesley. They had done the books that I loved the most
as a student. For them to come to me and say, "Would you write a book for us?", and here I am
just a secondyear gradate student -- this was a thrill.
Another very important reason at the time was that I knew that there was a great need for a
book about compilers, because there were a lot of people who even in 1962 -- this was January
of 1962 -- were starting to rediscover the wheel. The knowledge was out there, but it hadn't
been explained. The people who had discovered it, though, were scattered all over the world and
they didn't know of each other's work either, very much. I had been following it. Everybody I
could think of who could write a book about compilers, as far as I could see, they would only
give a piece of the fabric. They would slant it to their own view of it. There might be four
people who could write about it, but they would write four different books. I could present all
four of their viewpoints in what I would think was a balanced way, without any axe to grind,
without slanting it towards something that I thought would be misleading to the compiler writer
for the future. I considered myself as a journalist, essentially. I could be the expositor, the
tech writer, that could do the job that was needed in order to take the work of these brilliant
people and make it accessible to the world. That was my motivation. Now, I didn't have much
time to spend on it then, I just had this page of paper with 12 chapter headings on it. That's
all I could do while I'm a consultant at Burroughs and doing my graduate work. I signed a
contract, but they said "We know it'll take you a while." I didn't really begin to have much
time to work on it until 1963, my third year of graduate school, as I'm already finishing up on
my thesis. In the summer of '62, I guess I should mention, I wrote another compiler. This was
for Univac; it was a FORTRAN compiler. I spent the summer, I sold my soul to the devil, I guess
you say, for three months in the summer of 1962 to write a FORTRAN compiler. I believe that the
salary for that was $15,000, which was much more than an assistant professor. I think assistant
professors were getting eight or nine thousand in those days.
Feigenbaum: Well, when I started in 1960 at [University of California] Berkeley, I was
getting $7,600 for the nine-month year.
Knuth: Knuth: Yeah, so you see it. I got $15,000 for a summer job in 1962 writing a
FORTRAN compiler. One day during that summer I was writing the part of the compiler that looks
up identifiers in a hash table. The method that we used is called linear probing. Basically you
take the variable name that you want to look up, you scramble it, like you square it or
something like this, and that gives you a number between one and, well in those days it would
have been between 1 and 1000, and then you look there. If you find it, good; if you don't find
it, go to the next place and keep on going until you either get to an empty place, or you find
the number you're looking for. It's called linear probing. There was a rumor that one of
Professor Feller's students at Princeton had tried to figure out how fast linear probing works
and was unable to succeed. This was a new thing for me. It was a case where I was doing
programming, but I also had a mathematical problem that would go into my other [job]. My winter
job was being a math student, my summer job was writing compilers. There was no mix. These
worlds did not intersect at all in my life at that point. So I spent one day during the summer
while writing the compiler looking at the mathematics of how fast does linear probing work. I
got lucky, and I solved the problem. I figured out some math, and I kept two or three sheets of
paper with me and I typed it up. ["Notes on 'Open' Addressing', 7/22/63] I guess that's on the
internet now, because this became really the genesis of my main research work, which developed
not to be working on compilers, but to be working on what they call analysis of algorithms,
which is, have a computer method and find out how good is it quantitatively. I can say, if I
got so many things to look up in the table, how long is linear probing going to take. It dawned
on me that this was just one of many algorithms that would be important, and each one would
lead to a fascinating mathematical problem. This was easily a good lifetime source of rich
problems to work on. Here I am then, in the middle of 1962, writing this FORTRAN compiler, and
I had one day to do the research and mathematics that changed my life for my future research
trends. But now I've gotten off the topic of what your original question was.
Feigenbaum: We were talking about sort of the.. You talked about the embryo of The Art of
Computing. The compiler book morphed into The Art of Computer Programming, which became a
seven-volume plan.
Knuth: Exactly. Anyway, I'm working on a compiler and I'm thinking about this. But now I'm
starting, after I finish this summer job, then I began to do things that were going to be
relating to the book. One of the things I knew I had to have in the book was an artificial
machine, because I'm writing a compiler book but machines are changing faster than I can write
books. I have to have a machine that I'm totally in control of. I invented this machine called
MIX, which was typical of the computers of 1962.
In 1963 I wrote a simulator for MIX so that I could write sample programs for it, and I
taught a class at Caltech on how to write programs in assembly language for this hypothetical
computer. Then I started writing the parts that dealt with sorting problems and searching
problems, like the linear probing idea. I began to write those parts, which are part of a
compiler, of the book. I had several hundred pages of notes gathering for those chapters for
The Art of Computer Programming. Before I graduated, I've already done quite a bit of writing
on The Art of Computer Programming.
I met George Forsythe about this time. George was the man who inspired both of us [Knuth and
Feigenbaum] to come to Stanford during the '60s. George came down to Southern California for a
talk, and he said, "Come up to Stanford. How about joining our faculty?" I said "Oh no, I can't
do that. I just got married, and I've got to finish this book first." I said, "I think I'll
finish the book next year, and then I can come up [and] start thinking about the rest of my
life, but I want to get my book done before my son is born." Well, John is now 40-some years
old and I'm not done with the book. Part of my lack of expertise is any good estimation
procedure as to how long projects are going to take. I way underestimated how much needed to be
written about in this book. Anyway, I started writing the manuscript, and I went merrily along
writing pages of things that I thought really needed to be said. Of course, it didn't take long
before I had started to discover a few things of my own that weren't in any of the existing
literature. I did have an axe to grind. The message that I was presenting was in fact not going
to be unbiased at all. It was going to be based on my own particular slant on stuff, and that
original reason for why I should write the book became impossible to sustain. But the fact that
I had worked on linear probing and solved the problem gave me a new unifying theme for the
book. I was going to base it around this idea of analyzing algorithms, and have some
quantitative ideas about how good methods were. Not just that they worked, but that they worked
well: this method worked 3 times better than this method, or 3.1 times better than this method.
Also, at this time I was learning mathematical techniques that I had never been taught in
school. I found they were out there, but they just hadn't been emphasized openly, about how to
solve problems of this kind.
So my book would also present a different kind of mathematics than was common in the
curriculum at the time, that was very relevant to analysis of algorithm. I went to the
publishers, I went to Addison Wesley, and said "How about changing the title of the book from
'The Art of Computer Programming' to 'The Analysis of Algorithms'." They said that will never
sell; their focus group couldn't buy that one. I'm glad they stuck to the original title,
although I'm also glad to see that several books have now come out called "The Analysis of
Algorithms", 20 years down the line.
But in those days, The Art of Computer Programming was very important because I'm
thinking of the aesthetical: the whole question of writing programs as something that has
artistic aspects in all senses of the word. The one idea is "art" which means artificial, and
the other "art" means fine art. All these are long stories, but I've got to cover it fairly
quickly.
I've got The Art of Computer Programming started out, and I'm working on my 12 chapters. I
finish a rough draft of all 12 chapters by, I think it was like 1965. I've got 3,000 pages of
notes, including a very good example of what you mentioned about seeing holes in the fabric.
One of the most important chapters in the book is parsing: going from somebody's algebraic
formula and figuring out the structure of the formula. Just the way I had done in seventh grade
finding the structure of English sentences, I had to do this with mathematical sentences.
Chapter ten is all about parsing of context-free language, [which] is what we called it at
the time. I covered what people had published about context-free languages and parsing. I got
to the end of the chapter and I said, well, you can combine these ideas and these ideas, and
all of a sudden you get a unifying thing which goes all the way to the limit. These other ideas
had sort of gone partway there. They would say "Oh, if a grammar satisfies this condition, I
can do it efficiently." "If a grammar satisfies this condition, I can do it efficiently." But
now, all of a sudden, I saw there was a way to say I can find the most general condition that
can be done efficiently without looking ahead to the end of the sentence. That you could make a
decision on the fly, reading from left to right, about the structure of the thing. That was
just a natural outgrowth of seeing the different pieces of the fabric that other people had put
together, and writing it into a chapter for the first time. But I felt that this general
concept, well, I didn't feel that I had surrounded the concept. I knew that I had it, and I
could prove it, and I could check it, but I couldn't really intuit it all in my head. I knew it
was right, but it was too hard for me, really, to explain it well.
So I didn't put in The Art of Computer Programming. I thought it was beyond the scope of my
book. Textbooks don't have to cover everything when you get to the harder things; then you have
to go to the literature. My idea at that time [is] I'm writing this book and I'm thinking it's
going to be published very soon, so any little things I discover and put in the book I didn't
bother to write a paper and publish in the journal because I figure it'll be in my book pretty
soon anyway. Computer science is changing so fast, my book is bound to be obsolete.
It takes a year for it to go through editing, and people drawing the illustrations, and then
they have to print it and bind it and so on. I have to be a little bit ahead of the
state-of-the-art if my book isn't going to be obsolete when it comes out. So I kept most of the
stuff to myself that I had, these little ideas I had been coming up with. But when I got to
this idea of left-to-right parsing, I said "Well here's something I don't really understand
very well. I'll publish this, let other people figure out what it is, and then they can tell me
what I should have said." I published that paper I believe in 1965, at the end of finishing my
draft of the chapter, which didn't get as far as that story, LR(k). Well now, textbooks of
computer science start with LR(k) and take off from there. But I want to give you an idea
of
FreeDOS turns 25 years old: An origin storyThe operating system's history is a
great example of the open source software model: developers working together to create
something.
That's a major milestone for
any open source software project, and I'm proud of the work that we've done on it over the past
quarter century. I'm also proud of how we built FreeDOS because it is a great example of how
the open source software model works.
For its time, MS-DOS was a powerful operating system. I'd used DOS for years, ever since my
parents replaced our aging Apple II computer with a newer IBM machine. MS-DOS provided a
flexible command line, which I quite liked and that came in handy to manipulate my files. Over
the years, I learned how to write my own utilities in C to expand its command-line capabilities
even further.
Around 1994, Microsoft announced that its next planned version of Windows would do away with
MS-DOS. But I liked DOS. Even though I had started migrating to Linux, I still booted into
MS-DOS to run applications that Linux didn't have yet.
I figured that if we wanted to keep DOS, we would need to write our own. And that's how
FreeDOS was born.
On June 29, 1994, I made a small announcement about my idea to the comp.os.msdos.apps
newsgroup on Usenet.
ANNOUNCEMENT OF PD-DOS PROJECT:
A few months ago, I posted articles relating to starting a public domain version of DOS. The
general support for this at the time was strong, and many people agreed with the statement,
"start writing!" So, I have
Announcing the first effort to produce a PD-DOS. I have written up a "manifest" describing
the goals of such a project and an outline of the work, as well as a "task list" that shows
exactly what needs to be written. I'll post those here, and let discussion follow.
While I announced the project as PD-DOS (for "public domain," although the abbreviation was
meant to mimic IBM's "PC-DOS"), we soon changed the name to Free-DOS and later FreeDOS.
I started working on it right away. First, I shared the utilities I had written to expand
the DOS command line. Many of them reproduced MS-DOS features, including CLS, DATE, DEL, FIND,
HELP, and MORE. Some added new features to DOS that I borrowed from Unix, such as TEE and TRCH
(a simple implementation of Unix's tr). I contributed over a dozen FreeDOS utilities
By sharing my utilities, I gave other developers a starting point. And by sharing my source
code under the GNU
General Public License (GNU GPL), I implicitly allowed others to add new features and fix
bugs.
Other developers who saw FreeDOS taking shape contacted me and wanted to help. Tim Norman
was one of the first; Tim volunteered to write a command shell (COMMAND.COM, later named
FreeCOM). Others contributed utilities that replicated or expanded the DOS command line.
We released our first alpha version as soon as possible. Less than three months after
announcing FreeDOS, we had an Alpha 1 distribution that collected our utilities. By the time we
released Alpha 5, FreeDOS boasted over 60 utilities. And FreeDOS included features never
imagined in MS-DOS, including internet connectivity via a PPP dial-up driver and dual-monitor
support using a primary VGA monitor and a secondary Hercules Mono monitor.
New developers joined the project, and we welcomed them. By October 1998, FreeDOS had a
working kernel, thanks to Pat Villani. FreeDOS also sported a host of new features that brought
not just parity with MS-DOS but surpassed MS-DOS, including ANSI support and a print spooler
that resembled Unix lpr.
You may be familiar with other milestones. We crept our way towards the 1.0 label, finally
releasing FreeDOS 1.0 in September 2006, FreeDOS 1.1 in January 2012, and FreeDOS 1.2 in
December 2016. MS-DOS stopped being a moving target long ago, so we didn't need to update as
frequently after the 1.0 release.
Today, FreeDOS is a very modern DOS. We've moved beyond "classic DOS," and now FreeDOS
features lots of development tools such as compilers, assemblers, and debuggers. We have lots
of editors beyond the plain DOS Edit editor, including Fed, Pico, TDE, and versions of Emacs
and Vi. FreeDOS supports networking and even provides a simple graphical web browser (Dillo).
And we have tons of new utilities, including many that will make Linux users feel at home.
FreeDOS got where it is because developers worked together to create something. In the
spirit of open source software, we contributed to each other's work by fixing bugs and adding
new features. We treated our users as co-developers; we always found ways to include people,
whether they were writing code or writing documentation. And we made decisions through
consensus based on merit. If that sounds familiar, it's because those are the core values of
open source software: transparency, collaboration, release early and often, meritocracy, and
community. That's the open
source way !
I encourage you to download FreeDOS 1.2 and give it a try.
Actually smart Northern European men enabled the very Internet you are using to spread
kosher propaganda.
1. Gottfried Leibniz/German – binary number system.
2. George Boole/English – Boolean logic.
3. Konrad Kuze/German – electronic computer.
4. Donald Davies/Welsh – packet switching.
5. Clifford Cocks/English – public key encryption years before Rivest , Shamir, and
Adleman.
6. Edsger Dijkstra/Dutch – Dijkstra's algorithm and programming.
7. Tim Berners-Lee/English – HTML and http.
8. Håkon Wium Lie/Norwegian – Cascading Style Sheets (CSS).
9. Linus Torvalds/Finn – Linux on which many web servers run. Klaus Knopper/German
– Knoppix Linux variant.
10. Frank Codd/English – relational database model.
11. Michael Widenius/Swede – MySQL on which many web applications run.
12. Kristen Nygaard & Ole-Johan Dahl/Norwegians – object-oriented programming and
Simula programming language.
13. Guido van Rossum/Dutch – Python programming language.
14. Lennart Augustsson/Swede – Haskell programming language.
15. Bjarne Stroustrup/Dane – C++ programming language.
17 Geoffrey Hinton/English – artificial intelligence.
18. Jürgen Dethloff and Helmut Göttrup/Germans – chip card used in mobile
phones plus credit and debit cards.
19. Karlheinz Brandenburg/German – MP3 format.
It not Watson family gone it is New Deal Capitalism was replaced with the neoliberalism
Notable quotes:
"... Except when your employer is the one preaching associate loyalty and "we are family" your entire career. Then they decide you've been too loyal and no longer want to pay your salary and start fabricating reasons to get rid of you. ADP is guilty of these same practices and eliminating their tenured associates. Meanwhile, the millennials hired play ping pong and text all day, rather than actually working. ..."
A quick search of the article doesn't find the word "buy backs" but this is a big part of the
story. IBM spent over $110 BILLION on stock buy backs between 2000 and 2016. That's the
number I found, but it hasn't stopped since. If anything it has escalated.
This is very common among large corporations. Rather than spend on their people, they
funnel billions into stock buy backs which raises or at least maintains the stock value so
execs can keep cashing in. It's really pretty disgraceful. This was only legalized in 1982,
which not-so-coincidentally is not long after real wages stalled, and have stalled ever
since.
Thanks for this bit of insanely true reporting. When laid off from Westinghouse after 14
years of stellar performance evaluations I was flummoxed by the execs getting million-dollar
bonuses as we were told the company wasn't profitable enough to maintain its senior
engineering staff. It sold off every division eventually as the execs (many of them newly
hired) reaped even more bonuses.
Thank you ... very insightful of you. As an IBMer and lover of Spreadsheets / Statistics /
Data Specalist ... I like reading Annual Reports. Researching these Top Execs, BOD and
compare them to other Companies across-the-board and industry sectors. You'll find a Large
Umbrella there.
There is a direct tie and inter-changeable pieces of these elites over the past 55 yrs.
Whenever some Corp/ Political/ Government shill (wannbe) needs a payoff, they get placed into
high ranking top positions for a orchestrating a predescribed dark nwo agenda. Some may come
up the ranks like Ginny, but ALL belong to Council for Foreign Relations and other such high
level private clubs or organizations. When IBM sells off their Mainframe Manufacturing
(Poughkeepsie) to an elite Saudi, under an American Co. sounding name of course, ... and the
U.S. Government ... doesn't balk ... that has me worried for our 1984 future.
Yeah, it is amazing how they stated that they don't need help from the government when in
reality they do need government to pass laws that favor them, pack the court system where
judges rule in their favor and use their private police and the public sector police to keep
the workers down.
I wonder how many billions (trillions?) have been funneled from corporate workers pockets
this way? It seems all corporations are doing it these days. Large-scale transfer of wealth
from the middle class to the wealthy.
Not anymore. With most large companies, you've never been able to say they are "family."
Loyalty used to be a thing though. I worked at a company where I saw loyalty vanish over a 10
year period.
Except when your employer is the one preaching associate loyalty and "we are family" your
entire career. Then they decide you've been too loyal and no longer want to pay your salary
and start fabricating reasons to get rid of you. ADP is guilty of these same practices and
eliminating their tenured associates. Meanwhile, the millennials hired play ping pong and
text all day, rather than actually working.
Yeah, and how many CEOs actually work to make their companies great instead of running them
into the ground, thinking about their next job move, and playing golf
I have to disagree with you. I started with IBM on their rise up in those earlier days, and
we WERE valued and shown that we were valued over and over through those glorious years. It
did feel like we were in a family, our families mattered to them, our well-being. They gave
me a month to find a perfect babysitter when they hired me before I had to go to work!
They
helped me find a house in a good school district for my children. They bought my house when I
was moving to a new job/location when it didn't sell within 30 days.
They paid the difference
in the interest rate of my loan for my new house from the old one. I can't even begin to list
all the myriad of things that made us love IBM and the people we worked with and for, and
made us feel a part of that big IBM family.
Did they change, yes, but the dedication we gave
was freely given and we mutually respected each other. I was lucky to work for them for
decades before that shift when they changed to be just like every other large corporation.
The Watson family held integrity, equality, and knowledge share as a formidable synthesis of
company ethics moving a Quality based business forward in the 20th to 21st century. They also
promoted an (volunteer) IBM Club to help promote employee and family activities
inside/outside of work which they by-en-large paid for. This allowed employees to meet and
see other employees/families as 'Real' & "Common-Interest" human beings. I participated,
created, and organized events and documented how-to-do-events for other volunteers. These
brought IBMers together inside or outside of their 'working' environment to have fun, to
associate, to realize those innate qualities that are in all of us. I believe it allowed for
better communication and cooperation in the work place.
To me it was family. Some old IBMers might remember when Music, Song, Skits were part of IBM
Branch Office meetings. As President of the IBM Clubs Palo Alto branch (7 yrs.), I used our
Volunteer Club Votes to spend ALL that IBM donated money, because they
<administratively> gave it back to IBM if we didn't.
Without a strong IBM Club
presence, it gets whittled down to 2-3 events a year. For a time WE WERE a FAMILY.
Absolutely! Back when white shirts/black suits were a requirement. There was a country club
in Poughkeepsie, softball teams, Sunday brunch, Halloween parties in the fall, Christmas
parties in December where thousands of age appropriate Fisher Price toys were given out to
employee's kids. Today "IBMer" is used by execs as a term of derision. Employees are
overworked and under appreciated and shortsighted, overpaid executives rule the roost. The
real irony is that talented, vital employees are being retired for "costing too much" while
dysfunctional top level folk are rewarded with bonuses and stock when they are let go. And
it's all legal. It's disgraceful.
Microsoft co-founder Paul Allen died today from complications with non-Hodgkin's lymphoma. He was 65. Allen said
earlier this month
that he was being treated for the disease.
Allen was a childhood friend of Bill Gates, and together, the two started Microsoft in 1975. He left the company in 1983 while
being treated for Hodgkin's lymphoma and remained a board member with the company through 2000. He was first treated for non-Hodgkin's
lymphoma in 2009, before seeing it go into remission.
In a statement given to ABC News , Gates said he was "heartbroken by the passing of one of my oldest and dearest friends."
He went on to commend his fellow co-founder for his life after Microsoft:
From our early days together at Lakeside School, through our partnership in the creation of Microsoft, to some of our joint
philanthropic projects over the years, Paul was a true partner and dear friend. Personal computing would not have existed without
him.
But Paul wasn't content with starting one company. He channelled his intellect and compassion into a second act focused on
improving people's lives and strengthening communities in Seattle and around the world. He was fond of saying, "If it has the
potential to do good, then we should do it." That's the king of person he was.
Paul loved life and those around him, and we all cherished him in return. He deserved much more time, but his contributions
to the world of technology and philanthropy will live on for generations to come. I will miss him tremendously.
Microsoft CEO Satya Nadella said Allen's contributions to both Microsoft and the industry were "indispensable." His full statement
is quoted below:
Paul Allen's contributions to our company, our industry, and to our community are indispensable. As co-founder of Microsoft,
in his own quiet and persistent way, he created magical products, experiences and institutions, and in doing so, he changed the
world. I have learned so much from him -- his inquisitiveness, curiosity, and push for high standards is something that will continue
to inspire me and all of us as Microsoft. Our hearts are with Paul's family and loved ones. Rest in peace.
In a memoir published in 2011, Allen says
that he was responsible for naming Microsoft and creating the two-button mouse. The book also portrayed Allen as
going under-credited for his
work at Microsoft, and Gates as having taken more ownership of the company than he deserved. It created some drama when it arrived,
but the two men ultimately appeared to remain friends,
posing for a photo together two years later.
After leaving Microsoft, Allen became an investor through his company Vulcan, buying into a diverse set of companies and markets.
Vulcan's current portfolio ranges from the Museum of Pop Culture in Seattle, to a group focused on using machine learning for climate
preservation, to Stratolaunch, which is
creating a spaceplane . Allen's investments and donations made him a major name in Seattle, where much of his work was focused.
He recently
funded a $46 million building in South Seattle that will house homeless and low-income families.
Both Apple CEO Tim Cook and Google CEO Sundar Pichai called Allen a tech "pioneer" while highlighting his philanthropic work in
statements on Twitter. Amazon CEO Jeff Bezos said Allen's work "inspired so many."
Allen has long been the owner of the Portland Trail Blazers and Seattle Seahawks as well. NFL Commissioner Roger Goodell said
Allen "worked tirelessly" to "identify new ways to make the game safer and protect our players from unnecessary risk." NBA Commissioner
Adam Silver said Allen "helped lay the foundation for the league's growth internationally and our embrace of new technologies."
He also launched a number of philanthropic efforts, which were later combined under the name Paul G. Allen Philanthropies. His
"philanthropic contributions exceed $2 billion," according to Allen's own website, and he had committed to giving away the majority
of his fortune.
Allen's sister, Jody Allen, wrote a statement on his family's behalf:
My brother was a remarkable individual on every level. While most knew Paul Allen as a technologist and philanthropist, for
us he was a much loved brother and uncle, and an exceptional friend.
Paul's family and friends were blessed to experience his wit, warmth, his generosity and deep concern. For all the demands
on his schedule, there was always time for family and friends. At this time of loss and grief for us – and so many others – we
are profoundly grateful for the care and concern he demonstrated every day.
Some of Allen's philanthropy has taken a scientific bent: Allen founded the
Allen Institute for Brain Science in 2003, pouring
$500 million into the non-profit that aims to give scientists the tools and data they need to probe how brain works. One recent project,
the Allen Brain Observatory , provides an open-access "catalogue
of activity in the mouse's brain," Saskia de Vries, senior scientist on the project,
said in a video . That kind of data is key to piecing together
how the brain processes information.
In a statement emailed to The Verge, The Allen Institute's President and CEO Allan Jones said:
Paul's vision and insight have been an inspiration to me and to many others both here at the Institute that bears his name,
and in the myriad of other areas that made up the fantastic universe of his interests. He will be sorely missed. We honor his
legacy today, and every day into the long future of the Allen Institute, by carrying out our mission of tackling the hard problems
in bioscience and making a significant difference in our respective fields.
Man what a shock! I was lucky enough to be working at a Seattle startup that Paul bought
back in the 90s ( doing VoIP SOHO phone systems ). He liked to swing by office on a regular
basis as we were just a few blocks from Dicks hamburgers on Mercer St (his favorite). He was
really an engineer's engineer. We'd give him a status report on how things were going and
within a few minutes he was up at the white board spitballing technical solutions to ASIC or
network problems. I especially remember him coming by the day he bought the Seahawks. Paul
was a big physical presence ( 6'2" 250lbs in those days ), but he kept going on about how
after meeting the Seahawks players, he never felt so physically small in his life. Ignore the
internet trolls. Paul was a good guy. He was a humble, modest, down-to-earth guy. There was
always a pick-up basketball game on his court on Thursday nights. Jam session over at his
place were legendary ( I never got to play with him, but every musician that I know that
played with him was impressed with his guitar playing ). He left a huge legacy in the pacific
northwest. We'll miss you Paul!
The book Paul Allen wrote avoids a full report, but gives the impression that Bill Gates
was so angry, Paul Allen left the company because interacting with Bill Gates was bad for his
health.
Quotes from the book, Idea Man
[amazon.com] by Paul Allen.
Page 49:
THREE DECADES AFTER teaching Bill and me at Lakeside, Fred Wright was asked what he'd
thought about our success with Microsoft. His reply: "It was neat that they got along well
enough that the company didn't explode in the first year or two."
Page 96:
When Bill pushed on licensing terms or bad-mouthed the flaky Signetics cards, Ed thought
he was insubordinate. You could hear them yelling throughout the plant, and it was quite a
spectacle-the burly ex-military officer standing toe to toe with the owlish prodigy about
half his weight, neither giving an inch.
Page 177:
Bill was sarcastic, combative, defensive, and contemptuous.
Page 180:
"For Bill, the ground had already begun shifting. At product review meetings, his scathing
critiques became a perverse badge of honor. One game was to count how many times Bill
confronted a given manager; whoever got tagged for the most "stupidest things " won the
contest. "I give my feedback," he grumbled to me, "and it doesn't go anywhere."
He used to have the nickname "Doctor NetVorkian" because many of the things he invested in
promptly tanked in one way or another after his investment. He had a lot of bad luck with his
investments.
For those who don't understand the joke, a certain Dr. Kervorkian became notorious for
helping ill patients commit suicide.
But you can wipe Windows off your hard drive, so I don't get your point. Paul Allen was
a great guy in many, many ways.
Agreed. Even if you could "blame" him for all or part of Windows, he did start the
Museum of Pop
Culture [wikipedia.org]. If you are ever in Seattle, it is a must see. I mean, they have
what is probably the best Star Trek museum display anywhere (which is saying a lot since the
Smithsonian has a very nice one as well), including most of the original series set pieces
and I believe one of the only actual Enterprise models used for filming. In my mind, that
gives him a great deal of geek cred. Plus, as I under
I knew someone would say that. You are right. I won't. But he won't either. He was a
patent troll. Oh but: RIP and thoughts and prayers, right? He was a great guy and will be
missed.
[Editor's note: all links in the story will lead you to Twitter] : In the 1970s
the cost -- and size -- of calculators tumbled. Business tools became toys; as a result
prestige tech companies had to rapidly diversify into other products -- or die! This is the
story of the 1970s great calculator race... Compact electronic calculators had been around
since the mid-1960s, although 'compact' was a relative term. They were serious, expensive tools
for business . So it was quite a breakthrough in 1967 when Texas Instruments presented the
Cal-Tech: a prototype battery powered
'pocket' calculator using four integrated circuits . It sparked a wave of interest. Canon
was one of the first to launch a pocket calculator in 1970. The Pocketronic used Texas
Instruments integrated circuits, with calculations printed on a roll of thermal paper. Sharp
was also an early producer of pocket calculators. Unlike Canon they used integrated circuits
from Rockwell and showed the calculation on a
vacuum fluorescent display . The carrying handle was a nice touch!
The next year brought another big leap: the Hewlet-Packard HP35 .
Not only did it use a microprocessor it was also the first scientific pocket calculator.
Suddenly the slide rule was no longer king; the 35 buttons of the HP35 had taken its crown. The
most stylish pocket calculator was undoubtedly the Olivetti Divisumma 18 ,
designed by Mario Bellini. Its smooth look and soft shape has become something of a tech icon
and an inspiration for many designers. It even featured in Space:1999! By 1974 Hewlett Packard
had created another first: the HP-65 programmable pocket
calculator . Programmes were stored on magnetic cards slotted into the unit. It was even
used during the Apollo-Soyuz space mission to make manual course corrections. The biggest
problem for pocket calculators was the power drain: LED displays ate up batteries. As LCD
displays gained popularity in the late 1970s the size of battery needed began to
reduce . The 1972 Sinclair Executive had been the first pocket calculator to use
small circular watch batteries , allowing the case to be very thin. Once LCD displays took
off watch batteries increasingly became the norm for calculators. Solar power was the next
innovation for the calculator: Teal introduced the Photon in 1977, no batteries required or
supplied!
But the biggest shake-up of the emerging calculator market came in 1975, when Texas
Instruments -- who made the chips for most calculator companies -- decided to produce and sell
their own models. As a vertically integrated company Texas Instruments could make and sell
calculators at a much lower price than its
competitors . Commodore almost went out of business trying to compete: it was paying more
for its TI chips than TI was selling an entire calculator for. With prices falling the pocket
calculator quickly moved from business tool to
gizmo : every pupil, every student, every office worker wanted one, especially when they
discovered the digital fun they could have! Calculator games suddenly became a 'thing' ,
often combining a calculator with a deck of cards to create new games to play. Another popular
pastime was finding numbers that spelt rude words if the calculator was turned upside down; the
Samsung Secal even gave you a clue to
one!
The calculator was quickly evolving into a lifestyle
accessory . Hewlett Packard launched the first calculator watch in 1977... Casio launched
the first credit card sized calculator in
1978 , and by 1980 the pocket calculator and pocket computer were starting to merge. Peak
calculator probably came in 1981, with Kraftwerk's Pocket Calculator released as a cassingle in a
calculator-shaped box . Although the heyday of the pocket calculator may be over they are
still quite collectable. Older models in good condition with the original packaging can command
high prices online. So let's hear it for the pocket calculator: the future in the palm of your
hand!
I have a HP-15C purchased in 1985 and it is still running on the original batteries - 32
years!
That is phenomenal low power design for the technology and knowledge at the time.
I replaced the batteries in my 15c for the first time a couple of years ago. And just to
be clear, it has three small non-rechargable button batteries, like you would find in a
watch.
I have a HP-15C purchased in 1985 and it is still running on the original batteries - 32
years!
That is phenomenal low power design for the technology and knowledge at the time.
That's phenomenal even by today's design standards!
My dad's friend was a gadget hound, and had one of these in the 80's. Not a great machine.
The keys were weird and mushy. It had no electronic display. It only had a thermal printer
that printed shiny dark gray numbers on shiny light gray paper. In other words, visibility
was poor. It looked amazing, though, and you could spill a coke on it and the keys would
still work.
Much more impressive but more utilitarian - he had a completely electro-mechanical rotary
auto-dial telephone. It took small, hard plastic punch cards you'd put the number on. You'd
push the card into a slot on the telephone, and it would feed the card in and out, generating
pulses until it got to the number you punched out. Then it would pull the card back in and do
it again for the next number until the whole number was dialed. No digital anything, just
relays and motors.
In some ways, the electronic calculator market was created by TI and it's need to sell the
new IC. There were not many applications, and one marketable application was the electronic
calculator. In some ways it was like live Apple leveraging the microwave for the iPod.
Like the iPod, the TI calculators were not great, but they were very easy to use. The HP
calculators were and are beatiful. But ease of use won out.
Another thing that won out was until about a decade ago all TI calculators were very
limited. This made them ideal machines for tests. HP calculators could do unit analsys, and
since 1990 they had algebra systems, and could even do calculus. This made them the ideal
machine for technical students and professionals, but no high school would waste time
teaching it because all they care about is filling out bubbles on an answer sheet.
The interesting contemporary issue that I see is that schools are still teaching
calculators when really smart phones can do everything and more, especially with apps like
Wolfram Alpha. Unless you are a legacy HP user, asking kids to buy a calculator just to
boosts TI profits seems very wasteful to me. This is going to change as more tests move to
online format, and online resources such as Desmos take over the physical clacultor, but in
the meantime the taxpayer is on the hook for millions of dollars a year per large school
district just for legacy technology.
An interesting NHK World documentary about Japanese calculator culture and the history of
calculators in Japan. I generally watch these at speed = 1.5.
I love my TI-89. I still use it daily. There's a lot to be said for multiple decades of
practice on a calculator. Even the emulator of it on my phone, for when I don't have it
handy, isn't the same.
It doesn't need to be particularly fast or do huge calculations--that's what programming
something else is for. But nothing beats a good calculator for immediate results.
›
First calculator that did octal and hex math (also binary). Got one when they came out,
cost $50 in 1977. Still have it, still works, although the nicad battery died long ago. In a
remarkable show of foresight, TI made the battery pack with a standard 9V battery connector,
and provided a special battery door that let you replace the rechargeable battery with a
normal 9V. I replaced it with a solar powered Casio that did a bunch more stuff, but the TI
still works.
(tnmoc.org)the
National Museum of Computing published a blog post in which it tried to find the person who
has been programming the longest . At the time, it declared Bill Williams, a 70-year old to
be one of the world's most durable programmers, who claimed to have started coding for a living
in 1969 and was still doing so at the time of publication. The post has been updated several
times over the years, and over the weekend, the TNMC updated it once again. The newest
contender is Terry Froggatt of Hampshire, who writes: I can beat claim of your 71-year-old
by a couple of years, (although I can't compete with the likes of David Hartley). I wrote my
first program for the Elliott 903 in September 1966. Now at the age of 73 I am still writing
programs for the Elliott 903! I've just written a 903 program to calculate the Fibonacci
numbers. And I've written quite a lot of programs in the years in between, some for the 903 but
also a good many in Ada.
As most Perl fans are no doubt aware, the Perl Foundation
released version 5.10 last month and introduced
a number of significant upgrades for the popular programming language. Perl 5.10 is the first significant feature upgrade since the
5.8 release back in 2002.
First the good news, AKA why you should go ahead and upgrade: the major new language features are turned off by default which
means you can upgrade without breaking existing scripts, and take advantage of the new features for new scripts. Even cooler is ability
to progressively upgrade scripts using the "use" syntax.
For instance, add the line use feature 'switch'; prior to a block of code where you'd like to take advantage of the new
switch statement in Perl 5.10 and then turn it off after upgrading that block of code using the statement no feature 'switch';.
New features can be enabled by name or as a collective group using the statement use feature ':5.10';.
In addition to the switch statement, there's a new
say statement which acts like print() but adds a newline character
and a state feature, which enables a new class of variables with very explicit scope control.
But perhaps the most interesting
of 5.10's new features is the new 'or' operator, //, which is a "defined or" construct. For instance the following statements
are syntactically equivalent:
$foo // $bar defined $foo ? $foo : $bar
Obviously the first line is much more compact and (I would argue) readable - i.e. is $foo defined? If not, give it the value $bar."
You can also add an equal sign like so:
$bar //= $foo;
Which is the same as writing:
$bar = $foo unless defined $bar;
Another noteworthy new feature is the smart match operator, which the Perl Foundation explains as "a new kind of comparison, the
specifics of which are contextual based on the inputs to the operator." For example, to find if scalar $needle is in array @haystack,
simply use the new ~~ operator
if ( $needle ~~ @haystack ) ...
Perl 5.10 also finally gains support for named regex statements, which means you can avoid the dreaded lines of $1 $2 etc, which
often make Perl regex hard to decipher. Finally I might be able to understand what's going on in complex regex scripts like
Markdown.
Other improvements include a faster interpreter with a smaller memory footprint, better error messages and more. For full details
on the new release check out the notes.
I'll confess I abandoned Perl for Python some time ago, but after playing with 5.10 I may have to rethink that decision, Perl
5.10's new features are definitely worth the upgrade and a must have for anyone who uses Perl on a daily basis.
The Last but not LeastTechnology is dominated by
two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt.
Ph.D
FAIR USE NOTICEThis site contains
copyrighted material the use of which has not always been specifically
authorized by the copyright owner. We are making such material available
to advance understanding of computer science, IT technology, economic, scientific, and social
issues. We believe this constitutes a 'fair use' of any such
copyrighted material as provided by section 107 of the US Copyright Law according to which
such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free)
site written by people for whom English is not a native language. Grammar and spelling errors should
be expected. The site contain some broken links as it develops like a living tree...
You can use PayPal to to buy a cup of coffee for authors
of this site
Disclaimer:
The statements, views and opinions presented on this web page are those of the author (or
referenced source) and are
not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society.We do not warrant the correctness
of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be
tracked by Google please disable Javascript for this site. This site is perfectly usable without
Javascript.