|
Home | Switchboard | Unix Administration | Red Hat | TCP/IP Networks | Neoliberalism | Toxic Managers |
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix |
|
|
Brooks's law is a principle in software development which says that "grossly simplifying, adding manpower to a late software project makes it later". It was coined by Fred Brooks in his 1975 classic The Mythical Man-Month.
In humorous form of Brooks law can be formulated as following:
"What one programmer can do in one month, two programmers can do in two months."
The corollary of Brooks's Law is that there is a stage of the project when incremental addition of manpower makes the situation worse instead of making it better. And it is not limited to the total time to complete the project only. Architecture is badly affected too. That's why another variant of Brooks law is often formulated as "Nine women can't make a baby in one month".
The existence of Brooks Law is related to three important structural properties of large software projects:
Brooks law is not about man-month it is about complex relationship between structure of the project and structure of the team of developers involved. That's why, grossly simplifying, adding manpower to the late project is not trivial task that more often then not makes late project even more late.
It is simplistic to assume that modularity on a project determines the number of developers who can be effective and work without stepping on each other toes. Number of developers who are participating in given project is affected by many factors and in turn affects the architecture of the product as each developer try to carve his niche. In other words, there is a feedback loop from the structure of the developer group to the structure of the software system under development. This is probably the most overlooked side of Brooks Law.
My point here is that in no way Brooks Law is not negated by a fairy-tail about a dog and pony show project (Fetchmail) by a true believer who cannot even understand that this project should use a scripting language such as Python instead of C .
The book itself still did not lost its value due to the fact that it contains unique observations about large scale software development, the observations that can't be found anywhere else (for example, Steve McConnell is just a consultant who never headed large projects). Brooks touches things that are typical. but not highly visible in any complex engineering project. He is my compilation of key points from the book:
When you stop to think about it, this phenomenon is easy to understand. There is an inescapable overhead to yoking up programmers in parallel. The members of the team must "waste time" attending meetings, drafting project plans, exchanging EMAIL, negotiating interfaces, enduring performance reviews, and so on. In any team of more than a few people, at least one member will be dedicated to "supervising" the others, while another member will be dedicated to housekeeping functions such as managing builds, updating Gantt charts, and coordinating everyone's calendar. At Microsoft, there will be at least one team member that just designs T-shirts for the rest of the team to wear. And as the team grows, there is a combinatorial explosion such that the percentage of effort devoted to communication and administration becomes larger and larger.
~ Dr. Nikolai Bezroukov
|
Switchboard | ||||
Latest | |||||
Past week | |||||
Past month |
Jul 03, 2021 | en.wikipedia.org
Mission creep is the gradual or incremental expansion of an intervention, project or mission, beyond its original scope, focus or goals , a ratchet effect spawned by initial success. [1] Mission creep is usually considered undesirable due to how each success breeds more ambitious interventions until a final failure happens, stopping the intervention entirely.
The term was originally applied exclusively to military operations , but has recently been applied to many different fields. The phrase first appeared in 1993, in articles published in the Washington Post and in the New York Times concerning the United Nations peacekeeping mission during the Somali Civil War .
...
- Feature creep is an analogous phenomenon in software engineering .
- Ratchet effect is the inability of a system to reduce its scope once it expands.
- Scope creep is an analogous phenomenon in project management .
Jul 02, 2020 | zwischenzugs.com
I often feel this tension when discussion of Agile goes beyond a small team. When I'm told in a motivational poster written by someone I've never met and who knows nothing about my job that I should 'obliterate my blockers', and those blockers are both external and non-negotiable, what else can I do but laugh at it?
How can you be agile when there are blockers outside your control at every turn? Infrastructure, audit, security, financial planning, financial structures all militate against the ability to quickly deliver meaningful iterations of products. And who is the customer here, anyway? We're talking about the square of despair:
When I see diagrams like this representing Agile I can only respond with black humor shared with my colleagues, like kids giggling at the back of a church.
When within a smaller and well-functioning functioning team, the totems of Agile often fly out of the window and what you're left with (when it's good) is a team that trusts each other, is open about its trials, and has a clear structure (formal or informal) in which agreement and solutions can be found and co-operation is productive. Google recently articulated this (reported briefly here , and more in-depth here ).
Oct 05, 2019 | www.reddit.com
They say, No more IT or system or server admins needed very soon...
Sick and tired of listening to these so called architects and full stack developers who watch bunch of videos on YouTube and Pluralsight, find articles online. They go around workplace throwing words like containers, devops, NoOps, azure, infrastructure as code, serverless, etc, they don't understand half of the stuff. I do some of the devops tasks in our company, I understand what it takes to implement and manage these technologies. Every meeting is infested with these A holes.
ntengineer 613 points · 4 days ago
Your best defense against these is to come up with non-sarcastic and quality questions to ask these people during the meeting, and watch them not have a clue how to answer them.
For example, a friend of mine worked at a smallish company, some manager really wanted to move more of their stuff into Azure including AD and Exchange environment. But they had common problems with their internet connection due to limited bandwidth and them not wanting to spend more. So during a meeting my friend asked a question something like this:
"You said on this slide that moving the AD environment and Exchange environment to Azure will save us money. Did you take into account that we will need to increase our internet speed by a factor of at least 4 in order to accommodate the increase in traffic going out to the Azure cloud? "
Of course, they hadn't. So the CEO asked my friend if he had the numbers, which he had already done his homework, and it was a significant increase in cost every month and taking into account the cost for Azure and the increase in bandwidth wiped away the manager's savings.
I know this won't work for everyone. Sometimes there is real savings in moving things to the cloud. But often times there really isn't. Calling the uneducated people out on what they see as facts can be rewarding. level 2
PuzzledSwitch 101 points · 4 days ago
my previous boss was that kind of a guy. he waited till other people were done throwing their weight around in a meeting and then calmly and politely dismantled them with facts.
no amount of corporate pressuring or bitching could ever stand up to that. level 3
themastermatt 42 points · 4 days ago
Ive been trying to do this. Problem is that everyone keeps talking all the way to the end of the meeting leaving no room for rational facts. level 4 PuzzledSwitch 35 points · 4 days ago
make a follow-up in email, then.
or, you might have to interject for a moment.
williamfny Jack of All Trades 26 points · 4 days ago
This is my approach. I don't yell or raise my voice, I just wait. Then I start asking questions that they generally cannot answer and slowly take them apart. I don't have to be loud to get my point across. level 4
MaxHedrome 6 points · 4 days ago
CrazyTachikoma 4 days agoListen to this guy OP
This tactic is called "the box game". Just continuously ask them logical questions that can't be answered with their stupidity. (Box them in), let them be their own argument against themselves.
Most DevOps I've met are devs trying to bypass the sysadmins. This, and the Cloud fad, are burning serious amount of money from companies managed by stupid people that get easily impressed by PR stunts and shiny conferences. Then when everything goes to shit, they call the infrastructure team to fix it...
Oct 26, 2017 | www.amazon.com
Mohammad B. Abdulfatah on February 10, 2003
Programming Malpractice Explained: Justifying Chaoswiredweird HALL OF FAME TOP 1000 REVIEWER on May 24, 2004To fairly review this book, one must distinguish between the methodology it presents and the actual presentation. As to the presentation, the author attempts to win the reader over with emotional persuasion and pep talk rather than with facts and hard evidence. Stories of childhood and comradeship don't classify as convincing facts to me.
A single case study-the C3 project-is often referred to, but with no specific information (do note that the project was cancelled by the client after staying in development for far too long).
As to the method itself, it basically boils down to four core practices:
- Always have a customer available on site.
- Unit test before you code.
- Program in pairs.
- Forfeit detailed design in favor of incremental, daily releases and refactoring.
If you do the above, and you have excellent staff on your hands, then the book promises that you'll reap the benefits of faster development, less overtime, and happier customers. Of course, the book fails to point out that if your staff is all highly qualified people, then the project is likely to succeed no matter what methodology you use. I'm sure that anyone who has worked in the software industry for sometime has noticed the sad state that most computer professionals are in nowadays.
However, assuming that you have all the topnotch developers that you desire, the outlined methodology is almost impossible to apply in real world scenarios. Having a customer always available on site would mean that the customer in question is probably a small, expendable fish in his organization and is unlikely to have any useful knowledge of its business practices.
Unit testing code before it is written means that one would have to have a mental picture of what one is going to write before writing it, which is difficult without upfront design. And maintaining such tests as the code changes would be a nightmare.
Programming in pairs all the time would assume that your topnotch developers are also sociable creatures, which is rarely the case, and even if they were, no one would be able to justify the practice in terms of productivity. I won't discuss why I think that abandoning upfront design is a bad practice; the whole idea is too ridiculous to debate.
Both book and methodology will attract fledgling developers with its promise of hacking as an acceptable software practice and a development universe revolving around the programmer. It's a cult, not a methodology, were the followers shall find salvation and 40-hour working weeks.
Experience is a great teacher, but only a fool would learn from it alone. Listen to what the opponents have to say before embracing change, and don't forget to take the proverbial grain of salt.
Two stars out of five for the presentation for being courageous and attempting to defy the standard practices of the industry. Two stars for the methodology itself, because it underlines several common sense practices that are very useful once practiced without the extremity.
eXtreme buzzwordingA customer on May 2, 2004Maybe it's an interesting idea, but it's just not ready for prime time.
Parts of Kent's recommended practice - including aggressive testing and short integration cycle - make a lot of sense. I've shared the same beliefs for years, but it was good to see them clarified and codified. I really have changed some of my practice after reading this and books like this.
I have two broad kinds of problem with this dogma, though. First is the near-abolition of documentation. I can't defend 2000 page specs for typical kinds of development. On the other hand, declaring that the test suite is the spec doesn't do it for me either. The test suite is code, written for machine interpretation. Much too often, it is not written for human interpretation. Based on the way I see most code written, it would be a nightmare to reverse engineer the human meaning out of any non-trivial test code. Some systematic way of ensuring human intelligibility in the code, traceable to specific "stories" (because "requirements" are part of the bad old way), would give me a lot more confidence in the approach.
The second is the dictatorial social engineering that eXtremity mandates. I've actually tried the pair programming - what a disaster. The less said the better, except that my experience did not actually destroy any professional relationships. I've also worked with people who felt that their slightest whim was adequate reason to interfere with my work. That's what Beck institutionalizes by saying that any request made of me by anyone on the team must be granted. It puts me completely at the mercy of anyone walking by. The requisite bullpen physical environment doesn't work for me either. I find that the visual and auditory distraction make intense concentration impossible.
I find revival tent spirit of the eXtremists very off-putting. If something works, it works for reasons, not as a matter of faith. I find much too much eXhortation to believe, to go ahead and leap in, so that I will eXperience the wonderfulness for myself. Isn't that what the evangelist on the subway platform keeps saying? Beck does acknowledge unbelievers like me, but requires their exile in order to maintain the group-think of the X-cult.
Beck's last chapters note a number of exceptions and special cases where eXtremism may not work - actually, most of the projects I've ever encountered.There certainly is good in the eXtreme practice. I look to future authors to tease that good out from the positively destructive threads that I see interwoven.
A work of fictionA customer on November 14, 2002The book presents extreme programming. It is divided into three parts:
(1) The problem
(2) The solution
(3) Implementing XP.The problem, as presented by the author, is that requirements change but current methodologies are not agile enough to cope with this. This results in customer being unhappy. The solution is to embrace change and to allow the requirements to be changed. This is done by choosing the simplest solution, releasing frequently, refactoring with the security of unit tests.
The basic assumption which underscores the approach is that the cost of change is not exponential but reaches a flat asymptote. If this is not the case, allowing change late in the project would be disastrous. The author does not provide data to back his point of view. On the other hand there is a lot of data against a constant cost of change (see for example discussion of cost in Code Complete). The lack of reasonable argumentation is an irremediable flaw in the book. Without some supportive data it is impossible to believe the basic assumption, nor the rest of the book. This is all the more important since the only project that the author refers to was cancelled before full completion.
Many other parts of the book are unconvincing. The author presents several XP practices. Some of them are very useful. For example unit tests are a good practice. They are however better treated elsewhere (e.g., Code Complete chapter on unit test). On the other hand some practices seem overkill. Pair programming is one of them. I have tried it and found it useful to generate ideas while prototyping. For writing production code, I find that a quiet environment is by far the best (see Peopleware for supportive data). Again the author does not provide any data to support his point.
This book suggests an approach aiming at changing software engineering practices. However the lack of supportive data makes it a work of fiction.
I would suggest reading Code Complete for code level advice or Rapid Development for management level advice.Not Software Engineering.Philip K. Ronzone on November 24, 2000Any Engineering discipline is based on solid reasoning and logic not on blind faith. Unfortunately, most of this book attempts to convince you that Extreme programming is better based on the author's experiences. A lot of the principles are counterintuitive and the author exhorts you just try it out and get enlightened. I'm sorry but these kind of things belong in infomercials not in s/w engineering.
The part about "code is the documentation" is the scariest part. It's true that keeping the documentation up to date is tough on any software project, but to do away with documentation is the most ridiculous thing I have heard.
It's like telling people to cut of their noses to avoid colds. Yes we are always in search of a better software process. Let me tell you that this book won't lead you there.
The "gossip magazine diet plans" style of programming.A customer on August 11, 2003This book reminds me of the "gossip magazine diet plans", you know, the vinegar and honey diet, or the fat-burner 2000 pill diet etc. Occasionally, people actually lose weight on those diets, but, only because they've managed to eat less or exercise more. The diet plans themselves are worthless. XP is the same - it may sometimes help people program better, but only because they are (unintentionally) doing something different. People look at things like XP because, like dieters, they see a need for change. Overall, the book is a decently written "fad diet", with ideas that are just as worthless.
Hackers! Salvation is nigh!!Ian K. VINE VOICE on February 27, 2015It's interesting to see the phenomenon of Extreme Programming happening in the dawn of the 21st century. I suppose historians can explain such a reaction as a truly conservative movement. Of course, serious software engineering practice is hard. Heck, documentation is a pain in the neck. And what programmer wouldn't love to have divine inspiration just before starting to write the latest web application and so enlightened by the Almighty, write the whole thing in one go, as if by magic? No design, no documentation, you and me as a pair, and the customer too. Sounds like a hacker's dream with "Imagine" as the soundtrack (sorry, John).
The Software Engineering struggle is over 50 years old and it's only logical to expect some resistance, from time to time. In the XP case, the resistance comes in one of its worst forms: evangelism. A fundamentalist cult, with very little substance, no proof of any kind, but then again if you don't have faith you won't be granted the gift of the mystic revelation. It's Gnosticism for Geeks.
Take it with a pinch of salt.. well, maybe a sack of salt. If you can see through the B.S. that sells millions of dollars in books, consultancy fees, lectures, etc, you will recognise some common-sense ideas that are better explained, explored and detailed elsewhere.Long have I hated this bookAnil Philip on March 24, 2005Kent is an excellent writer. He does an excellent job of presenting an approach to software development that is misguided for anything but user interface code. The argument that user interface code must be gotten into the hands of users to get feedback is used to suggest that complex system code should not be "designed up front". This is simply wrong. For example, if you are going to deploy an application in the Amazon Cloud that you want to scale, you better have some idea of how this is going to happen. Simply waiting until your application falls over and fails is not an acceptable approach.
One of the things I despise the most about the software development culture is the mindless adoption of fads. Extreme programming has been adopted by some organizations like a religious dogma.
Engineering large software systems is one of the most difficult things that humans do. There are no silver bullets and there are no dogmatic solutions that will make the difficult simple.
not found - the silver bulletMaybe I'm too cynical because I never got to work for the successful, whiz-kid companies; Maybe this book wasn't written for me!
This book reminds me of Jacobsen's "Use Cases" book of the 1990s. 'Use Cases' was all the rage but after several years, we slowly learned the truth: Uses Cases does not deal with the architecture - a necessary and good foundation for any piece of software.
Similarly, this book seems to be spotlighting Testing and taking it to extremes.
'the test plan is the design doc'
Not True. The design doc encapsulates wisdom and insight
a picture that accurately describes the interactions of the lower level software components is worth a thousand lines of code-reading.
Also present is an evangelistic fervor that reminds me of the rah-rah eighties' bestseller, "In Search Of Excellence" by Peters and Waterman. (Many people have since noted that most of the spotlighted companies of that book are bankrupt twenty five years later).
- - in a room full of people with a bully supervisor (as I experienced in my last job at a major telco) innovation or good work is largely absent.
- - deploy daily - are you kidding? to run through the hundreds of test cases in a large application takes several hours if not days. Not all testing can be automated.
- - I have found the principle of "baby steps", one of the principles in the book, most useful in my career - it is the basis for prototyping iteratively. However I heard it described in 1997 at a pep talk at MCI that the VP of our department gave to us. So I dont know who stole it from whom!
Lastly, I noted that the term 'XP' was used throughout the book, and the back cover has a blurb from an M$ architect. Was it simply coincidence that Windows shares the same name for its XP release? I wondered if M$ had sponsored part of the book as good advertising for Windows XP! :)
Oct 03, 2017 | discussion.theguardian.com
mlzarathustra , 21 Sep 2017 16:52
I agree with the basic point. We've seen this kind of tactic for some time now. Silicon Valley is turning into a series of micromanaged sweatshops (that's what "agile" is truly all about) with little room for genuine creativity, or even understanding of what that actually means. I've seen how impossible it is to explain to upper level management how crappy cheap developers actually diminish productivity and value. All they see is that the requisition is filled for less money.The bigger problem is that nobody cares about the arts, and as expensive as education is, nobody wants to carry around a debt on a skill that won't bring in the bucks. And smartphone-obsessed millennials have too short an attention span to fathom how empty their lives are, devoid of the aesthetic depth as they are.
I can't draw a definite link, but I think algorithm fails, which are based on fanatical reliance on programmed routines as the solution to everything, are rooted in the shortage of education and cultivation in the arts.
Economics is a social science, and all this is merely a reflection of shared cultural values. The problem is, people think it's math (it's not) and therefore set in stone.
Jul 25, 2017 | www.paulgraham.com
CACM , December 1974
When Communications of the ACM began publication in 1959, the members of ACM'S Editorial Board made the following remark as they described the purposes of ACM'S periodicals [2]:
"If computer programming is to become an important part of computer research and development, a transition of programming from an art to a disciplined science must be effected."Such a goal has been a continually recurring theme during the ensuing years; for example, we read in 1970 of the "first steps toward transforming the art of programming into a science" [26]. Meanwhile we have actually succeeded in making our discipline a science, and in a remarkably simple way: merely by deciding to call it "computer science."Implicit in these remarks is the notion that there is something undesirable about an area of human activity that is classified as an "art"; it has to be a Science before it has any real stature. On the other hand, I have been working for more than 12 years on a series of books called "The Art of Computer Programming." People frequently ask me why I picked such a title; and in fact some people apparently don't believe that I really did so, since I've seen at least one bibliographic reference to some books called "The Act of Computer Programming."
In this talk I shall try to explain why I think "Art" is the appropriate word. I will discuss what it means for something to be an art, in contrast to being a science; I will try to examine whether arts are good things or bad things; and I will try to show that a proper viewpoint of the subject will help us all to improve the quality of what we are now doing.
One of the first times I was ever asked about the title of my books was in 1966, during the last previous ACM national meeting held in Southern California. This was before any of the books were published, and I recall having lunch with a friend at the convention hotel. He knew how conceited I was, already at that time, so he asked if I was going to call my books "An Introduction to Don Knuth." I replied that, on the contrary, I was naming the books after him . His name: Art Evans. (The Art of Computer Programming, in person.)
From this story we can conclude that the word "art" has more than one meaning. In fact, one of the nicest things about the word is that it is used in many different senses, each of which is quite appropriate in connection with computer programming. While preparing this talk, I went to the library to find out what people have written about the word "art" through the years; and after spending several fascinating days in the stacks, I came to the conclusion that "art" must be one of the most interesting words in the English language.
The Arts of Old
If we go back to Latin roots, we find ars, artis meaning "skill." It is perhaps significant that the corresponding Greek word was τεχνη , the root of both "technology" and "technique."
Nowadays when someone speaks of "art" you probably think first of "fine arts" such as painting and sculpture, but before the twentieth century the word was generally used in quite a different sense. Since this older meaning of "art" still survives in many idioms, especially when we are contrasting art with science, I would like to spend the next few minutes talking about art in its classical sense.
In medieval times, the first universities were established to teach the seven so-called "liberal arts," namely grammar, rhetoric, logic, arithmetic, geometry, music, and astronomy. Note that this is quite different from the curriculum of today's liberal arts colleges, and that at least three of the original seven liberal arts are important components of computer science. At that time, an "art" meant something devised by man's intellect, as opposed to activities derived from nature or instinct; "liberal" arts were liberated or free, in contrast to manual arts such as plowing (cf. [6]). During the middle ages the word "art" by itself usually meant logic [4], which usually meant the study of syllogisms.
Science vs. Art
The word "science" seems to have been used for many years in about the same sense as "art"; for example, people spoke also of the seven liberal sciences, which were the same as the seven liberal arts [1]. Duns Scotus in the thirteenth century called logic "the Science of Sciences, and the Art of Arts" (cf. [12, p. 34f]). As civilization and learning developed, the words took on more and more independent meanings, "science" being used to stand for knowledge, and "art" for the application of knowledge. Thus, the science of astronomy was the basis for the art of navigation. The situation was almost exactly like the way in which we now distinguish between "science" and "engineering."
Many authors wrote about the relationship between art and science in the nineteenth century, and I believe the best discussion was given by John Stuart Mill. He said the following things, among others, in 1843 [28]:
Several sciences are often necessary to form the groundwork of a single art. Such is the complication of human affairs, that to enable one thing to be done , it is often requisite to know the nature and properties of many things... Art in general consists of the truths of Science, arranged in the most convenient order for practice, instead of the order which is the most convenient for thought. Science groups and arranges its truths so as to enable us to take in at one view as much as possible of the general order of the universe. Art... brings together from parts of the field of science most remote from one another, the truths relating to the production of the different and heterogeneous conditions necessary to each effect which the exigencies of practical life require.As I was looking up these things about the meanings of "art," I found that authors have been calling for a transition from art to science for at least two centuries. For example, the preface to a textbook on mineralogy, written in 1784, said the following [17]: "Previous to the year 1780, mineralogy, though tolerably understood by many as an Art, could scarce be deemed a Science."According to most dictionaries "science" means knowledge that has been logically arranged and systematized in the form of general "laws." The advantage of science is that it saves us from the need to think things through in each individual case; we can turn our thoughts to higher-level concepts. As John Ruskin wrote in 1853 [32]: "The work of science is to substitute facts for appearances, and demonstrations for impressions."
It seems to me that if the authors I studied were writing today, they would agree with the following characterization: Science is knowledge which we understand so well that we can teach it to a computer; and if we don't fully understand something, it is an art to deal with it. Since the notion of an algorithm or a computer program provides us with an extremely useful test for the depth of our knowledge about any given subject, the process of going from an art to a science means that we learn how to automate something.
Artificial intelligence has been making significant progress, yet there is a huge gap between what computers can do in the foreseeable future and what ordinary people can do. The mysterious insights that people have when speaking, listening, creating, and even when they are programming, are still beyond the reach of science; nearly everything we do is still an art.
From this standpoint it is certainly desirable to make computer programming a science, and we have indeed come a long way in the 15 years since the publication of the remarks I quoted at the beginning of this talk. Fifteen years ago computer programming was so badly understood that hardly anyone even thought about proving programs correct; we just fiddled with a program until we "knew" it worked. At that time we didn't even know how to express the concept that a program was correct, in any rigorous way. It is only in recent years that we have been learning about the processes of abstraction by which programs are written and understood; and this new knowledge about programming is currently producing great payoffs in practice, even though few programs are actually proved correct with complete rigor, since we are beginning to understand the principles of program structure. The point is that when we write programs today, we know that we could in principle construct formal proofs of their correctness if we really wanted to, now that we understand how such proofs are formulated. This scientific basis is resulting in programs that are significantly more reliable than those we wrote in former days when intuition was the only basis of correctness.
The field of "automatic programming" is one of the major areas of artificial intelligence research today. Its proponents would love to be able to give a lecture entitled "Computer Programming as an Artifact" (meaning that programming has become merely a relic of bygone days), because their aim is to create machines that write programs better than we can, given only the problem specification. Personally I don't think such a goal will ever be completely attained, but I do think that their research is extremely important, because everything we learn about programming helps us to improve our own artistry. In this sense we should continually be striving to transform every art into a science: in the process, we advance the art.
Science and Art
Our discussion indicates that computer programming is by now both a science and an art, and that the two aspects nicely complement each other. Apparently most authors who examine such a question come to this same conclusion, that their subject is both a science and an art, whatever their subject is (cf. [25]). I found a book about elementary photography, written in 1893, which stated that "the development of the photographic image is both an art and a science" [13]. In fact, when I first picked up a dictionary in order to study the words "art" and "science," I happened to glance at the editor's preface, which began by saying, "The making of a dictionary is both a science and an art." The editor of Funk & Wagnall's dictionary [27] observed that the painstaking accumulation and classification of data about words has a scientific character, while a well-chosen phrasing of definitions demands the ability to write with economy and precision: "The science without the art is likely to be ineffective; the art without the science is certain to be inaccurate."
When preparing this talk I looked through the card catalog at Stanford library to see how other people have been using the words "art" and "science" in the titles of their books. This turned out to be quite interesting.
For example, I found two books entitled The Art of Playing the Piano [5, 15], and others called The Science of Pianoforte Technique [10], The Science of Pianoforte Practice [30]. There is also a book called The Art of Piano Playing: A Scientific Approach [22].
Then I found a nice little book entitled The Gentle Art of Mathematics [31], which made me somewhat sad that I can't honestly describe computer programming as a "gentle art." I had known for several years about a book called The Art of Computation , published in San Francisco, 1879, by a man named C. Frusher Howard [14]. This was a book on practical business arithmetic that had sold over 400,000 copies in various editions by 1890. I was amused to read the preface, since it shows that Howard's philosophy and the intent of his title were quite different from mine; he wrote: "A knowledge of the Science of Number is of minor importance; skill in the Art of Reckoning is absolutely indispensible."
Several books mention both science and art in their titles, notably The Science of Being and Art of Living by Maharishi Mahesh Yogi [24]. There is also a book called The Art of Scientific Discovery [11], which analyzes how some of the great discoveries of science were made.
So much for the word "art" in its classical meaning. Actually when I chose the title of my books, I wasn't thinking primarily of art in this sense, I was thinking more of its current connotations. Probably the most interesting book which turned up in my search was a fairly recent work by Robert E. Mueller called The Science of Art [29]. Of all the books I've mentioned, Mueller's comes closest to expressing what I want to make the central theme of my talk today, in terms of real artistry as we now understand the term. He observes: "It was once thought that the imaginative outlook of the artist was death for the scientist. And the logic of science seemed to spell doom to all possible artistic flights of fancy." He goes on to explore the advantages which actually do result from a synthesis of science and art.
A scientific approach is generally characterized by the words logical, systematic, impersonal, calm, rational, while an artistic approach is characterized by the words aesthetic, creative, humanitarian, anxious, irrational. It seems to me that both of these apparently contradictory approaches have great value with respect to computer programming.
Emma Lehmer wrote in 1956 that she had found coding to be "an exacting science as well as an intriguing art" [23]. H.S.M. Coxeter remarked in 1957 that he sometimes felt "more like an artist than a scientist" [7]. This was at the time C.P. Snow was beginning to voice his alarm at the growing polarization between "two cultures" of educated people [34, 35]. He pointed out that we need to combine scientific and artistic values if we are to make real progress.
Works of Art
When I'm sitting in an audience listening to a long lecture, my attention usually starts to wane at about this point in the hour. So I wonder, are you getting a little tired of my harangue about "science" and "art"? I really hope that you'll be able to listen carefully to the rest of this, anyway, because now comes the part about which I feel most deeply.
When I speak about computer programming as an art, I am thinking primarily of it as an art form , in an aesthetic sense. The chief goal of my work as educator and author is to help people learn how to write beautiful programs . It is for this reason I was especially pleased to learn recently [32] that my books actually appear in the Fine Arts Library at Cornell University. (However, the three volumes apparently sit there neatly on the shelf, without being used, so I'm afraid the librarians may have made a mistake by interpreting my title literally.)
My feeling is that when we prepare a program, it can be like composing poetry or music; as Andrei Ershov has said [9], programming can give us both intellectual and emotional satisfaction, because it is a real achievement to master complexity and to establish a system of consistent rules.
Furthermore when we read other people's programs, we can recognize some of them as genuine works of art. I can still remember the great thrill it was for me to read the listing of Stan Poley's SOAP II assembly program in 1958; you probably think I'm crazy, and styles have certainly changed greatly since then, but at the time it meant a great deal to me to see how elegant a system program could be, especially by comparison with the heavy-handed coding found in other listings I had been studying at the same time. The possibility of writing beautiful programs, even in assembly language, is what got me hooked on programming in the first place.
Some programs are elegant, some are exquisite, some are sparkling. My claim is that it is possible to write grand programs, noble programs, truly magnificent ones!
Taste and Style
The idea of style in programming is now coming to the forefront at last, and I hope that most of you have seen the excellent little book on Elements of Programming Style by Kernighan and Plauger [16]. In this connection it is most important for us all to remember that there is no one "best" style; everybody has his own preferences, and it is a mistake to try to force people into an unnatural mold. We often hear the saying, "I don't know anything about art, but I know what I like." The important thing is that you really like the style you are using; it should be the best way you prefer to express yourself.
Edsger Dijkstra stressed this point in the preface to his Short Introduction to the Art of Programming [8]:
It is my purpose to transmit the importance of good taste and style in programming, [but] the specific elements of style presented serve only to illustrate what benefits can be derived from "style" in general. In this respect I feel akin to the teacher of composition at a conservatory: He does not teach his pupils how to compose a particular symphony, he must help his pupils to find their own style and must explain to them what is implied by this. (It has been this analogy that made me talk about "The Art of Programming.")Now we must ask ourselves, What is good style, and what is bad style? We should not be too rigid about this in judging other people's work. The early nineteenth-century philosopher Jeremy Bentham put it this way [3, Bk. 3, Ch. 1]:Judges of elegance and taste consider themselves as benefactors to the human race, whilst they are really only the interrupters of their pleasure... There is no taste which deserves the epithet good , unless it be the taste for such employments which, to the pleasure actually produced by them, conjoin some contingent or future utility: there is no taste which deserves to be characterized as bad, unless it be a taste for some occupation which has a mischievous tendency.When we apply our own prejudices to "reform" someone else's taste, we may be unconsciously denying him some entirely legitimate pleasure. That's why I don't condemn a lot of things programmers do, even though I would never enjoy doing them myself. The important thing is that they are creating something they feel is beautiful.In the passage I just quoted, Bentham does give us some advice about certain principles of aesthetics which are better than others, namely the "utility" of the result. We have some freedom in setting up our personal standards of beauty, but it is especially nice when the things we regard as beautiful are also regarded by other people as useful. I must confess that I really enjoy writing computer programs; and I especially enjoy writing programs which do the greatest good, in some sense.
There are many senses in which a program can be "good," of course. In the first place, it's especially good to have a program that works correctly. Secondly it is often good to have a program that won't be hard to change, when the time for adaptation arises. Both of these goals are achieved when the program is easily readable and understandable to a person who knows the appropriate language.
Another important way for a production program to be good is for it to interact gracefully with its users, especially when recovering from human errors in the input data. It's a real art to compose meaningful error messages or to design flexible input formats which are not error-prone.
Another important aspect of program quality is the efficiency with which the computer's resources are actually being used. I am sorry to say that many people nowadays are condemning program efficiency, telling us that it is in bad taste. The reason for this is that we are now experiencing a reaction from the time when efficiency was the only reputable criterion of goodness, and programmers in the past have tended to be so preoccupied with efficiency that they have produced needlessly complicated code; the result of this unnecessary complexity has been that net efficiency has gone down, due to difficulties of debugging and maintenance.
The real problem is that programmers have spent far too much time worrying about efficiency in the wrong places and at the wrong times; premature optimization is the root of all evil (or at least most of it) in programming.
We shouldn't be penny wise and pound foolish, nor should we always think of efficiency in terms of so many percent gained or lost in total running time or space. When we buy a car, many of us are almost oblivious to a difference of $50 or $100 in its price, while we might make a special trip to a particular store in order to buy a 50 cent item for only 25 cents. My point is that there is a time and place for efficiency; I have discussed its proper role in my paper on structured programming, which appears in the current issue of Computing Surveys [21].
Less Facilities: More Enjoyment
One rather curious thing I've noticed about aesthetic satisfaction is that our pleasure is significantly enhanced when we accomplish something with limited tools. For example, the program of which I personally am most pleased and proud is a compiler I once wrote for a primitive minicomputer which had only 4096 words of memory, 16 bits per word. It makes a person feel like a real virtuoso to achieve something under such severe restrictions.
A similar phenomenon occurs in many other contexts. For example, people often seem to fall in love with their Volkswagens but rarely with their Lincoln Continentals (which presumably run much better). When I learned programming, it was a popular pastime to do as much as possible with programs that fit on only a single punched card. I suppose it's this same phenomenon that makes APL enthusiasts relish their "one-liners." When we teach programming nowadays, it is a curious fact that we rarely capture the heart of a student for computer science until he has taken a course which allows "hands on" experience with a minicomputer. The use of our large-scale machines with their fancy operating systems and languages doesn't really seem to engender any love for programming, at least not at first.
It's not obvious how to apply this principle to increase programmers' enjoyment of their work. Surely programmers would groan if their manager suddenly announced that the new machine will have only half as much memory as the old. And I don't think anybody, even the most dedicated "programming artists," can be expected to welcome such a prospect, since nobody likes to lose facilities unnecessarily. Another example may help to clarify the situation: Film-makers strongly resisted the introduction of talking pictures in the 1920's because they were justly proud of the way they could convey words without sound. Similarly, a true programming artist might well resent the introduction of more powerful equipment; today's mass storage devices tend to spoil much of the beauty of our old tape sorting methods. But today's film makers don't want to go back to silent films, not because they're lazy but because they know it is quite possible to make beautiful movies using the improved technology. The form of their art has changed, but there is still plenty of room for artistry.
How did they develop their skill? The best film makers through the years usually seem to have learned their art in comparatively primitive circumstances, often in other countries with a limited movie industry. And in recent years the most important things we have been learning about programming seem to have originated with people who did not have access to very large computers. The moral of this story, it seems to me, is that we should make use of the idea of limited resources in our own education. We can all benefit by doing occasional "toy" programs, when artificial restrictions are set up, so that we are forced to push our abilities to the limit. We shouldn't live in the lap of luxury all the time, since that tends to make us lethargic. The art of tackling miniproblems with all our energy will sharpen our talents for the real problems, and the experience will help us to get more pleasure from our accomplishments on less restricted equipment.
In a similar vein, we shouldn't shy away from "art for art's sake"; we shouldn't feel guilty about programs that are just for fun. I once got a great kick out of writing a one-statement ALGOL program that invoked an innerproduct procedure in such an unusual way that it calculated the mth prime number, instead of an innerproduct [19]. Some years ago the students at Stanford were excited about finding the shortest FORTRAN program which prints itself out, in the sense that the program's output is identical to its own source text. The same problem was considered for many other languages. I don't think it was a waste of time for them to work on this; nor would Jeremy Bentham, whom I quoted earlier, deny the "utility" of such pastimes [3, Bk. 3, Ch. 1]. "On the contrary," he wrote, "there is nothing, the utility of which is more incontestable. To what shall the character of utility be ascribed, if not to that which is a source of pleasure?"
Providing Beautiful Tools
Another characteristic of modern art is its emphasis on creativity. It seems that many artists these days couldn't care less about creating beautiful things; only the novelty of an idea is important. I'm not recommending that computer programming should be like modern art in this sense, but it does lead me to an observation that I think is important. Sometimes we are assigned to a programming task which is almost hopelessly dull, giving us no outlet whatsoever for any creativity; and at such times a person might well come to me and say, "So programming is beautiful? It's all very well for you to declaim that I should take pleasure in creating elegant and charming programs, but how am I supposed to make this mess into a work of art?"
Well, it's true, not all programming tasks are going to be fun. Consider the "trapped housewife," who has to clean off the same table every day: there's not room for creativity or artistry in every situation. But even in such cases, there is a way to make a big improvement: it is still a pleasure to do routine jobs if we have beautiful things to work with. For example, a person will really enjoy wiping off the dining room table, day after day, if it is a beautifully designed table made from some fine quality hardwood.
Therefore I want to address my closing remarks to the system programmers and the machine designers who produce the systems that the rest of us must work with. Please, give us tools that are a pleasure to use, especially for our routine assignments, instead of providing something we have to fight with. Please, give us tools that encourage us to write better programs, by enhancing our pleasure when we do so.
It's very hard for me to convince college freshmen that programming is beautiful, when the first thing I have to tell them is how to punch "slash slash JoB equals so-and-so." Even job control languages can be designed so that they are a pleasure to use, instead of being strictly functional.
Computer hardware designers can make their machines much more pleasant to use, for example by providing floating-point arithmetic which satisfies simple mathematical laws. The facilities presently available on most machines make the job of rigorous error analysis hopelessly difficult, but properly designed operations would encourage numerical analysts to provide better subroutines which have certified accuracy (cf. [20, p. 204]).
Let's consider also what software designers can do. One of the best ways to keep up the spirits of a system user is to provide routines that he can interact with. We shouldn't make systems too automatic, so that the action always goes on behind the scenes; we ought to give the programmer-user a chance to direct his creativity into useful channels. One thing all programmers have in common is that they enjoy working with machines; so let's keep them in the loop. Some tasks are best done by machine, while others are best done by human insight; and a properly designed system will find the right balance. (I have been trying to avoid misdirected automation for many years, cf. [18].)
Program measurement tools make a good case in point. For years, programmers have been unaware of how the real costs of computing are distributed in their programs. Experience indicates that nearly everybody has the wrong idea about the real bottlenecks in his programs; it is no wonder that attempts at efficiency go awry so often, when a programmer is never given a breakdown of costs according to the lines of code he has written. His job is something like that of a newly married couple who try to plan a balanced budget without knowing how much the individual items like food, shelter, and clothing will cost. All that we have been giving programmers is an optimizing compiler, which mysteriously does something to the programs it translates but which never explains what it does. Fortunately we are now finally seeing the appearance of systems which give the user credit for some intelligence; they automatically provide instrumentation of programs and appropriate feedback about the real costs. These experimental systems have been a huge success, because they produce measurable improvements, and especially because they are fun to use, so I am confident that it is only a matter of time before the use of such systems is standard operating procedure. My paper in Computing Surveys [21] discusses this further, and presents some ideas for other ways in which an appropriate interactive routine can enhance the satisfaction of user programmers.
Language designers also have an obligation to provide languages that encourage good style, since we all know that style is strongly influenced by the language in which it is expressed. The present surge of interest in structured programming has revealed that none of our existing languages is really ideal for dealing with program and data structure, nor is it clear what an ideal language should be. Therefore I look forward to many careful experiments in language design during the next few years.
Summary
To summarize: We have seen that computer programming is an art, because it applies accumulated knowledge to the world, because it requires skill and ingenuity, and especially because it produces objects of beauty. A programmer who subconsciously views himself as an artist will enjoy what he does and will do it better. Therefore we can be glad that people who lecture at computer conferences speak about the state of the Art .
References
1. Bailey, Nathan. The Universal Etymological English Dictionary. T. Cox, London, 1727. See "Art," "Liberal," and "Science."
2. Bauer, Walter F., Juncosa, Mario L., and Perlis, Alan J. ACM publication policies and plans. J. ACM 6 (Apr. 1959), 121-122.
3. Bentham, Jeremy. The Rationale of Reward. Trans. from Theorie des peines et des recompenses, 1811, by Richard Smith, J. & H. L. Hunt, London, 1825.
4. The Century Dictionary and Cyclopedia 1. The Century Co., New York, 1889.
5. Clementi, Muzio. The Art of Playing the Piano. Trans. from L'art de jouer le pianoforte by Max Vogrich. Schirmer, New York, 1898.
6. Colvin, Sidney. "Art." Encyclopaedia Britannica, eds 9, 11, 12, 13, 1875-1926.
7. Coxeter, H. S. M. Convocation address, Proc. 4th Canadian Math. Congress, 1957, pp. 8-10.
8. Dijkstra, Edsger W. EWD316: A Short Introduction to the Art of Programming. T. H. Eindhoven, The Netherlands, Aug. 1971.
9. Ershov, A. P. Aesthetics and the human factor in programming. Comm. ACM 15 (July 1972), 501-505.
10. Fielden, Thomas. The Science of Pianoforte Technique. Macmillan, London, 927.
11. Gore, George. The Art of Scientific Discovery. Longmans, Green, London, 1878.
12. Hamilton, William. Lectures on Logic 1. Win. Blackwood, Edinburgh, 1874.
13. Hodges, John A. Elementary Photography: The "Amateur Photographer" Library 7. London, 1893. Sixth ed, revised and enlarged, 1907, p. 58.
14. Howard, C. Frusher. Howard's Art of Computation and golden rule for equation of payments for schools, business colleges and self-culture .... C.F. Howard, San Francisco, 1879.
15. Hummel, J.N. The Art of Playing the Piano Forte. Boosey, London, 1827.
16. Kernighan B.W., and Plauger, P.J. The Elements of Programming Style. McGraw-Hill, New York, 1974.
17. Kirwan, Richard. Elements of Mineralogy. Elmsly, London, 1784.
18. Knuth, Donald E. Minimizing drum latency time. J. ACM 8 (Apr. 1961), 119-150.
19. Knuth, Donald E., and Merner, J.N. ALGOL 60 confidential. Comm. ACM 4 (June 1961), 268-272.
20. Knuth, Donald E. Seminumerical Algorithms: The Art of Computer Programming 2. Addison-Wesley, Reading, Mass., 1969.
21. Knuth, Donald E. Structured programming with go to statements. Computing Surveys 6 (Dec. 1974), pages in makeup.
22. Kochevitsky, George. The Art of Piano Playing: A Scientific Approach. Summy-Birchard, Evanston, II1., 1967.
23. Lehmer, Emma. Number theory on the SWAC. Proc. Syrup. Applied Math. 6, Amer. Math. Soc. (1956), 103-108.
24. Mahesh Yogi, Maharishi. The Science of Being and Art of Living. Allen & Unwin, London, 1963.
25. Malevinsky, Moses L. The Science of Playwriting. Brentano's, New York, 1925.
26. Manna, Zohar, and Pnueli, Amir. Formalization of properties of functional programs. J. ACM 17 (July 1970), 555-569.
27. Marckwardt, Albert H, Preface to Funk and Wagnall's Standard College Dictionary. Harcourt, Brace & World, New York, 1963, vii.
28. Mill, John Stuart. A System Of Logic, Ratiocinative and Inductive. London, 1843. The quotations are from the introduction, S 2, and from Book 6, Chap. 11 (12 in later editions), S 5.
29. Mueller, Robert E. The Science of Art. John Day, New York, 1967.
30. Parsons, Albert Ross. The Science of Pianoforte Practice. Schirmer, New York, 1886.
31. Pedoe, Daniel. The Gentle Art of Mathematics. English U. Press, London, 1953.
32. Ruskin, John. The Stones of Venice 3. London, 1853.
33. Salton, G.A. Personal communication, June 21, 1974.
34. Snow, C.P. The two cultures. The New Statesman and Nation 52 (Oct. 6, 1956), 413-414.
35. Snow, C.P. The Two Cultures: and a Second Look. Cambridge University Press, 1964.
Copyright 1974, Association for Computing Machinery, Inc. General permission to republish, but not for profit, all or part of this material is granted provided that ACM's copyright notice is given and that reference is made to the publication, to its date of issue, and to the fact that reprinting privileges were granted by permission of the Association for Computing Machinery.
May 17, 2017 | theregister.co.uk
Comment "It doesn't matter whether a cat is white or black, as long as it catches mice," according to Chinese revolutionary Deng Xiaoping.While Deng wasn't referring to anything nearly as banal as IT projects (he was of course talking about the fact it doesn't matter whether a person is a revolutionary or not, as long as he or she is efficient and capable), the same principle could apply.
A fixation on the suppliers, technology or processes ultimately doesn't matter. It's the outcomes, stupid. That might seem like a blindingly obvious point, but it's one worth repeating.
Or as someone else put it to me recently in reference to the huge overspend on a key UK programme behind courts digitisation which we recently revealed: "Who gives a toss if it's agile or not? It just needs to work."
If you're going to do it do it rightI'm not dismissing the benefits of this particular methodology, but in the case of the Common Platform Programme , it feels like the misapplication of agile was worse than not doing it at all.
Just to recap: the CPP was signed off around 2013, with the intention of creating a unified platform across the criminal justice system to allow the Crown Prosecution Service and courts to more effectively manage cases.
By cutting out duplication of systems, it was hoped to save buckets of cash and make the process of case management across the criminal justice system far more efficient.
Unlike the old projects of the past, this was a great example of the government taking control and doing it themselves. Everything was going to be delivered ahead of time and under budget. Trebles all round!
But as Lucy Liu's O-Ren Ishii told Uma Thurman's character in in Kill Bill : "You didn't think it was gonna be that easy, did you?... Silly rabbit."
According to sources, alarm bells were soon raised over the project's self-styled "innovative use of agile development principles". It emerged that the programme was spending an awful lot of money for very little return. Attempts to shut it down were themselves shut down.
The programme carried on at full steam and by 2014 it was ramping up at scale. According to sources, hundreds of developers were employed on the programme at huge day rates, with large groups of so-called agile experts overseeing the various aspects of the programme.
CPP cops a pleaFour years since it was first signed off and what are the things we can point to from the CPP? An online make-a-plea programme which allows people to plead guilty or not guilty to traffic offences; a digital markup tool for legal advisors to record case results in court, which is being tested by magistrates courts in Essex; and the Magistrates Rota.
Multiple insiders have said the rest that we have to show for hundreds of millions of taxpayers' cash is essentially vapourware. When programme director Loveday Ryder described the project as a "once-in-a-lifetime opportunity" to modernise the criminal justice system, it wasn't clear then that she meant the programme would itself take an actual lifetime.
Of course the definition of agile is that you are able to move quickly and easily. So some might point to the outcomes of this programme as proof that it was never really about that.
One source remarked that it really doesn't matter if you call something agile or not, "If you can replace agile with constantly talking and communicating then fine, call it agile." He also added: "This was one of the most waterfall programmes in government I've seen."
What is most worrying about this programme is it may not be an isolated example. Other organisations and departments may well be doing similar things under the guise of "agile". I'm no expert in project management, but I'm pretty sure it isn't supposed to be making it up as you go along, and constantly changing the specs and architecture.
Ultimately who cares if a programme is run via a system integrator, multiple SMEs, uses a DevOps methodology, is built in-house or deployed using off-the-shelf, as long as it delivers good value. No doubt there are good reasons for using any of those approaches in a number of different circumstances.
Government still spends an outrageous amount of money on IT, upwards of £16bn a year. So as taxpayers it's a simple case of wanting them to "show me the money". Or to misquote Deng, at least show us some more dead mice. ®
Re: 'What's Real and What's for Sale'...DogbowlSo agile means "constantly adapating " ? read constantly bouncing from one fuckup to the next , paddling like hell to keep up , constantly firefighting whilst going down slowly like the titanic?
thats how i read it
Re: 'What's Real and What's for Sale'...PatientOneHa! About 21 years back, working at Racal in Bracknell on a military radio project, we had a 'round-trip-OMT' CASE tool that did just that. It even generated documentation from the code so as you added classes and methods the CASE tool generated the design document. Also, a nightly build if it failed, would email the code author.
I have also worked on agile for UK gov projects a few years back when it was mandated for all new projects and I was at first dead keen. However, it quickly become obvious that the lack of requirements, specifications etc made testing a living nightmare. Changes asked for by the customer were grafted onto what become baroque mass of code. I can't see how Agile is a good idea except for the smallest trivial projects.
Re: 'What's Real and What's for Sale'...Doctor Syntax"Technically 'agile' just means you produce working versions frequently and iterate on that."
It's more to do with priorities: On time, on budget, to specification: Put these in the order of which you will surrender if the project hits problems.
Agile focuses on On time. What is delivered is hopefully to specification, and within budget, but one or both of those could be surrendered in order to get something out On time. It's just project management 101 with a catchy name, and in poorly managed 'agile' developments you find padding to fit the usual 60/30/10 rule. Then the management disgard the padding and insist the project can be completed in a reduced time as a result, thereby breaking the rules of 'agile' development (insisting it's on spec, under time and under budget, but it's still 'agile'...).
Re: 'What's Real and What's for Sale'...Doctor Syntax"Usually I check something(s) in every day, for the most major things it may take a week, but the goal is always to get it in and working so it can be tested."
The question is - is that for software that's still in development or software that's deployed in production? If it's the latter and your "something" just changes its data format you're going to be very unpopular with your users. And that's just for ordinary files. If it requires frequent re-orgs of an RDBMS then you'd be advised to not go near any dark alley where your DBA might be lurking.
Software works on data. If you can't get the design of that right early you're going to be carrying a lot of technical debt in terms of backward compatibility or you're going to impose serious costs on your users for repeatedly bringing existing data up to date.
Re: 'What's Real and What's for Sale'...FozzyBear"On time, on budget, to specification: Put these in the order of which you will surrender if the project hits problems."
In the real world it's more likely to be a trade-off of how much of each to surrender.
Re: 'What's Real and What's for Sale'...DaggI was told in my earlier years by a Developer.
For any project you can have it
- Cheap, (On Budget)
- Good, (On spec)
- Quick.( On time)
Pick two of the three and only two. It doesn't which way you pick, you're fucked on the third. Doesn't matter about methodology, doesn't matter about requirements or project manglement. You are screwed on the third and the great news is, is that the level of the reaming you get scales with the size of the project.
After almost 20 years in the industry this has held true.
Re: 'What's Real and What's for Sale'...ArchtechTechnically 'agile' just means you produce working versions frequently and iterate on that.
No, technically agile means having no clue as to what is required and to evolve the requirements as you build. All well and good if you have a dicky little web site but if you are on a very / extremely large project with fixed time frame and fixed budget you are royally screwed trying to use agile as there is no way you can control scope.
Hell under agile no one has any idea what the scope is!
Re: Government still spends an outrageous amount of money on IToldtakuI hope you were joking. If not, try reading the classic book "The Mythical Man-Month".
'Agile' means nothing at this point. Unless it means terrible software.Adrian 4At this point, courtesy of Exxxxtr3333me Programming and its spawn, 'agile' just means 'we don't want to do any design, we don't want to do any documentation, and we don't want to do any acceptance testing because all that stuff is annoying.' Everything is 'agile', because that's the best case for terrible lazy programmers, even if they're using a completely different methodology.
I firmly believe in the basics of 'iterate working versions as often as possible'. But why sell ourselves short by calling it agile when we actually design it, document it, and use testing beyond unit tests?
Yes, yes, you can tell me what 'agile' technically means, and I know that design and documentation and QA are not excluded, but in practice even the most waterfall of waterfall call themselves agile (like Kat says), and from hard experience people who really push 'agile agile agile' as their thing are the worst of the worst terrible coders who just slam crap together with all the finesse and thoughtfulness of a Bangalore outsourcer.
It's like any exciting new methodology : same shit, different name. In this case, one that allows you to pretend the tiny attention-span of a panicking project manager is a good thing.jamie mWhen someone shows me they've learned the lessons of Brooke's tarpit,. I'll be interested to see how they did it. Until then, it's all talk.
25% Agile:kmac499Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.
Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale.
Working software is the primary measure of progress.
Re: 25% Agile:Doctor SyntaxFrom Jamie M
Working software is the primary measure of progress.
Brilliant few word summary which should be scrawled on the wall of every IT project managers office in Foot High Letters.
I've lived through SSADM, RAD, DSDM, Waterfall, Bohm Spirals, Extreme Programming and probably a few others.
They are ALL variations on a theme. The only thing they have in common is the successful ones left a bunch of 0's and 1's humming away in a lump of silicon doing something useful.
Re: 25% Agile:Charlie Clark"Working software is the primary measure of progress."
What about training the existing users, having it properly documented for the users of the future, briefing support staff, having proper software documentation, or at least self documenting code, for those who will have to maintain it and ensuring it doesn't disrupt the data from the previous release? Or do we just throw code over the fence and wave goodbye to it?
Re: Limits of pragmatismAnonymous CowardIn France "naviguer à vue" is pejorative.
Software development in France* which also gave us ah yes, it may work in practice but does it work in theory .
The story is about the highly unusual cost overrun of a government project. Never happened dans l'héxagone ? Because it seems to happy pretty much everywhere else with relentless monotony because politicians are fucking awful project managers.
* FWIW I have a French qualification.
Anonymous CowardAgile only works if all stakeholders agree on an outcome
For a project that is a huge change of operating your organisation it is unlikely that you will be able to deliver, at least the initial parts of your project, in an agile way. Once outcomes are known at a high level stakeholders have something to cling onto when they are asked what they need, if indeed they exist yet. (trying to ask for requirements for a stakeholder that doesn't exist yet is tough).
Different methods have their own issues, but in this case I would have expected failure to be reasonably predictable.
You wont have much to show for it, as they shouldn't at least, have started coding to a business model that itself needs defining. This is predictable, and overall means that no one agrees what the business should look like, let alone how a vendor delivered software solution should support it.
I have a limited amount of sympathy for the provider for this as it will be beyond their control (limited as they are an expensive government provider after all)
This is a disaster caused by the poor management in UKGOV and the vendor should have dropped and ran well before this.
Doctor SyntaxI'm a one man, self employed business - I do some very complex sites - but if they don't work I don't get paid. If they spew ugly bugs I get paniced emails from unhappy clients.
So I test after each update and comment code and add features to make my life easy when it comes to debugging. Woo, it even send me emails for some bugs.
I'm in agreement with the guy above - a dozen devs, a page layout designer or two, some databases. One manager to co-ordinate and no bloody jargon.
There's a MASSIVE efficiency to small teams but all the members need to be on top of their game.
"I'm in agreement with the guy above - a dozen devs, a page layout designer or two, some databases. One manager to co-ordinate and no bloody jargon."Munchausen's proxyDon't forget a well-defined, soluble problem. That's in your case, where you're paid by results. If you're paid by billable hours it's a positive disadvantage.
Agile Expertise?Zippy's Sausage Factory" I'm no expert in project management, but I'm pretty sure it isn't supposed to be making it up as you go along, and constantly changing the specs and architecture."
I'm no expert either, but I honestly thought that was quite literally the definition of agile. (maybe disguised with bafflegab, but semantically equivalent)
Sounds like what I have said for a while...a_yank_lurkerThe "strategy boutiques" saw "agile" becoming popular and they now use it as a buzzword.
These days, I put it in my "considered harmful" bucket, along with GOTO, teaching people to program using BASIC, and "upgrading" to Office 2016*.
* Excel, in particular.
Buzzword BingoDoctor SyntaxAll too often a sound development ideas are perverted. The concepts are sound but the mistake is to view each as the perfect panacea to produce bug free, working code. Each has its purpose and scope of effectiveness. What should be understood and applied is not a precise cookbook method but principles - Agile focuses communication between groups and ensuring all on the same page. Others focus more on low level development (test driven development, e.g.) but one can lose sight the goal is use an appropriate tool set to make sure quality code is produced. Again, is the code being tested, are the tests correct, are junior developers being mentored, are developers working together appropriately for the nature of the projects are issues to be addressed not the precise formalism of the insultants.
Uncle Bob Martin has noted that one of the problems the formalisms try to address is the large number of junior developers who need proper mentoring, training, etc. in real world situations. He noted that in the old days many IT pros were mid level professionals who wandered over to IT and many of the formalisms so beloved by the insultants were concepts they did naturally. Cross functional team meetings - check, mentor - check, use appropriate tests - check, etc. These are professional procedures common to other fields and were ingrained mindset and habits.
It's worth remembering that it's the disasters that make the news. I've worked on a number of public sector projects which were successful. After a few years of operation, however, the contract period was up and the whole service put out to re-tender.* At that point someone else gets the contract so the original work on which the successful delivery was based got scrapped.goldcd* With some very odd results, it has to be said, but that's a different story.
Notas Badoff...My personal niggle is that a team has "velocity" rather than "speed" - and that seems to be a somewhat deliberate and disingenuous selection. The team should have speed, the project ultimately a measurable velocity calculated by working out how much of the speed was wasted in the wrong/right direction.
Anyway, off to get my beauty sleep, so I can feed the backlog tomorrow with anything within my reach.
Re: I like agilebfwebsterWanted to give you an up-vote as "velocity" vs. "speed" is exactly the sleight of hand that infuriates me. We do want eventual progress achieved, as in "distance towards goal in mind", right?
Unfortunately my reading the definitions and checking around leads me to think that you've got the words 'speed' and 'velocity' reversed above. Where's that nit-picker's icon....
I literally less than an hour ago gave my CS 428 ("Software Engineering") class here at Brigham Young University (Provo, Utah, USA) my final lecture for the semester, which included this slide:Pete 2Process is not a panacea or a crutch or a silver bullet. Methodologies only work as well as the people using it. Any methodology can be distorted to give the answer that upper management wants (instead of reality)
When adopting a new methodology:
- Do one or two pilot projects
- Do one non-critical real-world project
- Then, and only then, consider using it for a critical project
Understand the strengths and weaknesses of a given methodology before starting a project with it. Also, make sure a majority of team members have successfully completed a real-world project using that methodology.
Save some fun for us!> under the guise of " agile ?". I'm no expert in project management, but I'm pretty sure it isn't supposed to be making it up as you go along, and constantly changing the specs and architecture.
So why should the developers have all the fun? Why can't the designers and architects be "agile", too? Isn't constantly changing stuff all part of the "agile" way?
Apr 20, 2017 | www.youtube.com
Chad 2 years agoAgent76 1 year ago (edited)"People who believe in these rights very much are forced into compromising their integrity"
I suspect that it's hopelessly unlikely for honest people to complete the Police Academy; somewhere early on the good cops are weeded out and cannot complete training unless they compromise their integrity.
January 9, 2014500 Years of History Shows that Mass Spying Is Always Aimed at Crushing Dissent It's Never to Protect Us From Bad Guys No matter which government conducts mass surveillance, they also do it to crush dissent, and then give a false rationale for why they're doing it.
Homa Monfared 7 months ago
I am wondering how much damage your spying did to the Foreign Countries, I am wondering how you changed regimes around the world, how many refugees you helped to create around the world.
Don Kantner, 2 weeks ago
People are so worried about NSA don't be fooled that private companies are doing the same thing. Plus, the truth is if the NSA wasn't watching any fool with a computer could potentially cause an worldwide economic crisis.
Bettor in Vegas 1 year ago
In communism the people learned quick they were being watched. The reaction was not to go to protest.
Just not be productive and work the system and not listen to their crap. this is all that was required to bring them down. watching people, arresting does not do shit for their cause......
Dec 26, 2016 | it.slashdot.org
(threatpost.com) 148 Posted by EditorDavid on Saturday December 17, 2016 @07:34PM from the does-code-reuse-endanger-secure-software-development dept. msm1267 quotes ThreatPost: The amount of insecure software tied to reused third-party libraries and lingering in applications long after patches have been deployed is staggering. It's a habitual problem perpetuated by developers failing to vet third-party code for vulnerabilities, and some repositories taking a hands-off approach with the code they host. This scenario allows attackers to target one overlooked component flaw used in millions of applications instead of focusing on a single application security vulnerability.The real-world consequences have been demonstrated in the past few years with the Heartbleed vulnerability in OpenSSL , Shellshock in GNU Bash , and a deserialization vulnerability exploited in a recent high-profile attack against the San Francisco Municipal Transportation Agency . These are three instances where developers reuse libraries and frameworks that contain unpatched flaws in production applications... According to security experts, the problem is two-fold. On one hand, developers use reliable code that at a later date is found to have a vulnerability. Second, insecure code is used by a developer who doesn't exercise due diligence on the software libraries used in their project.
That seems like a one-sided take, so I'm curious what Slashdot readers think. Does code reuse endanger secure software development?
Dec 26, 2016 | ask.slashdot.org
(daftcode.pl) 332 Posted by EditorDavid on Sunday November 27, 2016 @11:30PM from the TDD-vs-HDD dept. marekkirejczyk , the VP of Engineering at development shop Daftcode, shares a warning about hype-driven development: Someone reads a blog post, it's trending on Twitter, and we just came back from a conference where there was a great talk about it. Soon after, the team starts using this new shiny technology (or software architecture design paradigm), but instead of going faster (as promised) and building a better product, they get into trouble . They slow down, get demotivated, have problems delivering the next working version to production.
Describing behind-schedule teams that "just need a few more days to sort it all out," he blames all the hype surrounding React.js, microservices, NoSQL, and that " Test-Driven Development Is Dead " blog post by Ruby on Rails creator David Heinemeier Hansson. ("The list goes on and on... The root of all evil seems to be social media.")Does all this sound familiar to any Slashdot readers? Has your team ever succumbed to hype-driven development?
In Fred Brooks' Mythical Man Month, he states:
Brooks's Law: adding manpower to a late software project makes it later.
In the book he notes that it is an "outrageous oversimplification", but most laws of this kind are. The trouble is that there are important exceptions that many people don't take the time to consider when using Brooks's law to justify something. I hinted at them in chapter 2 of The art of project management, but here's a full explanation of the notable caveats I've seen.
- It depends who the manpower is. The law assumes that all added manpower is equal, which is not true. Given the choice of adding a good programmer, who knows the code base and is friends with half the team, I'd consider it. Knowledge of the code and good relationships with the team offsets the two key reasons for Brooks law (ramp up and communication overhead). And if that individual has prior experience coming in late to a project, all the the better. Some people are capable of playing the SWAT team / Ninja role, and that skill is orthogonal to programming talent: some are simply better are getting complex work done on unfamiliar terrain with a minimum of disruption.
- Some teams can absorb more change than others. Some teams are more resiliant to change. They can can absorb communication overhead and teaching duties better than others can. It's well known that there is a wide range among programmers in their speed, but it's also true there is a wide range for communication and teaching skills. Even if all teams pay a price for adding people, that price will vary widely and is not a fixed cost based on the # of people alone(e.g. n*n).
- There are worse things than being late. The conclusion most people reach is that, given Brooks law, you should never add people. But it doesn't say that: it just says you'll be late. That can be OK if you also get higher quality (not a guarantee of course). Time is not the only success criteria. If the person you're adding fills an important hole in in the team, being later might be worth it.
- There are different ways to add manpower. The sad but popular approach is to throw people in without much explanation and let everyone figure it out for themselves. But if the manager clarifies why Sally and Rupert are joining, and defines good roles for them, with input from the team, they'll be set up to make a smooth transition. This can also include assigning them a mentor who is capable and motivated to be their primary point of contact for ramp up issues. The more experience everyone has with mid-stream personnel changes, the better. Some tasks will always be better suited to assigning new people and it's the managers task to figure out which ones those are.
- It depends on why the project was late to begin with. If the project was late because of inexperience or poor team practices, adding a small number of experienced people with those missing skills, and directing others to follow them, can eliminate some of those inefficiencies: not all knowledge or skill transfer requires long lead times. Therefore the added value of certain manpower is not just 1 additional programmer unit, but 1+, as they have a positive effect on the productivity of others. But more importantly, there are many reasons for being late that have nothing to do with the programming team personnel: no amount of programming staff modifications will resolve the psychiatric needs of team leaders or the dysfunctions of executives.
- Adding people can be combined with other management action . Contrary to popular belief, management is not a turn based strategy game. You can do more than one thing at the same time. For example, you can remove someone at the same time you're adding manpower. This does increase volatility, but if you're removing your worst, and most disruptive, programmer and adding one of your best, it can be a reasonable choice. Managers also have the option to cut or reorganize work across the project, improve tools and equipment, throw a social event to accelerate team relationships, etc. There are always ways for managers to provide cover fire. These all have disruption costs, but depending on the length of the project they might be offset.
Disclaimer: I'm sure there are specific cases where these exceptions do not apply. I am not suggesting the abandonment of Brook's law or that religious adherence to the above. All additional examinations or applications of Brook's law, or these exceptions, are welcome.
See also: the social contexts of open source software, from the Cathedral and the Bazaar.
Brooks looked specifically at software development projects, but I believe his conclusion holds true across all projects. With the proliferation of matrix style organizations and the lack of strategically focused planning, we've created a corporate culture that reacts to every crisis by sending lots of email and having lots of meetings in the hope that something positive must happen due to all this communication. The secret is, it won't.
Lots of communication only exacerbates the problem. What we need to do is remember that communication does not solve business problems; strategic planning and productive action solves problems.
MindBy
The Myth of Multitasking by Dave Crenshaw.
In his book he uses a simple example that is very convincing. Simply take a sheet of paper and draw a line across the page. Above of the line you will write the alphabet, below it the numbers 1 to 26. The kicker is that you write one letter and then one number, so you'd write 'A' above the line and '1′ below the line, then you'd write 'B' above the line and '2′ below the line, oscillating back and forth until done. Time this exercise. Next, time yourself writing just the alphabet and then just the numbers 1 to 26 in serial without switching back and forth. If you're like most people you'll find that it takes about twice as long to do the first exercise as it does the second. So think about that. You did the same amount of work in both cases but it took you twice as long. The only difference was context switching. Unfortunately, context switching has become the "norm" in today's offices.
People forget that multitasking isn't about processing information in parallel, which we just proved was impossible. Multitasking originally meant juggling or managing multiple tasks during a given period of time, not at the same time. Somehow the definition was hijacked to mean something more akin to parallel processing. Now people brag about how many conversations they can carry on at once, but based on what we now know, what they're really saying is "Look at me, I'm getting less work done". Probably something you don't want your boss to know .
The other thing to remember is that every communication channel has it's own place and purpose in the world and changing conventions can be unproductive. Email was born to replace memos and letters. No one expected you to reply to a memo or letter within minutes of it being issued. Your memos would gather in your mailbox and you'd pick them up once a day and work through the replies, just like you still pick up your mail once a day from the Post Office. What's changed is the use paradigm and expectations. Now in this world of cell phones and computers that has people linked in everywhere, all the time, we expect immediate responses to our emails. It's just not realistic.
Daemon News
In 1950 Alex Bavelas at MIT (1) was interested in the evolution of strategies and the perceptions of people with different skills that participated in group tasks. An experiment was designed where groups of participants were selectively isolated and a participative card game had to be solved:
As the figures suggest, two structures were tested, the circles represent the participants and the links represent the communication channels. The first structure, where all the participants are organized in a circular fashion was named the "democratic" structure to make emphasis on the similar status of all participants. The second structure, in form of a star, was called the "authoritarian" structure as one of it's members had communication, and therefore influence, with all the others.
In the first part of the experiment, where all the symbols in the cards were identifiable and had names, members of the authoritarian structure finished first, however they did not feel like winners, they felt they had failed and should have done much better; 94% of the team members recognized the vertex as the leader of the group. The democratic structure finished afterwards, however they were cheerful, they felt like winners,and they also stated they could have done it better; the leadership seemed to be equally distributed among them.
In the second part of the experiment a "noise" level was added by making the symbols in the cards more complex, some symbols didn't even have names to identify them. The "democrats" didn't vary much their performance, however this time the "authoritarians" didn't finish at all. Most interesting than the result is the fact that the "democratic" group extended their comunication language, inventing new names to some symbols, while the "authoritarian" group incremented the noise level by insulting each other.
This experiment is actually very interesting when we observe a similarity both in structure and results between the "democratic" structure and the BSD core teams, and the "authoritarian" structure and that of some other free software efforts.
From the above explanation, one could think that the best way to provide a stable development environment would be forming a democratic core group. This is not always true; the above results were obtained using only a small team. In bigger groups democracy almost always fails in providing the best solution to a problem. This effect was explained in 1785 by Condorcet in his famous paradox(2).
The election process is not transitive. Condorcet proposed a simple election where every voter had to elect first and second place among three options: In his example option A was the winner from the absolute majority, however by the vote distribution it was clear that if option B (the second place in absolute majority) hadn't been available, option C would have won the election over A.
In my Karate days I also experienced this non-transitiveness in a completely different situation: I won my kumite over two different contenders, they both won over a third contendor and I lost with this last guy.
Other than structure, the way resources are obtained and distributed is essential for the survival of any free software organization. The most appealing way to obtain resources is the creation of a non-profit organization, a model followed by the NetBSD Foundation and the Apache Foundation, this makes easier for companies and individuals to contribute due to the tax deduction advantages available in some countries.
Many countries, though, have little or no legal provision for this type of organizations; in those cases some companies might feel motivated to register their contribution as an operational expense in their financial state of results and while this expense will not be deducted from the taxes, no tax will be paid for it (consult your accountant if in doubt). FreeBSD Inc. is a company created to take advantage of these donations in order to provide support to the FreeBSD Project, (And of course, FreeBSD is not really a Project since it does not have a defined deadline for it's objectives...confusing heh?).
The real owners of a free software company should be their developers, this means that in order to be consistent companies that make money out of free software should pay at least part of their salaries in shares of the company.
May 2000 | DeveloperWorks
Contents: Brooks' Law Defying gravity Surgical teams Head chef Resources About the author An aphorism from some twenty years ago, Brooks' Law, holds that adding more programmers to a project only delays it. But if this is so, what accounts for Linux? Paul Jones gathers perspectives on the open source development method and whether it defies conventional wisdom.
What a difference open source makes: it appears to break all the old rules. Or does it? With open source, so we are told, all the rules of programming, sales, marketing, and customer relations have to be rewritten. But there is evidence that beneath the rhetoric of open source, the hard school of experience casts a shadow on the most optimistic claims for open source software and open source practices. Do thousands of programmers really stand ready to write quality code to solve common problems or are there restraints inherent in development that prohibit scaling up into world-wide software projects?
Consider this story: It is the summer of 1996 and Bob Young's start up should have been further along on the development of their core product, Red Hat Linux 4.0. It was beginning to look as if the company's cash reserves would be out before the product was ready to ship.
"I started putting pressure on Marc [Ewing] and Erik [Troan] to hire some additional developers to ensure that we shipped 4.0 before the cash ran out. Marc and Erik (who were doing most of the Red Hat Linux development themselves at that time) simply refused, saying that to add developers at that stage would in fact slow down the project -- an argument that made no sense to me," recalls Young.
But Troan insisted that Young go out and buy a copy of an over twenty-year-old book about software engineering: The Mythical Man Month by Fredrick Brooks. Not so much a technical book full of charts and formulae, it is instead illustrated with reproductions of woodcuts of giant sloths stuck and sinking in deadly tar pits and photographs of ancient cathedrals.
Brooks' Law
This little book by Fredrick Brooks, The Mythical Man Month (or M3 among its most ardent followers), has become a kind of bible for software developers. Brooks received the Turing Prize, often called "the Nobel Prize of Computing," from the Association for Computing Machinery this year.
Fredrick Brooks headed the team at IBM that created the first large-scale computer operating system in the early 1960s. During his work at IBM, he realized that programmers were not like ditch diggers or house painters. Instead, he wrote, "The programmer, like the poet, works only slightly removed from pure thought-stuff. He builds his castles in the air, from air, creating by exertion of the imagination. Few media of creation are so flexible, so easy to polish and rework, so readily capable of realizing grand conceptual structures..."
Further, he defends software architects against ignorant management with Brooks' Law. "Oversimplifying outrageously, Brooks' Law [can be stated] as 'Adding manpower to a late software project makes it later.'" In other words and with mixed metaphors, art and artists cannot be rushed, and too many cooks will surely spoil the broth.
Importantly and uniquely, The Mythical Man Month and Brooks' Law are now over 25 years old. In a world in which "old" is defined in days or weeks, such longevity is nothing short of a miracle.
But proponents of open source and free software development, including Linux developers, are not completely satisfied with the Law. Most famously (among geeks at any rate), Eric Raymond in his "The Cathedral and the Bazaar," declared Brooks' Law obsolete, if not simply limited, saying "if Brooks' Law were the whole picture, Linux would be impossible."
Although Raymond now says that he has somewhat modified his views or was misunderstood, some still would say he is given to oversimplifying and outrageousness himself. "I don't consider Brooks' Law 'obsolete' any more than Newtonian physics is obsolete; it's just incomplete. Just as you get non-Newtonian effects at high energies and velocities, you get non-Brooksian effects when transaction costs go low enough. Under sufficiently extreme conditions, these secondary effects dominate the system -- you get nuclear explosions, or Linux."
Defying gravity
But how do open source and free software efforts defy the crux of Brooks' Law, the need for small, smart groups and time flexibility for the artist/architect programmers? Perhaps by not having a schedule in the first place?
"I have yet to see an open source project commit to a schedule, and most of them don't ship anywhere near when folks think they will (some ship early, and some ship late -- it's hard to predict). I don't know how you take a rule about 'making a late project later' and apply it to development which has little formal scheduling," says Erik Troan, author of the popular Red Hat Package Manager.
The Free Software Foundation's McArthur Fellow Richard Stallman echoes Troan's sentiments. "GNU projects rarely have a schedule. When people ask me when something will be finished, I respond, 'It will be ready sooner if you help.'"
Ken Coar, member of the Apache Web Server group, will admit to having goals, but not deadlines. "Goal dates, yes -- but our release process more closely resembles the exchange between Pope Julius II and Michelangelo than a conventional software house's product development."
Yet, Coar gives himself (and us) a warning with his newly coined "Coar's Contention." "Most open-source fixes will not be proposed until a release date [for the project] is announced." In other words, deadlines do influence work and workers even in the artistically liberated world of open source.
"While the 'no deadlines' idea is not true in a general sense, we still can talk about speed of development -- just as critical a factor as deadlines. Just imagine what would happen if Linux kernel were frozen and no new version will appear until 2005!" stresses Nikolai Bezroukov of Fairleigh Dickinson University, a critic of both Stallman and Raymond.
Yet, Stallman sees projects without deadlines as a way to avoid the temptation to place untrained hands on the job at the wrong time and place. "If you don't have a deadline, you won't be likely to try to put more people on the job in that way. You will make use of more helpers only when they can actually help. Thus, I think that Brooks' Law does apply to [free software], but we have no inclination to try to violate it."
For Stallman, free software is ultimately about freedom of all sorts. "In free software projects there is usually a scarcity of resources, and no boss who can order people to do things this way instead of that way. We tend not to have too many people trying to help one project." In fact, free software -- as opposed to open source -- is not about products, per se. "In the free software movement, freedom and community are the focus. We talk about the deeper issues, such as what way of life we want."
Surgical teams
While Raymond tells us that "the more the merrier is the rule in an open source project" and Stallman bemoans the scarcity of programmers, Jamie Zawinski, formerly of Netscape and of the Mozilla project disagrees.
"The dynamic of development inside a company is definitely different from development in an open source way. However, I don't think that saying 'adding programmers makes it later' isn't true for open source projects because of their (alleged) lack of deadlines: I think that it's true (in any sense that it is true) only because the definition of 'programmer' is slippery in that case. If you have a project that has five people who write 80% of the code, and a hundred people who have contributed bug fixes or a few hundred lines of code here and there, is that a '105-programmer project?'"
What Zawinski and others have seen is that in such undertakings as the Apache project, Linux Kernel, and other large software projects, the bulk of the work is done by a few dedicated members or a core team -- what Brooks calls a "surgical team."
"In most open source projects," says Zawinski, "there is a small group who do the majority of the work, and the other contributors are definitely at a secondary level, meaning that they don't behave as bottlenecks."
Zawinski goes on:
"Most of the larger open source projects are also fairly modular, meaning that they are really dozens of different, smaller projects. So when you claim that there are ten zillion people working on the Gnome project, you're lumping together a lot of people who never need to talk to each other, and thus, aren't getting in each others' way."
Brian Behlendorf of Apache and of Collab.net agrees.
"We don't consciously think about it, but I think that the philosophy of keeping things simple and pushing out almost anything extraneous or nonessential to external modules has been followed fairly carefully in Apache. We've also been fairly successful (I think) in 'federalizing' the Apache process to sister projects."
Bezroukov agrees that modularity is essential, but there are limits:
"For example if a project logically consists of three parts and we have two developers, than adding third developer can be highly beneficial -- the structure of the project will now correspond to the structure of the team. If the project is decomposable into three major parts and we add fourth developer, the effect will be much less."
NNB --See more Conway's Law - Wikipedia, the free encyclopedia
Although sometimes construed as humorous, Conway's Law was intended as a valid sociological observation. It is based on the reasoning that in order for two separate software modules to interface correctly, the designers and implementers of each module must communicate with each other. Therefore, the interface structure of a software system will reflect the social structure of the organization(s) that produced it.
The human wave attack -- the thousand geeks hacking through the night -- too has limits. Limits are made less painful by modularity, but that goes only so far. At some point, management, whether they are software core team leaders or CEOs, will be tempted to make the wrong decision and send in more busy little hands.
Let us now return to Bob Young and Red Hat 4.0 in 1996 as they face a cash short run, mounting deadline pressure, and an anxious Board of Directors. Should Young add more programmers in an effort to get the product out the door or should he heed what may well have been seen as a bit of techie egotism on the part of Erik Troan and just stand back and watch his software team flourish or flounder?
To try to reduce his own stress level, Bob Young went out and bought Brooks' book. "While I didn't completely buy the arguments, (I had no frame of reference)," he says , "it at least did give me a more authoritative source making the same argument that Marc and Erik were making.
"Our small development team did ship 4.0 on the schedule to which they had committed. It was a big success, and Brooks' book about Brooks' Law contributed to reducing the (already small) chance that we were going to make the mistake of adding developers at a late stage of a software project."
And even Eric Raymond has lavish praise for Brooks.
"The Mythical Man Month is still, despite time and tide, one of the most penetrating and humane books on programming ever written; every programmer and everyone who manages programmers should read it."
Fredrick Brooks himself may have the final say about how software development should be understood:
"Programming is fun because it gratifies creative longings built deep within us and delights sensibilities we have in common with all men." --NNB see Ershov
That is a happy thought that is sure to please many in the open source community.
But if this strikes too optimistic a note for some of our more skeptical readers, bear in mind that Brooks has other sweeping statements that may not soon be hailed as law by most members of the community. He concludes: "No major improvement in the software engineering area will ever appear."
Google matched content |
[Jul 03, 2021] Mission creep Published on Jul 03, 2021 | en.wikipedia.org
Death march (software development) - Wikipedia, the free encyclopedia
Optimism bias - Wikipedia, the free encyclopedia
The Social Context of Open-Source Software
""Brooks' versus Linus' Law- An Empirical" by Charles M. Schweik ...
Open Sources > The Revenge of the Hackers > Beyond Brooks's Law ...
Addendum to Brooks' Law - MindBy
Brooks's law Summary | BookRags.com
The cathedral and the bazaar- musings on Linux and Open Source by ... - Google Books Result
Exceptions to Brooks' Law " Scott Berkun
Time to Vanquish the Mythical Man Month
Open Kosmaczewski - Adding Manpower
Opening minds- Cultural change with the introduction of open ...
Linux Today - IBM developerWorks- Brooks' Law and open source- The ...
19 Eponymous Laws Of Software Development
Open source and the mythical man month | ZDNet
Brooks' Law. It still applies.
Ray Kurzweil - Wikipedia, the free encyclopedia
In 1999, Kurzweil created a hedge fund called "FatKat" (Financial Accelerating Transactions from Kurzweil Adaptive Technologies), which began trading in 2006. He has stated that the ultimate aim is to improve the performance of FatKat's A.I. investment software program, enhancing its ability to recognize patterns in "currency fluctuations and stock-ownership trends."[8] He predicted in his 1999 book, The Age of Spiritual Machines, that computers will one day prove superior to the best human financial minds at making profitable investment decisions. In 2001, Canadian rock band Our Lady Peace released an album, titled Spiritual Machines, based on Kurzweil's book. Kurzweil's voice was featured in the album, reading excerpts from his book.In June 2005, Kurzweil introduced the "Kurzweil-National Federation of the Blind Reader" (K-NFB Reader)-a pocket-sized device consisting of a digital camera and computer unit. Like the Kurzweil Reading Machine of almost 30 years before, the K-NFB Reader is designed to aid blind people by reading written text aloud. The newer machine is portable and scans text through digital camera images, while the older machine is large and scans text through flatbed scanning.
Addendum to Brooks' Law - MindBy
I just read Joel Spolsky's blog entitled "A Little Less Conversation" which discusses something I've blogged about in the past here and here, communication overload.
After reading that post I began to consider my own personal experience in meetings over the last dozen or so years and decided to add an addendum to the communication node problem that was so eloquently detailed in the Mythical Man Month by Brooks.
The problem with Brooks' theory of intercommunication is that it doesn't take into account the "Number of Managers" in any given meeting. He assumes in his calculation that all nodes in a communication network are equal. This is a mistake. All nodes are not equal, as anyone who has sat through a meeting with more than one manager participating can attest to.
Managers have keen insight into every major (and minor) issue at hand and willingly share that information with the team in a seemingly endless discourse that greatly adds to the meeting's productivity and value. In fact I've been in meetings with multiple managers that have lasted two, maybe three, times longer than the scheduled meeting length due to the significant wisdom that each of the managers was imparting to their counterparts and the team.
This imbalance in communication node weighting should be reflected in a revised formula for group intercommunication (especially meetings). Brooks' original formula can be stated as n(n-1)/2=communication pathways. The revised formula adds the significance of management communication to the pathways problem by accurately describing the impact of management on the original formula. This new formula can be expressed as (n(n-1/2)) ^x (^x indicates raised to the power of x) where x is the number of managers.
As an example I will restate the original example given by Brooks and then show the difference when true communication weighting has been added…
Example: 50 developers give 50 · (50 – 1) / 2 = 1225 channels of communication.
However, given our new formula and assuming the presence of 3 managers (or significant stakeholders) into our team we now see the impact of the additional management on our communication overhead.
Example: 50 developers + 3 Managers give (50 · (50 – 1) / 2)^3 = 1838265625 channels of communication.
There, that's better. This new formula clearly shows the benefit of adding additional management resources to any project.
You can thank me later Fred
Society
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
Quotes
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Bulletin:
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
History:
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
Classic books:
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|
You can use PayPal to to buy a cup of coffee for authors of this site |
Disclaimer:
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.
Last modified: June, 07, 2021