What is the difference between Computer Science and Software Engineering? That depends on the nature of the problem you are trying to solve - is your problem one of Silicon or one of Carbon?
In my 20-year career in working with software, the single biggest issue has also been an elephant
in the room: the abysmal success rate of large software systems. Plenty
of books have been written on the subject with lots of great, if
unheeded, advice. Scope creep, ambiguous or missing requirements, poor communications, bad risk management; lots of reasons have been documented and just as many if not more solutions have been suggested. Brooks gave us "The Mythical Man-Month", Yourdon writes of "The Death March", Tom DeMarco has "Peopleware". All of them attempting to answer the question, "Why is it
so hard to write software?"
If you take the all the books on writing software and put their suggestions in one big list, the first question would be, "If we have all the answers, why don't we use them?" That might be nice for getting it off your chest but isn't has helpful in pointing us in the right direction. At its core, this question does have a useful kernel, it assumes that 'we' are the problem. If only we used this process, if only we implemented that practice, if only we hired the right people, if only we kept our teams motivated. As an aside, most of the reasons for failure are management reasons, not technical ones. It should be no surprise that many find the case study on the sinking of the Vasa in 1628 to be very instructive. The blame is as much, if not more so, on the King's unrealistic demands as on the builder's efforts to failure to meet the royal requirements. In our endless efforts to build bigger, more complex and complicated software solutions in the face of almost certain failure, should question our sanity; one definition of insanity being, "doing the same thing and expecting different results." Biting off more than we can chew, choking, and then doing it all again time after time only shows how our hubris can get the better of us, allowing us to continue on our Sisyphean task.
This brings me to an interesting observation. There is a long running debate over whether computer programming is/should be considered a form of artistic expression or of engineering prowess. Computers have such a wide scope that many sub-fields have crept up, hardware and software being the most obvious. Hardware is typically the realm of the engineer, designing circuit boards or embedded sensors. This changes as the hardware gets larger in size or scope. CPUs have microcode, missiles have on-board navigation systems with programmable targets.
Most universities that offer degrees related to programming computers name the department Computer Science. Now, being considered a 'science' may be just a way of expressing the notion that computers are deterministic and worthy of comparison with mathematics or engineering, having likely grown as an offshoot of the field of electrical engineering. Whether it be data mining, security, networking, or pure algorithms, most of the areas seem to ignore the question of why software is so hard. They assume the problem is in the silicon computer chips. Curiously, these fields don't even attempt to address this most difficult of questions, assuming improvements of our silicon constructs or their usage will somehow mitigate our human foibles. If only we had better tools, or faster computers, etc. then our carbon-based brains could somehow ascend above the project management miasma and become a beacon to those who struggle and toil without end. Considering the gargantuan number of man-hours and humongous amounts of money wasted, such an accomplishment would be of heroic proportions, mythic even; a modern-day retelling of Prometheus or Sisyphus.
The truth is that Computer Science may bring improvements in areas where
the problem is in the silicon, it is mute when asked to provide answers
to problems of a carbon nature. Software Engineering attempts to develop solutions that include how humans work together. Software Engineering does not assume that software development is as cut-and-dried as putting the right circuits on a board to achieve the desire product, or breaking down a complex problem into constituent algorithms to make the problem more manageable. There is no magic bullet, no single development process, no single answer. To understand why humans have a hard time developing software, we must admit that humans are part of the equation. Doing is brings with it a cold truth about humans - they are non-deterministic by nature. Alistair Cockburn describes it as people being non-linear, first order components in software development. To answer the question of what is software development, we might have to take detours through Psychology, and may be surprised to end up working with Philosophy even.
Wednesday, December 01, 2010
Thursday, September 02, 2010
When the Student is Ready, the Teacher Appears
I remember hearing from a 'higher-up' once at a previous company that we shouldn't do Use Cases. "The customer doesn't like them", something I never heard directly from a customer. Doing a design review with use cases, a domain model, and a set of fleshed-out sequence diagrams was once described as 'over designed'. It was a mystery to me at the time why they would be so ambivalent about better ways to develop software.
Now, in my graduate class at UTA on Software Design Patterns, the assignments are ironically familiar:
- project requirements
- use cases (3 levels)
- domain model
- sequence diagrams, design class diagram
- pair programming
- test driven development
Not even ten years ago, these practices were still gaining acceptance in the industry. A common complaint up until now has been that colleges and universities don't teach these techniques, which only gave the critics more ammunition. Now that has started to change, hopefully the debate about these practices will be about how to best implement them instead of how to dismiss them.
It turns out that the solution to the mystery had nothing to do with the merits of any software development methodology but rather had everything to do with the rejection of in-house software development. This company had a long history of developing their own software and over the years had ended up with one of the largest technology departments I've ever worked in, close to 1,000 people. When the leadership changed, the attitude shifted from "Not built here" to "Don't build it if you can buy it". They didn't value the capability to develop software, so trying to improve the process had become moot. Preferring to buy vs. build is a perfectly reasonable approach. Some company's realize they don't want to be a development shop and make no bones about it. The harm comes when there is a disconnect between management and the teams doing the work. Psychologically, this would fall under cognitive dissonance; having a development shop mindset at the same time the departments actions are turning away from development.
It wasn't a surprise then, when it was announced that they were 'partnering' with a large three-letter company to rewrite their web-site, which happens to be the company's main source of revenue. The team that developed the site was held in high esteem by the department, full of some of the smartest guys and pushing the envelope, all while supporting a truly business-critical app 24/7 with close to zero unscheduled downtime. The site happened to be written in C++, putting the bar high for development skills for recruiting. One of the clues to the mystery was at a monthly meeting of technical leads prior to the partnering decision. Using C++ was being de-emphasized, the reason given was the difficulty in finding qualify developers.
The interesting question becomes, if you devalue software development skills will you lose the ability to monitor your partner and be a fully engaged partner? Inevitably, you'll have to take some consultants word on a technical matter and you'll no longer be a partner but rather a simple client. Keeping your organizations software development skills sharp benefits you even if you don't write the code.
Monday, August 09, 2010
Student, Teach Thyself - A Theoretical Computer Science QnA site
I've been a fan of Stackoverflow for a while. Jeff Atwood and Joel Spolsky have helped the Internet turn a corner when it comes to helping people find answers to their questions. Google may do an incredible job indexing pages but it would be to no avail if the pages themselves were all junk.
I don't have a ton of time to be on the new site-proposal system called area51.stachexchange.com. During one of my CS grad school classes, I asked a question regarding my schoolwork on the main Stackoverflow site and was underwhelmed by the responses. Since the main site is more for working programmers who typically expect a code snippet as an answer, I figured the proposal for a site on theoretical computer science deserved a shout out.
Now if they would only restart their podcast.
Bookmark this on Delicious
I don't have a ton of time to be on the new site-proposal system called area51.stachexchange.com. During one of my CS grad school classes, I asked a question regarding my schoolwork on the main Stackoverflow site and was underwhelmed by the responses. Since the main site is more for working programmers who typically expect a code snippet as an answer, I figured the proposal for a site on theoretical computer science deserved a shout out.
Now if they would only restart their podcast.
Tuesday, July 27, 2010
The Battle of Computer Science Training
There is a constant battle between the corporate and academic worlds. Universities want to teach more theory so students are prepared for a wide range of situations. Companies want graduates with technology-specific training so they can contribute immediately. Who is right? Both and neither.
When taking a course in algorithms, you might need to demonstrate some type degree of learning through programming a lab assignment. Its a decent bet that languages the student may not be the best choice to highlight the various algorithms being taught. The student then finds himself learning a new language for the sake of submitting a lab assignment. This is usually done with zero training or exposure and very little support outside of asking classmates or asking for help from the instructor who is eternally busy with other things.
Version control is another topic that shows the rift between the two worlds. Most IT shops require the use of some version control system. Most developers learn enough to get the job done but never really understand it enough to make it an equal partner in the development process. The administrative function is usually lacking. big IT shops can afford to hire VCS admins also have so many repositories that the admin's time is spread too thin unless you happen to be working on the 800-lb gorilla project. Smaller shops muddle through and usually someone takes the admin role upon themselves because they are tired of the situation and are the first to hit their frustration threshold. Not unlike the newlywed dilemma, "If the full sink of dishes bothers you, you wash them."
Since starting grad school, I figured I'd apply my own coding lessons and use version control for my assignments. What an eye-opening experience! Some tools with which I'm familiar already have enough version history. Eclipse has a nice feature where it automatically saves the change history of a file every time you save it. This works for solo Java projects but doesn't for other languages or group projects.
There are two main obstacles to using version control in academia; having to be your own VCS admin/trainer, and the lack of consistency from the university. You have to have the discipline to keep your VCS configured and working but also extend it to whatever environment that might be required. It does you little good if you have your personal repository setup on your own PC when the professor requires assignments to be done on the campus Unix box. The same holds true when a lab is required to be done in a language that your IDE doesn't support. Eclipse has 'C' support through the CDT but if you aren't going to be using it next semester, the time it takes to get it configured and working becomes that much time away from getting the assignment done or studying for the next test. When working on a group project, it also means having to teach your teammates how to use the tool and then hope that they actually use it.
This leads to the second obstacle. Without a departmental policy or professorial dictate, version control usage by students will be as rare has hen's teeth. The university or at least CS department would need to support VCS tools for campus computing environments, at least for administrivia. The professors and instructors would need to make VCS part of the lab requirements.
Don't hold your breath.
There is no impetus by the faculty to standardize on a VCS tool. The sheer number of languages and environments make it unlikely for any consensus to appear. This is odd because CS classes constantly have issues with turning in code and checking for plagiarism/cheating. Looking through the check-in history would be a simple way to verify a student's progress. Code diff tools would make it almost effortless to check for "borrowed" code. There are enough cross-platform tools that it is not a technology problem, it's a human problem.
The fundamental problem is the two worlds are trying to 'train' for different things. In school, technology is a means to an end. Students are being trained to understand how to approach and solve problems. They are being prepared for unknown situations so they can recognize when an approach is guaranteed to fail or maybe has been solved before and a stock algorithm exists. Employers, on the other hand, want people who are trained in the tools they are using to solve their problems. New hires aren't the ones who get to analyze a business problem not to mention prioritizing which problem to address. Managers still like to think of coding like a factory floor, the workers don't get to design the manufacturing process or choose the tools.
For those of you that like analogies, here is how I'd put this conundrum. College is training blacksmiths that don't make the same thing repeatedly. They can be given a strange piece of hardware and figure out how to fix it. They can be asked to create something new based on ambiguous requests. Business wants someone who is trained to make swords. The enemy is already defined by the leaders, the strategies already determined by the generals.
The problem isn't training sword-smiths, it's wanting nothing but sword-smiths and then discovering you need a telescope, or horseshoes. Technology is still changing at an unheard of pace. Training in the current technology is a recipe for obsolesce. The window of current technology is getting smaller and smaller. COBOL was king-of-the-mountain for 20 years, C/C++ for maybe 15, Java is already on the downhill slope with C# and JavaScipt taking the spotlight and other languages in the wings (Ruby, Python, Scala) or working backstage (Perl, PHP, SQL).
Pick your tools well and train for anything. You'll have your fate in your own hands and not be surprised by the winds of change.
Bookmark this on Delicious
Equipped with skills |
When taking a course in algorithms, you might need to demonstrate some type degree of learning through programming a lab assignment. Its a decent bet that languages the student may not be the best choice to highlight the various algorithms being taught. The student then finds himself learning a new language for the sake of submitting a lab assignment. This is usually done with zero training or exposure and very little support outside of asking classmates or asking for help from the instructor who is eternally busy with other things.
Version control is another topic that shows the rift between the two worlds. Most IT shops require the use of some version control system. Most developers learn enough to get the job done but never really understand it enough to make it an equal partner in the development process. The administrative function is usually lacking. big IT shops can afford to hire VCS admins also have so many repositories that the admin's time is spread too thin unless you happen to be working on the 800-lb gorilla project. Smaller shops muddle through and usually someone takes the admin role upon themselves because they are tired of the situation and are the first to hit their frustration threshold. Not unlike the newlywed dilemma, "If the full sink of dishes bothers you, you wash them."
Since starting grad school, I figured I'd apply my own coding lessons and use version control for my assignments. What an eye-opening experience! Some tools with which I'm familiar already have enough version history. Eclipse has a nice feature where it automatically saves the change history of a file every time you save it. This works for solo Java projects but doesn't for other languages or group projects.
There are two main obstacles to using version control in academia; having to be your own VCS admin/trainer, and the lack of consistency from the university. You have to have the discipline to keep your VCS configured and working but also extend it to whatever environment that might be required. It does you little good if you have your personal repository setup on your own PC when the professor requires assignments to be done on the campus Unix box. The same holds true when a lab is required to be done in a language that your IDE doesn't support. Eclipse has 'C' support through the CDT but if you aren't going to be using it next semester, the time it takes to get it configured and working becomes that much time away from getting the assignment done or studying for the next test. When working on a group project, it also means having to teach your teammates how to use the tool and then hope that they actually use it.
This leads to the second obstacle. Without a departmental policy or professorial dictate, version control usage by students will be as rare has hen's teeth. The university or at least CS department would need to support VCS tools for campus computing environments, at least for administrivia. The professors and instructors would need to make VCS part of the lab requirements.
Don't hold your breath.
There is no impetus by the faculty to standardize on a VCS tool. The sheer number of languages and environments make it unlikely for any consensus to appear. This is odd because CS classes constantly have issues with turning in code and checking for plagiarism/cheating. Looking through the check-in history would be a simple way to verify a student's progress. Code diff tools would make it almost effortless to check for "borrowed" code. There are enough cross-platform tools that it is not a technology problem, it's a human problem.
The fundamental problem is the two worlds are trying to 'train' for different things. In school, technology is a means to an end. Students are being trained to understand how to approach and solve problems. They are being prepared for unknown situations so they can recognize when an approach is guaranteed to fail or maybe has been solved before and a stock algorithm exists. Employers, on the other hand, want people who are trained in the tools they are using to solve their problems. New hires aren't the ones who get to analyze a business problem not to mention prioritizing which problem to address. Managers still like to think of coding like a factory floor, the workers don't get to design the manufacturing process or choose the tools.
Trained to repeat a process |
For those of you that like analogies, here is how I'd put this conundrum. College is training blacksmiths that don't make the same thing repeatedly. They can be given a strange piece of hardware and figure out how to fix it. They can be asked to create something new based on ambiguous requests. Business wants someone who is trained to make swords. The enemy is already defined by the leaders, the strategies already determined by the generals.
The problem isn't training sword-smiths, it's wanting nothing but sword-smiths and then discovering you need a telescope, or horseshoes. Technology is still changing at an unheard of pace. Training in the current technology is a recipe for obsolesce. The window of current technology is getting smaller and smaller. COBOL was king-of-the-mountain for 20 years, C/C++ for maybe 15, Java is already on the downhill slope with C# and JavaScipt taking the spotlight and other languages in the wings (Ruby, Python, Scala) or working backstage (Perl, PHP, SQL).
Pick your tools well and train for anything. You'll have your fate in your own hands and not be surprised by the winds of change.
Friday, May 14, 2010
To Buy or Not to Buy Technical Books, That is the Question
So the rookie coder is working at his new job, fresh out of college, when the team malcontent is overheard a couple of cubes over, "This place is so cheap! They should reimburse me for this technical book. It's a heckuva lot cheaper then a training class." However true that is, it hides a simple choice: keep the book and be out the money, or give up the book and pocket the cash - assuming the boss will foot the bill.
Recently a question on Stackoverflow reminded me of that rookie coder. Being fresh out of college gives you an amazing combination of overconfidence and cluelessness. Without any basis, it's easy to take the office attitude and run with it. You shouldn't have to buy your tools, your company should.
I disagree.
Why? Because they are tools. It's not even a cost thing. A professional should be responsible for knowing which tools they need to be most productive. Tour an airline maintenance facility and check out the workshop. A large area, maybe 1000 sq. ft., with nothing but large toolboxes; each with a name and a lock. The mechanics have to buy all their own tools, think wrench sets and so forth - not large things like engine dollies.
At first it seems a bit silly and redundant but when you start to think about it, it's easy to see that not everyone is going to want to spend the extra money on the high-end tools, or they might want extras of the same set for whatever reason. Also, consider the hassles with sharing, e.g. "Where are the 1/2 inch box-end wrenches?" By not making the tools shared, the 'Tragedy of the commons' is avoided.
Another example is chefs and their knives. You don't touch another chef's knives. The restaurant provides the big things like ovens and stoves, the chefs bring their own knives.
Back in the world of computers, I've brought in extra memory before. I told my manager so he'd know that some of my personal property was in the PC in case it needed to be worked on by the internal techs. I also put a note on the box itself. I would have had a bigger issue if they discouraged bringing your own hardware. Most shops like to provide the box itself because it minimizes their support issues but they shouldn't have any problems with you bringing accessories like monitors, keyboards, mice, etc.
To summarize, change how you think about tools and strive to use whatever makes you better at your job, even if you have to pay for them out of your own pocket. It's the 'Professional" thing to do.
Wednesday, April 21, 2010
Martin Fowler, Alistair Cockburn, and Optimism
The effort to make Software Engineering actually live up to the 'Engineering' part has been greatly helped by Martin Fowler's book on refactoring. Martin recently posted about his decision to not participate in an effort by others in the industry, not lacking in optimism, to define a common process for software development.
Martin Fowler declined to get involved in a similar effort and for the same reason. The key quote is,
While my previous post about Alistair's paper covers much of the same ground I felt it was worthy of an update. There will always be souls who crave consistency just as there will always be salesmen willing to offer the illusion of it. Facing our own nature is one of life's hardest lessons. Some of those who ignore the human factor are optimists seeking to improve their chances of consistency, which has driven many towards the latest and greatest geegaw, a new language, a new GUI, a new design tool. Others are simply deluding themselves in the hopes that if only people would sit up and fly right (read this to mean 'do it their way') all our problems would be solved.
Studying the process of computer programming has been around for decades. Even so, we still have trouble internalizing human foibles - people aren't predictable and follow "standards" about as well as a random-walk. So the most successful way to get people to be consistent is to lead by example, blaze a trail ahead and show them the way. In this manner Extreme Programming was its own worst enemy and its own best cheerleader. By showing how a set of practices work in situ, it created a much-needed focus. The "Waterfall" model was the favorite whipping-boy, essentially dead, and the iterative approach was in its infancy. So XP served as a lightning-rod for discussions on the craft of software development. It provided an example of practices that could be emulated and provided an easy target for the doubters, disbelievers, and denialists; no doubt the negativity coming from the critics was channeling their innate understanding of the darkside of human behavior.
While Extreme Programming hasn't become the standard development model, that doesn't mean it failed. When the history of Software Development is written, XP will be given credit for re-introducing the most important factor; not tools nor process, but people.
Bookmark this on Delicious
Martin Fowler declined to get involved in a similar effort and for the same reason. The key quote is,
“Why this is so was primarily crystallized for me by Alistair Cockburn who explained that since people are the central element in software development, and people are inherently non-linear and unpredictable - such an effort is fundamentally doomed.”
While my previous post about Alistair's paper covers much of the same ground I felt it was worthy of an update. There will always be souls who crave consistency just as there will always be salesmen willing to offer the illusion of it. Facing our own nature is one of life's hardest lessons. Some of those who ignore the human factor are optimists seeking to improve their chances of consistency, which has driven many towards the latest and greatest geegaw, a new language, a new GUI, a new design tool. Others are simply deluding themselves in the hopes that if only people would sit up and fly right (read this to mean 'do it their way') all our problems would be solved.
Studying the process of computer programming has been around for decades. Even so, we still have trouble internalizing human foibles - people aren't predictable and follow "standards" about as well as a random-walk. So the most successful way to get people to be consistent is to lead by example, blaze a trail ahead and show them the way. In this manner Extreme Programming was its own worst enemy and its own best cheerleader. By showing how a set of practices work in situ, it created a much-needed focus. The "Waterfall" model was the favorite whipping-boy, essentially dead, and the iterative approach was in its infancy. So XP served as a lightning-rod for discussions on the craft of software development. It provided an example of practices that could be emulated and provided an easy target for the doubters, disbelievers, and denialists; no doubt the negativity coming from the critics was channeling their innate understanding of the darkside of human behavior.
While Extreme Programming hasn't become the standard development model, that doesn't mean it failed. When the history of Software Development is written, XP will be given credit for re-introducing the most important factor; not tools nor process, but people.
Monday, April 12, 2010
Context as a commodity
How much context do you need to do your job? Can we quantify context in any way?
Jeff Atwood has mentioned that he doesn’t shut his machine down every night. He wants his environment to be as he left it at the end of the previous coding session. Getting all the apps, tools or other windows opened to the right place takes not only CPU time but mental energy, aka context.
Some jobs require no context, for example a bank teller can walk in to work on Tuesday and not need to remember any of the transactions that took place on Monday. Contrast that with a novelist in the middle of writing a new book, let’s say writing the next chapter not editing anything. They need to remember all the characters, their personalities, the existing story, and in what ways the plot threads are to interact in the new material.
Another good contrast is a professional athlete. A pro baseball player doesn’t need to remember what happened in yesterday’s game in order to pitch a strike, hit a home-run, or execute a double-play. The limited context that does appear has to do with the optimizing performance. A pitcher does need to know the preferences and history of the batter who is at the plate, just like the batter needs to know if the pitcher has a wicked slider.
Companies want to be “green” these days and have instituted policies to shutdown computers overnight. This may be pennywise and pound foolish. Here’s why. A consultant gets paid $50/hr and it takes 30 minutes to get their workspace ready after an overnight shutdown lasting from 5pm to 8am the next day, 15 hours. A conservative energy estimate would be if electricity costs $0.25 a kilowatt hour multiplied by 15 hours for a cost of $3.75. So $25 is spent trying to save $3.75. That’s motivation to use the hibernate feature at a minimum. It also is a very simple way to quantify the cost of context.
Can we describe jobs by how much context is required? If so, is there a relationship between the amount of context needed and the average salary? Those are interesting questions, maybe I’ll cover those in a later post.
Thursday, January 21, 2010
Programmings Best Kept Secret
Software development is often compared to engineering or construction which has a hidden and debatable assumption that the source code is the product. One camp has declared that source code is the design . Other camps declare that programming as a branch of applied mathematics; see Dijkstra's "uncomfortable truth" quotes. Extreme Programming as a movement may have failed, it at least served as the flashpoint for a new examination of the nature of programming. Instead of arguing about engineering we can brag about how Agile we are or that we are Software Craftsmen.
Programmers are not engineers, nor are they artists. Even when they are compared to craftsmen, the comparison is always with woodworkers or blacksmiths. Programmers are nothing of the sort, programmers are writers. Danny Thorpe discusses how you might tell what sort of literature your app may be.
Danny is on the right track but when it comes to software, there is an elephant in the room that needs to be discussed in the open. Developers do not produce software, they write code. This means that sometimes they need the logic skills of an engineer but just as often they need the communication skills of a poet. Reading and writing, thinking about how to achieve a desired affect through the written word, those are the qualities of a writer. It just so happens that the one of the chosen audiences is a single-minded overly-literal pendantic polyglot idiot-savant, otherwise known as a computer. The other audience is other humans that may or may not have to actually follow the instructions, they do have to understand them well enough to tell if the instructions are correct.
It would be useful to compare programming to other fields and see how the comparison holds up. It is not a coincidence that the games industry has adopted the filmmaking model for the production of games. Good choices would be publishing a book, producing a movie, or creating a recipe.
Programming as publishing
Programmers are the authors. The compiler is the printing press. Publishers are the companies that agree to fund the process in hopes of turning a profit. Self publishing would be like creating training manuals for internal use; no profit is involved but the medium is appropriate for other reasons. The book still needs to be edited, proofread, typeset, marketed, etc. I've covered this ground before but there is still much to contemplate.
Programming as filmmaking
Filmmaking is in interesting comparison because it deals with translating the written word into another medium. Programmers are the writers of the scripts and screenplays. The producer is the stakeholder who desires the software. The director is the project lead. Instead of actors and crew we have the compiler which takes our instructions and executes whatever script of commands we provide; call it a screenplay or call it an application, the actions and actors my be different but it's still a set of commands given to actors to be performed.
Programming as cooking
Programmers create a recipe for a given dish. The recipe is not the final product, the result of following the recipe is. You don't buy a cookbook for the fabulous chocolate cake recipe, you buy it because you want to eat the cake. In this case the compiler would be the tools available to the chef; pots and pans, utensils, and the stove/oven. Having the right tools really helps but even the best oven won't correct a recipe that calls for 2 cups of vinegar for the sugar cookies. Eventually software may survive the patent wars and finally be considered on par with a recipe; instructions to a cook in order to produce the desired product.
Summary
Software is actually a mix of disciplines, all trying to contribute towards a common goal. The game industry may have it right, evidenced by them being to busy producing games and not endlessing debating whether their process is broken or seeking the latest silver bullet. They use directors, producers, art and sound specialists, etc. The compiler is their camera, used to capture the intent of the contributors; the CPU the silver screen for which to project the final product for consumption by the audience.
Bookmark this on Delicious
Programmers are not engineers, nor are they artists. Even when they are compared to craftsmen, the comparison is always with woodworkers or blacksmiths. Programmers are nothing of the sort, programmers are writers. Danny Thorpe discusses how you might tell what sort of literature your app may be.
Danny is on the right track but when it comes to software, there is an elephant in the room that needs to be discussed in the open. Developers do not produce software, they write code. This means that sometimes they need the logic skills of an engineer but just as often they need the communication skills of a poet. Reading and writing, thinking about how to achieve a desired affect through the written word, those are the qualities of a writer. It just so happens that the one of the chosen audiences is a single-minded overly-literal pendantic polyglot idiot-savant, otherwise known as a computer. The other audience is other humans that may or may not have to actually follow the instructions, they do have to understand them well enough to tell if the instructions are correct.
It would be useful to compare programming to other fields and see how the comparison holds up. It is not a coincidence that the games industry has adopted the filmmaking model for the production of games. Good choices would be publishing a book, producing a movie, or creating a recipe.
Programming as publishing
Programmers are the authors. The compiler is the printing press. Publishers are the companies that agree to fund the process in hopes of turning a profit. Self publishing would be like creating training manuals for internal use; no profit is involved but the medium is appropriate for other reasons. The book still needs to be edited, proofread, typeset, marketed, etc. I've covered this ground before but there is still much to contemplate.
Programming as filmmaking
Filmmaking is in interesting comparison because it deals with translating the written word into another medium. Programmers are the writers of the scripts and screenplays. The producer is the stakeholder who desires the software. The director is the project lead. Instead of actors and crew we have the compiler which takes our instructions and executes whatever script of commands we provide; call it a screenplay or call it an application, the actions and actors my be different but it's still a set of commands given to actors to be performed.
Programming as cooking
Programmers create a recipe for a given dish. The recipe is not the final product, the result of following the recipe is. You don't buy a cookbook for the fabulous chocolate cake recipe, you buy it because you want to eat the cake. In this case the compiler would be the tools available to the chef; pots and pans, utensils, and the stove/oven. Having the right tools really helps but even the best oven won't correct a recipe that calls for 2 cups of vinegar for the sugar cookies. Eventually software may survive the patent wars and finally be considered on par with a recipe; instructions to a cook in order to produce the desired product.
Summary
Software is actually a mix of disciplines, all trying to contribute towards a common goal. The game industry may have it right, evidenced by them being to busy producing games and not endlessing debating whether their process is broken or seeking the latest silver bullet. They use directors, producers, art and sound specialists, etc. The compiler is their camera, used to capture the intent of the contributors; the CPU the silver screen for which to project the final product for consumption by the audience.
Subscribe to:
Posts (Atom)