The most important piece of advice is the last line of the article.
In my experience, ignoring the above combined with a refinement only approach generates some amazingly horrible 'solutions'.
An example of one such 'solution' I was forced to implement by a manager and 'senior developer'. To make matters worse I was expected to be the one to supervise/troubleshoot the use of this 'solution' every week. Here goes:
Output data on a unix server, ftp it to a Windows server, ftp to a Windows PC (yes, it was FTP'ed twice on the same LAN), manually change in Access and export, ftp back onto the server, process again, FTPx2 back to the Windows PC, email to an off-site processor, wait 2-5 hours for a response, ftp back to the Server, process, ftp back to Windows, put it into Access again and export, and then ftp it to the clients mainframe (and manually remember to add some archaic FTP settings). On average it took 1.5 hours of dedicated work time to complete, not including the wait time or reattempts. It was extremely error prone due to the number of manual steps. Changing the settings in ftp also caused problems with other clients when people forgot to take off the archaic mainframe ftp settings.
For my own sanity I assumed responsibility of the task being done each week and replaced it with a script that did everything in half a minute including sending it off for offsite processing - the results of which I never used. I was eventually found out when the client requested the data early and I provided it to them several hours before the offsite processing was done. I got in trouble because of the thank-you email.
In my experience, ignoring the above combined with a refinement only approach generates some amazingly horrible 'solutions'.
An example of one such 'solution' I was forced to implement by a manager and 'senior developer'. To make matters worse I was expected to be the one to supervise/troubleshoot the use of this 'solution' every week. Here goes:
Output data on a unix server, ftp it to a Windows server, ftp to a Windows PC (yes, it was FTP'ed twice on the same LAN), manually change in Access and export, ftp back onto the server, process again, FTPx2 back to the Windows PC, email to an off-site processor, wait 2-5 hours for a response, ftp back to the Server, process, ftp back to Windows, put it into Access again and export, and then ftp it to the clients mainframe (and manually remember to add some archaic FTP settings). On average it took 1.5 hours of dedicated work time to complete, not including the wait time or reattempts. It was extremely error prone due to the number of manual steps. Changing the settings in ftp also caused problems with other clients when people forgot to take off the archaic mainframe ftp settings.
For my own sanity I assumed responsibility of the task being done each week and replaced it with a script that did everything in half a minute including sending it off for offsite processing - the results of which I never used. I was eventually found out when the client requested the data early and I provided it to them several hours before the offsite processing was done. I got in trouble because of the thank-you email.