swbluto wrote:Oh, well, since you're working with tons of data, then such high expenditure and performance can be justified. For regular GUI development and relatively simple code bases, it seems many slower computers would work practically as fast as human perception allows but, then again, I'm not a professional software developer so I'm not sure what a typical "software development" load is.
Yeah it seems like this should be true. But reality is different. I typically use about 3-4GB of memory. It spikes to five or six when loading up two or more solutions (projects). Some times its pretty useful to wind up a pre-configured entirely different development environment with a different OS etc for testing inside a virtual machine. You can roll it back to a fresh state and test deployments and installations in a safe, controlled and predictable way. Virtual Machines are pretty useful for loading up entirely different toolsets matched to different development/technology platforms. A lot of tools don't install well side by side and you don't want to be forced to upgrade existing applications that are in production based on a third party tool vendor's schedule. Having this pre-configured workspace in VM is a good way to address this need. Another trick with this kind of system memory is to create a ram drive with that extra memory and install programs or even make it the working directory for the thousands of files in the solutions I work on.
Some of the things that are running while I'm working:
Visual Studio - the IDE or work bench where most of the code gets written. I use a productivity tool called Resharper - it analyses your code while you type, suggests fixes and improvements, automates a lot of tedious stuff, analyses all of your types so that you can quickly navigate to different parts of the code base.
On some projects, every functional piece of code is supported by one and often more unit tests that assert that the code still does what it is supposed to do. In a sense the unit tests are fine grained executable requirements. When tests break they usually tell you what to fix. I'm constantly running tests to verify that all my previous assumptions still hold. Another principle that helps is to mercilessly refactor as you go. It is pretty hard to do this without a backing suite of unit tests. Tests that run fast tend to get run more often, its just human nature. This keeps things on track and helps to integrate easier with other developers working on different parts of the project.
Rather than run the app to test it, I run the unit tests more and fire up the GUI less. On really slick projects, the unit tests run in the back ground and you get notified when you've broken part of the code base. The old style was to run the app and click through and look at the UI or step thru the code with a debugger. Both are pretty tedious and haphhazard and in the end if relied on, create fragile code bases that developers are afraid to improve.
I still find that the GUI parts of the apps I write are the most tedious. Typically there is a GUI Designer that is slow to load, has a million properties to drill into with the mouse and you have to click to set widths, colors, and so on. Probably the most frustrating and error prone part of the whole app. A fast machine definately helps in this area - it loads up the app and renders your screens that your building in seconds to verify what you've dragged around in the designer.
A key theme with building business systems the way I do is to constantly refine and re-structure your code as you learn more about the underlying problems your solving. If you feel pain with the architecture, fix it. As I build a system, my understanding of the business will grow, I need to keep re-structering how the code maps to the business as I learn more about it. A lot of the tools I use support this with things like intelligent re-naming, moving classes to different parts of the solution etc. These tools need to really have the whole solution loaded up in memory to be able to do this as well. I can do comparisons of current files with incremental snapshots of the whole system going back to day one. I've also got a program running in the background that takes snapshots of my entire monitor space every 10 seconds. When it comes time to fill in my time sheet at the end of the month I just play the movie back. The codebase I'm working with right now is about 20 thousand lines of code, probably half of which are hand written or micro-generated. I'm guessing it will hit a hundred thousand or so by the time I'm done. This will likely be spread across two or three thousand different files.
Everything is automated. I can check out the current code from source control, build it all and deploy it to many different environments with a few simple commands. When someone checks in code that breaks existing code, our build server glows red, automated emails get sent etc.
My current project has me upgrading an Inventory and Purchasing system. I've got a copy of the production database locally running. As I add new features to the replacement app I'm building, I'll run data migration to suck in the data into the new schema. This verifies in almost real time that I'm on the right track and I'm eating the migration pain as I go to address problems early while theres still a chance.
Feedback is critical. The sooner and more you get, the quicker you can correct the course and get more in line with what the end users and business needs. Automated unit tests, real time code analysis and the ability to instantly deploy any version of the code base to QA, end user beta testers, trainers, documentation people and early releases of completed slices of functionality to production are helped by a honkin fast machine.
Another key theme is verbal communication. Here's a quick example: You sit down for dinner at a restaurant and read the menu. It says "The entree comes with soup or salad and bread" You ask yourself, does this mean I can have just soup? What if I want soup and bread? Then the waitress comes up and puts the free bread on the table and asks if you would like soup or saled with what you've just asked for. Written communication, is a very poor medium for conveying abstract concepts. Detailed upfront analysis and design, thick and beautiful requirements documents and architectual diagrams thrown over the fence to the developers is a sure sign of impending disaster. The teams I run try to defer decisions and definition of features until the last responsible moment. The code is the specification, it is constructed and built by the compiler, not by humans. When its built and running on your machine its just a bunch of magnetic switches bouncing around tubes of copper. For this specification to be maintainable, it has to be readable and steeped in language that mirrors the business problems being solved.
Ideally, I'm working side by side real users of the app while I develop it. Now and then I will ask them a question or two, get feedback on alternate courses of action and so on. Show a half built screen, draw lines and diagrams on a whiteboard that for that moment in time convey the idea between myself and an end user but afterward fade away and lose any relevance once we move on. Most of the written notes we have are just index cards with bullet points of future conversations we want to have. The cards are taped to a wall and grouped around key system features. When we finish a feature, we rip up the cards. Face to face dialog is the prefered way to elicit requirements. Most people on this board would have no trouble understanding what a given method does in projects I work on, especially for the business related areas of the code. I frequently sit business analysts and testers down to review code to ensure that I'm on the right track.
Here's a video of a friend of mine with real talent who demonstrates quite effectively what I'm talking about. I'm not as fast as him but you get a really good sense of the pace that someone with game can really crank production quality code out at.
silver light version (like flash)http://perseus.franklins.net/dnrtvplayer/player.aspx?ShowNum=0071
different versions can be downloaded from here:http://www.dnrtv.com/default.aspx?showNum=71
Anyhow, the ideas above aren't mine, hopefully they help give you a little background why a machine like that might be used. One thing that impresses me more than some idiot like myself blowing a wad on today's latest components is people that are gaming XP and linux distributions to run every thing in RAM. If you do it right, you can have a fairly low end laptop with the OS and a web browser, graphics editor, email booting too and running completly in RAM one one or two gb of total system ram. You start with a fresh machine every time you reboot. Less worrry about getting fragged by viruses, trojans and other exploits.
Tony Stark was able to build this in a cave. With a box of scraps!