I came across an interesting paper posted to Digg regarding parallel computing. The author encourages developers to abandon using threads and to shift their efforts to true parallel computing. All in all it's a good article, but, for the most part, it was all said about 60 years ago by John Von Neumann.
It's a shame that most people don't know much about John Von Neumann. Even the Wikipedia page about him is somewhat lacking in detail regarding his incredible contributions to our modern world. I wouldn't hesitate to say that John Von Neumann had a greater impact on our world than almost any other person in the past 100 years.
John Von Neumann is responsible for virtually all modern day computer architecture. His paper, with the exciting title of "First Draft of a Report on the EDVAC", which he wrote while working on the then top-secret ENIAC project, is still a very accurate description of basically all computers in use today.
But this man's brilliance isn't in that he invented the machines on which our world is built, but that he considered his invention to be primitive, almost ugly, and that he envisioned a far more advanced form of computer that would make today's most powerful machines look much like the ENIAC does when compared to the machine I'm writing this on. His ideas on the future of computer architecture have yet to be fully realized, but the work being done on parallel computing is as close as we've come.
Almost all of today's computers are, for the most part, serial in nature. They do one thing at a time. To simulate the appearance of "multi-tasking", or doing more than one thing at a time, they very quickly switch between tasks. In fact, one of the primary jobs of an operating system (like Windows, Linux, or Mac OS) is to take charge of this task switching to make the computer as responsive as possible.
A lot of the "advances" in computer speed over the past few decades have been in getting the CPU to kinda do multiple things at the same time. Whether that's through multiple CPUs, multiple cores, or multiple simultaneous threads, the reason we're jumping through these hoops is because our computers are serial.
One example of a parallel computer in use today is the graphics processing unit (GPU) on some of the newer 3D video cards. The GPU is a fairly primitive parallel computer, but its parallelism is what gives it the ability to produce those great graphics we all love so much. Alas, its specialized nature makes it a poor substitute for the generalized parallel computer that John thought up 60 years ago. There has been some interesting work in trying to adapt GPUs to make them usable for more than just graphics, but they have a long way to go.
John Von Neumann's talents were not limited to computing. He made great contributions to economics, quantum mechanics, and even military weapons research. If only he had lived a few more years (he died tragically at the age of 54 due to cancer he likely contracted as a result of the atomic bomb tests he witnessed) we would probably be decades ahead of where we are today. In fact, I'm not sure why he isn't on my list of personal heroes. I think I'll add him.
I've recently migrated to using Microsoft Team Foundation Server for nearly all my source control, work item tracking, and build management. So far I absolutely love it. For the most part, I'll never look back at VSS again... but every once in a while I encounter something that TFS is missing that VSS did easily.
Once such case was a few days ago when I had to figure out exactly what files have changed in a source tree since my last production build. I do this to manually check the scope of changes in a project. It's essentially a sanity check. In VSS this was trivial; just open up VSS, do a search, and refine by label and/or date and time. Done.
In Team Foundation Explorer, I found no way of doing this. I could easily get a list of changesets since a label, but that really doesn't help me unless I want to go through each and every changeset and write down what files changed in that change set. (In other words, double click on every changeset to open the files window.) Considering the size of my project, this would take hours. Honestly, I was shocked there wasn't a way to do this.
After a little research I decided the only way to do this was to write a quick utility. Luckily, programming against TFS is fairly easy. Despite the relative lack of documentation for the API, it's pretty easy to figure out. After about 15 minutes I had a working app, and after about an hour I had an application that was good enough for anybody to use. It's written in C# 2.0 and leverages the Microsoft.TeamFoundation.* libraries.
I call it TFRecursiveHistory. You give it a server, a project path, a label, and optionally a list of extensions, and it will output a list of TFS source control file paths for files that have changed since that label was applied. Works like a charm. Here is an example:
TFRecursiveHistory.exe /s:MyServer /p:"$/My Project/Main" /l:18.104.22.168 /ext:.cs,.cpp,.aspx
That would give you a list of all the cs, cpp, and aspx files that have changed since the label 22.214.171.124 was applied to the source tree "$/My Project/Main".
You can download it from my Code section, along with the source and VS.NET 2005 project files.
Buck Hodges, a developer lead for Team Build at Microsoft, posted a link to my utility and a warning regarding its accuracy. Essentially, because a TFS label is not a point in time it's possible that the file list won't be completely accurate if the label has changed. So use at your own risk.
Microsoft recently let it slip that there would be no support for "next gen" HD content on 32 bit Vista. Instead, in order to play HD-DVD or Blueray content you'll need to get the 64 bit version of Vista, and you'll obviously need some fancy new 64 bit hardware.
The reason for this change is that 32 bit Vista allows for unsigned kernel mode drivers while 64 bit Vista does not. Microsoft allows 32 bit Vista to run unsigned drivers because without this ability upgrading would be a nightmare. Since 64 bit machines are just starting to become mainstream, it won't be nearly as much of an issue for 64 bit Vista.
For those of you who don't know, driver signing is a process of ensuring the identity and integrity of a driver before it is allowed to be loaded by Windows. Since the only person who can sign drivers is Microsoft, it allows Microsoft to fully test drivers before signing off on them. This will increase both the security and the stability of Windows more than any other thing Microsoft is doing in Vista.
Of course this also means that pirates can't create drivers to bypass the DRM on HD-DVD or Blueray content. That's why "Media Companies" like it, and that's why 32 bit Vista won't natively support HD-DVD or Blueray content.
The immediate reaction to this was one of scorn, contempt, and anger. As usual, I disagree with these sentiments wholeheartedly.
The basic gist of why people are mad is that they think Microsoft is being a lapdog for the "Media Companies" and placing arbitrary restrictions on how people use their computers to appease the big, bad content producers.
What people seem to be missing is that the "Media Companies" are the ones who are going to be producing all of this content, so it's up to them to determine the terms of sale. It's not a question of DRM-free HD content or DRM-restricted content, it's a question of DRM-restricted content or no content at all.
Microsoft adding this restriction doesn't limit choice or "freedom". If anything, it creates MORE choice. It gives a big incentive to media companies to produce new HD content that people want. If the media companies thought their investments would go to waste thanks to piracy, they would be far less inclined to make those investments, and we wouldn't have the choice to view that content at all because it wouldn't exist in the first place.
It's not like this is going to stop anybody from producing "HD" content that plays without restriction. There are dozens of media formats available right now that allow you to reach HD resolutions. If you don't want DRM in the content YOU produce, this will not affect you at all.
Turns out that the 32 bit version of Vista will be able to play next gen HD content, just not natively. You'll need 3rd party applications to do so, but it will work fine. Regardless, I think my points are still valid.
The best programmers are up to 28 times more productive than the worst programmers.
Rob Walling over at softwarebyrob.com has an interesting piece detailing what he thinks are the personality traits that separate the good programmers from the bad.
I couldn't agree more... both about the traits and the tabs. (Read the article to find out what I'm talking about. :)
read more | digg story
Have you ever wondered how the software that runs the space shuttle is written? Well, the answer is very, very carefully.
A decent piece of commercial software typically has around 20 bugs for every 1000 lines of code written. A really good piece of software might have 1 bug for every 1000 lines of code.
How many bugs per 1000 lines of code does NASA have? Try .00000024. The software that controls the shuttle has about 420,000 lines of code, and only a single bug was found in each of the previous three versions. Wow.
When you're writing software that absolutely must work or people die, the processes involved are a bit different than all night coding sessions fueled by jolt and pizza.
This particular article claims that, eventually, all software will be written this way. I have to disagree. It doesn't make sense to spend incredible amounts of time (and therefore money) decreasing your bug rate from 20 per 1000 lines to 1 per 1000 lines if it's cheaper to fix those bugs after the software has shipped and they're discovered by customers, as long as your customers don't die and they don't lose a $4 billion spacecraft.
It's all about trade-offs. Sometimes having a bug matters a lot more than other times.
This is a test post from Windows Live Writer, some new software from Microsoft that allows you to write your blog posts in a nifty WYSIWYG interface.
So far, so good. :)
Apple's WWDC (World Wide Developers Conference) was yesterday. This is essentially Apple's version of Microsoft's PDC.
As with almost all Apple-related media events, Apple spent a good amount of time bashing Microsoft and claiming that everything that Microsoft does is a copy of Mac OS. This year's tag line was "Introducing Vista 2.0". Uh huh.
But who is really copying whom? It is very true that Microsoft has taken many UI cues from Apple over the years, but the copying is definitely not a one way street
One particularly glaring example is one of the most prominent new features in OS X.5, the "Time Machine". This is basically a feature that lets you retrieve old version of any file or folder. Aside from a much cooler name and a pretty interface, this feature has been in Windows for over 3 years. It's called Shadow Copy
Apple is obviously preaching to the choir at these events, so I can understand their desire to bash the evil empire. But the fact of the matter is that while Apple turns out releases far faster than Microsoft, Microsoft is often the one to create the features first. It's easy to claim innovation when you're the first to market, but it's also easy to copy features that have been publicly announced for several years.Update
Looks like Paul over at WinInformant agrees with me
. Or I agree with him. Or whatever.
Joel Spolsky (who writes the fairly well known JoelOnSoftware.com blog
) has written a fairly interesting post
Yes, yes it can! Since version 2.0 of .NET, the CLR has fully supported anonymous methods
In fact, in version 3.0 they will support anonymous types
, partially in order to facilitate some great stuff like LINQ
I've seen several people comment that delegates (or unmanaged function pointers) are functionally (no pun intended) equivalent to anonymous methods. This isn't really true.
Anonymous methods really shine exactly because you don't have to prototype the method first. Yes, you could do basically the same thing with delegates, but you would have to declare each of these delegates, create a method that matches the delegate signature, create a delegate instance, and finally pass that instance to the method in question.digg story