13. July 2006 20:43
I recently responded to a particularly annoying post on slashdot regarding Windows and the fact that before Vista, the default user created was always an administrator. You can read the comment I was responding to here. Below is my response.
I know it's hard sometimes, but please try and actually read the post you're responding to before ranting:
Once a program is running, it can do anything, up to the limits of what you yourself can do on the computer.
As far as Windows being the only OS where the user is admin by default, you're correct. Of all modern operating desktop systems, Windows XP is the only one to make the first user admin by default. But did you ever ask yourself why?
You claim it's a simple matter of "twenty years of fraudulent marketing bullshit trying to claim it wasn't a problem". Find me a single example of this. You can't, because you just made it up.
The fact of the matter is that Windows has a very long history on the desktop, and for a large percentage of that history they haven't even had memory isolation or a permissions system. (Read: Win X.XX, Win 9x, Win ME.) In Microsoft's defense, the Internet took them a bit by surprise. Until the Internet, desktop security wasn't an issue for anybody except businesses, and that's why they used NT.
Over those years many, many, many applications were written for those flavors of Windows. These applications all assumed they were running as admin, and for good reason... they were! It wasn't until just 5 years ago that Microsoft finally made the push to get consumers on to the NT kernel, with all its nice security features and the new world of multiple users with varying permissions. Ut oh. There in lies the problem. Microsoft couldn't simply make users non-admin by default because now almost all existing desktop applications, the very thing people buy Windows for in the first place, would break.
So Microsoft had to make a hard choice... break all existing applications and go out of business, or have the users run as admin by default. Tough choice.
Admittedly, Microsoft should have done a MUCH better job over the past 5 years to get people to develop Windows applications the correct way. Aside from their "Logo Certification", they've done almost nothing.
Vista's UAC is a huge step forward for Windows, and it solves a very difficult technical problem that is absolutely unique to Windows: a massive legacy software library dating back 20+ years that *must* run flawlessly on every new version of Windows. Microsoft does not have the luxury of breaking every existing application like Apple does (thanks to their extremely small, yet insanely loyal user base), nor do they have the pleasure of having a software library written with multi-user systems in mind from the get-go, like Unix/Linux.
Cleary this isn't as simple as Microsoft being "fraudulent", nor is it "marketing bullshit", and they certainly have never claimed it wasn't a problem.