Looking at that example again, I wonder if you would suggest they upgrade to Vista since they are looking to utilize multitasking? Personally I think this is a horrible fit for a business, particularly one that already has XP licenses that they are satisfied with.
Uh. No. I wouldn't suggest they upgrade to Vista. In fact I wouldn't recommend anyone upgrade to Vista. I don't think its worth it.
However, when its time to buy a new computer, getting one with Vista makes decent sense. If that time is now, and you don't need to hold onto XP for compatibility reasons I would recommend Vista. And if you are buying a new computer now, then putting a directX10 card into it is wise; and its not hard to reach... even the new Intel Integrated G35 can do it.
Ok, there seems to be a miscommunication here because I have tried to get this point across and it seems like you are missing it. The basic idea is that people with multiple video clips going in final cut with itunes, etc, etc...aren't building their PCs the same way as a gamer. Gamers build their PCs for what they will do the most with them and that is of course gaming.
Right. There is a disconnect. I'm saying DirectX10 is important for the desktop. Its no worse for single-task gaming than DX9 is. (Well that's not quite true DX10 is taking a performance hit, but I'm saying it worth taking a minor performance hit given the desktop benefits, and the fact that directX10 and the drivers will improve; i think at this point dx9 is a dead end.) If your a hard-core gamer spending $3k on a rig, and will buy another one in 6 months, by all mean buy XP, directx9, and c2d EE and enjoy your framerate uberness.
But if your a value-gamer aiming at 1k-2k rig and unlikely to upgrade for 2 years, directx10 on Vista is the better proposition. It will last you longer, and the performance is likely to get better as the dx10 drivers continue to mature.
So I have to ask what itunes and final cut have to do with gaming? I also wonder what kind of things iTunes is doing that would even remotely task modern cards...even with final cut going crazy in the background. If anything iTunes is an insignificant factor and Final Cut is the source of the real workload in that example.
iTunes coverflow, or 3d visualization of music can both benefit from hardware acceleration. The point was that Vista (or OSX or Linux) can support hardware acceleration in multiple windows at the same time, along with the desktop itself. GPU multitasking is a key part of how it acheives those effects.
How does it affect gamers? First it doesn't really hurt them. And 2nd the ability to run games in a window at full acceleration and interact with other (possibly hardware accelerated) applications at the same time is handy, even for gamers. I play warcraft and have itunes, browsers, and so forth all running at the same time, for example. Sometimes I'll even watch a movie while soloing in warcraft. XP doesn't do this terribly well.
No, the future is tomorrow. What you are saying is the untranslated advertising propaganda...
No the future is now. OSX and Linux have had this ability for a while now. GPU multitasking in Vista is a -good thing- and its catch up with the other desktop OSes.
The desktop *shouldn't* be a GPU taxing application...it ***can*** be done in a way that is as slick and streamlined as any desktop out there and without the resource requirements. But that doesn't sell hardware, and it doesn't help them reinforce the business model I translated above.
Its not GPU taxxing, in the same way that a game is. It just needs the right drivers. Like I said, Mac's can ALREADY do this with "low end cards", and even Vista is satisfied with new Intel Integrated graphics.
The problem was that Microsoft to "bundled" GPU multitasking with DirectX10, because nearly any card made in the last 3 to 4 years, including integrated stuff, can handle the GPU rendering/multitasking requirements of the Vista desktop. And if they released directx10 drivers for those cards, you could use them and benefit from gpu multitasking. However, they can't really release DX10 drivers, because of some of the OTHER technical requirements of DirectX10 can't be satisfied... e.g. part of directx10 is the abolishment of capability bits, and directx10 also requires the drm stuff... and so they can't even just not support those "features".
I understand why microsoft did it. DX10 is a 'clean slate'. They needed a new driver model to support gpu multitasking, and DRM. And abolishing capability bits at this point to make life better for developers was a good idea too.
BUT doing them all at once made backporting gpu multitasking to directx9 cards an issue.
But your proposition really is one of "pay now for what you will get later...see look at how it worked in the past". Consumers care about where the technology is when they buy it.
I'm not saying that. I'm just saying "don't -cling- to the past and reject directx10". It's definately the case that directX10 isn't a compelling upgrade for gamers today. But the -desktop- needs to take this step to directX. And its silly for gamers to trash it.
I don't get how you can say "gpu multitasking doesn't require a state of the art card. It just requires an os kernel and graphics system that CAN do it." and not realize how much that clashes with the reality of the end user Vista experience?
The confusion arises from the fact that you -do- need a card with directx10, but a directx10 card does not have to be an expensive monster powerhouse. The new intel g35 integrated chipset for example will do nicely. The problem arises from the fact that most low end cards are still directx9, and so the experience is worse than it should be. As i said before, a lot of these older directx9 cards are powerful enough, and would work well if they had directx10 drivers... but they don't, and can't. (thanks to the capability bits issue, and the drm issues for example)
When MS gets this down to the point that it is useful and it actually provides benefit to the customer is when the customer will be interested. I don't buy the idea that because we are in the tough years of the OS cycle and history says things will be ok (which I actually disagree with but thats a whole other argument) we should stick with them regardless of how illogical it is and it will be ok again!
Vista does provide benefit to the consumer.
GPU multitasking is an important step forward, because it allows us to have applications like final cut pro, OSX-like User Interfaces, and so on. Consumers do want that.
Moving 64-bit memory addressability into the mainstream is also a crucial step forward. Vista x64 may still be the minority, but its not the bastard half-brother with almost no driver support that XPx64 is.
Getting rid of the administrator as default, requiring escalation to write to the windows system folder and other security features are also all crucial steps windows had to make, but broke a lot of applications. etc. Security is a hard sell, especially when it breaks stuff. But its a necessary step.
And I agree the hardware requirements, and backwards compatibility issues make an immediate upgrade un-compelling in the extreme, and I don't suggest anyone upgrade. But rather, look at vista in the same way we looked at 95 on launch. It wasn't compelling for gamer's at all when it launched. It wasn't compelling for businesses either. But it was compelling for its desktop. And it was crucial we move to a pre-emptive multitasked 32-bit OS. It laid the ground work for the future.
If MS believes in its vision then they can justify making the investment and they will be vindicated when their advancements blow people away. But when Vista can hardly muster a stiff breeze I don't think it is hard to see why people are taking a pass on this one.
Give it time. I'm not saying you should buy it today, and you definitely shouldn't upgrade an existing machine to use it. But it does take windows an important step forward, and when you are buying your next machine, its not like you have to spend extra to support it, and at that point it is a decent value proposition.







