Recently I've been playing with my OS installations quite a bit. Setup Linux on this disk, XP on that one, another copy of XP there and maybe Vista there.
Well, Vista eventually had to go, but I kept the Linux and XP installations. What remains is the memories of all the problems I had installing all these operating systems. And guess what - all the problems stem from the fact that none of these systems really read the BIOS settings telling them what I wanted to do.
OK, I really have to be fair to Ubuntu: at least it attempted to install exactly where I told it to. Unfortunately, things didn't go quite as smooth as expected. I had to manually place GRUB to the appropriate disk. Otherwise at least it didn't mess with the other three disks I have.
The story was completely different for XP and Vista. I set the appropriate drive as boot in BIOS, but the crap installer put boot loader on any particular drive it found fit for that at a particular time. Notifying me about the chosen disk is of course out of the question.
Then I try to disable (in BIOS) all disks but the one I want to install XP to. One would hope that at least this would work. Wrong!!! XP's IDE drivers simply ignore any such settings in the BIOS and continue showing all the disks that are attached. The only solution for this problem is to physically disconnect the disks.
Modern OS's don't even use the drivers provided in hardware BIOS except in some ultra-special fall-back modes. Those of us old enough will still remember all those int xx functions with which one could access the hardware in an easy, but rather slow way. Pretty much every hardware still comes with BIOS providing those interrupt functions although nowadays these functions are rather poor in number and functionality. The reason for this I think is simple: everybody simply expects a driver for your chosen OS will fix this providing for full functionality.
So, what's the use of BIOS then anyway?
With current solutions, all one needs from BIOS is that tiny little bit of functionality to allow boot from CD, HDD or some such device. Everything can be sorted out later with appropriate drivers for the OS being booted.
Of course, things aren't quite as simple. The most obvious example of hardware that isn't managed by OS directly is RAM. Overclockers out there will know what I'm talking about. Currently there are no drivers for any OS that would manage how RAM works. Same goes for system buses (PCI, PCI-Ex, AGP, HTT and the like) and a lot of this goes for the processor as well. There are many aspects of today's computer that still get managed by the BIOS directly. But that doesn't make it BIOS (Basic Input/Output System) now, does it?
But still, why have any of this in BIOS when we could just as well have a few appropriate extra drivers for the OS and be done with it? It's not like those 512KB of ROM motherboards have could ever show, not with today's multi GB operating systems. Some fail safe functionality needed for boot could just as well be placed in 64KB of ROM, everything else is just unnecessary duplication of code.
What I hate most about this code duplication is how some settings are respected while others are not. For example - to make power management work, it has to be set just right in BIOS as well as in the OS. Integrated NICs and their drivers obey the on/off switch, pretty much the same goes for other peripherals, but other settings for those same peripherals are simply moot since the driver makes sure that the peripheral in question works. Disks on the other hand are supported in BIOS just for the purpose of booting, nothing else. A few milliseconds after the boot the driver is loaded and everything set in BIOS is forgotten...
What's the most ridiculous of all is that we have all the necessary drivers in BIOS, but your OS of choice won't be able to use the device until you provide it with appropriate drivers. Which is why I'm writing this post.
You see, I'm not really against drivers in the BIOS. On the contrary: I believe BIOS should provide the drivers for all hardware in the computer. Without exception!
These drivers should:
1) Provide a fast, overhead free interface directly to hardware
2) Manage all IRQ, DMA and similar stuff through universal interfaces
3) Provide just the functionality of the hardware, nothing more, nothing less. Application specific bug fixes like the graphics drivers are full of must remain an OS (or better yet - application) specific feature.
4) Integrate with the BIOS on code segment basis, eliminating excessive function calls and such
5) Long-term standardization of hardware communications protocols
6) Provide an easy method for upgrading the driver at a later time
7) Many other things, but I'm too lazy to think of them right now
What I'm proposing here is a bit of paradigm shift: currently OS manages the hardware of interest entirely - by using drivers. I believe this is not optimal for the following reasons:
1) Drivers have to be written separately for each OS and possibly for each major version of the OS creating lots of excessive work (and bugs) for driver developers
2) Plethora of operating systems out there makes it practically impossible to write drivers for each of them increasing consumer problems with hardware purchases
3) Creates usage problems because each hardware manufacturer does it's drivers it's own way. Drivers and supporting applications look different from manufacturer to manufacturer steepening the learning curve for the users
I've always found it super annoying to have to find the correct drivers for my on-board Ethernet NIC or sound card or anything else for that matter. If you ever installed an OS on your own, especially an OS that is a couple of years old, you have noticed that half your hardware is not recognized at installation leaving you with plenty of work installing the latest drivers. That is, if you can figure out what all those "Unknown device" entries in the hardware manager are. One of the most annoying drivers to install are the NIC drivers: if you forgot do download them before installing the OS, you have no way of accessing the internet, which means you have to install them from the CD that came with your computer (if you still have it) or find a friend that can download the drivers for you from the internet. Great, if you're like me (installing OS at night), you won't be as stupid to go wake your friend up at 3am, will you?
So, my proposition is to make BIOS really a BIOS again, but this time a bit more modern. DOS had no problem just linking all the BIOS interrupt functions into it's own int 0x21. That was fast enough since the hardware was so slow at the time. Modern operating systems just made it easier to write the drivers for than the BIOS, but it all started with Windows 3.x anyway. DOS didn't have a need for graphics accelerator, you see. That's one of the reasons why we have this ridiculous situation today. At least, it's ridiculous in my opinion.
What we need to do today is simply design a good method that would allow for fast and reliable access to all provided hardware functions while at the same time allowing for easy integration into an OS supporting the new model.
With this, we provide the following benefits:
1) Provide a standard hardware interface for all operating systems
2) Driver developers only have to worry about 8,16,32 and 64 bitness of their drivers and even that could be taken care of by including a small p-code to machine code compiler built into the motherboard BIOS. This could even eliminate platform differences (same driver working with a MIPS, ARM, X86 or any other processor).
3) Virtualization becomes a snap. BIOS would become the platform for virtual server.
4) No more trouble finding the drivers for your OS of choice
5) Maybe, just maybe, basic functionality of common hardware could in time be covered by a single, generic driver built into the motherboard BIOS
I hear EFI (Extensible Firmware Interface) already handles a lot of what's proposed here. But if I understand it's specification correctly, it is again a compromise, offering just basic functionality. Not to mention it isn't being used despite it's obvious advantages over the now 30 year old PC BIOS...
I wonder who was too powerful this time...
Monday, 31 December 2007
Saturday, 22 December 2007
Why Vista disappointed me so much
I'm sorry. The sheer amount of recent flood of hate posts about MS Windows Vista certainly doesn't warrant yet another post adding to the negative hype. But I just couldn't help myself, I'm so much disappointed that I feel compelled to write a little bit about this OS.
So, without further ado, let's start with the good first:
Finally!!!! I don't have to select Slovenia / Slovenian five times during installation any more. One selection fixes them all properly and if I later desire to add a bit more support for additional character sets, I have the option to do so. I really like this and MS certainly delivered on this little tidbit.
It looks good. I always was a sucker for eye candy and the look of Vista certainly is enjoyable. I think the look is clean and really done professionally. On XP you can choose any theme, load any 3rd party software, but it will never have that cleanliness and smoothness. Well, at least I can't find anything that would duplicate the effect. I've also tried out Compiz for Linux and while it is technologically way more sophisticated than anything Vista delivers, it's the good measure of aesthetics and just the right measure of effects that makes Vista superior in my eyes. I know you can set all this just about the same in Compiz, but I must say that sometimes too many options can also be a pain. Even if I know exactly what I want, I sometimes spend hours looking for the setting that would do that for me. On the other hand, Compiz simply doesn't provide a preset that would appeal to me. Anyway, Vista looks good.
I like what they've done to some of the Control panel applets. Some of them look cleaner and make managing their respective assets easier than was possible before. However, also see the Control panel section in "the bad" part.
The open file dialog has a minor addition that makes my life a bit easier: the favorites list. It's sooo cool, but at the same time so unmanageable. It's so hard to set it so that it would always show my favorite folders at the top and others, a bit less favorite lower down... This certainly is a feature that I immediately liked, but also delivered tons of frustration for me.
Ahem, that's about it. Aside for some minor other things, this is just about everything I like about Vista more than with XP. Did I forget anything? Maybe, but I really don't think so. A shame really since MS invested so much into the making of this OS. After all this work everybody just bashes the product... Well, it seems a lot of the work maybe went into the wrong code.
So, what's the disappointment?
There's so many things Vista didn't deliver for me I don't even know where to start, really.
Let's start with bloat. The thing takes > 8GB of my disk space and 1.1 GB of RAM just to run the most basic services I require. Sorry - OS thinks I require. WTF?!? This is an OS, not a complete human brain replica. I hope. At least the functionality doesn't suggest a particularly high level of intelligence in it. I believe XP is a bloat, but compared to Vista, it's an angel. I'm sorry, but I really believe an OS should never take more than a few MB. I'm talking less than 100 here. Yes, OK, Vista does have some fancy animations and some pretty pictures, but that most certainly doesn't take 8 GB. With XP I had a feeling my 1GB of RAM was more than sufficient for anything, but just installing Vista made me go buy another GB of RAM in hope that after the upgrade the OS would show at least some superiority over XP. Boy, was I wrong :(
Display. Wait a second, I just praised it's looks a few paragraphs back. Yes, I did, but at the same time, Vista's display capabilities are one of the greatest disappointments for me. I really expected that the new and shiny WPF would finally dispense of the pixel and just draw vector graphics, properly antialiased and all, mind you. This would finally allow for high resolution displays to gain entrance to the market. And no, I'm not talking 2560 x 1600 resolution here. I'm talking 200 DPI, 300 DPI and the like. Size and final resolution don't matter. Try working with XP on a 300 DPI display (if there was such a display) and you'll see what I'm talking about. Fonts so small you can't read it, icons not much better and just you try to resize that window - you'll never make those 3px of headroom you have. Vista doesn't bring anything new to the table in this respect. Just using "large fonts" creates a myriad of problems on both systems. So, yes, this is a huge disappointment for me.
I also had high hopes for I/O. You know - when some service really uses the disk a lot and you can't even start explorer properly? Waiting and waiting and a bit more waiting is what remains with Vista. And to top this all off, there are a myriad of services in Vista that only increase the load to the disks or any other streaming device for that matter. In fact there are so many that disks practically have to work constantly. OK, so I'm exaggerating a bit, but it's a pretty good description of what's going on in your system. Anyway, Vista unfortunately doesn't make I/O faster, only slower. I'm talking real-life usage here, don't care about theoretical peaks. Oh, did I mention that setting the "compressed" flag for a large file still yields that unresponsive dialog box which tells you absolutely nothing until the file is actually compressed? Yep, it does.
While just at the amount of services, I have to say that Vista surely delivers on the numbers. A service for this, a service for that all doing something on my computer and I'd really like to know what that something is. Then I decide a particular service is of no use to me and I cant even turn it off! And that stupid time service is still a service. Oh, just naming an example here, showing the pattern. You can pick your own "favorite" service name instead of mine. Anyway - why on earth would I want something that is executed once a week to update my computer's time to run as a service? I for one really don't know. I have a 50KB utility I picked up somewhere on the net sometime around '96-97 and I set it to run at every midnight. No fuss, it just sets my computer's clock to what the time server says is the correct time. Unlike the super-duper always running service that has to run all the time only to remind me that computer clock cannot be set because it differs too much from the time server's. Yes, I had to manually alter the computer's clock to test some program's functionality. What's it to you, stupid service? Just set my clock so that I won't have to. I changed date, hours AND minutes, repeatedly, by various amounts, you know, I really can't tell how much. Do I really have to find another clock, set the time back manually for you to grace me with a properly synchronized computer time? Idiotic.
Ooooooooohhhhh. Just remembered the famed "super fetch". 'Tis a service, you know. Taking some 100MB RAM, give or take. This beauty really is the masterpiece: when the OS starts, this little service will begin reading my disks for everything I touched in the last week. Well, I'm sorry, mrs. Service, that's some 30GB of data. Just what do you think you'll achieve reading 30GB of data into your non-existent buffer and system disk cache which can't under no circumstances be more than 2GB (which just happens to be the amount of RAM my comp has)? I couldn't believe my own eyes when I saw this pretty little service using up my disk resources reading files it shouldn't be reading...
Oh, did you know that the Calculator still doesn't provide the square root function? But this is not the point: All the applications that were already in Win 3.0 are still exactly the same! Not a single function added. Why are these apps even installed if the first thing I have to do after installing the OS is go get some decent utilities that cover this functionality for me? Well, dear MS: don't waste MY disk with useless crap. I buy your OS to help me do things and these API programming samples without the accompanying source code don't help.
Let's say that I primarily even bother with Windows for one reason only: games. Unfortunately, pretty much all the games for PC are for MS Windows. At least this functionality still works in Vista, right? After all, Vista is Windows too, and the newest and shiniest too. Wrong!!! Pretty much all the games I can run on Vista, I can also run on Linux. The other's just don't work. The ones that do work typically have lower frame rates than when ran in Linux. What??? Wait a second. I just bought an OS that gives me a hard time playing my games? But the OS is Microsoft's?!? It's supposed to be compatible? You wanna say they kept all those stupid ANSI char API functions, but didn't keep Direct X compatibility? And what happened to all those performance improvement promises about the new Direct X API? I just don't see them. Maybe I didn't look hard enough. Or did I?
OK, enough bashing. This post is too long as it is and I left out all the concept changing expectations of mine except the display one. Let's just say I needed some venting and this amount of text just about does it. Maybe I'll continue next time. Or not.
Just what might be the reason to keep this OS on my hard drives then?
I don't see any.
I'm waiting for Windows 8. I hear MS finally got the message and are doing it properly this time. Maybe...
So, without further ado, let's start with the good first:
Finally!!!! I don't have to select Slovenia / Slovenian five times during installation any more. One selection fixes them all properly and if I later desire to add a bit more support for additional character sets, I have the option to do so. I really like this and MS certainly delivered on this little tidbit.
It looks good. I always was a sucker for eye candy and the look of Vista certainly is enjoyable. I think the look is clean and really done professionally. On XP you can choose any theme, load any 3rd party software, but it will never have that cleanliness and smoothness. Well, at least I can't find anything that would duplicate the effect. I've also tried out Compiz for Linux and while it is technologically way more sophisticated than anything Vista delivers, it's the good measure of aesthetics and just the right measure of effects that makes Vista superior in my eyes. I know you can set all this just about the same in Compiz, but I must say that sometimes too many options can also be a pain. Even if I know exactly what I want, I sometimes spend hours looking for the setting that would do that for me. On the other hand, Compiz simply doesn't provide a preset that would appeal to me. Anyway, Vista looks good.
I like what they've done to some of the Control panel applets. Some of them look cleaner and make managing their respective assets easier than was possible before. However, also see the Control panel section in "the bad" part.
The open file dialog has a minor addition that makes my life a bit easier: the favorites list. It's sooo cool, but at the same time so unmanageable. It's so hard to set it so that it would always show my favorite folders at the top and others, a bit less favorite lower down... This certainly is a feature that I immediately liked, but also delivered tons of frustration for me.
Ahem, that's about it. Aside for some minor other things, this is just about everything I like about Vista more than with XP. Did I forget anything? Maybe, but I really don't think so. A shame really since MS invested so much into the making of this OS. After all this work everybody just bashes the product... Well, it seems a lot of the work maybe went into the wrong code.
So, what's the disappointment?
There's so many things Vista didn't deliver for me I don't even know where to start, really.
Let's start with bloat. The thing takes > 8GB of my disk space and 1.1 GB of RAM just to run the most basic services I require. Sorry - OS thinks I require. WTF?!? This is an OS, not a complete human brain replica. I hope. At least the functionality doesn't suggest a particularly high level of intelligence in it. I believe XP is a bloat, but compared to Vista, it's an angel. I'm sorry, but I really believe an OS should never take more than a few MB. I'm talking less than 100 here. Yes, OK, Vista does have some fancy animations and some pretty pictures, but that most certainly doesn't take 8 GB. With XP I had a feeling my 1GB of RAM was more than sufficient for anything, but just installing Vista made me go buy another GB of RAM in hope that after the upgrade the OS would show at least some superiority over XP. Boy, was I wrong :(
Display. Wait a second, I just praised it's looks a few paragraphs back. Yes, I did, but at the same time, Vista's display capabilities are one of the greatest disappointments for me. I really expected that the new and shiny WPF would finally dispense of the pixel and just draw vector graphics, properly antialiased and all, mind you. This would finally allow for high resolution displays to gain entrance to the market. And no, I'm not talking 2560 x 1600 resolution here. I'm talking 200 DPI, 300 DPI and the like. Size and final resolution don't matter. Try working with XP on a 300 DPI display (if there was such a display) and you'll see what I'm talking about. Fonts so small you can't read it, icons not much better and just you try to resize that window - you'll never make those 3px of headroom you have. Vista doesn't bring anything new to the table in this respect. Just using "large fonts" creates a myriad of problems on both systems. So, yes, this is a huge disappointment for me.
I also had high hopes for I/O. You know - when some service really uses the disk a lot and you can't even start explorer properly? Waiting and waiting and a bit more waiting is what remains with Vista. And to top this all off, there are a myriad of services in Vista that only increase the load to the disks or any other streaming device for that matter. In fact there are so many that disks practically have to work constantly. OK, so I'm exaggerating a bit, but it's a pretty good description of what's going on in your system. Anyway, Vista unfortunately doesn't make I/O faster, only slower. I'm talking real-life usage here, don't care about theoretical peaks. Oh, did I mention that setting the "compressed" flag for a large file still yields that unresponsive dialog box which tells you absolutely nothing until the file is actually compressed? Yep, it does.
While just at the amount of services, I have to say that Vista surely delivers on the numbers. A service for this, a service for that all doing something on my computer and I'd really like to know what that something is. Then I decide a particular service is of no use to me and I cant even turn it off! And that stupid time service is still a service. Oh, just naming an example here, showing the pattern. You can pick your own "favorite" service name instead of mine. Anyway - why on earth would I want something that is executed once a week to update my computer's time to run as a service? I for one really don't know. I have a 50KB utility I picked up somewhere on the net sometime around '96-97 and I set it to run at every midnight. No fuss, it just sets my computer's clock to what the time server says is the correct time. Unlike the super-duper always running service that has to run all the time only to remind me that computer clock cannot be set because it differs too much from the time server's. Yes, I had to manually alter the computer's clock to test some program's functionality. What's it to you, stupid service? Just set my clock so that I won't have to. I changed date, hours AND minutes, repeatedly, by various amounts, you know, I really can't tell how much. Do I really have to find another clock, set the time back manually for you to grace me with a properly synchronized computer time? Idiotic.
Ooooooooohhhhh. Just remembered the famed "super fetch". 'Tis a service, you know. Taking some 100MB RAM, give or take. This beauty really is the masterpiece: when the OS starts, this little service will begin reading my disks for everything I touched in the last week. Well, I'm sorry, mrs. Service, that's some 30GB of data. Just what do you think you'll achieve reading 30GB of data into your non-existent buffer and system disk cache which can't under no circumstances be more than 2GB (which just happens to be the amount of RAM my comp has)? I couldn't believe my own eyes when I saw this pretty little service using up my disk resources reading files it shouldn't be reading...
Oh, did you know that the Calculator still doesn't provide the square root function? But this is not the point: All the applications that were already in Win 3.0 are still exactly the same! Not a single function added. Why are these apps even installed if the first thing I have to do after installing the OS is go get some decent utilities that cover this functionality for me? Well, dear MS: don't waste MY disk with useless crap. I buy your OS to help me do things and these API programming samples without the accompanying source code don't help.
Let's say that I primarily even bother with Windows for one reason only: games. Unfortunately, pretty much all the games for PC are for MS Windows. At least this functionality still works in Vista, right? After all, Vista is Windows too, and the newest and shiniest too. Wrong!!! Pretty much all the games I can run on Vista, I can also run on Linux. The other's just don't work. The ones that do work typically have lower frame rates than when ran in Linux. What??? Wait a second. I just bought an OS that gives me a hard time playing my games? But the OS is Microsoft's?!? It's supposed to be compatible? You wanna say they kept all those stupid ANSI char API functions, but didn't keep Direct X compatibility? And what happened to all those performance improvement promises about the new Direct X API? I just don't see them. Maybe I didn't look hard enough. Or did I?
OK, enough bashing. This post is too long as it is and I left out all the concept changing expectations of mine except the display one. Let's just say I needed some venting and this amount of text just about does it. Maybe I'll continue next time. Or not.
Just what might be the reason to keep this OS on my hard drives then?
I don't see any.
I'm waiting for Windows 8. I hear MS finally got the message and are doing it properly this time. Maybe...
Thursday, 20 December 2007
What's up with Intel and graphics anyway?
During this last year I've read quite a lot about Intel getting into the graphics business. Well, maybe I didn't say that 100% correct: Of course, I'm talking about discrete graphics business. Intel is already quite a big player in integrated graphics market and as far as I know, it's still #1 player there although NVidia and ATI (now AMD) are catching up quite fast.
To get back to the topic at hand, all the writing about it (Intel) getting into the business never managed to produce any concrete info on what exactly they intend to do anyway. Seems to me Intel really managed to hush-hush their employees this time or that all the rumors were simply untrue.
Just in the same time frame there were also lots of news about Intel demonstrating their newest processors by showing off how fast they can render 3D scenes ray-traced. We've seen four and eight core systems displayed that could render ray-traced images pretty much in real time. And I'm not talking about 640x480 resolutions here. See where I'm getting at?
I believe Intel finally went to try to do something ATI, NVidia or any other GPU maker should do quite some time ago: they went for the holy grail of the graphics - real time ray-tracing.
Of course this may prove a bit more difficult as it may seem. One may even achieve enough muscle to render ray-traced images in real time at decent resolutions, but just where does accomplishing such a feat take us? Currently there are only a few software programs available that perform ray-tracing and they would most certainly appreciate an accelerator they could use to perform their work faster. But: would that make the entire enterprise profitable? Selling a few accelerators to studios and geeks that need / play with such programs? I think not. If you go for such a goal, you go for it all the way, counting on millions of gamers out there who would appreciate the realism ray-tracing delivers.
But what about backwards compatibility? And all the new games coming out? Will the new accelerator be able to support them well? Well, this is where the problems kick in. 2D is not a problem - one only needs to slap in some extra 100K transistors and you have a decent enough 2D engine. I think Intel has this territory covered well already. What isn't so easy are the 3D games. If you look at this from ray-tracing perspective, current 3D games's look is all based on deception. A game can look really good and realistic and shadows in it are nothing short of amazing, but this is all based on special algorithms that require data prepared in a very specific way for them to work. And then there's another problem with the quantity of data required for various algorithms: current algorithms require no information about material an object is composed of whereas ray-tracing does. Or did you think those super-duper transparency and reflections always boasted in ray-tracing software come out of thin air?
While both methods require polygons to describe objects and textures to describe the colors of surfaces, this is just about everything they have in common. The main difference between the "new" and the "old" way is in the way a particular pixel is drawn on the screen. Current GPUs draw triangles and in the end just make sure that the pixel from the "topmost", that is, the triangle closest to the user is displayed. Ray-tracing on the other hand starts from the pixel and then determines all the triangles that affect it's final color. This is also reason for ray-tracing not needing the currently-so-famous anti aliasing. Anisotropic filtering is needed, but it's meaning is completely different as in the "old" method. Pixel shaders: well, let's not even go there. To cut this a bit shorter: taking any current 3D game and trying to play it on a ray-tracing card would result in pretty miserable image quality since the driver would have to guess the missing parameters as well as guess which code / objects are unnecessary (shadows for example). Not to even mention that shadows in particular require different source objects for ray-tracing than for the "old" method.
So, what really happened with the discrete graphics at Intel?
If my reasoning is correct and they really went for ray-tracing, I'm guessing they're currently struggling with making their products backwards compatible. It would be some time before developers started developing their software designed for ray-tracing instead of the current methods even if the appropriate cards would already be available. So compatibility is an important task to do with such a product. Trying to make it so that it would at the same time show some benefits of the new algorithms must be a really tough job and I really hope the team can do it. I'm looking forward to a third strong player, even if it means this player will be Intel :)
I can't wait to play my favorites (Deus Ex, Morrowind) on such a card and I'm looking forward to a new generation of games even more.
To get back to the topic at hand, all the writing about it (Intel) getting into the business never managed to produce any concrete info on what exactly they intend to do anyway. Seems to me Intel really managed to hush-hush their employees this time or that all the rumors were simply untrue.
Just in the same time frame there were also lots of news about Intel demonstrating their newest processors by showing off how fast they can render 3D scenes ray-traced. We've seen four and eight core systems displayed that could render ray-traced images pretty much in real time. And I'm not talking about 640x480 resolutions here. See where I'm getting at?
I believe Intel finally went to try to do something ATI, NVidia or any other GPU maker should do quite some time ago: they went for the holy grail of the graphics - real time ray-tracing.
Of course this may prove a bit more difficult as it may seem. One may even achieve enough muscle to render ray-traced images in real time at decent resolutions, but just where does accomplishing such a feat take us? Currently there are only a few software programs available that perform ray-tracing and they would most certainly appreciate an accelerator they could use to perform their work faster. But: would that make the entire enterprise profitable? Selling a few accelerators to studios and geeks that need / play with such programs? I think not. If you go for such a goal, you go for it all the way, counting on millions of gamers out there who would appreciate the realism ray-tracing delivers.
But what about backwards compatibility? And all the new games coming out? Will the new accelerator be able to support them well? Well, this is where the problems kick in. 2D is not a problem - one only needs to slap in some extra 100K transistors and you have a decent enough 2D engine. I think Intel has this territory covered well already. What isn't so easy are the 3D games. If you look at this from ray-tracing perspective, current 3D games's look is all based on deception. A game can look really good and realistic and shadows in it are nothing short of amazing, but this is all based on special algorithms that require data prepared in a very specific way for them to work. And then there's another problem with the quantity of data required for various algorithms: current algorithms require no information about material an object is composed of whereas ray-tracing does. Or did you think those super-duper transparency and reflections always boasted in ray-tracing software come out of thin air?
While both methods require polygons to describe objects and textures to describe the colors of surfaces, this is just about everything they have in common. The main difference between the "new" and the "old" way is in the way a particular pixel is drawn on the screen. Current GPUs draw triangles and in the end just make sure that the pixel from the "topmost", that is, the triangle closest to the user is displayed. Ray-tracing on the other hand starts from the pixel and then determines all the triangles that affect it's final color. This is also reason for ray-tracing not needing the currently-so-famous anti aliasing. Anisotropic filtering is needed, but it's meaning is completely different as in the "old" method. Pixel shaders: well, let's not even go there. To cut this a bit shorter: taking any current 3D game and trying to play it on a ray-tracing card would result in pretty miserable image quality since the driver would have to guess the missing parameters as well as guess which code / objects are unnecessary (shadows for example). Not to even mention that shadows in particular require different source objects for ray-tracing than for the "old" method.
So, what really happened with the discrete graphics at Intel?
If my reasoning is correct and they really went for ray-tracing, I'm guessing they're currently struggling with making their products backwards compatible. It would be some time before developers started developing their software designed for ray-tracing instead of the current methods even if the appropriate cards would already be available. So compatibility is an important task to do with such a product. Trying to make it so that it would at the same time show some benefits of the new algorithms must be a really tough job and I really hope the team can do it. I'm looking forward to a third strong player, even if it means this player will be Intel :)
I can't wait to play my favorites (Deus Ex, Morrowind) on such a card and I'm looking forward to a new generation of games even more.
Subscribe to:
Posts (Atom)