Friday, 23 September 2011

Building Qt library with Visual studio 2010 Express for x86 and x64 targets

Foreword
I have been looking for a decent C++ GUI library / application framework for a pretty long time. I must admit I have no idea which universe I was in when I last looked but I swear Qt was at that time a pathetic pre-jurrasic era GDI wrapper just as MFC or wx still are.

A few weeks ago though I looked at the latest version for the fun of it and was super impressed by their demo app with samples. The smooth animations, the high-res icons, the custom controls - exactly what I had been looking for for so long. So without further ado I went on to see if development in this framework is any easier than in other two. Of course it turns out it is not, but the framework provides so much more than others that it's a non-issue when it comes to selecting which one to work with.

Before we go into installation details, a word on MS x64 support:
I like 64 bit. It lessens restrictions that 32 bit modes impose though I'm not saying those problems are insurmountable. It does so with modest size penalty, but brings with it also the benefit of using a newer instruction set of your CPU. Besides, there's no reason I should be running a 32 bit app on my 64 bit OS if I can just as well run a native app, is there?
I have no idea why Microsoft insists on express only having x86 compiler. They are imposing the chicken and egg problem on themselves when their x64 operating systems are concerned.

However, the 2010 version makes it extremely easy to get an x64 compiler working and to make it even simpler, the IDE itself also adapts with just a few mouse clicks. There are plenty of guides for doing this on the net so I will not go into details, but following this particular guide will also get you fully functional cross platform compilers.

Please note that I am doing this on my Windows 7 Home premium x64 computer so your mileage may vary, but I suppose it should not vary by much. So without prolonging this too much, here are the prerequisites:

The prerequisites
To successfully compile Qt libraries for your VS2010 Express, you will need:
  1. Visual Studio 2010 Express install ISO
  2. VS2010 Service Pack 1
  3. Windows Platform SDK
  4. Compiler update for Platform SDK after SP1 patch
  5. Qt Creator
  6. Qt Libraries for VS 2008
  7. Qt jom
Jom is there only to make use of your multi-core CPU. You can use nmake instead, but I highly recommend you use jom. Your beard will be much shorter after compilation.

Installing Visual Studio 2010 Express with x64 compiler support
This one is pretty straight forward:
  1. Install VS2010 Express
  2. Install VS2010 SP 1
  3. Install platform SDK
  4. Install compiler update - this one will provide cross platform compilers in your VS bin directory though it isn't needed for IDE x64 development. Absolutely *must* be installed last of these
Installing Qt
There's nothing special about this one. I'd just care to mention that it pays off to think a bit ahead and not take the library proposed path when installing Qt Visual Studio libraries.
I suggest you install all Qt related files into the same base directory for later easy access.
While installing the library, instead of taking the x.y.z (version) directory, I suggest you append _x86 to it for later reference. Then duplicate the library directory into a new one with a _x64 suffix. This is necessary because Qt doesn't have x86-and-x64 target meaning you can have debug and release files in the same directory, but not multi-platform ones.

Your Qt directory structure should now look like this:
You see how the x86 and x64 targets are nicely aligned and distinguishable? :D I hope my nagging about suffixes paid off for you as well as it did for me :)

You are all set for compilation by this point.

Compilation
I should mention that installing SP1 and Compiler update is mandatory for x64 target because the original VS2010 x64 compiler had an optimization bug which produced segfaulting executables (misaligned SSE data). This is extremely important! The code will build even without those two updates, but it will be unusable!
The next thing to take care of is the fact that your library directory will increase in size from 1,3GB installed to almost 8GB compiled (OK, only 6,5GB for x86 version ;) ).

To compile, the following steps need to be performed from a command prompt. I made two batch files for me in the Qt base directory, you may choose for yourself, naturally:
  1. call "C:\Program Files\Microsoft SDKs\Windows\v7.1\Bin\setenv" /x86 /win7
  2. set QTDIR=C:\Qt\4.7.4_x86
  3. set QMAKESPEC=win32-msvc2010
  4. set path=%path%;%QTDIR%\bin
  5. cd %QTDIR%
  6. configure -debug-and-release -opensource -no-qt3support -platform %QMAKESPEC%
  7. ..\jom\jom -j 4
The above steps are for x86 (32 bit) compilation. To compile for x64, just replace all occurences of x86 with x64. Be careful not to change "win32-msvc2010" because this Qt profile will work for both targets and there is no "win64-msvs2010" profile.

Note that the paths may be different in your case. The path in line 1 is your Platform SDK install dir, path in line 2 is your Qt library install dir.
Also you may choose to apply different Qt configurations in line 6 than I did. The flags I chose build everything but legacy support.

You can easily compile both targets at the same time, but make sure you run the compilation before you go to sleep because your CPU is not fast enough to wait it out :)

Sunday, 22 May 2011

The broken language

Disclaimer: this is a rant. A huge one. A rage really. Consider yourself warned. I make no attempt to go easy on this piece of crap in this entry.

As described, I chose C++ as the language for my projects. It just seemed the right thing to do with its omnipresence in the programming world.

But I must admit that after 20.000 lines of code written in the darn thing (recently alone), I'm having increasing doubts about it. It just takes too much time to get anything done in it. And befory any of the C++ zealots come flaming me, please google your flames before posting, you will get enough counter-arguments.

This language is simply broken!!! And this was an understatement of the year! I have no idea what its proprietors are thinking when they go to the drawing board. But I'm very sure of one thing - it definitely isn't clarity and ease of programming. I've just been reading the proposed changes for C++0x standard (for the n-th time) and came back just as dissapointed as I am using this crap of a language.

I've been thinking long about how to list some of the things that bother me and decided on a plain old list in the end. There just isn't a prettier form for doing this, I think:
  1. The entire language is based on one of the oldest non assembler languages. In this case old implies unevolved, archaic and most definitely not modern. Note the triple use of what is essentially the same word.
  2. Even the basic types are completely messed up in this language. A programmer never knows exactly what he's going to get when he declares a simple integer (int). "Platform dependent" they say... Well, platform dependent my ass. I have yet to see an algorithm that scales from 8-bit to 64-bit CPU with the same int declaration. How idiotic. Not to mention that one of the most important (close to metal) integer types (byte or 8-bit integer) is named char. And to top it all off, it's even signed!!! How lame can this get?
  3. But these two are just the just the tip of the iceberg. The real fun begins when you pop the hood:
  4. There are two ways of using pointers - pointers and references. Well, references are a pathetic attempt at garbage collection. I guess. At least I have no idea what else they would be since the compiler seems to know better than me when to destroy an instance and when not. In the end you're just as forced to use pointers in C++ as you were in C since references are only useful for basic types. Either that or you have copy constructors all over the place. How can such a performance crap piece even be in the language?
  5. Here we have a language with templates, multiple inheritance and whatnot, but this same language has next to no run-time type information support
  6. We have a language with operator overloading, but for some reason class properties are evil and not to be supported even if hell freezes over.
  7. Compiler compiles your code in its entirety every time. Because the files your code is broken into are just good programmimg practice and there for your easier coding. To the compiler it's just a heap of code in one block.
  8. There is no difference between a .h and a .c / .cpp file. Whatsoever! None! Not even if you wish for it! This one actually took me years to finally understand. Especially because Pascal was my primary language and there the distinction between interface and implementation was a matter of language syntax.
  9. Stemming from the previous point is another language beauty: you can write fully functional implementation in the declaration section. Sure, it's meant only for inlining only the most inlinable methods, but since it doesn't matter to the compiler, why should you - the programmer - care?
  10. Despite its idiotic nature and completely useless existence, the preprocessor is still capable of processing one line only and is not recursive.
  11. In the end one of the things that disappointed me the most was my fellow programmers' mentality. Whenever I would ask how to get rid of some 4 bytes of memory usage in a class instance, all I got was: are you working on a 20year old 8-bit chip? No wonder a graphics driver control panel applet nowadays gobbles a couple hundred MB od RAM... When minimized, of course...
I could go on like this for thousands of items, but it would serve no purpose, so I'll just stop. You can find a million sites on the net detailing pretty much every flaw of this language many of which I totally agree with so I'll just return to what I think is the cause of this:

I believe C++ is broken as a language and will remain broken because of two things only:
  1. It is based on an archaic language and carries on its legacy despite the fact that they are very different languages and that the compilers actually have two separate code paths to be able to compile either
  2. The language is expanded and extended with the above in mind by people who know only C / C++ and are told of other languages' progress. The latest already late standard extension just proves this: it solves none of the existing shortcomings and introduces quite a few new ones. Sure it adds some nice things into the language, but the syntax for them is even more messed up than it already was.
All things said and done, C++ standards body should look at a good, popular modern language and try to adapt C++ to that instead of just complicating it further. I already mentioned one such quite successful project in my previous blog entry and I'm sure there are other, even better attempts.

Ahhhhhh, what the heck:
I vote for end of further mutilation of C++ language. I propose to accept for C++ 0X standard the specification of D 2.0. Then go from there and make it better.

The pain of choosing the right tool to work with

I've been programming in many languages. Some of them were easy to learn, some prove difficult even after months of intensive use. In the end the most important part of learning a language are the libraries provided with it. Until you know the name of that particular function, you spend tons of time looking it up in documentation.
A while ago I had to make a choice about what I'd use to program my little projects in. I wanted to use Python which I consider programmer's heaven - language wise. They don't say "batteries included" for nothing when it comes to Python. Just about anything you need is already provided for your programming pleasure. And it just feels natural to use it. If you're not sure, just type the code and to your surprise: it will usually just work. This language simply rocks!
Alas GIL, some performance issues and inability to protect my code turned me away from this beauty. So I started looking for other options. Compiled languages of old that I learned programming with and used them all my life. Surely they would provide me with more or less the same level of programming enjoyment so that I can finish my projects quickly and with minimal fuss?
I looked at Pascal, which was my primary language for years. Even in its latest incarnations I saw that I'd still have to write entire classes just to get me a simple sorted list of records. Not to mention that Borland is no more and the language itself turned to a niche player. FreePascal sure is a fine compiler and tool, but it still didn't evolve the language enough to even attempt competing with Python.
Then I looked at C++ which was never really an important language for me - Pascal was just so much easier. But it has one major advantage: just about everybody uses it. The compilers are the most advanced for this language and they exist for just about any platform you could ever think of. And it has templates and STL and libraries for just about any purpose all over the net. And pretty much everything comes with a C++ libraries and samples. However, C++ also wasn't even close to perfect. Despite all its advantages, it's still an archaic language which doesn't really evolve much and in the recent years it certainly didn't evolve into something modern.

So I looked around and I found what seemed lie a perfect candidate: D. It's a C++ derivate which took ideas from other modern languages and merged them into a language that seems the perfect mixture between Python and plain old C. I'd even go as far as to say that D is pretty much a strictly - typed Python, as much of it as can be implemented into strict types.
The problem with this language I didn't find in the language itself, but in its proprietor. For some reason Digital mars, the creators of this language, don't want to open the compiler source. Instead of them contributing to an open project, they continue to develop their own closed-source (partly open is still closed) compiler. Also the entire documentation about the language is hosted directly on the company's web site and not on a site devoted to the language itself. There are open source attempts, but they are obviously undermanned and thus too little. The entire thing smells too much of a - as in one single - company. There are simply too many what ifs.

In the end, C++ prevailed, but mostly due to its popularity. But with it came much pain. Following in the next blog entry.

Saturday, 12 February 2011

Monitor test

As I promised, I've been working on a little program which could replace Nokia Monitor test.

So, finally I can present to you my latest creation (but still one of the first published ones): the Monitor test app.
Presented to you under modified BSD license.

Download the program from here.

The archive is a bit bigger because it contains both 32 and 64 bit windows binaries. Just unpack anywhere and run the appropriate exe.
I believe it shouldn't be hard to port this app to linux or mac os X, but I currently don't have either the time nor the resources necessary.
This program requires Visual C++ redistributable package to run.

If there will be any interest in this app, I will release the source code and / or work on further improvements like actually writing a manual and stuff like that. Currently you have to pretty much figure out yourself what each test / pattern is supposed to tell you.

Let's just say that mouse left clicks will advance through patterns.
And also scroll wheel works in mandelbrot and in monitor refresh rate test (rotating wheel).

For now, enjoy and let me know what you think of it.

Wednesday, 26 January 2011

Super high precision wxDateTime

I've been working slowly on the promised program that would show some test patterns for your beloved monitor.
Recently I got stuck on the response time "patterns" for two reasons:
1. wxWidgets that I used for the platform don't exactly support fast rendering. Only method supported is GDI and OpenGL has issues with text.
2. Even deciding on what to use I still had a serious issue of wxWidgets time functions not having the required resolution. The best they manage - with wxDateTime::UNow() - is around 15ms resolution which suffices for 67FPS assuming those FPS match screen refresh intervals.

It doesn't help if I can reach 250000 FPS, but the numbers on those frames remain the same, so I went looking for a better / more precise version.

Turns out this isn't in such a high demand or at least that there aren't many solutions around.
So I just wrote my own.
I used the wxDateTime class which is already precise to the millisecond (in data, not in implementation). Unfortunately for me, but still precise enough since my solution manages quite a bit more. On my particular computer, 1/3339736 seconds precision. That is better than 1 microsecond precision.
Also, my solution is platform specific (windows) since I don't yet have any relevant Linux development VMs. If anyone cares to add cross platform code, I'm all inbox for changes required :)

I give you super high precision timer by courtesy of WTFPL license. Enjoy.

supertimer.h
#ifndef __supertimer_h
#define __supertimer_h

#include "windows.h"
#include "wx/datetime.h"

void initSuperTimer();
wxDateTime getDateTime();

#endif


supertimer.cpp
#include "supertimer.h"

LARGE_INTEGER offsetCounter, perfFreq;
wxDateTime refTime;

void initSuperTimer()
{
//Initializes global variables for use in getDateTime() function
QueryPerformanceFrequency(&perfFreq);
wxDateTime a;
a = wxDateTime::UNow();
while (((refTime = wxDateTime::UNow()) - a).GetMilliseconds() == 0)
; //This loop really hopes that UNow() has a decent resolution, otherwise it will take forever :(
QueryPerformanceCounter(&offsetCounter);
}

wxDateTime getDateTime()
{
//Gets system time accurate to the millisecond
//It could do more, but unfortunately wxDateTime isn't that precise
wxDateTime now, ref;
LARGE_INTEGER pc;
QueryPerformanceCounter(&pc);
pc.QuadPart -= offsetCounter.QuadPart;
pc.QuadPart *= 1000;
pc.QuadPart = pc.QuadPart / perfFreq.QuadPart;
ref = wxDateTime::UNow(); //Get system time for reference
now = refTime + wxTimeSpan(0, 0, 0, pc.QuadPart); //Calculate current time from reference time and time elapsed since then
if ((now - ref).GetMilliseconds() > 125)
{ //If there is more than 125ms difference between calculated and system time, reinitialize
//This also assumes that wxDateTime::UNow() is at least precise to 125ms. If it's not, this
//will constantly reinitialize the globals
initSuperTimer();
return getDateTime();
}
return now;
}

Thursday, 13 January 2011

Dell U2711 Part 2 (Calibration)

I can't believe I just did this.
I borrowed a calibrator yesterday. A datacolor Spyder 3 Pro. So that my nice new monitor would display its colors properly. And also that the colors on the monitor would match those on the TV.

Everything went well. I calibrated the monitor, I calibrated the TV, I calibrated the TV to my HTPC. All the colors looked about the same. The grays on the monitor finally looked correct. The TV and the Monitor matched. But I wanted more...

I wanted to see how my monitor fares and the "Pro" software didn't offer the info I wanted. So for the purpose of this blog, I shelled out a tidy 75€ to buy the Spyder3Elite 4.0 software. Talk about stupid...

Well, since I paid the money, here are the results of 120 nit 6500K calibration:
Gamut:
The red triangle is my monitor, the purple one is Adobe RGB.
A simple calculation shows that my monitor covers about 86% of Adobe RGB gamut. The excessive coverage not defined by Adobe RGB would cover some 92% of the gamut.

Gamma:
Gamma looks OK. It isn't on the target curve, but it doesn't stray far from it either. A minor correction in the profile covers this just fine.

Screen backlight uniformity for 100% white:
The maximum deviance is 10% in the upper right corner.
The graph looks far worse than reality I'm happy to say. I can't say the difference is bothering to me.

The actual contrast ratio has been found at 400:1. This figure is quite disappointing, especially after reading the Anandtech review which suggested I'd have at least 800:1.

After shelling out the money and getting the results I wish I hadn't done so. I was aiming for a total bragging piece. The reviews suggested 95+% Adobe RGB, superior contrast ratios and superb viewing angles. Backlight uniformity also wasn't bad. Instead I only got a good gamut, an average contrast and a pretty poor screen uniformity. I can't say I feel bad about purchasing this monitor, but it sure isn't what it's advertized to be.

No more to say, I'm afraid.

Wednesday, 12 January 2011

Why would NVidia seek x86 licenses now?

I read this article today and it mentioned something very interesting. About how NVidia wanted to bargain the x86 license from intel in this last bout of lawyerfest they had. It turned out so great for NV raking in a tidy $1.5 bn over a few years.

So the result was quite favorable for NV, but they wanted more aut of the deal? If I understand correctly, they got chipsets back, but chipsets won't help them any more. It's only a minor part now that CPUs got all the goodies. Eventually more and more peripherals will migrate from chipset to CPU and even NV must surely realize that.

But what on earth would make them go for x86? Sure, it's a huge market right now, but also a cutthroat one at that! AMD has been struggling for decades against the behemoth that is Intel with more or less success. In fact I only know of one instance that they had a short lead (the original Opteron / Athlon 64). VIA retired to a niche years ago when they found they had no chance competing against those two. Any other attempt (remember Transmeta?) went under as well.

Even getting x86 licenses would cost them billions in R&D to get somewhere around Intel's previous generation CPUs or even the generation before. So why even bother?

Instead I think NV should focus on ARM. They have been doing it and doing it relatively well. Sure, they have high power consumption, but that doesn't go for every market segment. Their Tegra 2 won't find a place in a smartphone, but beefed up it sure could find a place in netbooks and even PCs of tomorrow.

ARM AFAIK doesn't restrict core optimizations. Since NV knows a lot about caches and similar stuff thay could take what they already have and beef it up into a chip to behold! An ARM chip with a decent graphics core and performing better than, say, Zacate, would surely attract plenty of attention, especialy once Windows 8 comes out. If it consumed less than 4W, all the better.

So, NVidia: why even bother with something that may never be competitive? Instead use the knowledge you already have and differentiate yourselves!