Sunday, 13 December 2009

What exactly do movie studios think they are doing?

I like watching movies. A lot.
So my typical movie usage scenario might go somewhat like this:
  1. I go to the movies. Nothing like the big screen and thundering audio with no wife around to tell you to turn it down already :)
  2. I go to the movie rental shop and rent the movie
  3. I buy a DVD with the movie
  4. I watch the movie on TV
Not all movies go through all these actions, in fact very few go through two of them in my case. With today's Hollywood Special FX and no story products it's a wonder even that happens.
Very few movies see me buy a DVD, most of them are blockbusters. Well, European movies are a bit different. Very few are blockbusters but quite a few are definitely worth owning :)

Every single one of those actions requires the consumer to pay percentage to the movie industry. So that means that I need to pay for the movie up to four times! Well, at least action #3 (buying a DVD) lets me watch the movie as many times as I want.

Which brings me to the following "calculation":
I have to pay to watch a single movie up to:
+4 - 8€ (cinema)
+1 - 2€ (rental)
+10 - 40€ (buy)
+change (view on TV)
This totals from practically free (TV) to 50€ per one single movie per, say, family if it's really good.

I've spent a total of 460€ on movie purchases this year. And it wasn't all that many movies come to think of it. I got 21 movies for that money which averages to 22€ per movie (blu rays are a bit more expensive). If there wasn't for packs, I wouldn't have gotten away this "cheap".
Practically every one of those movies I also went to see in Cinema, some of them I also watched on TV. I must admit not to have rented any of these particular titles.
I would have spent even some 30 - 40€ more if I was able to get my hands on LOTR blu ray, but I couldn't - which made my wallet very happy / less starved.

This is one heck of a lot of money to give for 21 movies in my opinion.

No wonder Hollywood is all freaked about piracy. With these prices I can't really blame anyone for pirating. This is beyond ridiculous. Just how much do they want me to pay them? One month's salary? No? Two months? How much then? I do need to eat too, you know?

Instead of prolonging this rant I propose content owners to finally accept and embrace modern day technology. I don't care about the boxes in my closet, they are just gathering dust now. Though I must admit some of them are pretty. OK, OK, pretty much all of them are nice to look at. :)
Movies should be legally downloadable - in highest quality and without and stupid show stoppers like DRM. Movies can already be illegally downloaded non DRMed before they hits the shops so it's completely pointless to try to "protect" the legal downloads. Personalized, maybe, but nothing that might interfere with playback in any way. It's easy to encode purchaser's e-mail address into the movie every x > 50 frames and this is a completely adequate "protection" even though with today's trojans and other malware even this would prove nothing since the user's computer might have been compromised and the movie files simply published to the internet.
There would still be plenty of opportunities for studios to make money:
Sell movie at base price, say 4€
Sell extended version for 1€ more
Sell director's cut, uncut, etc versions for 1€ more each
Sell extras for 1€ more
etc, etc.
There's plenty of money to be made here and with *reasonable* prices, I'm sure a lot more people would buy movies too. Currently these prices just make almost everyone to just wait for the movie to appear on TV or go rent it. I don't think pirating is as high as RIAA would like to show. With rental prices so low it just doesn't make sense.

So turn around a bit faster guys!!! You're losing money because I simply can't buy your movies (no stock / not correct version / not high quality, etc).

Tuesday, 10 November 2009

Kruha in iger!

V zadnjih mesecih se sindikati spet glasneje oglašajo, da imajo nekateri zaposleni pri nas daleč prenizke plače.

Res je, 450€ na mesec je za slovensko draginjo plača, ki praktično ne omogoča preživetja, vsaj ne na nekem dostojnem nivoju. Še posebej, če imata oba partnerja tako plačo, je preživetje iz meseca v mesec verjetno več ali manj edina misel, ki se takima partnerjema plete po glavi vse od 18. enega meseca, ko zapadejo položnice, pa do 15. naslednjega meseca, ko pride naslednje vbogajme na račun.

Osebno se močno strinjam z dvigom minimalne plače v naši državi. Čim više, tem bolje. Vendar pa se bodo sindikati morali začeti zavedati, kaj to dejansko pomeni za delovna mesta, ki imajo trenutno plače pod novim najnižjim nivojem. Močno upam, da ne pričakujejo, da bodo taka delovna mesta kar obstala. Že pri sedanjih minimalnih plačah podjetja propadajo in to predvsem taka, ki take plače dejansko tudi izplačujejo.

Dvig minimalne plače bi torej zanesljivo sprožil kar nekaj negativnih posledic:

Prizadeta podjetja bi imela le tri možnosti:
a) Propasti (stroški > prihodki)
b) Poiskati cenejšo delovno silo - seliti proizvodnjo v tujino
c) Povečati storilnost delavcev - avtomatizirati proizvodnjo

Vse tri možnosti torej vodijo v ukinitev delovnih mest, ki so imela najnižje plače, saj za podjetja dvig takih plač preprosto ni stroškovno izvedljiv. Redke izjeme, ki bodo potrjevale pravilo, si bodo lahko privoščile dejanski dvig takih minimalnih plač, predvsem seveda delavcem v proizvodnji.

Tako pridemo do delavcev, ki danes prejemajo minimalne plače, zaradi dviga le-teh pa bodo njihova delovna mesta ukinjena:
1. Ali se bodo ti ljudje lahko izobrazili ter učinkovito nastopili na delovnih mestih, ki imajo višjo dodano vrednost in torej lahko prenesejo tudi tako povišane plače?
2. Če ne, kaj bo država spet naredila? Bomo plačevali povečane prispevke za socialno varnost (nezaposleni) ali povečane prispevke za pokojninsko zavarovanje (še en val upokojevanja ljudi, mlajših od 50 let)

In na koncu pridemo še do makro ekonomskih sprememb, ki bi jih tak dvig plač prinesel:
V idealnem primeru dejanskega dviga minimalnih plač in z minimalnimi socialnimi spremembami v smislu povečanja brezposelnosti ali upokojevanja bi to pomenilo, da se na trgu kar naenkrat pojavi precej več denarja. Dvig produktivnosti, ki bi prinesel denar za višje plače, bi se torej na našem trgu odrazil tudi v obliki presežkov denarja v prostem pretoku. Kdo in kako bi potem brzdal inflacijo? Saj se ne pritožujem - več denarja na trgu pomeni tudi za ostala delovna mesta, da svoje delo lahko prodajo dražje in tako zaslužijo več. Pozitivni učinki takega scenarija so seveda veliko bolj daljnosežni, kot vidijo naši sindikalisti. No, prinesejo pa seveda s sabo inflacijo, saj bi se cene predvsem za storitve pričele dvigovati.

Problem je le v tem, ker se tak scenarij najverjetneje ne bo uresničil. Že sedaj je zaposlena manj kot polovica našega prebivalstva, po takem dvigu plač pa bi se razmerje nezaposlenih na zaposlene še povečalo. Razlog za to je dokaj preprost: če bi imeli produktivnost na zadostnem nivoju za tak dvig plač, potem bi najverjetneje plače take že bile in zahteve sindikalistov niti ne bi bile potrebne. Glede na propadanje takih podjetij močno dvomim, da izplačujejo minimalne plače le zato, ker so delavci za take plače tudi pripravljeni delati.

Zagotovo bo skoraj vsaka "bitka", ki jo bodo delavci izbojevali, že v srednjem roku (kvečjemu nekaj let) prinesla posledice v obliki odpuščanja. Šivilja je v današnjem svetu lahko plačana kvečjemu določeno vsoto denarja, podobno tudi delavci za tekočimi trakovi in podobno. Že sedaj so njihove minimalne plače lahko tako "visoke" le zato, ker so poceni delavci (Azija, Afrika) tako daleč, da se razlika v končni ceni proizvodov pokrije s stroški prevoza iz teh oddaljenih krajev - malo pa še s kakšno carino tu in tam.

Edino upanje za delavce z minimalnimi plačami je torej iskanje služb z višjimi dodanimi vrednostmi. Podjetja se bodo že prilagodila primanjkljaju delavcev, ki bodo take službe uspeli najti. Nemogoče pa je pričakovati, da bo šivilja imela 2000€ plače. Vsaj ne taka, ki dela v večjem podjetju. Nekaj povsem drugega so uveljavljene mojstrice, ki krojijo za posamezne stranke po meri.

Pa smo spet pri meji

Videti je, kot da se moja želja počasi bliža uresničitvi. Videti je, kot da sta g. Pahor in ga. Kosorjeva uspela nekako pririniti vsaj izhodišča za končno odločitev o meji do nekega nivoja, ko se zdi, da bo počasi pa le potrebno sprejeti odgovornost in posledice, ki jih bo prinesla odločitev tretjega.

Kot sem bral pred nekaj dnevi v komentarju v časniku Delo in se pri tem pošteno nasmejal: Dejstvo, da bo o dejanski meji odločal tretji je pravična in ustrezna kazen za obe državi oziroma za njuna politična aparata. Samo upam lahko, da bosta ta ista aparata premogla toliko pameti, da bosta stvari pripeljala dostopnje, ko bo tretji sploh lahko odločal...

Krasno! Jaz sem vesel, da se končno zadeve premikajo. Videti je celo, da bodo v kratkem začele dobivati konkretno obliko in to je lahko le dobro.
Pravkar gledam oddajo Odmevi (RTV SLO), kjer se naši vrli politični veljaki prepirajo o vprašanju, ki naj bi bilo postavljeno na predhodnem posvetovalnem referendumu. Kako patetično! Vprašanje naj oblikujejo strokovnjaki, katerim ne visi za vratom noben od naših politikov, pa bo. Če posvetovalni referendum že ravno mora biti - sprejetih odločitev ljudstva na takem referendumu naša politika tako ali tako nikoli ne upošteva - torej so povsem nesmiselni.

Torej, h "pravilno" oblikovanem vprašanju vsekakor spada tudi ustrezno izobraževanje javnosti. To pomeni, da bi mediji v času pred referendumom morali natančno pojasniti, o čem bodo arbitri sploh odločali. To seveda vsebuje tudi podrobne zemljevide - s katastrskimi občinami (slovenskimi in hrvaškimi junija 1991) ter v kakšnem razmerju so meje katastrskih občin s terenom, o katerem bodo arbitri odločali. Barve bi tukaj naredile prave male čudeže, kar se obveščenosti in izobraženosti nas, preproste raje tiče. Seveda v primeru prekrivanja katastrskih občin poleg vsega skupaj sodi še obrazložitev argumentov ene in druge strani, zakaj se meje prekrivajo.

Samo na ta način bodo ljudje lahko pravilno "odločali".

Tuesday, 3 November 2009

Windows 7 pricing and market share

I've just recently read an analysis about Win7's market share after one week since it going public. Anyway, the number was something around 2% and the article author was flaming MS for performing miserably with it's newest OS. He even took it so far as to say that XP and Vista users are migrating to Apple's products instead of Windows 7.

I don't see 2% as a failure, even if there was quite a substantial pre-order campaign. Judging from pre-order pricing of the product which changed quite a few times between august and release date, the campaign was a huge success. 2% still means millions of copies in this market.

What I do see a problem with is pricing of the product now that it is released:
105€ for Home premium (with a new computer)
200€ for Ultimate (also with a new computer)

These are prices for Slovenia (amazon.de is a bit cheaper), but I also got full prices from amazon.de which are 120€ for home premium and 300€ for Ultimate.

Now, I'm sorry, MS, but this is way too excessive. 300€ for an OS?!?!? Operating system! I mean, it's not like I can do much productivity on the OS alone, can I? Sure, the calculator is great, finally improved after what, 29 years of existence? OK, so the scientific mode was added with Win 3.0 "only" 19 years ago. Wordpad is also super cool, if I may be a bit sarcastic (I own Office 2007 BTW).

Not much else productivity-like in there, is there? It's funny, but those games seem to be the most value-add in the OS and even for them there are many freeware alternatives.

In the end we're still talking only about an OS.

Now, I ordered 2 copies of home premium for my home needs. Fortunately for me, MS decided to offer a pre-order promotion of about 50€ per home premium.

I will state this for the record: 50€ is an acceptable price for an OS, 300€ is not! I might pay 80€ for premium versions, but I definitely discarded that because premiums were 110€ in pre-order. So in my view MS just lost 30€ per copy just because they priced ultimate too high.

Need I say more?

Sunday, 11 October 2009

Socialna država

Naši politiki tako radi poudarjajo, kako socialno državo da imamo. V imenu tega pojma prenašajo neprestano napihovanje sindikalistov, ki očitno težijo takrat, ko to ni potrebno, ko pa pride do resnih problemov, pa zamudijo vse vlake in se samo še na koncu na televiziji hvalijo, kako so v bistvu vse uredili prav oni, čeprav je iz izjav prizadetih delavcev razbrati le to, da sindikat zanje ni naredil nič.
Tako imamo sedaj krasno stanje, ko so naši vrli sindikalisti povzročili, da imajo delavci v javnem sektorju v povprečju precej višje plače, kot pa delavci v gospodarstvu, ki dejansko prinašajo denar za delavce v javnem sektorju. Seveda pa še sedaj niso zadovoljni in hočejo še in še, vsakič za nek drug sektor negospodarstva. Seveda so pri Virantovi reformi vsi vzeli višje plače, danes pa so razmerja tako prekleto nepoštena, da jih je treba še malo povišati. Potem je pa Virant kriv, da je naredil neumno reformo. Jaz bi rekel, da je kriv le v tem, da je naivno zaupal sindikatom, da ne bodo izsiljevali države po vseh letih usklajevanja in določanja plačnih razredov.
Pri vsem skupaj pa še vedno nisem omenil, da je precej nekih delavcev v javnem sektorju enih navadnih lenuhov, ki samo lovijo vsak mogoč ovinek, samo da ne bi bilo potrebno delati. Govorim iz osebnih izušenj, ko čakam v čakalnici sam (nikogar drugega daleč naokoli) na storitev po tri ure, preden me končno spustijo v sobo, kjer bo storitev opravljena, ali pa me vrla "preobremenjena" delavka pošlje k privatniku opravit zadevo, saj da ima ona veliko preveč dela. Prav tako ob 8:00 zjutraj, nikogar drugega v pisarni, ona pa s kofetom v roki debatira s sodelavko. In potem ima ta lenuh vsaj tako plačo, kot jo imam jaz?!?! Hvala lepa.
Sindikalisti seveda tega ne bodo priznali, toda socialna država ne pomeni ohranjanja delovnih mest za vsako ceno. Socialna država prej pomeni, da ima vsak nezaposlen realne možnosti za novo zaposlitev, če jo le želi in da v času nezaposlenosti ne umre od lakote ali bolezni, ker ni imel strehe nad glavo.
OK, dovolj pritoževanja, ne bom z njim nikamor prišel. Zato se raje vračam na zadevo, ki me je privedla do tega pisanja:
Vlada je objavila sveženj ukrepov, k jih namerava izvesti v sklopu pokojninske reforme. Daljša delovna doba, izločitev študija, vojaščine in podobnih zadev, vse lepo in prav.

Ampak kaj hudiča se je pletlo v glavi onemu KRETENU, ki je izključil iz pokojninske dobe porodniški dopust?!!!?!?!?

Se opravičujem (mah - niti ne), toda če se par odloči, da bosta imela otroka, to pomeni, da bosta spravila na svet novega delavca, ki bo v prihodnosti prinašal v državno blagajno nov denar, zato da bomo mi lahko imeli vsaj 10% pokojnino glede na plače v delovni dobi. Vsaj tako je trenutno videti; da bodo pokojnine namreč tako mizerne. Sam sem že pisal o problemu pokojninskega sistema, pred kratkim pa se je v časopisju pojavilo tudi pismo bralca, ki je naredil enak izračun kot jaz. Če na kratko komentiram: g Jurišič je opravil vse izračune povsem primerno, na žalost pa ni dobro predvidel rešitve. Preprosta ukinitev sedanjega sistema bi žal povzročila resen problem pri sedanjih upokojencih in tudi vseh, ki bi pogoje dosegli v naslednjih 20 letih, saj ne bi imeli prihodkov, od katerih bi lahko živeli.
Rešitev za to zagato je v bistvu le ena: država bi morala zahtevati, da delavci vedno več vplačujejo tudi v drugi in tretji steber. Brez obveze ne bo nihče vplačeval in po trenutnem sistemu bomo v naslednjih letih začeli videvati učinke (ne)vplačevanja, ko bomo imeli vse več upokojencev, ki bodo socialni problemi. Začnite z enim odstotkom, gospodje. V 30 letih bi to moralo postati 100% in potem pokojninskega sistema ne potrebujemo več, saj bo vsakdo privarčeval dovolj za svojo pokojnino. Eh, spet sem zgrešil smer.
Torej: dva se odločita za otroka in otrok prijoka na svet. Recimo, da je nek par zelo zavzet pri tem in imata otroka vsako leto v ženini rodni dobi. To pa je lahko vse od 20 do >45 leta. Menda je neka ženska rodila celo pri 56 letu!!! Ta ženska ali njen mož (če bi dopust za nego otroka izkoriščal on) bi torej po novem sistemu ne imel nobene osnove za pokojnino, hkrati pa bi oba imela najmanj 25 otrok, ki bi vsekakor bili sposobni ju preživljati. Odličen primer za državljansko nepokorščino, če vprašate mene. Že precej manj drastičen primer (5 otrok) prav tako vpije po državljanski nepokorščini, saj bi po trenutnem predlogu mati petih otrok imela 15% malus pri pokojnini. Da sploh ne govorimo o absurdu, ki ga ta predlog ponuja, torej starša sta zagotovila delavce, ki ju preživljajo hkrati z vsemi ostalimi "penzionerji", ki morda sploh niso imeli otrok!

Predlagam torej ravno obratno: država naj zagotovi možnost, da se državljan(ka) odloči za poklicno kariero starševstva. En otrok ne pomeni veliko, pri štirih ali petih pa bi moral vsaj eden od staršev imeti precej dostojno plačo, na primer vsaj 1500€, če ne 2000€ (progresivna lestvica torej). Otroci so namreč kar draga zadeva in ljubezen ne pomaga pri stroških. Kdor torej hoče otroke, bi jih po novem lahko celo imel. Seveda ta plača velja pod podobnimi pogoji, kot danes prejemamo otroški dodatek, torej vse dotlej, dokler je otrok v oskrbi staršev. Država pa bi pač investirala tisto plačo (otroški dodatek) v izračunu na prihodnje davke, ki jih bodo ti otroci še prispevali v državno blagajno. Prav tako bi seveda starša za čas starševstva dobila 100% priznano pokojninsko dobo, če ne bi tega odstotka raje dvignili.

S tem bi odprli možnost za razvoj kariere starševstva, če bi se neka družina pač za to odločila. Dodatno bi tak ukrep pomagal pri naravnem prirastku prebivalstva, da niti ne omenjam, da bi bila končna posledica tudi več delavcev, ki bi omogočili preživetje naši "socialni državi".

Wednesday, 26 August 2009

The sorry state of hardware video playback acceleration

I've been playing around with options for a HTPC for a while now. See my initial HTPC article for introduction about this. BTW: It's almost finished with another post following in a couple of months (I need to do some programming first).

Anyway, while buying hardware, I was looking for a discrete graphics card that would provide HW accelerated video playback since my old Athlon X2 3800 isn't quite up to the job when full HD H264 content is to be played.

Anyway, after looking around for a while, I decided to go for an ATI 4350 (30€), since Nvidia's entry level offerings were a tad too expensive for me (>50€). It was rather a matter of principle than price. Since both manufacturers advertize HW accelerated playback for years now, I was confident any offering would do the trick. Also initial searching for "HW accelerated H264" turned up some promising pages quickly "discouraging" me from digging deeper.

I could not be more wrong...

As it turns out, HW accelerated video playback isn't nearly capable of providing an average user with what they want. Be it ATI or Nvidia, both have their issues and problems, not to mention the very concepts HW based acceleration is currently implemented around. I should mention that Nvidia is currently in a bit of a lead due to their successful CUDA advertizing and support (CoreAVC released a CUDA accelerated codec). ATI on the other hand is betting on it's OpenCL horses, but currently don't even offer a driver.

I compiled a short list of problems I detected while trying to enable any kind of HW acceleration. Note that I have tried just about any codec that was mentioned on various net forums first. These problems are just the best case I found after trying them all and failing with each and every one of them.
1. DXVA
The famed microsoft framework for HW based video playback seems to be a one way street. There's no return information. Either the HW can or cannot play a codec. If it can, it will be accelerated, otherwise it will not. Some other codec will have to take charge.
What's worse is that the card won't just decode the video stream - the whole system works only if the decoded content is displayed immediately on the screen.

2. HW accelerated functions
Actually a very similar problem to the one listed above. This one was the most dissappointing to me because ATI avivo is supposed to provide some nice and quite powerful deinterlacing algorithms. Since this functionality works in only "all or nothing" mode, you can get deinterlacing only if the avivo can also decode the stream itself. Otherwise you're so out of luck. Same goes for any other filter avivo or purevideo may or may not provide.

3. Supported video codecs
This one is just great! After playing around for a while, you can only find out that number of supported codecs by any of the two chipmakers is frighteningly low. To make matters worse: if chipmaker says it's chip supports H264, that doesn't mean it will support everything encoded in H264. It turns out, lots of important codec features are not supported and the stream also has to be encoded just the right way for even the supported features to work. It turns out that my own camera clips, carefully transcoded into H264 (x264) of course can't be played back. I sure as hell am not transcoding them again. Anyway, even when I managed to find a video that the card was willing to play back, it was a DVD - which my Athlon can already deal with quite well. I don't need HW acceleration for low res MPEG2. I need it for H264, dammit - high resolution.

4. Additional filters
With DXVA you can just forget that. DXVA is a one way street. This means no additional filters. Yep, that also means subtitles. A disgrace.


Quite frankly I can't help but be disappointed. I had a nice old 6600GT laying around and I bought the ATI card to help decode the more demanding content (my full HD camera) which my poor old Athlon struggles with at 80-90% CPU usage. It turns out I just wasted my money.
Even my friends pathetic 300MHz ARM in his MediaTank plays back just about any content he throws at it easily, thanks to properly implemented HW accelerated codecs. I can't believe that after at least five years of bragging, ATI and Nvidia can't provide decent acceleration for this.

It seems both of them will be saved by OpenCL in the end, but even with that coming up I don't believe we'll see a good solution until FFMPEG project gurus implement it. At that time I'm betting my HD4350 will turn out to be a pretty weak card offering very little acceleration. Well, no matter - i just need 20 - 30% off my poor old CPU and I'm hoping the little bugger will at least be able to do that much.

Until then I'll just have to swallow an occasional dropped frame and no postprocessing for my full HD.

Sunday, 9 August 2009

Siva pot, vodi me.... (drugi del)

Bitnje so ena prav lepa vasica. Par kilometrov med Kranjem in Škofjo Loko. Ravna cesta, en sam blag ovinek, cesta rahlo nagnjena v smer Škofje Loke. Ravno toliko, da na kolesu občutiš, da je v tisto smer malo lažje goniti.

Pred nekaj tedni se je tam zgodila prometna nesreča. Nekaj mladincev se je peljalo skozi Bitnje, verjetno iščoč naslednji bar, v katerem bi se lahko še malo zabavali. Ali pa so šli domov, ni pomembno. Pomembno je, da so ti mladinci vozili krepko čez 100 km/h pri omejitvi 60, pri tem pa so jim cesta + avto + hitrost + pozna ura izstavili račun v obliki precej trdega pristanka v neki hiši ob cesti.

En teden kasneje so izginili znaki za omejitev 60 km/h, zaradi česar je sedaj omejitev skozi Bitnje 50km/h.

Če se prav spomnim, so bile table za pričetek vasi Bitnje (zgornje, srednje in spodnje) pred dobrimi 15 leti, ko sem dobil izpit za vožnjo avtomobila, še tiste brez črt (omejitev 80km/h je to takrat pomenilo). Ko se je spremenil zakon, so bile najprej table za 70km/h, pozneje pa so (prav tako po eni nesreči zaradi prevelike hitrosti) zmanjšali na 60.

Lahko da se tudi motim, ampak ni poanta v tem.

Poanta je v tem, da na naših cestah vidiš neznansko število znakov za omejitev, neprekinjenih črt - seveda na ravninah - in podobnih zadev. In večina jih je bila postavljena kmalu po tem, ko se je nek norec / pijanec / pač običajen nesrečnik zvrnil s ceste zaradi kakršnega koli razloga.

Očitno naši velmožje, ki imajo te zadeve "čez", ne ločijo med pojmoma "nesreča" in "objestnost" (v kakršni koli obliki).
Nesreča se pač zgodi zaradi trenutka slabosti ali neprevidnosti. Lahko se tudi zgodi, če je določen odsek dejansko nevaren in se to odraža v večjem številu nesreč.
Objestnost pa povzroča nesreče zaradi prevelikih hitrosti, alkohola ali preprosto "afnanja" po cesti.
Medtem, ko je povečano število nesreč na določenem odseku tehten razlog za omejevanje hitrosti ter druge opozorilne znake, nobeden od ostalih vzrokov ni in ne more biti razlog za take ukrepe.

Že 60km/h je bila kar nizka omejitev za ravno in pregledno cesto skozi Bitnje. Voziti skoznje 50km/h je čisto mučenje in povsem nesmiselna omejitev. Me prav zanima, ali se tisti, ki je dal odstraniti znake, sedaj vozi 50km/h skozi to vas. Stavim pir, da se ne. Pijem seveda v Radovljici, da ne bo pomote. Lahko pa povem to, da od odstranitve znakov vedno povzročim kolono. Nihče se ne strinja s to odločitvijo in tudi dejansko je popolnoma nesmiselna in ne naredi povsem ničesar za varnost prometa.

Ko se naši velmožje že tako pritožujejo nad ceno prometnih znakov, jim priporočam, da se enkrat zapeljejo po cestah, ki jih sami upravljajo, in uredijo prometne znake, da bodo bolje odražali dejansko stanje in varnost posamezne ceste.

Tako bomo vsi bolj zadovoljni, pa tudi nesreč bo manj.
Samo moja 0.02€

Siva pot, vodi me....

400 km. 600 z vsemi dodatnimi potrebnimi povezavami. Morda kakšnega več, morda kakšnega manj.
Imamo torej Ljubljanski obroč, Karavanke - Brežice ter Lendava - Koper. Potem pa še Postojna - Jelšane, Brežice - Šentilj ter Žalec - Dravograd in Dravograd - Maribor. Aja, pa še Novo Gorico in Ajdovščino sem pozabil. Hudi kilometri.

Toliko avtocest bi morala Slovenija zgraditi. Morda še kakšno zraven, ampak to je več ali manj to.

In to ubogo revščino "gradimo" že 18 let. V prvih letih so nam razlagali, kako poceni kilometer da imamo, in to je bil seveda izgovor za počasno gradnjo. Sedaj je gradnja še vedno počasna, imamo pa verjetno najdražje kilometre v Evropi. Neprestano samo hvalisanje, kakšne tehnološko napredne tunele in viadukte da smo zgradili, potem pa se zapelješ par kilometrov po kateri koli sosednji državi in vidiš še bolj impresivne - pa se nič kaj toliko hvalisanja ne sliši od njih. Medtem pa so naši zaprti, ker se je voziščni beton izrabil, so bili napačno postavljeni jekleni distančniki, ali pa kar tako, ker je nekomu všeč zapreti avtocesto...

Naši južni sosedje v enem letu zgradijo več kilometrov avtocest, kot pa jih bi mi vsega skupaj sploh morali - ne pretiravam prav veliko. Samo cesta Zagreb - Lipovac ima čez 300 km, pa je prav lepo vzdrževana, da sploh ne govorim o povezavah, ki so jih naredili v letih po osamosvojitvi. Mislim, da bi pri vsem hvalisanju, kako dobro gospodarstvo, da imamo, mi zlahka naredili prav toliko, pa nam niti ni bilo potrebno.

Pri vsem skupaj pa imamo na naših avtocestah neprestano neke omejitve hitrosti zaradi nekih del, ki jih ni videti nikjer, zaradi del, ki še bodo ali pa kar tako, samo zato, ker si DARS oziroma njegovi pooblaščeni vzdrževalci to pač lahko privoščijo. Saj ni pomembno, da je zaradi teh nesmiselnih blokad in zapor na teh istih cestah veliko nesreč. Pač v takem primeru postavijo še malo nižjo omejitev, da je verjetnost nesreče in frustracija voznikov še malo večja.

Samo par banalnih primerov:
1. Oni most pri Naklem čez tržiško bistrico: Nekdo je ugotovil, da so tiste jeklene traverze za par milimetrov napačno postavljene in že smo imeli celo leto omejitve 60km / h tam čez. Prosim vas, prav lepo!!! Kdo hudiča bo pri dveh pasovih na avtocesti vozil tam 60??? Sem celo poskušal to delati, pa sem skoraj vsakič skoraj povzročil nesrečo.
2. Eden od tunelov na štajerski avtocesti: Celo pomlad je bil eden od pasov zaprt, s prav tako krasno omejitvijo že par kilometrov pred tunelom, mislim, da je bilo čez tunel celo 50, ali pa so nam milostno vseno naklonili 60? Ne vem več, vendar na onem zaprtem pasu ni bilo videti nobenega delavca nikdar. Dokler se nise en dan pojavili s cisterno in spirali sten tunela. Morda da so potem še celo barvali, ne vem več...
3. Cestninske postaje: Če prav vem, je bil sprejet celo odlok o njihovi odstranitvi, da ne bi bilo potrebno voziti 40km/h tam čez, pa so še kar tam. Policaji z laserskimi merilniki pa tudi...
4. Ali moram sploh omeniti one imbecilne omejitve 40km/h na vsakem izvozu z avtoceste? Če se kdor koli drži te omejitve 99% sledi nesreča, ker se mu sledeči zagotovo zaleti. Ali je res potrebno postavljati tako - še enkrat poudarjam - imbecilne omejitve? Ali ne more biti 80? Ali ne more biti sploh brez znaka, kot je to npr. v Avstriji? Ali smo vozniki res tako neumni, da nas je treba s temi znaki spraviti k pameti? Jaz bi sicer raje rekel, da nas z njimi spravljate ob pamet.
5. Prosim, da se enkrat kak gospod z DARSA pelje po kateri koli naši avtocesti in malo opazuje znake za omejitve hitrosti. Trdim, da bo na metre štel odseke, kjer je mogoče več kot 10 minut voziti s hitrostjo 130km / h. Neprestano nekaj - dela na cesti (ki jih ni), križišče, spolzko vozišče, karkoli, samo da lahko postavite tiste tako priljubljene znake z omejitvami hitrosti.
6. Najbolje pa je na tistih odsekih, kjer avtoceste še ni. Idealen primer: Peračica. Imamo torej bivšo hitro cesto (no, OK, nov pas avtocesete, ampak povsem enake širine). Ta pas je sicer dvosmeren, ampak brez možnosti prehitevanja (ad-hoc ograja, ipd.). Torej, zakaj mora ta pas imeti omejitev 80km/h, pogosto skozi tunel celo 60km/h, medtem ko je stara cesta za motorna vozila lahko imela 100km/h in dovoljenje za prehitevanje nedeljskih voznikov. V čem je tako zavarovan pas manj varen, kot pa prejšnja cesta? Isto vprašanje velja seveda tudi za vse odseke, kjer so tako imenovana "dela na cesti", dejansko pa zaprt pas 3 mesece in potem dva tedna del na cesti.
7. Za konec še en cukrček: Za pir dam prvemu, ki mi pove odsek naših avtocest, ki je bil izročen v uporabo (torej narejen) in potem NI bilo v enem mesecu že vzdrževalnih del na njem. Jaz takega odseka nisem videl. Da sploh ne omenjam, da naši gradbeniki očitno niso sposobni narediti več kot 10 km na enkrat... Aja, pozabil: saj jih ne naredijo, takoj nato so že vzdrževalna dela.

Meh, zdaj sem pa povsem znorel in napisal nekaj, česar prvotno sploh nisem imel namena. Sledi torej Siva pot (drugi del).

Ne vrag, le sosed bo mejak

Če je kaj potrebno priznati našim vrlim politikom, je to vsekakor, da znajo do onemoglosti zakomplicirati neznatne probleme.

Govorim seveda o 18 let trajajoči tragikomediji z južno mejo. No, če hočem biti res pošten, moram pri tem omeniti tudi hrvaške politike, ki, prav tako kot naši, v večnem strahu pred izgubo volilnih glasov - in s tem politične moči - prav tako nikakor ne morejo sprejeti nekega kompromisa.

Že davno je jasno, da je obema stranema škoda vsakega kvadratnega metra od tistih ubogih desetih kvadratnih kilometrov, če jih je sploh toliko vprašljivih... Pri tem pa se že 18 let obe strani obmetavata s takimi podlostmi, da se še parameciji obračajo v svojih grobovih ob njihovi imbecilnosti (se opravičujem za tujko, vendar nisem našel ustrezno močnega slovenskega izraza). Trpita pa oba naroda. Ljudje smo pač "ovce" in precej dovzetni za propagando. Še posebej če ta traja 18 let. Na eni strani neprestano poslušajo: "Našo zemljo nam hočejo ukrasti", na drugi pa "Spet so provocirali s tem ali onim incidentom" - na obeh straneh pa neprestano le medsebojna obrekovanja in zvračanje krivde.

Se opravičujem, gospodje politiki, na obeh straneh: s to svojo imbecilnostjo ste zaradi nekaj metrov levo ali desno dosegli le to, da se bodo običajni ljudje še kar nekaj let gledali postrani. Tragikomedija na vrhuncu. Ne eni ne drugi niste sposobni narediti nekega kompromisa, s katerim seveda nobena stran ne bo zadovoljna (ker bo vsaj nekje) izgubila tistih nekaj metrov, vendar bi končno vsaj rešil to vprašanje.

Sedaj že pokojna nekdanja predsednika obeh držav sta v izbranem trenutku premogla dovolj treznega razmišljanja, da sta prišla do nečesa kolikor toliko spodobnega za obe strani. Ta predlog je prav toliko sprejemljiv, kot bo kateri koli drug, za katero koli stran. Nobena stran se ne bo nikoli odrekla vsem vprašljivim metrom, tako da je upanje na tak "kompromis" odveč, vse ostalo pa so povsem nepomembni detajli.

Dogovorite se torej za nekaj - KARKOLI - in končno sprejmite mejo, postavite tiste bedne mejnike, ki čez 2 leti tako ali tako ne bodo več pomembni, in zaključite zgodbo. Preden naredite trajno škodo v odnosih med normalnimi ljudmi. Ali je to res tako težko?

In potem sprožite kampanjo na obeh nacionalnih televizijah, ki bo poskušala vsaj malo popraviti škodo, ki ste jo povzročili s svojimi neumnostmi v teh 18 letih. Nekaj v smislu rožic in skupnih piknikov - saj veste: nismo tako različni, da se ne bi mogli imeti radi. Sovraštvo in zamere res nimajo kaj iskati pri tradicionalno prijateljskih sosedih.

Thursday, 11 June 2009

The woes of building a home HTPC / file server

Recently I decided that I want to build me a nice HTPC. You know - the PC you stick under your TV and it's supposed to manage all the media playback, recording and other management thereof.
I just happen to have a nice Athlon 64 X2 3800+ sitting around, doing absolutely nothing, so I thought that sticking it into a nice HTPC case, adding 3 super big disks in RAID-5 and connecting it to my Logitech Z-5500 speakers + a nice full HD LCD TV might fix all my file and media needs.

These were my requirements:
  1. File server - i have lots of projects I'm working on and some of them require substantial storage
  2. Media storage - i also make about 20GB of photos and movies with my digital camera per year
  3. Kids games - not a requirement, but it would be nice if my kids could play some games on the gadget
  4. IP TV - i have a provider that streams live TV through IP. Though their offering is quite user friendly, I still want to have all the functionality merged in one box.
  5. Simple interface - I want the media functionality to be as easily accessible as possible
  6. Instant on - I want my HTPC to be available from shutdown or stand by in a few seconds time
  7. Performance - I want all my movies to play back flawlessly with no dropped frames
To be honest, I didn't expect to bump into so many problems with the first requirement already: the desire to make my data a bit safer by using RAID-5 turned out to be a major problem. Hardware controllers are pretty expensive and also a bit of a problem when they die: if you can't get a replacement, your array died with the controller. So I decided to go software. I found that there's a hack for windows XP, but none (yet) for Vista or Windows 7. I just can't understand why MS offers this support only for server OS versions. Where have you seen a decent server that used software RAID?!? One point for Linux since RAID-5 is quite well supported there. I just don't like hacks...

Requirement two also turned out to be a bit of a problem: most HTPC softwares simply don't offer what I need in this respect. I don't want my movies to be separated from my photos. They were shot in order and I want to watch them in order as well, not in some other menu with a gazillion clicks to find what I'm looking for. It's idiotic, really, since all media software can display images as well as play movies, but few can simply take what I feed them and display that in the photo gallery without regard for the type of the file. We're in the 21st century guys!!! Even photos aren't what they used to be any more.

I haven't played much with the third requirement - at this point I can only say that Windows 7 Media center has a special menu for games installed in the Games folder. How it works I haven't even checked out yet.

IP TV also poses a significant problem. Lots of PVR / HTPC software recognizes and works with tuners, but simulating a tuner with IP TV is so much tougher. Many won't even think of going there, others just make it hard. Direct separate support for IP TV is of course out of the question.

Simple interface isn't a problem. Most of HTPS sotwares offer easy enough interface. It just starts getting a bit more complicated when you're trying to solve specific problems through plugins.

Instant on again proved to be quite a handful. The computer in question takes about a minute to boot into OS and a minute is not the time I'm willing to wait for it to boot. So sleep / resume is the only option forward. Well, as it turns out, only Windows 7 makes my little Athlon go to sleep and resume, all other OSs simply fail at any step of the process. Some won't go to sleep, some won't properly resume after...

And finally, there's performance issues. It seems my Athlon isn't capable of playing full HD content on one core, it simply needs both and even then it might skip a frame here and there. Some HW acceleration would definitely be appreciated from my good old GeForce 6600GT graphics card. As it turns out, the card is too old to support H264 acceleration. And I do recode all my movie clips into H264. I'm not exactly fond of MPEG 2 they are originally coded in. It's way too big for my taste and my quantities. Recoding to H264 and AAC saves me lots of space. 5/6 to be exact.
Anyway, it turns out that mplayer in Linux is compiled single threaded and so it most of other media software. I managed to compile mplayer with multi threading support, but mplayer isn't exacty the front-end I desire in my HTPC. I failed with other programs...
FfdShow for Windows on the other hand seems to handle my files just fine...

Well, currently I'm quite stuck with my little project. Some of my requirements aren't met in Windows, some in Linux and I'm not exactly sure how I can get my self out of this mess and still have my HTPC.
It seems the problems on windows are easier to solve. Since there's no SW RAID, I can always just buy a HW card and be done with it. Or write a nice little driver that would do things my way. I already know backup is going to be an issue. I'm planning on having an array four times as big as all the disks I currently have...
As for the other problems - Sometimes I understand why everyone wants to build their own HTPC / PVR software...

Friday, 22 May 2009

Solving Zalman Reserator 2 problems

See here for the introduction.

Sure enough, it wasn't even a month before my little corner of peace turned first into a little nuisance and after a while into a nightmare:

Problems #1 and #2 - flow indicator
First I began having problems with cricket-like sounds that the flow indicator was making. I contacted my reseller and asked them for help. Aside from a bit of mumbling I overheard ("customer sucks" style), they said they would check with Zalman and get back to me. Well, in the three weeks it took them to get back to me, I started getting the second problem - the flow indicator stopped spinning which caused the unit to shut down the pump and start beeping violently. And that's not something you want with your water cooling. If the water stops flowing, it's a matter of minutes before the water in the blocks is too hot to provide any cooling for the components.
At this point I just took the unit to the retailer and hoped that it would be fixed ASAP. Naturally, it took them more than a month to actually give up and give me a new unit.
Problems #3 and #4 - service level
Since the retailer's service personnel was so helpful, I received the new unit alone. Having used all the coolant for the previous setup, I naturally complained. So I was given some of my original solution from the first unit plus a bit of coolant from the second package. And to top it all off, they poured the stuff into tap water washed plastic flasks (originally a sweet beverage). No amount of complaining made any difference. Together with this I was also given a lecture on how the coolant is too thick and that the instructions were wrong to suggest a 1:4 mixture. This supposedly caused the original indicator failures. A 1:10 mixture was more than enough. Blah blah. Naturally I also couldn't convince them to give me a new degassing tube. So much for customer service... :(
Problem #5 - pump too weak
I tried to remove the plugs from the tubes to make myself a new degassing tube, but that's impossible without cutting the tubes themselves. Of course, having no degassing tube made degassing even harder. The pump in the Reserator unit is simply too weak and after 4 hours of trying I had to give up. The pump simply couldn't push the water down the tubes and no amount of shaking, raising / lowering of both the unit and computer helped. Calling service again had no effect. Also writing mails to Zalman support also bore no fruit. So in the end I simply decided to buy a second pump. I chose the Laing DDC-1T and it arrived in two days. The Laing pump has similar properties as the Eheim in the Reserator. While also being a quiet pump, it has a rather nasty problem of vibrating alot. Only using lots of foam around it made it quiet. Two pumps in the loop finally managed to push the water through.
Having solved the problem, I also assembled the rest of the loop with the NB block and VGA block. After the system was degassed properly, the Zalman pump alone was able to push water through the system, but I rather kept the second pump in operation since the flow indicator turned spookily slow in comparison.
Problem #6 - very high flow resistance
At this point I should mention that the entire system offers significant water flow resistance. If you wish to empty the tubing, both the computer loop subsystem as well as the Reserator unit subsystem offer such resistance that one has to blow really hard into the tubes to make the water go through. I mean *really* hard. My wife saw me a few days ago when I was making the final cleanup / repairs and she yelled at me to stop as she thought I would have a stroke :) My face was just so red from all the blowing. }:-)) This kind of resistance certainly demands a pump with much higher head pressure than the Zalman integrated pump provides.
Let's put it this way:
Initially when I assembled the original CPU only loop I calculated the water flow and it was only 27 l/h (a very generous calculation) instead of the pump rated 300 l/h. Completing the loop with three elements didn't much change this. Maybe the flow fell to 25 l/h. In any case, the flow indicator spins really slow with Zalman pump alone.
Adding the Laing pump into the loop increases the flow to approx 60 l/h (also a generous calculation).
Problem #7 - flow indicator again After a month of bliss, problems started again. The flow indicator was stopping again and I couldn't figure out why. The reseller service refused to help me any further and they directed me to Zalman support, which I already knew to be quiet.
Zalman USA
So I turned to online forums, asking my peers for help. I received some great feedback as well as phone numbers of Zalman USA office. Naturally I called them and spoke to a nice guy named Keith and he agreed for me to send him an email with description of the problem. So I did and I got a reply the next day. This was great: finally some support. We exchanged questions and answers three times all of them within a single week and Keith was kind enough to give me all the requested info. He even promised to ask the Koreans to send me a replacement pump, flow indicator, degassing tube and coolant. This was truly a pleasant experience and I was really happy with Keith. To be more precise - I still am.
Well, to be sure, Zalman support was once again quiet. After waiting for two months I finally gave up on the replacement parts and proceeded to fix the unit following instructions by Keith and the forums folks. In order to do that I had to order some coolant, but unfortunately it was unavailable in Slovenia. Yep, both our distributors sell Zalman's water cooling kit, but none of them provide replacement parts of any kind. So I ordered from Germany, paying 13€ for coolant and 17€ for delivery :P
Problem #8 - Algae buildup
Having such a weak coolant solution predictably led to algae buildup in the system. While I was discussing things with Zalman USA and waiting for replacement parts that were never to come, algae started growing in the system. Fortunately for me, the buildup was slow enough that the system wasn't clogged by the time I started fixing things. But a thin film was clearly visible on surface of the coolant solution.
Problem #9 - Screw quality
Disassembling the unit revealed a few more faults with the unit. The screws used are extremely low quality. Even with a proper screwdriver I simply ruined a few of them. They are so soft that any force will damage them. One would think they are made of alluminum, that's how soft they are. But they react to magnets so I guess they are some iron alloy after all.
Problem #10 - Flow indicator materials
The flow indicator itself is made of clear plastic and the centerpiece (the spinning gauge) some blue plastic. Here's a pic of what I pulled out:

Note the two metal weights and the plastic around them. One weight is some iron alloy with good magnetic properties so that the sensor can detect the indicator spinning. The other doesn't have magnetic properties (and is also not affected by rust).
I cleaned up the metals and re-glued them into the plastic. To prevent further corrosion, I painted the whole thing with some water resistant paint. Excess plastic was cut off.
Since at the time I still didn't know what causes the indicator to stop spinning, I performed multiple tests for spin resistance. The indicator spinned no matter how low a water flow I generated, so I left it at that and proceeded with fixing the rest of the unit.
Problem #11 - Sharp tubing turns in the unit
Here they are:

I should mention that these imaged don't do the actual situation justice. Both tubes were bent so badly that the internal tube dimensions were at most 2mm x 8mm, thus severely restricting water flow. The springs that are supposed to prevent this bending don't quite do their job.
Having bought a couple of L joints beforehand, I was able to fix this problem like this:

The bottom joint was particulary problematic since there wasn't enough space and I definitely wanted to keep at least two of the sealing rings. Not being able to cut the L joint to optimum length left me with a bit of an angle which surely creates some turbulence, but I'm still betting this solution is a lot better than the original one.
Problem #12 - Air buildup
During all the time having these problems, I've had two occasions where air would start building up in the tubing. Initially I attributed this to algae, but it also happened again even after I had already used the new coolant. I'm guessing I just didn't make a strong enough solution which led to the original algae colony not being completely killed off. I have in the mean time cleaned the system again, also by using quite a lot of alcohol and I'm now using a much stronger coolant solution (1:3).
Anyway, I have also determined that this air buildup was responsible for the flow indicator failures. While I could certainly see the air in the computer loop tubing, I could never see it in the unit tubing. However I am now sure that air buildup was in both cases present also in the unit tubing itself.

So, this is it. After all the troubles and work fixing them I now have what I originally purchased. The conclusions from the previous post are still valid and I'm also happy that I managed to fix a problem where lots of things were working against me.

I still / again believe this is a very good piece of equipment and I think it's a shame Zalman removed it from their portfolio instead of fixing it. Its thermal properties are no doubt much better than Reserator 1's which stays in the portfolio, probably due to much simpler design which lacks many of the shortcomings of this unit.

Zalman Reserator 2 (review ?)

Well, seems I can't keep my own schedules, but still it's time to start on the promised reviews.
The Zalman Reserator post will be in two parts, first about the product and then about all the issues I've been having with it. Yes, you've read this correctly - even Zalman sometimes makes mistakes. Having used their products for some years now, I thought that nearly impossible, but this particular product has quite a few shortcomings I will describe in the second article (hopefully also done today). For now I will focus on the good side since after all the modifications I made this product now does what it was intended to do, I just had to work a bit for it:

I've bought Zalman Reserator 2 last July. You see, I have my computer set up in a corner of my living room with the box under the desk and the rest of stuff on it. Since I like to squeeze every ounce of power from my computers, naturally that also means I have a 150W oven under my desk. So, off to the first online store I went and ordered me a nice watercooling setup. The general idea was of course to move the heat source from under my desk to above it.

Just a few days later, the postman rang with a nice little package:

Well, OK, not so little. The box itself was actually quite big :)

Trembling with excitement, I went ahead and opened it:
First thing inside was the radiator / reservoir unit itself. I must admit, I didn't expect to see such quality of workmanship. All corners are nicely rounded, so there's no chance of you cutting yourself. The transitions between various parts are smooth and any screws holding the whole thing together blend superbly into the design of the unit.
The paint finish is superb with silver and matte black adding a high quality touch to the overall unit impression.
Here's a snapshot the other parts in the package although I'm sure you've seen all of this in all the proper reviews around on the net. There's two stands, an expansion slot tube bracket, the CPU and VGA cooling blocks, a degassing tube, 4 meters of tubing and a flask of coolant liquid. Naturally, a manual too.
Following instructions in the manual, I could have the unit assembled in a few minutes, but I decided to enjoy the process and took a good hour to assemble the basic CPU loop, turning and marvelling at every component in the process. This was after all my most expensive needless computer component bought to date.

The degassing process didn't go as well as I expected. I spent a good half hour before I couldn't hear any bubbles any more. Today, almost a year later I know exactly what to do to degass the unit, but back then I was obviously too much of a n00b. In case you're wondering: you have to tilt the unit clockwize (observed from the front) until you hear the bubbles. Then tilt it some more and just wait until the bubbling stops. The entire procedure can be completed in a couple of minutes with a maximum of one reset necessary. My original problem was that I was tilting and shaking the unit to both sides, as was described in the manual...
Having degassed the unit, I plugged it into the simple CPU loop I made and let it run overnight to check for leaks. Naturally, there were no leaks whatsoever so I finished up and closed the computer. Here's how the original loop looked like inside my Antec P182 case:


The VGA block was of course useless for my 8800GTS so I immediately went online and ordered the appropriate block for that as well. What would be the point of buying a component to move the heat when I wouldn't even apply it to the hottest component in the computer?

A few days later I received a small box with my 8800GTS cooler:
Again, a high quality looking component, nicely presented in two brushed alluminum shades. Attached were instructions and all the required parts to assemble the beast onto my gfx card.

Having had to buy a new VGA cooler, I was left with the original block which now suddenly got the appeal of becoming a nice chipset block, even though it originally wasn't designed for that function. Oh well, if the designers only knew what folks do with the stuff they worked so hard to design }:-))

So, here's the snapshot of the final, three-block loop i assembled. The order of components is CPU - Chipset - VGA. I figured the temperature delta is the largest on the CPU so it seemed to be the best candidate for the cold water. VGA on the other hand can take temperatures higher than CPU so it doesn't matter it the water pouring into its water block is already at 70 degrees centigrade.... :D OK, OK, I admit, the order was suggested by gurus on the numerous forums I read before even going for the setup.
I did have to clip one of the mount holes on the "chipset" block in order to be able to fit it next to the CPU socket, but otherwise the installation went easy. You will also notice that I removed the heat sink from the power regulation modules on my Asus P5K-E WiFi. I figured since the top row didn't have any, the back ones also don't need it. For this reason I didn't want to cut the NB / modules heatpipe in case I'd ever need to go back to air cooling. Until today, almost a year later, there have been no adverse effects due to this decision although I must admit that initially I feared it a bit.

After completing the modifications, I went ahead and started testing the whole system. I must say I'm very impressed with the results:
The cpu (E8400 @ 3,44GHz) went from 40 / 65 (idle / load) to 40 / 55 degrees centigrade.
I never had any numbers for the NB, so I cant give any, but it works, so I guess it's cooled sufficiently.
The VGA, however, was the biggest jump: Initially it was 55 / 80 degrees. Now it's 50 / 55. No matter how much I work it out, the temperature difference won't go higher than that. And I'm running it overclocked to the max 650 (core) / 2000 (RAM).

You probably noticed the relatively high idle temperatures. Well, I can also explain that: I'm running this computer 24/7 and at this moment the room temperature is 29 degrees centigrade. Yes, we're having quite a hot week here in Slovenia and no, I have no air conditioning installed.

What I did notice was that the Reserator unit manages to cool all three components and even with the computer constantly running, the water will go no higher than 10 degrees above ambient, the CPU will have water temperature at idle and max +15 degrees when running prime over night. The graphics card always has water temperature +10 - 15 degrees, depending on load.

I even managed to push my E8400 to 4.6GHz, but that required some insane voltage - if I remember correctly, I had to use 1.65V or something like that. And it also got to 65 degrees after a good night of prime so that was a bit too high for my personal taste.
Now I just have it overclocked to 3.44GHz which is the highest it will go with the lowest voltage the MB suports (1.1V). In case you're wondering, my particular CPU overclocks a bit strangely - if I want to go even to 1.85GHz, I already need 1.35V so I just figured a measly 12% increase in performance isn't worth it.

Obviously, now I have the heat all placed above the desk with practically none of it coming from the box below. Which was my initial purpose for this little beast. And it worked :)
Also this setup is practically completely silent. The pump in the Reserator unit is practically inaudible and I only have two very low RPM fans left in the case to cool the other components. I guess the only quiter PC I ever had was my first 286 AT which had no hard drive and no fans. And even it lost the quietest status the moment I installed my first second-hand 10MB IBM double height MFM disk :)

Overall, an excellent product, unfortunately removed from market due to its many shortcomings. See the next post, I'm guessing I had to solve pretty much all of them to make the unit work properly.

Another snapshot of it installed in my corner of zen and peace :) :

Tuesday, 24 March 2009

Faster data fetching from multiple SQL tables with multiple base elements to analyze

What this method gives you:
  1. This method doesn't require you to massively rewrite the code.
  2. The code is relatively easy to understand.
  3. The speed improvements are simply massive (30 - 50 times faster code, up to 200 in some scenarios).
  4. and best of all: you can implement this method step by step, one subquery at a time and observe the speed improvements as you go.
Note that I don't read insane amount of books. That said I haven't yet seen this method in a book and therefore don't know it's "official" name. But since my fellow programmers never know what I mean when I say it's name, I'll state for the record that I call this method "synchronized queries". The name stems from the fact that at fetching time, some synchronization must be performed among the queries affected.

Also, this method is to be considered intermediate level at best. It's no rocket science. But since I implemented it on a fair bit of different code segments, I guess it isn't the typical programmer's first line of thought.

On to the method then:

If you've ever programmed a business type application, like MRP, CRM or some such, you've most certainly had to implement an analysis that took multiple base documents as input and then had to fetch the data about those documents from a series of database tables.

The most common scenario that comes to mind is order analysis.
To analyze an order, one has to fetch data from multiple tables like some of the following:
Purchases (materials, services)
Shipping documents (finished products)
Invoices (finished product)
And then a whole myriad of data in the production which you entered to keep track of the production of the ordered product:
Work orders
Material issues
Semi-product and product completion
Worktime + tools usage
etc, etc.

In such an analysis all these tables have the requested order(s) to be analyzed in common. Meaning that for each table you can by some means get to the order id. Either it's in the table already or there's some other table that creates the connection between the two.

The most common solution to do such an analysis that I have seen goes something like this:

for each order do
analyze purchases
analyze shipping
analyze invoices
analyze work orders
analyze material issues
...
next

As you can see, this solution is OK for situations where you only have to analyze one order and it's respective details. You will have one compile for the order and one compile for each detail.

However, when you are doing for example a yearly analysis, it quickly becomes obvious that we have potentionally thousands of orders and therefore thousands of compiles for the poor SQL server.

A compile (statement execute) is relatively the most time consuming part of fetching data from an SQL database. Especially if the query itself is complex and returns relatively few records. Having no indexes on the filter and order fields helps a lot too :)

So the above example behaves optimally when there is one order to analyze and decreases in efficiency with each following order since all subqueries have to be recompiled to return data for each subsequent order.

Knowing that a compile is very expensive, the best way to go should be quite obvious - just decrease the number of compiles. But how can one do that? If I make the queries such that they will return data for all requested orders, that data will just be mixed all together making the analysis very hard to do, especially if we consider the volume of all that data. There's no way it can be fit into memory and retrieved when needed.

Well, the solution is quite simple:
In order to keep the actual analysis routine as close to the original algorithm as possible and still reap the benefits of fewer compiles, one needs to fulfill ony one additional condition: all queries must be sortable by order number.
Since SQL has an adequate command for that, namely "order by", this condition is easy to meet.

There is one additional issue that typically needs to be solved for such situations: normally you won't get a list of orders that need to be analyzed. Rather, the filter will be more like: orders from - to (date), order state, seller, etc.
So, in order to solve this problem, we have to somehow get order numbers since all these details will not be stored in every table.
The solution is quite simple: we just create a temporary table from the original order query with a select into statement. We can then use this temporary table as a join in all subsequent detail queries.

So, the example for one such detail table will be:

Create the temp table
select f1, f2, f3, f4, f5, ...
into #myTempOrders
from Orders
where c1 = xxx and c2 = yyy...

Make the detail query for purchases:
select f1, f2, f3, f4, ...
from purchases
join #myTempOrders on purchases.orderid = #myTempOrders.orderid
order by purchases.orderid

This purchases query will return data for all orders being analyzed if data for them exists.

So to complete the analysis, our new code looks like this:
Create temp table (if needed)
Compile all detail tables
for each order do
analyze purchases
analyze shipping
analyze invoices
analyze work orders
analyze material issues
...
next

Surprising, how the code is actually the same, isn't it? :)
The only additions are initial compiles that will speed up our analysis.

But there is one significant difference:
In our initial code "analyze" meant: compile the detail query, fetch the records, do what you have to do with them.
In our new code "analyze" means: synchronize the master and detail query, fetch the records, do what you have to do with them.

What is changed is "compile" versus "synchronize".
A compile would look like this (a little remodelled query from before):
select f1, f2, f3, f4, ...
from purchases
where purchases.orderid = xxx

A synchronize on the other hand looks like this:
int CmpRec(SQL sqlOrder, SQL sqlAny)
{
//Compares the "key", that is order number in both datasets
if (sqlOrder.asInt("orderid") > sqlAny.asInt("orderid")) return -1;
else if (sqlOrder.asInt("orderid") == sqlAny.asInt("orderid")) return 0;
else return 1;
}

//Advances the detail dataset until equal or greater order id is fetched
while (CmpRec(sqlVk, sVrDok, iStNal, iStPos) < 0)
if (!sqlVk.ForEach()) break;

while (CmpRec(sqlVk, sVrDok, iStNal, iStPos) == 0)
{
//Do your stuff
if (!sqlVk.ForEach()) break;
}

Have fun creating faster programs :)

Autosizing multi-line merged cells in Excel using macros

As part of my previously mentioned Excel project i also had to display multi-line content in cells. Incidentally, most of this content was in the RTF fields mentioned in my previous post.

However, Excel has some issues with auto sizing merged cells containing such content. To be more specific - if the cell containing such text is merged, all Excel methods for auto sizing fail miserably.

So I went ahead and looked for this on the net. After quite a bit of searching I managed to find a forum thread dealing with this problem. Since my memory is worse than that of a fish, I'm afraid I can't give proper credit to the author of the code that solves the problem (just spent another half an hour searching for that thread, but can't find it :( ).

Anyway, the original solution was made so that a macro would first search for all merged cells and then auto-size their respective lines based on the content of those cells. Although this solution may fail if you have single cells that would resize higher than other cells, it was fine for me.

I have modified the original algorithm since I already knew which cells would require auto-sizing. So the "gathering" algorithm is in my case simplified to adding appropriate cell info into the array as I add new cells to the final report.
The actual resizing algorithm is unmodified.

This is what needs to be done:


  'add merged cells into an array
  If iFirst = 1 Then
    ReDim myTexts(0)
    iFirst = 0
  Else
    ReDim Preserve myTexts(UBound(myTexts) + 1)
  End If
  opisi(UBound(myTexts)) = "D" & iRow & ":G" & iRow
  'note that columns D to G were used in my .xls. You can use whatever range you want, just make sure you add all the cells

  'Actual resizing code
  If iFirst = 0 Then
    'Do this only if you added any elements into the array
    For i = LBound(opisi) To UBound(opisi)
      sReport.Range(opisi(i)).Select
      With ActiveCell.MergeArea
        If .Rows.Count = 1 And .WrapText = True Then
          'Do the magic
          CurrentRowHeight = .RowHeight
          ActiveCellWidth = ActiveCell.ColumnWidth
          For Each CurrCell In Selection
            MergedCellRgWidth = CurrCell.ColumnWidth + MergedCellRgWidth
          Next
          .MergeCells = False
          .Cells(1).ColumnWidth = MergedCellRgWidth
          .EntireRow.AutoFit
          PossNewRowHeight = .RowHeight
          .Cells(1).ColumnWidth = ActiveCellWidth
          .MergeCells = True
          .RowHeight = IIf(CurrentRowHeight > PossNewRowHeight, _
            CurrentRowHeight, PossNewRowHeight)
        End If
      End With
      MergedCellRgWidth = 0
    Next i
  End If

Saturday, 7 March 2009

Converting RTF to plain TXT

As part of my work on the Excel spreadsheet mentioned in the previous post, I also had to display contents of a RTF field.
Since Excel doesn't parse RTF in any way (well, except for import as a file, but that sucks too), I had to convert the RTF field retrieved from the database into plain text so that I could display it for the user.

Following is code that will convert the RTF to plain text.
However, the function makes some assumptions:
  1. The RTF is normal text, maybe with some font formatting
  2. No tables, lists or any special RTF structures are supported. They will be converted to plain text with no special formatting. If you need special formatting, you'll have to add appropriate lines of code into the parser...
  3. The convertor assumes that the code page of the RTF is the same as the code page of the client computer. This is important if you have special (language specific) characters in the RTF itself.
  4. Also in regard to code page, this convertor will only work on ANSI code pages. This means, it will only convert single-byte characters, not multi byte ones. Since I don't have a multi byte RTF available, I don't know how hard it is to fix this one.
  5. Also note that I'm no Excel guru, so the code in the macro may be sub-optimal
So here it goes:

Private Function hexcode(ss)
 If ss = "0" Then
   hexcode = 0
 ElseIf ss = "1" Then
   hexcode = 1
 ElseIf ss = "2" Then
   hexcode = 2
 ElseIf ss = "3" Then
   hexcode = 3
 ElseIf ss = "4" Then
   hexcode = 4
 ElseIf ss = "5" Then
   hexcode = 5
 ElseIf ss = "6" Then
   hexcode = 6
 ElseIf ss = "7" Then
   hexcode = 7
 ElseIf ss = "8" Then
   hexcode = 8
 ElseIf ss = "9" Then
   hexcode = 9
 ElseIf ss = "a" Or ss = "A" Then
   hexcode = 10
 ElseIf ss = "b" Or ss = "B" Then
   hexcode = 11
 ElseIf ss = "c" Or ss = "C" Then
   hexcode = 12
 ElseIf ss = "d" Or ss = "D" Then
   hexcode = 13
 ElseIf ss = "e" Or ss = "E" Then
   hexcode = 14
 ElseIf ss = "f" Or ss = "F" Then
   hexcode = 15
 Else
   hexcode = 0
 End If
End Function

Private Function RTF2TXT(ss)
 While (Right(ss, 1) = Chr(10) Or Right(ss, 1) = Chr(13) Or Right(ss, 1) = " " Or Right(ss, 1) = "}")
   ss = Left(ss, Len(ss) - 1)
 Wend
 If (Len(ss) >= 1) Then
   ss = Right(ss, Len(ss) - 1)
 End If
 iPos = 1
 sResult = ""

 While (Len(ss) > 0)
   If (Mid(ss, iPos, 1) = "\") Then
     If (Mid(ss, iPos + 1, 3) = "tab") Then
       sResult = sResult + Chr(9)
       iPos = iPos + 4
     ElseIf (Mid(ss, iPos + 1, 3) = "par") And (Mid(ss, iPos + 1, 4) <> "pard") Then
       sResult = sResult + Chr(10) 'Chr(13) + chr(10) seems to not work, #13 is displayed as a square char
       iPos = iPos + 4
     ElseIf (Mid(ss, iPos + 1, 1) = "'") Then
       sResult = sResult + Chr(hexcode(Mid(ss, iPos + 2, 1)) * 16 + hexcode(Mid(ss, iPos + 3, 1)))
       iPos = iPos + 4
     Else
       iPos = iPos + 1
       While Mid(ss, iPos, 1) <> "\" And Mid(ss, iPos, 1) <> "{" And Mid(ss, iPos, 1) <> Chr(13) And Mid(ss, iPos, 1) <> Chr(10) And Mid(ss, iPos, 1) <> " "
         iPos = iPos + 1
       Wend
       If Mid(ss, iPos, 1) = " " Then
         iPos = iPos + 1
       End If
     End If
   ElseIf (Mid(ss, iPos, 1) = "{") Then
     iLevel = 1
     iPos = iPos + 1
     While iLevel > 0
       If Mid(ss, iPos, 1) = "{" Then
         iLevel = iLevel + 1
       ElseIf Mid(ss, iPos, 1) = "}" Then
         iLevel = iLevel - 1
       End If
       iPos = iPos + 1
     Wend
   ElseIf (Mid(ss, iPos, 1) = Chr(10) Or Mid(ss, iPos, 1) = Chr(13)) Then
     iPos = iPos + 1
   Else
     sResult = sResult + Mid(ss, 1, 1)
     iPos = iPos + 1
   End If
   If iPos = Len(ss) Then
     ss = ""
   Else
     ss = Mid(ss, iPos)
   End If
   iPos = 1
 Wend

 While (Right(sResult, 1) = Chr(10) Or Right(sResult, 1) = Chr(13) Or Right(sResult, 1) = " ")
   sResult = Left(sResult, Len(sResult) - 1)
 Wend
 RTF2TXT = sResult
End Function

Hope it helps anyone. I couldn't find anything like this function after searching for days.

Database queries and how to use them with macros in Microsoft Excel

I'm no Excel guru. In fact, I've never before written an Excel macro. Also, the last time I was programming in Basic, it was Simon's Basic on my beloved Commodore 64 some 25 years ago.

That said, I recently got a request from a client to do a bill of materials output from SQL database directly into Excel. The root cause of this was that Crystal Reports in our MRP application that we use for reports is unable to export to Excel if the report contains a RTF field + subreports + whatnot.

Since my knowledge of macros was quite insufficient, what I mostly did was record a macro, see what Excel generated and copied that code into my functions.

What I needed first was some kind of parameter entry. The user needs to enter the item, it's variant, number of levels that the BOM would be outputted with and the language.
So I looked up "Excel running macro on cell change". This one was pretty straight forward:
  1. You have to run the Visual Basic editor, embedded into Excel. In pre 2007 versions, that would be Tools / Macros / Visual Basic Editor. For Excel 2007 (and presumably up), you first have to enable the developer ribbon (Office button, Excel Options, Popular, Show developer tab in the ribbon). After that you open the developer tab and click the leftmost icon, saying "Visual Basic".
  2. Once in the editor, right click the sheet in question and select "View code" from the popup menu
  3. At the top of the window that opens, there are two combos. Select "Worksheet" from the first and "Change" from the second
  4. Write the function, something like:

Private Sub Worksheet_Change(ByVal Target As Range)
If Target.Column = 3 Then
If Target.Row >= 6 And Target.Row <= 9 Then
'Your code here
End If
End If
End Sub


Then I needed the query that would return the bill of materials. So I went recording a macro and defined my query. Note that Microsoft Query isn't capable of defining complex queries. The most it will do is allow you to define a join or two :) But, fortunately it does have functionality needed to define queries of any complexity.
To write any custom query and have it's results returned to Excel sheet, you have to select "Execute SQL..." from File menu. You will get a dialog, which is unfortunately not resizable, but you can just as well paste the query you made previously in your tool of choice.
After you have written your query, just press the "Execute" button and you will get the results for the query into a new window in the Microsoft Query.
Having this window on top (if it's not the only one), just select "Return data to microsoft office Excel", again from the File menu.
You now have the query in your sheet of choice in the Excel. So you can stop recording the macro and go view it's code.
You can now grab the generated query code and go beautify it some since your SQL statement was garbled hideously by Excel macro recording. Additionally, insert the user input variables into the query so that the query will respond to what user types as parameters.
For me, the end result is something like this:


iIdent = Range("C6").Value
sVarianta = Range("C7").Value
iStNivojev = Range("C8").Value
sJezik = Range("C9").Value

With ActiveSheet.QueryTables.Add(Connection:= _
"ODBC;DSN=IIS;UID=juhuhu;" _
, Destination:=ActiveSheet.Range("A1"))
.CommandText = Array( _
"select * from (" & Chr(13) & "" & Chr(10) & _
" select KoSumNivo, KoInfZapSt, KoKolMateriala, isnull(KomOpis, KoMPNaziv) as KoMPNaziv, m1.MPTeza," & Chr(13) & "" & Chr(10) & _
" KoPodSifMP, KoMPSifKarKlj, m1.MPMaterial, KoOpomba2, KoMPSifEM1, m1.MPDoNaziv," & Chr(13) & "" & Chr(10) _
, _
" KoSumKolMatNIzm, KoSumZapSt, m1.MPOpis, KoMasterMP, KoMasterVarianta, GetKosovnica1.datum," & Chr(13) & "" & Chr(10) & _
" m2.MPNaziv as MasterNaziv, m2.MPDoNaziv as MasterDoNaziv, m2.MPTeza as MasterTeza, KoSeNaroca" & Chr(13) & "" & Chr(10) & "" _
, _
" from GetKosovnica1(" & iIdent & ", '" & sVarianta & "', 2)" & Chr(13) & "" & Chr(10) & _
" join MaticniPodatki m1 on KoPodSifMP = m1.MPSifra" & Chr(13) & "" & Chr(10) & _
" join MaticniPodatki m2 on KoMasterMP = m2.MPSifra" & Chr(13) & "" & Chr(10) & _
" left outer join KomOpisMat on KoPodSifMP = KomSifMP and KomJezik = '" & sJezik & "'" & Chr(13) & "" & Chr(10) _
, _
" ) as Tmp1" & Chr(13) & "" & Chr(10) & _
" where KoSumNivo <= " & iStNivojev & Chr(13) & "" & Chr(10) & _
"order by KoSumZapSt" _ )
.Name = "Query1"
'.... Lots of properties here
End With


I start out with putting the parameters into respective variables. I then proceed to use the variables in the query itself. If you'll go looking at the SQL itself, note that the GetKosovnica is a stored SQL function that I wrote, which parses recursively a table that defines bills of materials. Now that you have this out of the way, you can start modifying the parameters and see how a different results will be returned for different parameters. But, there's a catch: Everytime you change a parameter, you'll have one more query in the sheet. In my case the previous queries were shifting to the left. So now we have to either reuse or delete any previous query on the sheet. I have searched a lot for a way of deleting an existing query on a sheet. I found several methods, but none of them really worked for me, so in the end I decided to go for query reusing. So this is how you reuse an existing query in Microsoft Excel:

If ActiveSheet.QueryTables.Count = 0 Then
'The query does not exist yet, so let's create a new one
With ActiveSheet.QueryTables.Add(Connection:= _
..... Query parameters here
End With
Else
'The query is already there, so let's reuse it
With ActiveSheet.QueryTables(1)
.CommandText = Array( _
..... Query parameters here
End With
End If


Now you will only have one query on the sheet. However, before you can use the data the query returns, you will have to modify it a bit.
By default Excel queries are created as background queries. This means that when you run the query, Excel will run the query in the background and refresh the data only when it is actually returned. Any macro or user operation before this will work on a previous set of data (if any).

In order to have the macro wait for the query to actually execute and replace the data with a new set, you have to set the following query parameters:

.BackgroundQuery = False
.Refresh

Well, actually ".Refresh" is calling the refresh method. Do this both on the newly created query and on the reused query part.

Now you can use the data from the query and copy it into a nicely formatted table.

Special word of caution: while the Query result table will always display the data as it came from the SQL database, the copied data will have leading zeros removed if the entire cell is composed of numerics. To make Excel understand that you want those zeros, you have to copy the data like this:

Dim sODBC As Worksheet
Dim sReport As Worksheet

Set sODBC = Sheets("ODBC result set")
Set sReport = Sheets("Final output")
sReport.Range("K" & iVrstica).Value = "'" & sODBC.Range("H" & iSrc).Text


Note the single quote in double quotes. This tells Excel that you're entering a string, not a number. This also works with manual entry.
The sReport and sODBC are Sheet objects I predefined so that I don't have to type Sheets("ODBC result set") and Sheets("Final output") for each copying command.

This is it for the first part of my Excel tips. Hope you find it useful.

Who's got the balls?

It's here. The economic crisis, I mean.
To be quite honest, it was due for quite some time now. With the global economy booming and everybody spending like there's no tomorrow, without much thought on prices and actual need, we got just what we deserve. Nobody, I mean NOBODY was thinking about saving anything in case things wouldn't be so good in the future.
An economic crisis is certainly no fun, don't get me wrong. Aside from businesses closing down, it has a nasty habit of hitting the average Joe. Lots of people lost their jobs. I wish I could say it was for a greater good, but it wasn't. In my opinion, our current situation is a result first and foremost of greed. Distant second is plain carelesness.

So a couple of trillion US $ just vanished from the market. How miserable an excuse can that be for a crisis?!? The most virtual stuff man managed to invent, easily redefined in any way we see fit - and we call that a good excuse to make several hundred million people unemployed?

World governments failed to react in due time. The measure was quite simple: just replace the vanished money. Yes, I know, that would just boost the inflation in the mid-term. But it would fix the immediate problem in the short term.
But even if the governments did react in time, it would only delay the inevitable cleansing of the economy (by means of a crisis).

It's what the governments are doing now to get us out of the crisis that matters. It is known that government spending boosts economy. But while our governments were spending like mad during the good times, now they are squeaking how there's no money to spend. Oh please - I've lived in a country with 2000% plus inflation. So I know money can be made at will. They won't even do warranties so that banks would start lending again.

So while Obama is signing bills that will cost U.S. taxpayers in excess of 2 trillion US $, the Euro group is clinging to the monetary criteria like we're all going to die if the criteria would be temporarily forgotten to get us out of the crisis. None of the member countries conforms to the criteria anyway, they just have to lie about it now... Not to even mention the program our own prime minister made: a couple of hundred million € savings in government spending with special emphasis that he would be buying his own coffee from now on. OK, forgetting the coffee thing - you managed to save a bit. But where will you put this money? No idea? Oh, well...

To cut this short:
I believe the Americans are doing the right thing at the moment. Government warranties + direct spending to re-boot the banks is currently all that is needed. Sure it will boost inflation, but it will also revive the industry.
The EU on the other hand has no approach at all. Even when the french president decided to throw a hefty pack of cash to the automobile industry, everybody just cried foul at him. At least he did something...

Guess who's going to come out of this stronger?

I'm back

Long time no hear :)
I've been pretty busy lately and as a result this blog suffered accordingly. But at the same time, material was gathering and now I have quite a few things to say.

This post will just be an intro (to help me remember what I have prepared) :) so here's a kind of TOC tor the following posts:
1. First, I'll have to do a short rant on current economic crisis and the way EU + member governments are dealing with it. I just have to rant a little :D

After this I'll follow with some more useful posts, beginning with programming. Since I've had a request from a client to do a rather complex Excel application, I'll begin with that:
2. The first post will deal with Excel and using data queries in macros
3. The second will be about RTF to plain text conversion, also in an Excel macro
4. The third will be about multi line cell auto-sizing, since Excel doesn't quite cut it with it's standard functions
Although I had to research many aspects of Excel macro programming while I was doing this, these three seem worth publishing as the solutions for the particular problems were not readily available on the net.

5. Continuing with programming tips, I'll explain a rather simple method to increase data fetching speed from multiple SQL queries.

Last, but not least, I've been lax in posting about some hardware that I have tested, so you'll have my thoughts on the following:
6. Auzentech X-Fi Prelude 7.1 audio card
7. Zalman Reserator 2
8. Antec P182 computer case

9. To finish up, I have some code that allows one to sort huge amounts of data and still do it in their lifetime. I'm talking billions of records and hundreds of gigabytes of data. This one will take a bit since I have to clean the implementation specifics from the code, but eventually I'll post this too :). Bottom line with this code is that this code helped reduce some reporting from 48+ hours (data in SQL database) down to 20 minutes (data in custom files, but on the same server).

So here goes. Lots of work to be done. I'm hoping to have all these articles done in march, but one never knows :)