Wednesday, December 5, 2007

Tango with D

I have been playing around with D for a while, nothing serious but I am pretty excited about this language. D is better explained as “C++ done right” or perhaps as “Java done right”, but I will just quote Walter Bright, D’s author from here:

Its focus is on combining the power and high performance of C and C++ with the programmer productivity of modern languages like Ruby and Python. … The D language is statically typed and compiles directly to machine code. It is multiparadigm, supporting many programming styles: imperative, object oriented, and metaprogramming. It's a member of the C syntax family.

This post, however, is not about D. Check out Digital Mars site to learn more about it. Here I just want to let you know that APress just published a book about D and its "Tango" library. As far as I know "Tango with D" is the first book about this wonderful language.

If you have not heard about D, you owe it to yourself to look at it. Most C++ programmers will fall in love with it, and I personally hope it will replace C as a dominant language on Linux, especially after seeing it perform well on programming language shootout. Good thing about D is that there are already two implementations of it and one of them is a front end to GCC.

Well... first book is always great, let's hope it will increase the visibility for D. Congratulations to D community!

P.S. My only concern is the name. While "D" sounds awesomly geeky and sufficiently attractive while implying the underlying superiority over C++, this is the absolutely terrible name as far as search engines are concerned. Try googling for "arrays in D" to see what I mean.

Thursday, November 29, 2007

Pikluk! Browser and Email for Kids

I get asked a lot about Pikluk, our new startup venture. Questions like how we are doing and why are we better than NetNanny always come up, especially among my tech-savvy friends. We tried our best to build the web site as simple and informative as possible, but it wouldn't hurt to explain it again, using less marketing-speak.

Pikluk is by far the most simple way to allow your kid to use the Internet. Sure, there are multiple content filtering packages out there, and big companies sell some scary software that literally is indistinguishable from a rootkit that spies on your own children and sends you reports by email. Yes, there are plenty of people doing similar things, but not quite the same.

Pikluk is simple. Just define two lists:

List for the web sites
Another list for email addresses

... and those are the only resources your child will have access to. The entire process takes seconds. Once you're done, you'll have a fully functioning children's browser and secure email.

We have kept it simple, just two main features: secure email for kids, and a safe web browser. Check out the screen shot.

Parents can also explore around and select from the most popular sites that other parents have already added for their kids. Everything is, of course, anonymous.

Pikluk Tech

Our browser is built upon Internet Explorer, and everything about the content and the look and feel is controlled remotely. The content is completely customizable by the parents through the Parent Dashboard on www.pikluk.com. The browser can optionally "lock" your kids in, not allowing them to switch to other programs and wander about.

Pikluk is built on the Linux platform using a couple of dynamic languages, and from my posts it is fairly easy to guess which ones. Starting it up was a lot of fun, especially once first emails from real customers started to pour in. Things got even more exciting once very happy emails started to appear. And of course, we were giddy with excitement when the first paying customers started to trickle in.

Check it out, spread the word! Pikluk - the web browser and email for kids!

P.S. This is my last shameless self-promoting post, I promise :)

Monday, November 19, 2007

Adobe Flash/Flex. Plague of the Web.

Someone posted a question recently on hacker news asking what kind of new software technologies readers are into. Few years ago I would be quite eager to jump on such discussion, raving about something really cool I had just discovered and picked apart in my free time.

I do not follow every announcement from Microsoft, Sun or even Google anymore. After riding this crazy technology train ever since I graduated in 99, I forced myself to slow down and re-evaluate everything I have ever picked up. Turns out it wasn't that much. More than half of what came around had gotten obsolete or irrelevant or proven to be dead wrong.

On the other hand, I see the same old (sometimes scary-old) and good ideas coming back into fashion over and over again being re-packaged into newer and trendier buzzwords.

Did you notice how web programmers pissed all over themselves when they happily discovered MVC? Do you have any idea how old the concept is? Didn’t you find it ironic how JavaScript suddenly became trendy, and not just for the Web, and how its zealots love to name-drop first-class functions and closures it borrowed from 2nd oldest computer programming language in the world?

This is why I started moving back in time about 2 years ago, my latest books are all about “good old stuff”. I met the beauty of Smalltalk and liberating flexibility of Lisp. Bam-m-m! And Boost with its binding facility did not look so sexy anymore… These languages, or should I say cultures, did not land me any gigs and I haven’t done any projects with them, but they helped me to notice and pick up Python and Ruby, discover bash and vim, and enjoy the freshness and simplicity of a plain ASCII text file, free of XML garbage.

Old tech can feel very refreshing after years of Win32, GDI, COM, XML/XSLT/XPath and .NET. Most of that is nearly useless now and the stuff hasn’t even been around that long... I guess I got lucky for not ever touching Java after college - that would be another pile of wasted neurons in my head.


Some "New" Tech is Scary.

In fact (or maybe it comes with age) I started to dislike "new technologies". They are scary, like Flash/Flex/Air from Adobe. I see a great danger of "web runtime" being controlled by a single company, even if they are playing nice and flirting with open source community. If not stopped, Adobe will soon become "Microsoft of the Web" and they will be in control of "Web OS", turning browsers into dumb and irrelevant hosts for Flash Runtime. That is precisely why I want AIR to fail and I hope that more web applications will stick to standards-based HTML/CSS/JavaScript as opposed to moving to Adobe world.

It is sad that Miguel de Icaza, who I have great respect for, is not seeing this when he enthusiastically speaks out on Silverlight issue.

Adobe already controls video on the Internet (YouTube, Google Video) and soon, by looking at the trend established by startups like Anywhere.fm, Songza and Scribd, Adobe will have the sound and “universal document” formats under their API umbrella. Do not overestimate the power that comes with a monopoly on APIs.

People go nuts these days if they discover a useful web application that's got "Designed for Internet Explorer" stamp on it, but nobody has ever complained about "Designed for Flash" sites that often have zero HTML content on them. Why is that? If anything, Flash is more proprietary in nature than IE. Not only Flash has closed-source implementation, everything about it is proprietary and not standardized: the runtime, development tools, data formats, etc.

What amazing is that nobody seems to notice... Flash is slowly taking over the Web absolutely unnoticed. What happened? Where are you, open-standards proponents? Where are you, folks who used to whine about IE's monopoly? At least Internet Explorer supports a significant portion of HTML/CSS/JS standards. Sure IE has always had Microsoft-specific extensions, but they were only extensions - the basic foundation has always been open. Flash is none of that - Flash represents Adobe's own and complete redesign of the Web.

The only thing stopping them from completely taking over is somewhat higher cost of entry for content producers, i.e. developers and designers. A simple Flash/Flex web form with back-end server scripts is not nearly as trivial to implement as a typical PHP/Rails one-page tutorial.

I like the addition of VIDEO and AUDIO tags to HTML5, I like SVG, but I am afraid by the time all mainstream browsers adopt these features it will be too late. More specifically, I am afraid that IE8 won't be out for the next 3-5 years. Without new IE Flash/Silverlight will win.

Some would disagree arguing that developing for Flash/Flex frees you from browser incompatibilities. They would say that Adobe has not been hostile the way Microsoft has been. To those, I say that any Monarch, albeit wise, loving and open to his peoples, dies sooner or later. And nobody can tell who is coming next.

Therefore let's stick to democracy and avoid rulers altogether.

Monday, October 22, 2007

Respect C Programmers

During last few weeks I noticed quite a few blog articles and online discussions dedicated to something awful: bashing C as a language and C programmers who still use it and especially those who LIKE to use it. The bashing seems to be coming primarily from adepts of higher-level languages, filled with self-proclaimed superiority mixed with camouflaged attempts to proclaim the other side stupid. Even well recognized C hackers like Torvalds got their share of criticism in a rather rude manner.

And I have a big problem with that.

But before I explain myself, lets look closer what high-level language geniuses have to say. Summarizing, it predictably sounds like C is too low level and using it in 2007 is like programming in assembly language which is Forrest Gump stupid. More specific drawbacks of being so bare metal include but not limited to the following statements:

Programming in C is too slow and unproductive.
C programmers by default are performing premature optimization.
C-style of memory management is inferior to automatic garbage collectors found in modern VM implementations.
C programs have more bugs in them.
C programs are prone to security vulnerabilities.
C programmers write too many for loops and look stupid doing it.
C is not that fast. Actually Java is almost as fast now!
C programmers are not cost effective and picking C as a language of choice is bad business decision.

You can have a full blown argument for each of those bullet points but I won't. Instead, I want to point your attention to something else.

Dear Haskell People...

You see, most of you Haskell/Lisp/Erlang gurus are true intellectuals and are very capable. You write very well: many great “functional-style programmers” like you have nicely written blogs filled with sacks of good thoughts and ideas.

However, when I look around I rarely see any results of your work. I browse APT repository on Debian, sometimes I check out various shareware for Windows, I follow promising software reviews. I see interesting new projects pop up here and there, and almost everything is done in C. Maybe it is just me, but most of the software I'd personally use on my computers is written in that retarded and unproductive language you dislike so much.

I am counting real software, not software jobs. Because, as many of you already know, most software jobs exist out there solely to produce boring crap around SQL server of some sort for the enterprise with very few exceptions. That's the enterprise software forya. People still use it, but only if you pay them salaries. Here I want to concentrate on software people actually want to use themselves. Even pay for it. Real dollars, you know? And I don't see much of that written in higher level languages, let alone functional ones.

Moreover, looking around I can't help but notice that somehow those "stupid and ignorant" C programmers managed to build everything that powers the Internet and computers in general: GCC, Apache, Operating systems they run under, numerous tools, Gnome+friends, Doom, system utilities and personal money managers - the list is huge. Nearly everything is written in C. And it's not just legacy: if you focus on projects started within last 5 years the picture will not change much.

Meanwhile, higher-level language intellectuals produced very little. On my Ubuntu desktop I do not believe I have a single piece of software written in Haskell or Lisp. I tried Eclipse but continued looking for something usable until I discovered Vim. Even mighty Python has mostly been kept as a tool to write helper scripts as more powerful replacement for bash. I am not saying there aren't any programs written in higher level languages, I am just saying that ratio of C-to-everything-else is profound.

Why is that? Haskel/Lisp hackers, I enjoy reading books you write, blogs you maintain and I find your comments on most programming topics on slashdot very insightful. However, if your language is so superior and it makes you guys so much more productive, why don't you flood us with actual software written in it? Besides your own server-side web applications? I see so many articles and tutorials that end with "WOOW in 5 lines of code!" or "WOOW re-written in less than a week!" that it seems that all that increased productivity and your superior intelligence should compensate somewhat for your smaller number of heads.


Meanwhile, C programmers just get things done. This is why survival of any new high-level super language will always depend on having a reliable and easy interface to C. Why? Because the entire world is written in C by guys who are too busy coding and do not have time to bash cocky Haskell people in their blogs.

Peace. Respect. Love.

Monday, October 15, 2007

Life before Google

While driving to work the other day I caught myself thinking: what did I use for online search before google came out?

There was yahoo search, there was definitely altavista, I loved the way it sounded... But did I use them on a regular basis? How often? I can't recall.

I could remember my “google day” very clearly: I was looking to buy a car and one of my friends stopped by my office right before lunch and suggested to run some google searches to see what comes up. I do not remember being particularly impressed by the search results, but the simplicity of the front page was what caught my initial attention.

Then, almost the next day, someone suggested searching for “dumb m@$%ker” online and the bio of one of presidential candidates showed up. Jokes like that were much easier to do back then.

And I was happily googling ever since. However, I honestly cannot answer this question: what did I search with before?

The more I think about it, the more I am starting to realize that I didn't. I did not search back then. Well, not that much. Wasn't part of my daily ritual, you know. “Internet without Search” sounds almost idiotic in 2007 but in 1999 it was OK.

It just occurred to me that I didn't do a lot of things back then. And only now I realize how different the Internet has become. It appears that google all by itself dramatically changed the way I use the Internet. I am talking about my personal experience here and not trying to generalize, but I strongly suspect it is not just me.

What has changed since Google?

I stopped bookmarking things. Why? I can always search for the damn thing and find it again. In fact, it will take fewer keystrokes to get there and in case something better comes along I'll get on it right away.
I stopped using specialized sites like edmunds.com. If I want to read up on some vehicle, or anything, for that matter - google will show me the specs and reviews much, much faster than stupid edmunds with their multi-level menu drill down, killed by a naïve question “what is your ZIP code?” at the end. Are you out of your minds, edmunds?
Actually, edmunds was not a particularly good example. I stopped using sites way better than edmunds, like weather.com, yahoo movies, financial sites, etrade, fool, even calculator, dictionary and Microsoft Excel for many cases! Everything could be done by a simple text box and “Search” button. Faster.
In fact I stopped using the address bar all together, just like many others – Firefox has an alternative “address bar” that is powered by Google and is proper: no need to add “.com” or retarded “www” to everything and accidental misspellings will get fixed. Isn't that sweet? Just go Ctrl+K instead of Alt+D.
I do not have POP3 email client software anywhere. Gmail, even though it does not have folders and uses “collapsed” conversations without an option for normal people, was by leaps and bounds the best web-based email program. My primary address is 10 years old and does not end with '@gmail' but I tunnel all my messages through gMail anyway simply because they eliminate spam. 99.999% of it, like nobody else.
I stopped using my personal site. I still have an account and paying for hosting for nostalgic reasons, but I don't need it. I used to store family photos and useful files up there on FTP, just in case I'll need them when away from home. With ad-free Picasa and the giant gmail mailbox I don't need those either.
Finally, all sorts of Yellow Pages became history: online, offline – does not matter. Google finds that stuff too. Phone companies finally (!) realized that as well and stopped bombarding me with those useless yellow bricks.



Who can compete with Google? Compete for my “Internet time”? Only Wikipedia comes to mind, in fact I often throw in a short and effective 'wiki' at the end of my searches just to make sure I am getting wikipedia's perspective as my first search result.

And I never paid a dime for all this informational nirvana... Incredible.

Tuesday, October 2, 2007

Is Privacy Overrated?

Recently I noticed an interesting trend among people I know. I am not sure how to express it in full other than by giving you a few examples.

I few people I know have WEP/WPA encryption turned off on their wireless routers. They claim they do not really care if someone is using their Internet connection. Warning them about potential security issues with strangers breaking through Windows networking, lurking around inside of their computers looking for credit card numbers did not raise as much of a concern as I expected. Not as much as it used to anyway. How did they explain it? Apparently they cannot stand the hassle of dealing with passwords for every WiFi device that makes its way into their house, especially for those that do not have keyboards.

I also knew quite a few people, especially older relatives, who never managed to put their family photos online because they felt it was silly to have your private life exposed for everyone to see on the Internet. The issue was not technical: they have always known how to upload photos. But more and more of them are getting Picasa and Flickr accounts and upload their pictures now. They changed their mind.

Meanwhile I was approached by more than one person asking me to "build something" that will liberate them from logging into tens of web sites every day. Apparently all mainstream browsers are still not helpful it with all those "remember-your-password" features.

Don't you find it strange? While tech media cannot get enough of identity theft stories and companies like Micorsoft get hammered by online press almost weekly for yet another exposed security hole in IE, real customers, real people (around me at least), seem to care about those things less and less. A while ago people used to call GPS feature on their cell phones spooky because "anyone would be able to track me wherever I am" but look what really happened: even without GPS you see thousands of happy Twitter users announcing their every step to anyone who'll listen.

Do we Really Care about Privacy?

Privacy is a broad term and I prefer to think more of online anonymity. Back in early Internet years there was a lot more excitement about coolness of being anonymous. Expressing yourself, letting your voice be heard by thousands without fear of reprisals from an employer or your government was how the whole idea of the Internet was perceived by many who never heard of it before.

Fast forward to 2007 and let me ask you: Do you really want to listen to what Mr. Anonymous has to say? Have you noticed how meaningless and short, if not offensive, anonymous comments usually are? Well, enough about others, think of your own experience. Do you like to be anonymous? Really? Think of every time when a web application wants you to log in again and again, whenever you receive spam, whenever you type your name and credit card number the millionth time in your life or whenever your browser pops up a message with something about cookies or submitting insecure data to "scary Internet". And I haven't even asked the web developers. How fun is it for them to deal with anonymous visitors. Don't you find online anonymity simply annoying at times?

Does online anonymity really buy us anything?

I wonder how it would be like if DHCP did not exist?

Nobody is ever Anonymous. Try to imagine an absolutely unique "IP address" assigned to every person on earth immediately after birth. You add that ID to their connection IP address along with ID of the device they are accessing the net. No need to log in, if you are online, "the Internet" already knows who you are. This gives us an incredible accuracy at identifying who and when gets access to what. Sounds like a nightmare doesn't it? That is some scary future that Hollywood warned us about in "Minority Report" huh?

But how different it really is from what we already have? After all, we're not really anonymous. Interested parties (think RIAA) will get you even though they still need to get a court order to squeeze your real address out of your ISP. Imagine the state of affairs in North Korea, how anonymous their Internet users really are?

It seems to me that we are getting all the headaches of the anonymous Internet without any real anonymity. And judging by what people around me are saying, I would suggest that annoyances of online anonymity are eclipsing the fake sense of security it provides.

Don't you think that "scary future" is not that scary but rather convenient? Or am I being naïve?

Thursday, September 13, 2007

On Programmers and Business

Software development, just as science fiction writers predicted some fifty years ago, is becoming more and more common activity people are engaged in. Software drives not just traditional computers, it now propels literally everything plugged in; phones, cameras, TVs, cars, even washing machines. The knowledge of at least one simple programming language soon will be as essential as basic writing skills. Why? Because as the complexity of average software increases, professional programmers become less affordable and less accessible for basic programming tasks.

Let me explain.

Just recently a friend of mine was asking for some help with Excel. He needed a fairly intelligent macro for some intense spreadsheet transformations. I had to explain that it would probably cost him about six hours of programmer's time and no programmer will do it for less than at least $60/hour. Actually... no programmer will do it. Period. Because it is boring. Because they have "better things to do" for $60 an hour. Outsourcing ultimately is not going to help because the population of laptops, cell phones, TVs, microwaves and other programmable devices is growing much faster than world-wide population of professional programmers. Therefore nobody will help you with your basic programming tasks.

And you will always have some basic programming tasks, simply because more and more of what we do includes programmable devices. And off-the-shelf components created by professional programmers will not keep up. Charging your customers money, for instance, used to be a purely mechanical operation; you'd put a smile on your face, stick your hand out and a customer lands a $10 bill on your palm. Now you would want to set up a merchant account hooked to a payment gateway all integrated into your accounting and possibly other software. And you inevitably will want something more, something custom from your (without a doubt) highly customizable software. Why? Because it is customizable. But mostly because you already know: what you want is possible. Because it, whatever you have on hands, is programmable.

And programmers willing to help you are getting scarce. The alternative? Learn to do it yourself. Visual Basic (or bash scripting - depends on who's reading this) should be next to English in an average school's curriculum.

Programmers are expensive. Geez... they are getting more expensive every year. And the software they build is getting more complex. And more essential to whatever you do. Starting almost any kind of company these days involves hiring programmers. Your business does not need to be in information technology, but you have to hire programmers anyway.

Bad Programmers

Now lets think about starting companies. Some of them get born, grow, transform, evolve and eventually succeed. But others miserably crash and burn. Books get written about those failures, articles get published and MBA students are getting fresh real world cases to study and learn from. They blame CEOs for their mistakes. They blame ineffective marketing strategies, strong competition and only god knows what else those MBAs are trained to blame failures on.

But you know what? I have grown to suspect that a lot of companies are failing simply because they hired dumb engineers. There are several factors that lead me to believe this.

First, as I said above, programmers are becoming more important as software becomes more vital to what most companies do. Therefore, the impact the quality of engineers has on business has grown substantially. Second, it is very hard to find good programmers because there are fewer and fewer of them measured in "PPD" (per programmable device). And finally, it is damn near impossible to tell good programmer apart from a bad one unless you happened to be an engineer yourself. And most companies are not started by engineers.

Do they teach this in business schools these days? Do they teach that dumb engineers will have an immense impact on your business? A sufficiently dumb engineer may hurt you more than most competitors will. When organized in loose formations, even in modest numbers, they can even kill an otherwise healthy business. I'll write some more about those blood sucking yet fascinating creatures a bit later.

You may label this post as "self important crap" and you are welcome to, but isn't it everyone's belief that his or her profession is the most important one? Similar to university professors who almost without exceptions believe their course is the most valuable.

If only I had a billion dollars...

... I would... no, it's easier to talk about you :-)

Assuming You want to start your own company, you will want to hire some programmers. You will absolutely have to, and the problem you will face is a problem of avoiding hiring dumb engineers because dumb engineers will ruin your business.

And oh my... Businesses in US are plagued by dumb engineers. There are many different factors that allow these unqualified individuals to get engineering jobs, lack of decent interviewers is certainly a big one. Few years back I got into a habit of going to interviews just for kicks, probably because I have worked for the same company for nearly five years and forgot how it was like. What I discovered was shocking: literally everyone can get a programming job if he or she simply goes to enough interviews and tells the same made up story in response to "so... what exactly did you do at company X? Your resume said you wrote reporting system using Java and DB2..."

I am talking 2002-2003 and since then job market has only gotten hotter.

Do Good Programmers Cost More?

What was even more shocking, however, that many of those jobs paid a lot more than my previous employer did. And that company was ridiculously over-equipped with brilliant engineers. Heck, even now, after 4 years, I keep thinking that an average programmer at "Company N" was magnitudes smarter than most "enterprise level senior software architects" with fat salaries I've met. I also hear that Google does not pay as much as most competitors but they somehow managed to hunt down and hire nearly every well known open source developer all around the globe.

You know what it means? As shocking as it may appear, money is not everything. Smart people like to hang out with other smart people. Similarly, mediocrity comes in volumes too.

The Church Method

Building a great R&D is almost like starting a church: the trick is to keep your "business types" in the basement and, most importantly, - pick the right God. Lisp or Haskell will probably work. Put it up on a banner real high! And here they come: your high quality parishioners... They will build you a highest quality Java-powered unbreakable billing system (or any stupidest data pumping software you may want) in no time. As long as you let them show up to work at 11am and worship their God as much and as often as they wish. And no meetings, of course. Just don't put "Java" in the job description and you'll be fine.

Those who are terrified by total absence of meetings are worried for a very good reason. If you do not have much to do in such meetingless environment, maybe your position should... (it is always hard to call someone else's baby ugly) ... not exist to begin with?

The Internet Approach

In fact I can propose even more radical way of building something great. Suppose you have money and you are looking for something "hi-tech" to invest in. You know what you can do? (Listen to this nearly anonymous advice from Internet, right?)

You can, using the church strategy described above, attract as many smart hackers as you can possibly find, put them all in one comfortable place and leave alone. No projects, no business people, certainly no marketing and "team building" bullshit. Just give them an opportunity to meet, co-exist and collaborate together. Give them computers and free beer. Give them Barnes&Nobles for life memberships. Give them cigarettes if you have to.

And I guarantee that a number of fascinating events will occur. First, this ecosystem of smart hackers will reject "false positives". Naturally, without any HR department or "project managers". They will be expelled by their superior peers and chased away. Then you will start seeing signs of self-organization and, finally - some bright sparks. Those will be semi-interesting projects here and there, probably one per engineer or two. Eventually the most interesting ones will gather followers and evolve further. Not-so-good ones will die off. And I bet, that given sufficient time, you may create something truly spectacular. Something that will revolutionize some markets or possibly create new ones.

Laugh all you want, but look at what Internet did. Well... it surely is responsible for many things, but think for a second about Linux phenomena. An operating system, especially as well featured and modern as Linux, is probably the most complex piece of software a mankind has ever attempted to create. The scary monster of corporate world, Microsoft, has miserably failed at it - they built just Vista and it took them 6 years! Yet an army of smart souls managed to self-organize and pull it off so seamlessly. The Internet only served as one "big comfortable room" that allowed all those processes I describe to take place.

On Commercial Innovation

But what if true innovation in a commercial software space is dead? That may very well be the truth. You know why? Because the coolest software projects, most of their code base, were written between 10pm and 4am. And those, folks, are not your normal business hours. Surely some code gets written and checked in before lunch. Remember the last time your browser crashed? That was probably it.

Tuesday, August 21, 2007

Best Notebook for Linux

Ever since I rediscovered Linux I wanted to have a dedicated computer to run it on. VMWare and dual boot, even though I use both of them for work, were not good enough for personal matters: I wanted 3D-accelerated desktop, fast boot times and "true" Linux experience (whatever that means). Besides, it felt sick to realize how much time I had been spending in my office while I could have done a lot of work outside of it. I definitely needed a laptop.


Apple Macbook Pro
At first my mind was set on MacBook Pro. I adore Apple OS X and to me it is as much UNIX as Linux claims to be. But there were two major issues with this machine: cost and build. With 7200rpm drive and some mandatory software this setup is pricey: dangerously close to $3K territory but I did not see a $3K build quality in Apple hardware. Just by holding the MacBook in my hands and knowing my usage habits - no cell phone ever survives longer than a year, I simply could not see a happy ending there.

What about Linux notebook?
This brought me back to my original idea of running Linux on a notebook PC. Guess what I did first? I googled "Best Laptop for Linux" and was amazed. Apparently Linux still has a lot of issues with portable hardware, especially with el-cheapo sub-1K mainstream machines from Dell, HP and alike. To keep costs in check while manufacturing those throw-away notebooks, Dell and friends have to jump from one cheapest component to another, chasing the best deal I suppose. The brave souls who write open-source drivers simply cannot keep up with all possible "integrated solutions" found on a typical wallmartized laptop. Power management seems to be a big issue as well.

Apparently, to get Linux running smoothly on a notebook, the trick is not to buy cheap crap and try be be very picky about your hardware. In some cases installer will not even recognize your freakin hard drive.

IBM/Lenovo ThinkPad T60p

I looked at several machines and tried out more than one. At some point I was prepared for a compromise, it seemed obvious that every notebook I looked at had some inherent built-in incompatibilities with Linux, and all I I was doing was to pick which component or functionality I could live without: be it a hibernation feature, poor battery life, sub-par graphics card or not working function keys including those that control the brightness of the LCD screen.

Well, I'm glad to report I was wrong - in the end no compromise was needed. The search for a perfect Linux notebook now is over. There have been an elephant in the room, my friends: a big one. From what I learned, the best Linux laptops ever made have actually been made by IBM, and still are. I know, I know - they are Lenovo now but they still are the same dudes in the same offices doing what they do best: they design great looking notebooks that are great performers and last forever.

The Keyboard
To begin, just pick ThinkPad up, open the lid and type "Hello world" or "I hate you all", depending on your mood. You will feel the difference immediately. The keyboard is simply gorgeous. I have owned full-size PC keyboards that were worse than this! As far as notebooks go ThinkPad keyboard cannot be beaten: it has all the important keys of the right size in the right places: nicely aligned arrows, "Ins", "Del", Home/End cluster - everything is proper. This is a full size PC keyboard without any compromises in form of misplaced buttons. Up until recently they did not even have the retarded "Windows" button but finally gave in...

Linux-compatible Hardware
Secondly, they use fairly common, Linux compatible hardware. I installed Ubuntu 7.04 on ThinkPad T60 without a single glitch. As far as I can tell everything works: even all function keys. It sleeps and hibernates, I can control MP3 playback and screen brightness. In case you are curious, click to see my exact configuration. I was even able to install drivers for a built-in fingerprint reader and integrate it into GNOME login and screensaver. Although I ended up not using this gizmo simply because typing a password takes less time than scanning. Bottom line, however, is that ThinkPad T60 is 100% Linux compatible.

Why not ThinkPad T61?
Notice that I did not buy the latest T61 and for a good reason: this one is simply too new and people report that it will not work as flawlessly as T60 would. For instance one needs to be careful which Wi-Fi card to order (go with older one, Intel 3945). Some function keys reportedly are not working yet and standby/wakeup is not reliable. But not for long though: IBM is very serious about Linux support, they even sell ThinkPads with pre-installed Suse Linux with a full suite of "ThinkAdvantage" applications very similar to their windows counterparts. You cannot order one of those off their web site but you can, if you want, do that over the phone.

There is another good reason to go with a T60 if you can still find one. Those gems are available with gorgeous LCDs that utilize more professional and useful aspect ratio (i.e. not wide). For instance you can get a T60 with a 15" LCD with native 1600x1200. Unfortunately I ended up with a wide screen display (1680x1050) but at least I got 1050 vertical pixels. Not only that, but some of those screens (check out 15" @1400x1050 model) produce spectacular colors and are great for someone who's into photography. The screens I've seen on T61 are not like that: the colors have "metallic" tint, they're somewhat dimmer and angle of view is much narrower.

Regardless of that, if you are reading this in the 4th quarter of 2007 or later, all those issues with T61 and Linux are probably resolved already. Go with T61 then.

UltraNav and why it rocks.

I may sound silly but this little thing makes all the difference in the world. Unlike others, who supply their laptops with regular touchpads with a couple of buttons, IBM's approach is very much UNIX-like: they want your fingers on a keyboard doing what they do best: typing. To assist you with that IBM added a second row of mouse buttons above the touchpad. Use your thumbs for mouse clicks and use a little red pin to move the cursor around, while keeping your hands in close combat position: always ready for sudden bursts of keystrokes.

I can honestly say that UltraNav coupled with ThinkPad's perfect keyboard nearly eliminates the productivity difference between a notebook and a desktop workstation. I watched enough of colleagues "working" on their Dell Lattitudes - a profoundly disgusting experience, very much like stop-and-go traffic.

What about Performance?
Are you kidding? The software has stagnated behind hardware for nearly ten years (!) - of course the notebook is fast for everything I do. It's a Linux machine, remember? It does not need to run Windows with a typical 182 pre-installed do-nothing junk-processes that you may need 3Gz quad-core CPU for.

Honestly I did not care for CPU speed at all: anything better than PIII 800Hz would do just fine. I was unable to find T60 with 1.6 or 1.8Gz CPU and I view my 2Gz clock speed as a waste: both cores are running at 1Gz 99% of the time, governed by power-saving CPU frequency scaling feature (just like in any other OS). The things I cared for was more RAM and to have a dedicated video card, because "shared memory" solutions put too much strain on the main bus and affect (in my experience) even tasks unrelated to graphics.

The bottom line - the laptop is fast. With 2GB of RAM I am constantly running two VMWare sessions and a bunch of apps as well. After a while everything gets cached up to memory and I don't even see HDD LED blinking anymore. Somehow Linux manages to get the most out of surplus of RAM: Windows caching schemes are way more conservative, but I suspect it has changed in Vista. When running Linux with plenty of RAM, getting a slower, larger and more power-efficient 5200rpm drive seems like a good idea now.

Want a problem-free Linux notebook?
Here you have it. If you wonder which notebook to buy to run Linux on, the answer is simple: get a ThinkPad. The reasons are:

Great build quality.
Excellent Linux hardwre support.
Best keyboard in the industry.
UltraNav.
Awesome LCD screens with crazy viewing angles.
Subtle and purposeful business look.
Runs Linux.

Sunday, August 5, 2007

Wide Screens and Best Buy

Let me tell you a little something. To begin with, I must confess that I am not claiming this is a true story, I heard it from someone a good while back, but it serves me nicely with my point down the road, so here it goes:

The Tale of Blue Crystals
Once upon a time there was a company in the market to make a laundry detergent. Their name is unknown. What matters, however, is that their R&D department really had no idea how to make a laundry detergent. The kind that works i.e. cleans shirts, jeans and the like. Their detergent sucked. It did not work. Consumers kept buying "Tide" and largely ignored the inferior product despite its smaller price tag.

Being unable to afford good engineering, I can only guess, the company goes ahead and hires a marketing guy soon to become known as "Blue Crystals Dude". The blue crystals idea was simple: he proposed the company cuts production costs by stopping using expensive ingredients in their product, effectively making matters even worse. And to improve sales he suggested adding blue crystals to the "formula": a dirt-cheap and harmless substance that looked kind of blue indeed. The crystals did not do anything. They were just blue. However, magic crystals allowed the company to package a crappy product into a shinier box, slap "Improved! Now with blue crystals!" on it, and sell at a hefty price.

What do you think happened? Customers loved it!

Cost Cutting in LCD Market.
Similar thing just happened in the market for LCD panels for notebooks, and the disease of blue crystals is steadily spreading onto desktop monitors as well. The disease is called "Wide Screen" and this is how it was born:

Apparently some marketing genius picked up his high school math book and found out that the area of a square is significantly larger than area of a rectangle given an identical diagonal width of the two. In practice is means that the area of 14" LCD panel with an aspect ratio of 4:3 is larger than the area of 14" LCD with a ratio of 9:6.

That translates into this: a 9:6 display is cheaper to make than a 4:3 display of the same diagonal width.


Wide Screens - the Blue Crystals of LCDs
Blue crystals ruthlessly strike again, while it is unknown which company hit first, but the decision was made to take a full size SXGA screen (1280 x 1024) and simply chop off some pixels at the top, reducing overall display area to save on manufacturing costs. That means making a display smaller: only 1280x800.

To make sure consumers won't revolt, the marketing term of "Wide Screen" was born, to make suckers feel like they are gaining something from this cost-cutting exercise. Never mind that horizontal resolution did not get any "wider", only vertical pixels disappeared, forcing consumers scroll down on pretty much any web site, any document, any photo or anything not tiny. The world largely went back in time to pathetic 800 vertical pixels.


Usability Issues and WebSites running Wide
Eight hundred is your new total display height. Now, if you subtract the pixels needed for your browser’s mandatory menu, title bar and status bar, you will end up with roughly 650 pixels available for a web site you happened to open. However all web sites must have their own top level menu, a logo and such. CNN.com takes 130 pixels for those, so even after you maximize your browser window, you end up with only 520 pixels left for the actual content. Welcome to 2007.


How about TVs?
Every time I bitch about this to my friends I hear nonsense that goes like "wide screens are the future: look at the TVs for god's sake!". Come on, dudes: you cannot compare TV screens to computer monitors. You watch moving picture on a TV, and you read text on a computer. Well... most of us and most of the time. Video and text are very much different in how people perceive and consume them. Books are not wide, folks. Your eyes cannot follow lines that are longer than just a couple of inches: that is why newspapers, the only paper media in "wide" format, use freakin columns. And not only that: newspapers have plenty of "vertical pixels" as well.

Therefore do not kid yourself: N diagonal inches in 4:3 gives you much more usable real eastate than 9:6 of the same width.

It is interesting to note, that most web designers keep producing vertically oriented designs one after another. It became almost a norm to only see a site's top-level navigation in default browser window: to get a glimpse of what's on there you're forced to maximize, enjoy empty "ears" on both sides of your wide screen and, of course, scroll down.


Marketing at Work
Did customers fight back? Nope. They loved blue crystals again. Today it is virtually impossible to buy a notebook with a full-sized LCD display. Believe it or not, I even saw software people write in their blogs "... First thing I want for my new laptop is to have a wide screen, cuz I like to watch movies on a plane..." Whoever you are, the blue crystals dude, you are Genius - I must give it to you. Making college educated folks to blog "I want my next laptop to miss roughly 200 pixels" is something you can proudly report back to your marketing professor.

I looked for a notebook with a full-sized 1280x1024 LCD. It does not exist. Or maybe I did not look long enough. I played with newegg and pricegrabber, I googled, I even propelled myself to a nearest friendly "Best Buy", to no avail.

Which brings me to my second, smaller rant for today.

What the Hell is BestBuy selling?
My experience with Best Buy has always been limited to rare and isolated events whenever I needed to buy some blank CDs urgently enough to pass on newegg.com and other internet shops. Basically it means I opened my wallet at Best Buy... well... maybe once in my life.

Jesus motherloving GOD!!! Have you been to Best Buy lately? Five or maybe seven years ago it used to be a place where cool geeky teens would go to find out what new and exciting is out there. Best Buy was a perfect place to waste you lunch drooling over some newest gadget they happened to put up on display that day. Not anymore.

Best Buy sells... How do I put it nicely? Well... they sell obsolete shit. I am serious, I am not joking. Check this out: they sell portable CD players! "How would a CD by itself be portable?" you ask, but that is beyond me. Best Buy sells them. Yes, these antiques from the 80s and they call them portable. Sure. Anyone can easily load one of those into a sizeable male purse. (I have semi-European roots). In case you're still not convinced that Best Buy turned into a history museum, I have another one for you: they have a dedicated isle for telephones. The kinds you plug into wall-mounted outlets. Remember those? Just like the ones grandmother used to curse at before she passed away in 1988...

I honestly do not know anybody who still uses land lines to talk or rides a horse to work.

By the way, 1GB of decent DDR2 RAM is about $140 at Best Buy. Which is roughly 250% more than newegg.com is selling it for. Maintaining an isle dedicated to land line phones would cost you, I reckon...

Wednesday, August 1, 2007

Linux in 2007

Every now and then I like to take a break from my typical PC routine and install Linux, as any self-respecting geek should.

History
I started playing with Linux back in 98, without any particular goal in mind, and kept doing that just for kicks about once every two years. Frankly, Windows has always felt good enough to me and I never really dreamed of switching to something better. After all it is not the OS I use, it's all about the applications.

So every couple of years I install Linux, I look at its somewhat rough desktop, my eyes immediately get fried by its awful, unreadable fonts and usually I quickly retreat to the black and mysterious world of the text console. I may spend a day or two playing with some command line toys and call it a nice try. Unreadable text by itself served me as a powerful motivation to get back to Windows, every time. However, crappy fonts and primitive font rendering were not all.

Linux GUI and X
Linux X-system has always looked like a giant architectural mistake to me: you don't want to build your GUI subsystem as a "server" that "serves requests" from the clients. UNIX guys made matters even worse by implementing other GUI components in this manner, like the font server, for instance. Frankly I can only theorize on why Linux UI has been so painfully unresponsive. Even the mouse has always felt awkward on Linux, so too has choppy window resizing. Maybe those "servers" are not it. Well, let's get back (or move forward) to 2007.
I am happy to report that in 2007, Ubuntu, one of the many Debian-based distributions, looks fine. No, don't get me wrong: the fonts are still screwed by default, but at least there is Google and great support from the community of Ubuntu fans. Apparently, in 2007 Linux knows how to render fonts properly, with anti-aliasing and font hinting, but due to patent issues with Microsoft, Adobe and Apple, those features are disabled by default. Lucky me, I know how to edit an ASCII file, copy those gorgeous Windows True Type fonts into Linux fonts folder and restart my X server. Nice! Not quite Vista/XP quality, but very much usable, especially on modern XZ-something-VGA resolutions we all are accustomed to.


Oh! The fonts!
I wish I could close the issue of fonts, but I can't. As it turns out, many applications simply ignore system-wide font settings and render their own. How the heck is that possible - would be my question, coming from years of Win32 GDI programming, but apparently you can do that in X. OpenOffice happily makes itself totally useless by rendering its own crappy fonts that are as hard to read as they were back in 98, worse than Windows was in 1991. I guess OpenOffice in 2007 compares favorably to something "graphical" from the 70s that I am not familiar with. Heck, I'm only 30. FireFox seems to have its own ideas about fonts, but at least they look much better.

Hardware support and Gnome
But guess what - I don't need OpenOffice, GVim works fine in GNOME, thank you. Oh, speaking of GNOME. I have not tried modern KDE yet, but GNOME has definitely evolved. Overall look and feel are very polished and professional. Everything that I expected to work just worked. Ubuntu even allowed me to use a proprietary, binary nVidia driver for my video card, and everything was responsive. Not quite XP-level responsive, where windows and other graphical objects have almost physical, real-world feel when you move them around, but certainly more responsive than Vista. When I plugged my fairly basic Motorola phone into USB, a little iPod icon appeared on the desktop and all MP3 files from the phone showed up in GNOME's music player. Sweet. XP doesn't do that! Well, of couse it does, but I never noticed it before...


Installing Software
Ahhrr, it turns out that MP3 is a proprietary protocol therefore Ubuntu cannot legally install an MP3 codec for you, but it conveniently offers you an option of installing it yourself with 3 mouse clicks. Same applies to video codecs as well. Downloading and installing the missing codecs worked a lot smoother than it did in XP, which due to its age also comes without DivX.
Most of curious souls who try to compare Linux to Windows usually start by picking Windows features one by one and comparing them to how Linux does that. That's just wrong. Because there are things (at least in Ubuntu/Debian) that Windows simply doesn't do. Take their package management system for instance. Finding free software and installing/removing it from a central repository is awesome. Windows, with its always broken registry and freakish MSI, makes it scary and generally "not safe" to install new software. In fact, Windows gradually gets more and more broken as you install something. Hey, computer geeks, how often do you get a call from a friend, complaining that "My Windows computer got a lot slower"? And what can be said about an OS that discourages you from installing software on it?

And that is the most beautiful part of the Linux experience: there is lots of free software. One can spend days browsing Ubuntu/Debian repositories, installing, playing, removing and comparing all kinds of programs. Let's start with software Ubuntu comes pre-installed with. It was carefully pre-selected and it shows: standard programs are very well-made. Almost every component is better than its Windows counterpart; Instant Messenger is better, it supports all IM protocols I care to remember, default Image Editor is a lot better, default "Notepad" is also nicer, the list goes on. My favorite one is Rhythmbox, GNOME's music player. It has a very clean and intuitive UI that easly eats Windows Media Player for lunch. Somehow, even though I've been using Media Player for years, I am guilty to admit that I STILL have very little idea how to DO anything in it. Rhythmbox is intuitive, simple and powerful. Rhythmbox is usable right away.

Is Ubuntu Good Enough for You?
So... is Ubuntu ready for a typical average user? I don't know, I am not average, moreover - I am nearly a computer genius, right? :) But I seriously do not know. As it appears, most people spend their time in their browsers lately listening to MP3s playing in the background. They may download their digital photos from their camera, organize them into albums and possibly email them to their friends. What else? Rip music CSs into MP3s? - check! Burn MP3s onto CDs? - check! Backup files to external USB drive? - check! Write simple basic documents - check! Hmmm... it appears that Ubuntu will work just fine for them, once they figure out the font madness.

But hey, I am not an average user. I am a software developer. And you know what? For software people Linux is a dream OS these days. It wasn't the case in 98 though. I do not believe it was the case in 2002 either. But in 2007 it is definitely an OS for software people. Why? I don't know where to begin! But haven't you noticed that most of the newest and coolest stuff that people rave about in their blogs is Linux-native? Seriously, look around. It is much easier to try and play with Ruby, Python, Haskell, LISP, Squeak, OCaml, D Language, Rails and Django, PHP and friends - all are first-class Linux citizens that do not "feel good" on Windows. Even Java does not feel as native on Windows. Do you read tech books or you are 9-to-5 kind of programmer? In case you do read and care to recall, what OS is most commonly used to produce screen shots for tech books these days? Or whenever you find an interesting piece of open source code, I bet it's tgz file, isn't it?

Who is it For?
What I think is happening, is that many bright minds in computer field are moving away from Windows. And for "computer people" who like to keep up, UNIX *is* the system of choice. I said UNIX, not necessarily Linux, because Mac OS X is a nice programmer's OS too. In fact OS X is probably the *only* OS for guys who like to build really nice GUIs and get paid for it.

Ahrr... the mandatory conclusion... Here we go: Linux has become a gorgeous OS for "computer people", and a very good alternative to Vista for average+ users who don't have very specific Windows-only needs, like Photoshop or DSLR RAW converters. All they have to do is to enable readable fonts. But truth be told, if my mom asks me which computer/OS to buy I will have to send her to http://apple.com

Notebook Support
Meanwhile I am returning my sweet DELL Vostro 1400 because Ubuntu would not run on it well and getting somewhat obsolete ThinkPad T60 with linux-friendly hardware. But, on a side note, in case you absolutely need to log into Vista prison every day, Vostro 1400 is one sweet&cheap piece of hardware - easily the most pleasing item Dell has ever sold to me. I will keep looking for a perfect Linux notebook and I'll be back with my findings.

Desktop Applications are Dead

You think I am speaking old news here? After all, everybody and his sister have been screaming on the street lately about the death of desktop programming.

The majority, if not all, of those screamers have always been "web app" developers. Who listens to those? They aren't even real developers, right? They don't know any better. They don't even know how to use malloc() and free() properly! Poor souls... writing foreach loops in their toy languages, acting like little girls trying mom's makeup while she's at work. Surely they have been naturally pissed at us, real engineers, creating real applications, allocating and releasing our own memory, passing pointer-to-pointer parameters.

Sure we've been looking down at web-app "developers", rumor has it that some of them are known to be ex-taxi drivers. Do you trust those folks proclaiming that the desktop was dead? Besides, tell me this, what has been MORE glamorous in IT, more than web apps, since the rise of Yahoo? However, get this: the desktop is dead.

Who is Ev?
You are hearing this from a seasoned desktop developer. Yes, from one of those cynical dudes who always puts "online applications" in quotes, because, you know, they are not "real" apps. Oh! Another classic: I am someone who never considered JavaScript to be a "real language". How about that?

Even more! I've been in love with Microsoft as long as I care to remember! (still reading?) I used to jump on every new technology coming out of Redmond. Heck, I even seriously believed that every programmer should have a good understanding of his hardware and have some assembly language credentials. Moreover, I believed that C++ was a godfather of all programming languages and Java was just a simplistic subset of it, meant to be used by the average Joe with an average degree in something remotely technical, a language created by a corporation for other corporations to have their pathetic RDBMS-wrapping software development done cheaply by armies of disposable chimps. Well, while the former still holds true, the desktop is dead anyway.

Who Killed the Desktop?
Do you know who killed it? Microsoft did. Yes, I am pointing my finger at Redmond, and I am neglecting the advances in HTTP-based approaches to problem solving. The desktop is dead not because the Web is great. Nope. The Web still sucks. After all, the majority of it was created by non-real developers, remember?

The desktop used to be very much alive, but it has been getting worse and worse. It took a long time, but finally the desktop has gotten so bad, that *even Web-based* UI now looks decent and usable!

Forget for a second about collaboration, information sharing and all those other goodies that the Web gives you. They are not important in this context. I am talking about overall user experience; User Interface, primarily. Besides, you can collaborate, share and do whatever you please by using your favorite desktop application - there isn't a technical obstacle in doing that. Take any 1st person shooter and its multiplayer capabilities, people have been killing each other online for years. Need another example? One word: iTunes.

Standard Runtime Support.
Why has the desktop gotten so much worse? What's broken? Well, try installing something on it. Actually, you should dig a bit deeper: try to develop something for it and then have people to download, install and use it. Do you know what kind of runtime support you get *standard*? Old school circa 1991! Yep, that is what you get. Do not point your finger at .NET, it is still a subset of Win32. It is built mostly on top of Win32, and it is still not available for us to use! Yes, .NET runtime is not part of the most popular Desktop out there - Windows XP.

Microsoft made a strategic mistake by NOT bundling .NET 1.0 into the initial XP release. Moreover, even jumbo-sized Service Pack 2 did not include the newest .NET.

Here we go folks: The only standard runtime available to a desktop application is still old and rusty Win32 API. Otherwise...

Otherwise your poor users, those who still prefer desktop apps, they will have to download and install that fat .NET runtime just to try out your little piece of software. Easy to do, you say? We'll get to downloading and installing in a second.

Downloading Software
The downloading! My favorite area where Desktops suck. Have you noticed how scary the downloading process got lately? In some cases your users will face 3 (three!) scary dialog boxes, warning them that they are about to (potentially) screw themselves in the butt by installing your "potentially dangerous" software. Well... let's hope our users are brave and they will make their way through.

Let them run the installer. But where is it? I mean, where did it go?

- "I just downloaded it and it went somewhere. Where?".

Sounds familiar? No? If not, you just haven't developed any desktop software yet. A lot, and I mean it, tons of users are not capable of figuring out where your little precious installer goes. This is why big desktop software companies have step-by-step instructions up on their sites, on how to well... check this out, on how to download an installer and find out where it goes.

Desktop Security.
If Microsoft really wanted us all to develop more desktop applications for Windows, why wouldn't Internet Explorer let you drag&drop our awesome desktop applications from a page to... well... to the desktop? After all, people have managed to learn how not to put everything they see in their mouth, learned how to stop on red and go on green, how to find a restroom in an unknown restaraunt and wipe off their little popas when they're done. They'll learn not to drop viruses on their desktops.

Ok, a user finds your installer, clicks or double clicks... Waits...

-"Did you download this binary off the Internet?! Are you crazy!? It's me, Vista, talking to you, stupid person! Do you trust this software you just downloaded? How about I make your screen go dark and you type your administrative password? I would also like to play a gunshot in your speakers at maximum volume, but some people here in Redmond figured it was a bit too much."

Buying desktop software versus using online apps these days looks like if you were going to a supermarket to buy "Exedrin" (desktop) and to accept the trouble of staying in line to a register, only to have a cashier bitch-slap you, hysterically screaming in your face:

- "Are you sure you want these pills!?!? Are you sure?! What if they cause testicular cancer?!"

... Meanwhile, less effective "Advil" (online apps) is free and you don't need to go anywhere... Do you want to be in business making Exedrin? Me - not anymore. Exedrin is dead.

- User: "Here! Here! Shut up, you crazy OS that came preloaded on my laptop! Take my password! I still want to install that scary desktop app!"

- Vista: "The password is correct, so if you insist... Oh, in order to protect you better, may I ask you to wear a condom, while using that desktop app? It came from the Internet, by the way..."

Windows Installer and New Computers in US
Pretty easy, huh? Well... easy that is, assuming your user does not happen to be one of those victimized PC owners, who purchased their PCs somewhere on US soil, where most computers come pre-damaged with sacks of crapware, such as 3rd party software firewalls, "internet security" suites, anti-spyware and adware programs and other internet-disabling, I/O consuming, eating-your-dual-Gz-for-lunch junk. Those programs intercept and twist the calls of your application into the same old and rusty Win32 API, throwing more and more scary and confusing dialog boxes at your user.

And the poor bastard (assuming you still haven't lost the sucker at this point) has to deal and calm down this zoo of monsters that infected his PC to look after the "safety" of his Desktop.

Do you still want to be on the market to make Exedrin? Seriously. Would you? In case you still not convinced, let me remind you what's ahead.

For starters, you'll hear about some strange application, called "Windows Installer" (MSI), that mysteriously would try to pop up and "repair" your application later on for no apparent reason? That, perhaps, is a side effect of your poor user trying to install more than one (!) application on his desktop, causing MSI to keep things running properly. Secondly, the state of most Windows machines, especially those that have been running for a while, say 2 years, without professional "assistance" is absurdly bad: registry is a mess, reboot times are ridiculously long and (real life example) Windows Installer tries to repair Microsoft Office every time a user touches your application.

And you know what? A User is getting tired of this crap. He finds that clunky and slow Ajax-spiced UI of most web sites is by leaps and bounds far less annoying. He likes to maximize the browser window and never having to face all this ugly and slow shit that Microsoft, Dell and friends have put for him in there. My point is that Windows is a very fragile platform that people are learning to avoid messing with. Windows makes it hard for users to explore various software options. Can you honestly say that you would be comfortable installing and uninstalling 50 random desktop apps in one day? Personally I use VMWare for that.


The Finale
Stupid user... Using those inferior clunky web apps... Ignoring the superior "user experience" provided by "rich" desktop applications.

And you know what, I can't blame him. Although I could not resist the urge to use desktop-based notepad.exe to rant about this crappy deal. Blogstop's HTML "editor" (yet more quotes) is kind of... not quite ready to have a text typed in.