Saturday, November 29, 2008

Death of the Premium

What a great article on Gizmodo! After numerous failed attempts to buy a decent laptop recently, I’ve been ranting about the depressing effects of Sturgeon’s law on computer industry. My wife tells me I’m getting old. Perhaps... But there is one thing I am sure about: this isn’t just about computers.

It has been universally accepted that 90% of everything available to buy, watch, eat or listen to is crap, but the mere existence of those “normal” 10% made me feel OK about it. Want something that works? Want to see a movie that wasn’t made for a dumbass? Want to eat an actual grown vegetable instead of a toxic manufactured biomass sprayed with “taste” yet approved by FDA because it hasn’t been proven to kill instantly on contact? Well, you could always pay extra and get the “other 10%” - the stuff that works, movies that make sense, food that’s been grown as opposed to manufactured, software that doesn’t crash, an alive customer service rep. instead of a robot-over-email, etc. Yes, premium goods and services usually cost a lot more, but hey! - this food will give your dog a chance to actually live as long as the wikipedia article says he’s supposed to.

The problem is that we’re seeing a slow death of the premium. Many companies are deciding that the other 10% just aren’t worth the trouble. Market doesn’t want it as much. The consumer prefers free and slow crashing software over paying $59.99 for something that works. Pre-broken computers are popular because they would have been $50 more expensive if sold in a proper working condition without damaging crapware on them. Weird crunchy red objects at my local grocery store are called “strawberries” and there is a growing generation of kids that actually believe that strawberries are supposed to be that. You cannot buy a laptop with a usable LCD: today your only available option will be TN-based, 6-bit, low-contrast, glossy wide-screens with pathetic color reproduction often spiced up by horrendous light bleed. Yeah, those 12-megapixel noisy photos from your latest Canon camera will look fantastic! Never mind that 4 year old cameras available for $20 on eBay actually take better looking photos. And even Apple won’t build you 16.7 million colors laptop despite their pricing: dithered 262,000 is “good enough”. And who’s complaining? $499 for a freakin computer can do no wrong, you can buy one for every Christmas (and why shouldn’t you? Next year they’ll drop another megapixel into an integrated webcam and a keyboard will be glossy too!)

Our bottom-line oriented, cost-driven consumer culture is dragging us into the world of affordable mediocrity, where everything is commoditized, standardized, made in China and very affordable. There are grown ups now who call shopping their hobby. I guess nobody wants stuff that works simply because most of what people buy never gets any real world use, the mere fact of buying the goddamn thing, not using it, is the point.

There are two practical and unfortunate effects from all this. First, we can’t dodge the crap anymore by buying less and paying more. That stinks but well... You can always picture poor kids in Africa or go 200 years back in time to realize how silly you look bitching about those LCDs... But there is another, more troubling aspect of it though: the death of premium means that innovation, engineering and science don’t matter as much as marketing, advertising and packaging. As much as I hate the military, they remain the only customer capable of demanding more. And paying for it. And since Cold War is long time over, I guess we’ll continue living off the tech we had built to fight it. Until the aliens, of course, threaten to conquer us all.

Sunday, November 2, 2008

Are Social Networks Underhyped?

Apparently someone thinks so.

Pretty bold statement, in spite years of Facebook's repeated failure of making any money. If you don't have the time to read the article, the argument goes like this: we haven't seen the full potential of them yet. In the future the entire Internet will be revolving around social graphs because, presumably, our social connections are what guides us in real life: doctor recommendations, business introductions etc. And these real life nets are going to be transitioning online dragging the rest of the Internet along.


Don't think so. All that stuff's online already, there isn't a greater degree of "onlineniness" possible. This is not an early adopter game anymore. How's someone is going to "get more connected"?

In fact I am observing the opposite: the mainstream public (looking at my non-techie friends) have been fully exposured to it, had enough of it, and is slowly getting tired of it. We're not talking about early adopters anymore: everybody has an online identity and has learned its limitations and implications.

If anything, social networks are getting boring: outside of your real circle of friends you see the same strangers posing to be smarter, better looking and happier than they really are: people aren't that different after all, and your real social network stays where it has always been: in your cell phone's address book.

And that's where I'll be turning to for an advice about finding a doctor or a car mechanic. I don't give a rat's ass about what "people on the internet" have to say. At least half of them voted for Bush. Twice.

Saturday, November 1, 2008

Yes. Windows is a ghetto.

This guy complains that developing for Windows is tough. No kidding. Ironic as it is, but his writing is mostly about an easy kind of Windows development. Web apps behind IIS? Not that hard, really.

Try building an installable desktop software for Windows. That's where real fun begins. It is quite common for teams doing Windows work to dedicate as much as 20% of available manpower to the installer alone (!). I've been coding Windows desktop in C++ since graduation in 98 and has always looked down on web programmers since they had it so easy. But once I got older I realized that all my Windows-fighting instincts and in-memory database of gotchas are nothing to be proud of: most of my career I was boxing against the platform I was working on, while some were having fun building an actual software.

Some may ask why on earth would someone spend 20% of time writing an installer, copying files shouldn't be hard, right?

Wrong. The problem isn't getting files in the right place, the problem lies in how inter-connected and "integrated" everything is on Windows. "Internet Settings" in the control panel, while they seem to be IE's settings, actually affect how some Windows Internet-family API functions behave. Then there is a big hairy mess called COM/ActiveX: you can't parse XML without it, yet there will be computers, be it one out of 100, with broken XML parser COM registration. Same applies to various shell-related COM servers which are essential to desktop integration. And yes, IE is the part of the OS despite of the illusion that you can uninstall it.

Then there are cases when other applications break your code: Symantec used to install their own version of MFC DLL right into System32. Then there are super-aggressive anti-spyare/anti-virus/anti-whatever packages that are basically hacks breaking all kinds of legit software. Windows encourages this style of development: when an application essentially becomes a collection of COM servers scattered across your hard drive, hooked into your system via complex mesh of registry settings. And there is no way around it: this is what MSDN tells you to do.

Then you'll have users complaining that when they launch your RSS reading software (or whatever you do), they get a popup that says "Windows Installer: configuring Microsoft Office" that disappears after about a minute of "collecting system information". And users will tell you that everything worked great for 2 months but then "computer did something" and this popup started appearing.

And they keep adding more shit on top of existing shit. Now, in addition to COM and MSI and registry you get this "side-by-side execution" bullshit, when you can't even tell which version of a DLL is being loaded and Windows Explorer essentially hides your own fucking files from you, so even locating a misbehaving DLL becomes a debugging session on its own, where you'll need to decode cryptic hidden directory names and extract a manifest from some executable's resources to see which DLL it actually wants. Having a DLL side by side with an executable isn't guaranteed to work anymore. When they rolled that out I felt sick for a while.

And finally, there is no such thing as "Windows API" anymore. XP machine connected to a domain controller/AD is a very different beast than XP home or Win2K. I'm not even mentioning Vista here.

For my last Windows project I went for "xcopy deployment" with one single fat executable statically linked to everything it needed to run with some help from open source libraries, essentially very similarly to how Firefox does it. But this style of development isn't really for "Windows Platform", this way you're targeting a "sane subset" of Windows platform.

There may be a few minor technical mistakes in what I've written. After all I haven't seen Visual Studio in a long time, but I know that largely I am correct. And I don't hold any excitement about upcoming Windows 7. Why would I? None of their initiatives is targeted towards developers. They're tweaking minor and irrelevant UI pieces, massaging services installed by default, but in the end it's exact same old lame turd. If Microsoft wants to impress me, release a version of Windows without system registry, without MSI and make c:\windows completely sealed: make it so after 3 years of installing Windows, there isn't a single new file sitting under that folder. Make everything, every little piece of next version of Windows fully scriptable and include JavaScript, Python, Perl and Ruby by default. Make the server edition completely free. Integrate with Xen and others, instead of competing with them. Implement every imaginable suggestion for future HTML/CSS in IE9 strictly according to the spec. Break backwards compatibility whenever you want and keep selling XP for those who need it. Finally, try to reinvent the desktop: the damn thing hasn't changed in decades and people still lose their own files without any help from a faulty hardware: they just forget where they are and how they're called.

Linux, and especially OSX, aren't prefect either – there are tons of areas for improvement. Innovate Microsoft, instead of competing with Google in a lame contest of being the most technologically inclined media company. It's very easy to compete with a media company: just make sure you focus on software and where do you think the best CS grads will want to work at?

The irony of it is that once upon a time Microsoft was kinda cool. I was a teenager but I still remember those "good&old" days of snobby IBM/Sun/Oracle salespeople with their super-expensive software, hardware, development tools and even documentation (I was told you could sell your car for a full copy of OS/2 SDK). And then there was Microsoft and Borland, with great, inexpensive and innovative products, driving PC revolution and attracting folks like me and my friends to CS.