Archive for January, 2006


So, by this point we hopefully have all heard about spyware, adware, malware. And we all know that this is nasty stuff, and that we should avoid having it on our systems – they tend to slow things down, eat resources, get in the way, destabilize things, what have you. Bad stuff.

Today, I want to cover shovelware. Shovelware is that software which is of such low quality or is otherwise so nearly useless to you (or has an atrocious usefulness to resources ratio) that just about the only way the companies can get it in front of you is to pay hardware manufacturers to pre-install it on new machines, or otherwise bundle it with other software you do want. If you don’t get a choice to install it or not, it’s probably shovelware. Shovelware takes up resources, but usually isn’t as malevolent as spyware – it’s not trying to record keystrokes or the web pages you visit. But it does cost you. And often it ends up in front of you because someone somewhere along the supply chain didn’t have the right motivations – they weren’t thinking about how to help you, the customer, but how to extract the biggest revenue dollars. Maybe some engineer stopped being excited about going to work and it just became a job.

Who knows?

I’ve hit a bunch of examples of shovelware lately. It bugs me, because I like a neat, clean system. A clean system performs better, feels better. And logs in a whole bunch faster.

The first resource offense most shovelware commits is the task tray icon on Windows. Hey, some software really has a legitimate reason for starting up when I log in and having an icon there. Palm Hot Sync, a virus scanner, Microsoft AntiSpyware – these all have a legitimate reason for starting up when I log in, and might as well have an easy access icon. But QuickTime? Java? C’mon, these things can be lazy started. And don’t insult me by giving me an option to hide the darned icon. That’s not what I want – I want you to not eat resources just because I logged in. Some of these things are even more pointless. HP’s printer drivers now install an additional little tray icon utility, and I don’t see the value add – everything it gives you access to is available elsewhere as far as I can figure out. But can I stop it from loading at startup? Is some critical service wrapped up in that resource theft? Do I really need some WinCinema manager in the taskbar tray making login take an extra 10 seconds? And just why is it so hard to keep Microsoft Messenger from starting up?

Of course, there’s also the Start menu pollution and disk space munching that happens. I needed to burn a CD from an .iso image on my Media Center the other day, and the program I would normally use (CDRWin) doesn’t like that drive. So, I’d heard about Nero, decided to give that a try. First, the download was 100MB – crazy. Clearly, there’s other junk in there. When I installed the trial, it installed all sorts of shovelware I didn’t want (now, I don’t know if what they installed was good stuff or not – it was just useless *for me*, and that’s the key). I just wanted the burning component. But because I couldn’t just get and buy the burning component, I gave up, uninstalled Nero, and searched a little harder, finding CD Burner XP Pro. Not only was it free, it worked and didn’t come with any shovelware. Nice.

Heck, I think shovelware is one of the things that hurt RealPlayer so much. Once they started bundling all the shovelware with the player, I gave up on it. Who needs all that garbage – some of which verges on adware – when there are ways of getting media content without it?

Then there’s the shovelware that comes with a new system. Both Dell and HP are bad about this (I’m sure others are as well, I just don’t have direct experience with them). It’s why many of us have a policy of wiping out and installing from a *real* Windows XP install CD right after getting a new machine, and why I think most of those “restore” disks are practically useless. My policy is to never buy a machine that doesn’t come with a real Windows install CD – a policy I violated with my Media Center PC, and of course ended up regretting when I restored it to try and stabilize it only to find that all the shovelware HP was installing meant the system didn’t even start out too terribly stable. Yuck. Wouldn’t it be nice if a system just coming out of the box was actually *clean*, and not the mucked up, dirty, registry scrambled mess that most manufacturers think pass for acceptable these days?

Of course, there’s also the case of software that starts out useful and devolves. I like my MusicMatch radio, but over the past two years, the darned thing has gone from eating 10% of my machine CPU to just about half. I don’t know what you do to waste that sort of time, but that’s getting near the limit of tolerability – that is software that is on the verge of becoming shovelware for me (though I really like having music playing while I work, so I can tolerate quite a bit). MP3 players tend to skirt this edge more than other programs it seems, grabbing memory and keystrokes and otherwise doing things that get in the way.

Now, I’m sure that any and all this stuff may be useful to someone, somewhere. But it isn’t to me, and I don’t want to have to go digging around afterward shutting it off. I know conventional wisdom says that modern machines have enough power and memory for all this stuff, but there is just so much of this junk that it’s death by a thousand paper cuts. Every percentage of CPU actually counts – it could make the difference between a clean and glitchy video capture. And every chunk of RAM that is forcefully kept alive counts – when that blue line (free RAM) in my Performance Monitor hits 10MB, things are going to come to a crawl (I always have Performance Monitor running on Windows and X Resource Graph running on Mac, though Activity Monitor is OK in a pinch).

I suppose the worst part is that it’s just so darned hard to keep a machine clean. Yeah, I go into Add/Remove Programs regularly and clean up old junk – but sometimes things are bundled in ways that you can’t get rid of the bad without getting rid of the good. And even if you could, you still have to have a good registry cleaner because so many of the uninstallers are half-hearted efforts. And you shouldn’t really have to pull open the Program Files directory to go cleaning up disk files left behind, but often do. It used to be that you’d have to completely re-install Windows 98 every so often because the system itself became unstable. Now, while Windows XP itself is stable, it’s getting to the point where you have to do a re-install just to really get rid of all the vestiges of all that shovelware.



The Sad State of Hardware

I (vaguely) remember when I was a young, naive programmer, and I assumed that compilers could essentially do no wrong. Gosh, if something was broken or didn’t work right, it _must_ be my fault. Of course, that assumption didn’t last long. I remember when I first realized that my program was compiling differently based on if and where comments were – must have been some bug in the compiler parser, who knows? Once that line was crossed, though, I began to get a sense of what problems might be mine, what actually might be the compiler, when I should pull up the generated code to look for compiler faults – and figuring out how to work around them. And I’d get a sense of which compilers were the worst offenders. Heck, there was one compiler which seemed to have the development philosophy of shoving in every optimization they could think of with a major version release, then back them out in patches as they got code generation bug reports.

Hardware was different. Because hardware could always break, you’d always be on the lookout for it to go funny. From loose cables, to unseated cards in slots, to overheating problems. You’d expect something to eventually go bad because it lived in the physical world and eventually everything dies. But you didn’t expect design flaws – at least not flakey ones that affected specific software packages. If something was busted, it was pretty consistently busted. And there wasn’t nearly the variety of hardware that you could plug together – not to mention that tolerances of dodgy parts was looser.

Things have changed, and not for the better. It was always the case that chips might have errata – but they’d usually be covered up in the microcode or in the BIOS before things shipped. Devices would have issues, but the driver would be tweaked before the device was shipped to compensate. In general, you could buy a box, and have reasonable confidence it was put together reasonably and would do what you wanted. Sure, there were the manufacturers who had the (very) annoying habit of putting in cheesed-up and dumbed down OEM versions of certain boards, and it would be hard to get driver updates for them. You just learned to avoid them.

Then there was that first interesting case of a design flaw on the motherboard. An undersized capacitor specification on a reference motherboard design would allow the system memory bus to go undervolt if the bus was saturated for more than a certain length of time. Guess which program was best at managing to do that? I think for a lot of the users affected by that problem it was the first time they had encountered real hardware issues – and much like my early experience with compilers, they were hard pressed to believe that it was a hardware issue. Heck, isn’t it always the fault of the software? There had always been bad RAM issues that Photoshop seemed to be the only piece of software that could tickle it, but this was the first really widespread hardware issue. And worse, the only fix was a physical motherboard swap.

Now, of course, it’s even worse. With the internet being an accepted method of delivering drivers and BIOS updates and whatnot, I think most PC manufacturers have gotten lazy. Things no longer necessarily go out the door in a working state. Heck, the Media Center PC I have at home couldn’t burn DVDs for the first 10 months I owned it, until a BIOS update and a driver firmware update (and a re-image).

Don’t get me wrong – I think the internet updates are great for letting slightly older hardware adapt to new things without having to go replacing it all. Much better to just flash your WiFi card to add better security than it is to replace it.

But I think we’ve hit a point where things are just abusive for the average consumer. We now see RAM issues so frequently that it’s part of the FAQ list. Video drivers seem to be perpetually broken in some way or another. I simply don’t feel comfortable recommending machines from any name brand PC makers anymore because they all seem to have serious weak points in their product lines. Now, one could argue that it’s the price preassures that have been at least partly to blame for this – if people didn’t try and save every last dollar from their purchases, that the manufacturers could use more reliable and better tested parts.

I just nailed down an issue today having to do with Hyperthreading and memory allocation. Yes, a BIOS update solved it (as did turning off hyperthreading). But should the average Photoshop user really have to know how to figure out the motherboard manufacturer, find the BIOS update and apply it? Or turn off Hyperthreading? And why wasn’t the BIOS update pushed out over Windows Update for such a serious issue? To the user, it just looked like Photoshop was freezing up in the middle of painting or other operations.

And, of course, it hits Photoshop more than other applications. It’s the nature of the beast – the data sets we’re dealing with are very large for an end-user application, and we move it around a lot faster than other applications. It’ll expose marginal hardware more often than the best system diagnostic software.

Some would argue that we should make Photoshop tolerant of bad hardware. On this I have to disagree. We’re talking about penalizing all users because some people bought dicey systems or cheap RAM. For those who get bad hardware, it sucks – but the right place to take that up with is the hardware manufacturer (yes, I know such things generally fall on deaf ears). But until they get some kickback, they’re going to continue to put out flakey stuff, crammed with shovelware, that’ll manage to run your MP3 player or browser, but gives up the ghost when trying to capture video or touch up your pictures. Or, let’s push the PC magazines to put together some real stress tests, and rate the hardware vendors on long-term stability – knowing what machine is fastest at Quake 4 is useless if it reboots after 9 hours of heavy use because of a thermal issue. I’d say start buying Macs, but things don’t seem to be too much better there, either. I think the hardware is (generally) in better shape, but I think the OS could use a bit more bug fixing.

Until then, just realize that marginal hardware can affect software, especially those programs which try and get the best performance out of your machine. Users shouldn’t have to become hardware diagnosticians just to remove red eye from their kids’ pictures, but that’s where we’re at.

Sucks but true.