Posts in Category "Hardware"

Reboot: New Habit, New Machine

Ok, time to get this thing restarted.  I just need to make this a weekly habit.  I’ve been better at being able to form good, new habits recently – a Google search turns up varying answers for how long it takes.  I think it just takes conscious effort until it doesn’t.

So, there’s an interesting engineer-ism that I’ve seen be true more often than not: software will run best on whatever is on the engineer’s desk.  Well, here at work, for my Windows box, I’ve now been running Vista since it came out, and the 64-bit version of Vista for over a year.  (I spend equal time on my Macs, so don’t get any crazy ideas).  The machine here at work has two quad-cores in it, which is great for compiling, but is kind of overkill for Photoshop.  It’s got 8GB RAM in it, which isn’t overkill anymore.

Knowing that engineer-ism, I let that inform me when I recently bought a new machine for home.  I bet I’m like a lot of you when it comes to such things – I like to look for the best value I can get, including being fast enough so that the lifetime of the machine is a good, long time.

The machine I got is a workstation-class (nothing high-end, mind you, but not consumer-level either) refurbished machine.  It’s got a quad-core processor in it – not the fastest, but not far from it – 4GB ram, 3 hard drives (I couldn’t quite justify a NAS RAID setup at home… yet.  The machine came with 1 drive, I just moved the biggest, most recently purchased drives over from the old machine using a couple of SATA-IDE adapters).  I don’t do any 3D work, and most of the time, I use the machine remotely from downstairs, so the onboard video was good enough.  That’s always easy enough to upgrade later if need be.  I made sure that the motherboard supported 8GB of RAM, so I’m not hitting a wall there, either.

The machine came from a company which was proud of their "XP downgrade".  Blech.  I suspect it’s just because that’s what’s on most of their engineers desks.  Sorry, this geek likes Vista.  I like the faster boot, the more aggressive caching (unused RAM is wasted RAM), the fixed video driver model, and rock-solid stability.  Yes, everything got moved around from XP, and some of the moves still don’t make sense to me.  But things got moved around between NT and Win2k too, and I lived through that.

So, I looked through the pre-installed, uhm, software on the machine, didn’t see anything worth caring about.  I then went through my normal process for re-imaging a machine – bring up Add/Remove Programs on the old machine (yes, the renaming of that item in Vista still annoys me – it’s harder to find in the middle of the Control Panel menu – which I still always set up to be a big flat menu off the Start button list), walk through the programs list, writing down those applications which I remember being useful.  Then went and found updated drivers for the new machine (before re-installing the OS) , got them on a USB stick.  And finally, did a full, fresh install of the 64-bit version of Vista Ultimate.  Ah.  Much better.

Is moving to an 64-bit version of Vista for everybody?  My advice is the same as it’s always been: check for driver availability and software compatibility before leaping.  My personal perception is that driver availability for the 64-bit versions of Vista has changed dramatically over the summer, and I haven’t had an issue with the recent versions of the any of the software I use.  Your mileage may vary.  My 3-year-old laptop is still running XP precisely because of those caveats.

So, there you have it.  This engineer, on his main Windows machines, is running 64-bit editions of Vista.  Take it for what you will.

64 bits…when?

I’ve gotten a number of questions on the beta forums as to why Photoshop CS3 won’t have a 64-bit version.  It’s definitely a when question, not an if, and there are a lot of factors involved.  I though I might collect some of the information together here.

First, let’s check all the 64-bit hype at the door.  Being a 64-bit program means, most simply, that pointers in an application are 64 bits wide, not 32 bits.  A 32 bit pointer means that an application can address 2 ^ 32 bytes of memory, or 4GB, at the most.  The operating system an application runs on slices that application address space up, so that the application can actually only allocate a part of that address space for itself.  Thus, on Windows XP, an application can use 2GB of address space, on Macintosh OS X 10.4, an application gets 3GB, on Windows XP 64-bit edition, a 32-bit application gets nearly 4GB of address space.  Applications don’t allocate RAM on most modern operating systems – that’s a common misconception and a gross oversimplification your computer geek friends tell you because they don’t want to explain virtual memory, TLBs and the like.

A 64-bit application doesn’t have same that limit on its address space, since pointers are 64 bits – they can address a much larger amount of memory.  That’s pretty much it.  64-bit applications don’t magically get faster access to memory, or any of the other key things that would help most applications perform better.  Yes, the x64 instruction set has some more registers available, and that helps in limited circumstances, but the processing throughput of a memory bandwidth bound application is pretty much not going to benefit from being a 64-bit application.  In fact, it gets worse in many cases because the internal data structures the application is dealing with get bigger (since many data structures contain pointers, and pointers in a 64-bit application are twice as big as in a 32-bit application).  Memory bandwidth has not kept up with processor speeds, and has become a precious resource.

Let’s contrast to the old switch from 16-bit to 32-bit computing.  During that switch, 16-bit applications were already not really true 16-bit applications (in order to address 640k of memory, there was a really ugly hack that meant pointers weren’t really pointers and weren’t really 16 bits) – so the data structures an application was dealing with pretty much didn’t change in size at all.  Even if they did, memory bandwidth at that point in history was high enough relative to processor performance that what data structure size increases did happen were easily absorbed and didn’t affect performance significantly.  Not only that, but for many operations, moving to 32-bit computing meant a lot of fixed point math could be done in a lot fewer instructions, and a lot faster.  For many applications, this was such a huge win -and one of the reasons why the switch to 32-bit computing happened fast relative to the 64-bit computing switch.  These days, many 32-bit processors already have math instructions for doing 64-bit fixed point math to the degree most applications need it, and many things are done in floating point.  Combine the performance penalty for the data structure size increase with the lack of any sort of other performance win, and the number of situations in which an application being 64-bit are a performance win is very small.

And, in the cases where it would be a win – dealing with a very large data set (Photoshop files over 1GB is a good guess as to about where things get interesting) – having the operating system use any unused RAM to cache scratch disk reads and writes can be nearly as effective (depending on how good the operating system is at this – and there are even a few add-on products which can make it even better) as having Photoshop itself being able to hold all of scratch within it’s own address space.  (Side note: a computer system is most efficient when it has nearly no free RAM – there really is no point in having RAM sit empty.  An aggressive file cache is one good way of accomplishing this.  Vista is supposed to be really good at this, but I don’t have enough direct experience yet to know for sure.)

Ok, so in a very limited set of cases there might be a little bit of a win for Photoshop to be a 64-bit application.  But…

A 64-bit application can’t be run on a 32-bit chip or a 32-bit operating system.  That means a 64-bit version of Photoshop would have to be a completely separate binary from the 32-bit version.  A separate binary has a huge cost associated with it (in terms of features we can’t do because we have to spend resources elsewhere).  Quality assurance is a big part of that – you essentially have to dive in and test every corner of the app for each binary, across the matrix of supported operating systems – but there are also many ancillary pieces that add to that cost as well.  And given that a Universal Binary application really is two separate binaries smashed together (and accounting for the kinds of issues that an application can have going from big endian (PPC) to little endian) we already had a lot of extra cost going through this cycle.  Adding the cost of adding a 64-bit version to the mix of things that were already on the have-to-do list – especially in light of the very limited benefits – and doing a 64-bit version this cycle really became an unreasonable thing to think about it.

There’s more.  Since a 64-bit application can only run on a 64-bit chip with a 64-bit operating system, just how many users could run a 64-bit version in the first place?  Macintosh OS X 10.4 isn’t fully 64-bit – many of the libraries an application would need to be fully 64-bit aren’t available.  So, on the Macintosh side the answer is zero.  Ok, so consider the Windows side: Windows XP 64-bit edition has been out a bit.  How many Photoshop customers run it?  A vanishingly small percentage – there’s been a lack of compatible drivers, especially for the kinds of things that make up a Photoshop workflow, like scanner divers, monitor calibration bug drivers, and the like. Things are looking better for Vista – it comes with a wider spread of 64-bit drivers out of the box – but there are still 64-bit versions of some of the pieces of the puzzle missing there, and the expected Vista adoption rates mean that the number of Photoshop customers running the 64-bit version of Vista will remain very tiny over the next couple of years.  Yes, there’s a whole look to the future thing, but even considering that, the timing just isn’t right yet.

As for the engineering part of it, well, I want to do it.  I want the transition to 64-bit computing to happen sooner rather than later, I’d really like to not have to worry about address space limitations so much.  Once, long ago, I ported Display PostScript to the Dec Alpha workstations (what a nice processor, sigh…), so I have a good idea of what the level of engineering work will be.

At some point, some of these things will change – certainly the number of systems capable of running a 64-bit version of Photoshop will – and at some point it will make sense to do a 64-bit version.  That wasn’t this time around.  But like I said, it’s a when, not an if.

The Sad State of Hardware

I (vaguely) remember when I was a young, naive programmer, and I assumed that compilers could essentially do no wrong. Gosh, if something was broken or didn’t work right, it _must_ be my fault. Of course, that assumption didn’t last long. I remember when I first realized that my program was compiling differently based on if and where comments were – must have been some bug in the compiler parser, who knows? Once that line was crossed, though, I began to get a sense of what problems might be mine, what actually might be the compiler, when I should pull up the generated code to look for compiler faults – and figuring out how to work around them. And I’d get a sense of which compilers were the worst offenders. Heck, there was one compiler which seemed to have the development philosophy of shoving in every optimization they could think of with a major version release, then back them out in patches as they got code generation bug reports.

Hardware was different. Because hardware could always break, you’d always be on the lookout for it to go funny. From loose cables, to unseated cards in slots, to overheating problems. You’d expect something to eventually go bad because it lived in the physical world and eventually everything dies. But you didn’t expect design flaws – at least not flakey ones that affected specific software packages. If something was busted, it was pretty consistently busted. And there wasn’t nearly the variety of hardware that you could plug together – not to mention that tolerances of dodgy parts was looser.

Things have changed, and not for the better. It was always the case that chips might have errata – but they’d usually be covered up in the microcode or in the BIOS before things shipped. Devices would have issues, but the driver would be tweaked before the device was shipped to compensate. In general, you could buy a box, and have reasonable confidence it was put together reasonably and would do what you wanted. Sure, there were the manufacturers who had the (very) annoying habit of putting in cheesed-up and dumbed down OEM versions of certain boards, and it would be hard to get driver updates for them. You just learned to avoid them.

Then there was that first interesting case of a design flaw on the motherboard. An undersized capacitor specification on a reference motherboard design would allow the system memory bus to go undervolt if the bus was saturated for more than a certain length of time. Guess which program was best at managing to do that? I think for a lot of the users affected by that problem it was the first time they had encountered real hardware issues – and much like my early experience with compilers, they were hard pressed to believe that it was a hardware issue. Heck, isn’t it always the fault of the software? There had always been bad RAM issues that Photoshop seemed to be the only piece of software that could tickle it, but this was the first really widespread hardware issue. And worse, the only fix was a physical motherboard swap.

Now, of course, it’s even worse. With the internet being an accepted method of delivering drivers and BIOS updates and whatnot, I think most PC manufacturers have gotten lazy. Things no longer necessarily go out the door in a working state. Heck, the Media Center PC I have at home couldn’t burn DVDs for the first 10 months I owned it, until a BIOS update and a driver firmware update (and a re-image).

Don’t get me wrong – I think the internet updates are great for letting slightly older hardware adapt to new things without having to go replacing it all. Much better to just flash your WiFi card to add better security than it is to replace it.

But I think we’ve hit a point where things are just abusive for the average consumer. We now see RAM issues so frequently that it’s part of the FAQ list. Video drivers seem to be perpetually broken in some way or another. I simply don’t feel comfortable recommending machines from any name brand PC makers anymore because they all seem to have serious weak points in their product lines. Now, one could argue that it’s the price preassures that have been at least partly to blame for this – if people didn’t try and save every last dollar from their purchases, that the manufacturers could use more reliable and better tested parts.

I just nailed down an issue today having to do with Hyperthreading and memory allocation. Yes, a BIOS update solved it (as did turning off hyperthreading). But should the average Photoshop user really have to know how to figure out the motherboard manufacturer, find the BIOS update and apply it? Or turn off Hyperthreading? And why wasn’t the BIOS update pushed out over Windows Update for such a serious issue? To the user, it just looked like Photoshop was freezing up in the middle of painting or other operations.

And, of course, it hits Photoshop more than other applications. It’s the nature of the beast – the data sets we’re dealing with are very large for an end-user application, and we move it around a lot faster than other applications. It’ll expose marginal hardware more often than the best system diagnostic software.

Some would argue that we should make Photoshop tolerant of bad hardware. On this I have to disagree. We’re talking about penalizing all users because some people bought dicey systems or cheap RAM. For those who get bad hardware, it sucks – but the right place to take that up with is the hardware manufacturer (yes, I know such things generally fall on deaf ears). But until they get some kickback, they’re going to continue to put out flakey stuff, crammed with shovelware, that’ll manage to run your MP3 player or browser, but gives up the ghost when trying to capture video or touch up your pictures. Or, let’s push the PC magazines to put together some real stress tests, and rate the hardware vendors on long-term stability – knowing what machine is fastest at Quake 4 is useless if it reboots after 9 hours of heavy use because of a thermal issue. I’d say start buying Macs, but things don’t seem to be too much better there, either. I think the hardware is (generally) in better shape, but I think the OS could use a bit more bug fixing.

Until then, just realize that marginal hardware can affect software, especially those programs which try and get the best performance out of your machine. Users shouldn’t have to become hardware diagnosticians just to remove red eye from their kids’ pictures, but that’s where we’re at.

Sucks but true.

Saluting General Specific (Canon 20Da)

So, I find this camera fascinating. Not for it’s specific features, but more for what it represents and means. Here is a camera, based on a popular DSLR, that just by tweaking a few things could target the fairly specific needs of an not-very-large target market. Now, if you read the reviews of the camera on the general photography sites, you might not understand how neat this camera is. But if you know someone who does astrophotography as a hobby (like my father), you would know different.

The review here can be helpful in understanding part of why this camera is so interesting. With the capability of taking images nearly as good as sensors costing 4 times as much, it’s quite a bargin. But that’s only the beginning. Because, you see, with those other sensors, you have to have a tethered laptop with you. Not only that, the images are stored in a proprietary file format that has some, well, annoying software that you have to use. I actually tried to help my father with some processing one evening and was tearing my hair out by the end of it. (Though, if you’re lucky, you can get the files into FITS format, in which case the FITS Liberator is your friend). Now, if you are picky enough to know that you just don’t want to deal with Bayer filter geometrical artifacts, those more expensive solutions are probably still for you. But for most, it is a hobby, and like most hobbies, you never get to spend enough time at it, and anything like the 20Da that saves you that much time and lets you have more fun is worth it.

What is interesting to me is that a large company such as Canon would not only stumble into having a general camera such as the 20D that had some nice properties for this hobby, but they would actually listen to their customers and add the now incremental improvements to create the market specific 20Da – and then listen a second time and make the camera available worldwide after a Japan-only initial release.

I think that’s kinda cool.