Author Archive: Scott Byer

Reboot: New Habit, New Machine

Ok, time to get this thing restarted.  I just need to make this a weekly habit.  I’ve been better at being able to form good, new habits recently – a Google search turns up varying answers for how long it takes.  I think it just takes conscious effort until it doesn’t.

So, there’s an interesting engineer-ism that I’ve seen be true more often than not: software will run best on whatever is on the engineer’s desk.  Well, here at work, for my Windows box, I’ve now been running Vista since it came out, and the 64-bit version of Vista for over a year.  (I spend equal time on my Macs, so don’t get any crazy ideas).  The machine here at work has two quad-cores in it, which is great for compiling, but is kind of overkill for Photoshop.  It’s got 8GB RAM in it, which isn’t overkill anymore.

Knowing that engineer-ism, I let that inform me when I recently bought a new machine for home.  I bet I’m like a lot of you when it comes to such things – I like to look for the best value I can get, including being fast enough so that the lifetime of the machine is a good, long time.

The machine I got is a workstation-class (nothing high-end, mind you, but not consumer-level either) refurbished machine.  It’s got a quad-core processor in it – not the fastest, but not far from it – 4GB ram, 3 hard drives (I couldn’t quite justify a NAS RAID setup at home… yet.  The machine came with 1 drive, I just moved the biggest, most recently purchased drives over from the old machine using a couple of SATA-IDE adapters).  I don’t do any 3D work, and most of the time, I use the machine remotely from downstairs, so the onboard video was good enough.  That’s always easy enough to upgrade later if need be.  I made sure that the motherboard supported 8GB of RAM, so I’m not hitting a wall there, either.

The machine came from a company which was proud of their "XP downgrade".  Blech.  I suspect it’s just because that’s what’s on most of their engineers desks.  Sorry, this geek likes Vista.  I like the faster boot, the more aggressive caching (unused RAM is wasted RAM), the fixed video driver model, and rock-solid stability.  Yes, everything got moved around from XP, and some of the moves still don’t make sense to me.  But things got moved around between NT and Win2k too, and I lived through that.

So, I looked through the pre-installed, uhm, software on the machine, didn’t see anything worth caring about.  I then went through my normal process for re-imaging a machine – bring up Add/Remove Programs on the old machine (yes, the renaming of that item in Vista still annoys me – it’s harder to find in the middle of the Control Panel menu – which I still always set up to be a big flat menu off the Start button list), walk through the programs list, writing down those applications which I remember being useful.  Then went and found updated drivers for the new machine (before re-installing the OS) , got them on a USB stick.  And finally, did a full, fresh install of the 64-bit version of Vista Ultimate.  Ah.  Much better.

Is moving to an 64-bit version of Vista for everybody?  My advice is the same as it’s always been: check for driver availability and software compatibility before leaping.  My personal perception is that driver availability for the 64-bit versions of Vista has changed dramatically over the summer, and I haven’t had an issue with the recent versions of the any of the software I use.  Your mileage may vary.  My 3-year-old laptop is still running XP precisely because of those caveats.

So, there you have it.  This engineer, on his main Windows machines, is running 64-bit editions of Vista.  Take it for what you will.

Photoshop World – Heavy Lifting

Well, another Photoshop World come and gone.  This year Adam Jerugim and I did a presentation on setting up a machine for doing very large file work.  Thank you again to those who attended – we were up against some interesting stuff.  I know I promised to have this entry up on Monday, but I’ve been struggling with getting PowerPoint to let me extract the information in a good way.  I’ll probably have to update this entry a couple more times as I figure out how to get what I want pulled out.

I wanted to call out a couple of things that are currently buried in the speaker notes, and I’m not sure if I got them across appropriately.  First, setting Photoshop’s memory percentage to 100% only makes sense if you’ve got more than 4GB of RAM in the machine, and again, only if you haven’t run into trouble running the filters you need that way, and are on CS3.  We’ve improved our ability to back off in the case that the machine we’re on starts to page every version.  However, it’s still important to watch that free RAM (or, in the case of Vista, the amount still being used for the system file cache).  It’s important that you watch what’s going on on your system when pushing things to their limit.  If you’re regularly seeing free memory (or the amount of free memory + system cache on vista) go below 20MB, it’s time to back off that memory percentage setting and try again.

Someone had asked a question at the end about RAID types, and I wanted to repeat that here – for fastest performance, RAID 0 with 2-6 drives for the Photoshop primary scratch disk is the fastest way to go, but isn’t a good place for storing files unless you’re going to back them up very frequently (daily?) to a big server on the network.  RAID 0+1 or RAID 5 would add in the reliability at some cost in performance or the need for additional drives.  We still need to measure which of those is the best way to go.  It’ll be about throughput.

So, here’s the current version of the presentation.  The animations don’t come across, and I still have to go through the speaker notes and clean them up and get a good export of that.  Hopefully within the next day or so, and I’ll update this post.

psworldperformancepresentation_export

[Update: I meant to get the version of the presentation with extended notes up before my travels, apologies for not getting that done.  I'll get that up and catch up on the comments when I return in October. -Scott]

[Update: Sorry this took so long, but PowerPoint and I had to come to an, um... understanding.  So below is the link to the annotated version of the presentation, with all slide builds manually exploded apart so that they are actually usefull. -Scott]

PSWorldPerformancePresentation_Expanded

64 bits…when?

I’ve gotten a number of questions on the beta forums as to why Photoshop CS3 won’t have a 64-bit version.  It’s definitely a when question, not an if, and there are a lot of factors involved.  I though I might collect some of the information together here.

First, let’s check all the 64-bit hype at the door.  Being a 64-bit program means, most simply, that pointers in an application are 64 bits wide, not 32 bits.  A 32 bit pointer means that an application can address 2 ^ 32 bytes of memory, or 4GB, at the most.  The operating system an application runs on slices that application address space up, so that the application can actually only allocate a part of that address space for itself.  Thus, on Windows XP, an application can use 2GB of address space, on Macintosh OS X 10.4, an application gets 3GB, on Windows XP 64-bit edition, a 32-bit application gets nearly 4GB of address space.  Applications don’t allocate RAM on most modern operating systems – that’s a common misconception and a gross oversimplification your computer geek friends tell you because they don’t want to explain virtual memory, TLBs and the like.

A 64-bit application doesn’t have same that limit on its address space, since pointers are 64 bits – they can address a much larger amount of memory.  That’s pretty much it.  64-bit applications don’t magically get faster access to memory, or any of the other key things that would help most applications perform better.  Yes, the x64 instruction set has some more registers available, and that helps in limited circumstances, but the processing throughput of a memory bandwidth bound application is pretty much not going to benefit from being a 64-bit application.  In fact, it gets worse in many cases because the internal data structures the application is dealing with get bigger (since many data structures contain pointers, and pointers in a 64-bit application are twice as big as in a 32-bit application).  Memory bandwidth has not kept up with processor speeds, and has become a precious resource.

Let’s contrast to the old switch from 16-bit to 32-bit computing.  During that switch, 16-bit applications were already not really true 16-bit applications (in order to address 640k of memory, there was a really ugly hack that meant pointers weren’t really pointers and weren’t really 16 bits) – so the data structures an application was dealing with pretty much didn’t change in size at all.  Even if they did, memory bandwidth at that point in history was high enough relative to processor performance that what data structure size increases did happen were easily absorbed and didn’t affect performance significantly.  Not only that, but for many operations, moving to 32-bit computing meant a lot of fixed point math could be done in a lot fewer instructions, and a lot faster.  For many applications, this was such a huge win -and one of the reasons why the switch to 32-bit computing happened fast relative to the 64-bit computing switch.  These days, many 32-bit processors already have math instructions for doing 64-bit fixed point math to the degree most applications need it, and many things are done in floating point.  Combine the performance penalty for the data structure size increase with the lack of any sort of other performance win, and the number of situations in which an application being 64-bit are a performance win is very small.

And, in the cases where it would be a win – dealing with a very large data set (Photoshop files over 1GB is a good guess as to about where things get interesting) – having the operating system use any unused RAM to cache scratch disk reads and writes can be nearly as effective (depending on how good the operating system is at this – and there are even a few add-on products which can make it even better) as having Photoshop itself being able to hold all of scratch within it’s own address space.  (Side note: a computer system is most efficient when it has nearly no free RAM – there really is no point in having RAM sit empty.  An aggressive file cache is one good way of accomplishing this.  Vista is supposed to be really good at this, but I don’t have enough direct experience yet to know for sure.)

Ok, so in a very limited set of cases there might be a little bit of a win for Photoshop to be a 64-bit application.  But…

A 64-bit application can’t be run on a 32-bit chip or a 32-bit operating system.  That means a 64-bit version of Photoshop would have to be a completely separate binary from the 32-bit version.  A separate binary has a huge cost associated with it (in terms of features we can’t do because we have to spend resources elsewhere).  Quality assurance is a big part of that – you essentially have to dive in and test every corner of the app for each binary, across the matrix of supported operating systems – but there are also many ancillary pieces that add to that cost as well.  And given that a Universal Binary application really is two separate binaries smashed together (and accounting for the kinds of issues that an application can have going from big endian (PPC) to little endian) we already had a lot of extra cost going through this cycle.  Adding the cost of adding a 64-bit version to the mix of things that were already on the have-to-do list – especially in light of the very limited benefits – and doing a 64-bit version this cycle really became an unreasonable thing to think about it.

There’s more.  Since a 64-bit application can only run on a 64-bit chip with a 64-bit operating system, just how many users could run a 64-bit version in the first place?  Macintosh OS X 10.4 isn’t fully 64-bit – many of the libraries an application would need to be fully 64-bit aren’t available.  So, on the Macintosh side the answer is zero.  Ok, so consider the Windows side: Windows XP 64-bit edition has been out a bit.  How many Photoshop customers run it?  A vanishingly small percentage – there’s been a lack of compatible drivers, especially for the kinds of things that make up a Photoshop workflow, like scanner divers, monitor calibration bug drivers, and the like. Things are looking better for Vista – it comes with a wider spread of 64-bit drivers out of the box – but there are still 64-bit versions of some of the pieces of the puzzle missing there, and the expected Vista adoption rates mean that the number of Photoshop customers running the 64-bit version of Vista will remain very tiny over the next couple of years.  Yes, there’s a whole look to the future thing, but even considering that, the timing just isn’t right yet.

As for the engineering part of it, well, I want to do it.  I want the transition to 64-bit computing to happen sooner rather than later, I’d really like to not have to worry about address space limitations so much.  Once, long ago, I ported Display PostScript to the Dec Alpha workstations (what a nice processor, sigh…), so I have a good idea of what the level of engineering work will be.

At some point, some of these things will change – certainly the number of systems capable of running a 64-bit version of Photoshop will – and at some point it will make sense to do a 64-bit version.  That wasn’t this time around.  But like I said, it’s a when, not an if.

Beta!

It’s real.  A big, public beta of Photoshop CS3.  Now, if you own CS2, you can see what’s been keeping us so busy lately.

Now, you can read all about features on other sites – I don’t think this needs to be yet another me too list of features.  And this isn’t the place for reporting bugs or asking questions – use the Labs forums for that, I’ll be there along with some of the other engineers, time allowing (hey, we have to finish this thing).

We are doing this mostly because we wanted to get the Macintosh Universal Binary of the product into your hands as soon as we could.  It really wasn’t possible to do that with an updated CS2, it really did take that much effort, and it really wasn’t ready until recently.  But then, it wouldn’t be fair to do just a beta on the Mac side and not let the Windows users along for the ride.

This is a beta, but I think it’s in pretty darned good shape.  Part of why is that as a development team we broke free of the old waterfall methodology.  If you’re a developer, visit Waterfall 2006.  I find it pretty funny, maybe that’s just me – I’ve been living the waterfall nightmare for so long, the laughter comes from a dark place…

What it means is that we’ve kept the bug count low the entire cycle – especially the nastiest bugs.  Now, your "favorite" long-time bugs and annoyances may or may not be fixed (yes, we do listen, I keep a list, and we get to as many as we can), and after all, this is still a beta, there will be dragons there, but for the most part for something still this far from release, it’s in pretty good shape – and this wasn’t a special, lots-a-pain bug-fix build, but pretty much just pulling out a daily.  Yeah, I know, some of you developers will be going "but we’ve been doing things that way for a long time".  Hey, around here, a lot of the groups fall right back into waterfall at the first sign of trouble.  Nasty.

A couple of notes – due to the platform rules, the Macintosh version when running native on an Intel-based Macintosh will not load and run PPC-only plug-ins.  You’ll have to run under Rosetta to access those old scanner plug-ins. Or better yet, just use the scanner software in stand-alone mode.  Also, brush-sized cursors aren’t working when running Macintosh Intel native yet (they do work in Rosetta).  With a little help from Apple, the replacement for the old framebuffer-fiddling methods are well underway, but they didn’t make the beta.

The new UI can test Windows XP video drivers a little more, so you may want to make sure you have those updated, especially if you see some problems.  I like the new palette panes, but they do take getting used to, so give them a chance.  There are legacy workspaces in there if you really need the floating palettes back.  Vista support is in there, too.

As for performance, you should see start up times that are much lower than for CS2.  We’ve really worked hard on this – there’s more to go, of course, this is just a beta.  Oh, and as for comparing performance between platforms now that they use the same chip and we’re a Universal Binary on the Macintosh, well, I just don’t think such comparisons are valid – there are so many variables involved: number of processors, system memory bandwidth, disk I/O bandwidth, OS scheduling, API performance.  So if you see sites claiming that one platform or the other performs better, take it with a big grain of salt – I do.

So, if you’ve got the time and inclination (and Photoshop CS2) go ahead and grab the beta from Adobe Labs tomorrow and see what we’ve been up to.

[Edit - 12/15/06 9:30amPST - The links still aren't live on the labs pages yet. It's being worked on. - Scott]

[Edit - 12/15/06 12:15PST - They're live now. Visit the forums at Adobe Labs for more help and info.]

Vegas, baby!

Ah, Photoshop World!  A chance to see some of the wacky and wonderful ways people use Photoshop in their work.  And, of course, to listen and watch for the kinds of problems people see in their everyday work.

I’ll be there, and so will several other members of the Photoshop engineering team.  Find me if you can, say hello, wish me a happy 40th birthday (it’s only a few days early), tell me your one biggest annoyance (but keep it to one).  And if you ask, I can show you some of the pictures I took while in the Galapagos in June.

Macintosh and the Intel switch.

By now you have probably figured out that we aren’t releasing Universal Binaries of our current application versions.  If you haven’t, all you need to know is pretty explicitly spelled out here.


“But, c’mon”, I hear people saying, “Steve said it was just a recompile!”  Or, “Back during the PowerPC transition, you guys released a patch!”


Well, this time is different.  And I really wish it weren’t.  But let me tell you how…


When that original PowerPC transition was done, Apple did something clever.  Very clever.  The emulator that ran 68k code would recognize when it was calling out to PPC code, and would fiddle with things on the stack using the Universal Procedure calling vector.  A lot of gobbledy gook meaning that a 68k binary could call out to PPC code that could then execute at native speeds.  Well, for those that don’t know, Photoshop has a bunch of routines all tucked away to do the real heavy lifting – the bottlenecks.  Most of Photoshop’s CPU time is spent in these routines.  Even better, you can replace these routines using a plug-in.  There’s the Multiprocessor extension plug-in, which replaces some routines with ones that know how to divide work up among multiple processors.  And some which use the multimedia instruction sets that are available to varying degrees on different processors.  And, in the case of the PPC transition, we could replace them with PPC native versions.  With a plug-in, Photoshop could get a majority of the speed up as if it were a fully native application, but – and it’s a key point here – without having to recompile the vast majority of the Photoshop code, along with the resulting testing hit, mounds of debugging, and everything else that would imply.  Most of the gain with a fraction of the cost, it made sense to do a mid-cycle update consisting, essentially, of that plug-in.


Doing that this time around was just not possible for a variety of reasons.  It means is that this time, there’s no limited-cost option for getting most of the performance available on the platform for Photoshop in a short amount of time.  In other words, no shortcuts.


That leaves doing the work for real – taking the whole application over into XCode and recompiling as a Universal Binary.  And that’s no small task.  You see, as software has matured so have the development environments we’ve used – Visual Studio and Metrowerks - they’ve adapted to handle the ever-growing applications using them.  From having projects with large numbers of files that open quickly, to having compact debugging information, to having stable project formats that are text-merge-able in a source control system.  These are things XCode is playing catch-up on.  Now, Apple is doing an amazing job at catching up rapidly, but the truth is we don’t yet have a shipping XCode in hand that handles a large application well.  And switching compilers always involves more work than you would think in a codebase of this size.


Now, I’m an engineer, and I’m all for getting products out in front of customers so they can use their machines to their fullest as soon as possible, but there is just no way putting out a Universal Binary of Photoshop CS2 would make any sort of sense.  If you think about switching tool sets, with the resulting huge amount of work for both engineering and quality engineering, if you think about how far past the Photoshop CS2 release we already are, and if you include not having the workstation-class machines ready yet, I think you’d have to agree – far better to focus on making sure Photoshop CS3 is able to absolutely squeeze every ounce of power out of what I’m sure will be pretty spankin’ Intel-based towers by that point than to do tons of work moving an old code base to new tools.


-Scott

Reap what you Measure

You reap what you measure.  Measure twice cut once.  Don’t tweak without measuring.


So you want Photoshop to perform faster?  You’re gonna want to measure.  But measure what?


There are many things on a system that can go wrong.  You can run out of RAM, a disk can be going bad, you can be bottlenecking on that disk, and you could, in fact, simply be trying to do too much at once.  Unless you watch most of this at once, you’re not going to know what could be going wrong.


Fortunately, both the Macintosh and Windows systems have tools available – built-in or easily available – which measure most of what you need to know.  On Windows, I use the built-in Performance Monitor.  On the Macintosh, there’s Activity Monitor, but I personally prefer a great little application called X Resource Graph, because I like the visual history of the strip chart format, and it has most of the indicators I like to use. 


On the Windows side, I have a standard set of things I like to map on each machine.  I use a consistent color scheme, so that I can go from machine to machine and just glance at Performance Monitor and know what I’m looking at – if you only have one machine, this may not be an issue for you.  I choose blue for memory, red for processor (and green for a dual processor system), yellow for networking items, and various of the dark colors for disk items, with a dashed style.  I map both Available KBytes and Available MBytes so I can play the dual-sensitivity trick – I adjust the scaling on Available MBytes so that the blue line is somewhere in the middle of the strip window, adjusting the scale on the graph so that the blue line ends up somewhere between 10 and 100 (on a machine with 1GB, you’ll end up with it somewhere in the upper half using 0.1, on a machine with 2GB to 5GB, you’ll end up with it in the bottom half using 0.01), then I adjust the scaling on Available KBytes to match, then set the scale to 0.001.  What this does is give you a gross-level idea of free memory on the system, and when memory gets below 100MB free (the danger zone), you get a detailed view.  For each physical disk, I like to add % Disk Time, I set the line style to dashed and a dark color, for no particular reason beyond that’s the way I’ve always done it.  And of course, mapping % Processor Time to a big bright red is to make it obvious.  I then hit the General tab and turn off all the Display Elements.  You may need the legend turned on until you’ve learned what all the colors are, but I’m used to this setup by now, and it lets me have a real uncluttered window with no wasted real estate.  While I’m there, I set the update interval to something useful like 5 seconds.  Yeah, it’s not as much fun to watch that way, but you can usually get a better sense of tends over time this way.


Once I’ve done that, I go back into the Performance Monitor part and use View / Customize View to turn off everything.  I then go into File / Options and set Console mode to User Mode, limited access, single window.  I then use File / Save As to save out perfmon.msc, putting it into my Documents and Settings\<USERID>\Start Menu\Programs\Startup folder, so that it loads up when I log in.  Yeah, you could save it somewhere else and open it when you need it, but I find that having it up all the time is a big advantage.  I then close Performance Monitor, and select perfmon.msc from the Start Menu, All Programs / Startup (you’ll see it there if you saved it there in the previous step).  That gets you this nice, clean, compact monitor window that you won’t mind always having up and that you can glance at (when it’s not buried behind the Photoshop main window – it’s a great window to go along with all the palettes on your new second monitor  and see what’s going on.


On the Macintosh side, things are a little easier.  You can go with the standard Activity Monitor, in which case I’ll leave it up to you to figure out where to watch, or you can install the excellent X Resource Graph, which will let you quickly map all the items you need, and again has a nice strip chart format that gets you that sense of history that can often help figure out what’s going on.  I know some of the folks around here like some of the other monitor programs, but when I glance over to see what’s going on, I don’t need to try and have me brain interpret fancy colored circles on anything.  The key thing here again is turning on enough items in the preferences, and set up a color scheme you like.  Again, I set this as one of my login items.  I really like X Resource Graph’s ability to stalactite/stalagmite some of the graphs, but I really wish I could have a strip graph of the memory items instead of the paging graph (well, in addition to, really).  And as on Windows, I like to slow down the refresh time to once every five seconds, to get a reasonable amount of history into those strip charts.  There are color schemes which you can download which aren’t so colorful, and won’t be so intrusive on your image editing, and I usually have mine set up a little more subdued than this.  And again, it’s a great thing to tuck over on that second monitor.


On both systems, the question becomes, just what are you looking for?


Well, let’s start with memory.  Photoshop uses a lot of memory.  And it has it’s own virtual memory systems because it has to deal with data sets far larger than the address space allowed to 32-bit applications (and we can’t just do a 64-bit version and it’s still not clear if it’s worth it), plus additional reasons I’ll cover some other time.  What it means is Photoshop is going to use the memory percentage of free RAM you’ve told it, and try and live within that amount.  What’s key here is that while living within that amount, you really, really don’t want the amount of system free memory to get below 20MB.  What happens if you get somewhere below that amount of free RAM (the exact amount shifts around between the operating systems, and there are other variables – 20MB seems to be a reasonable safe zone on both platforms) is that the operating systems start trying to get free RAM back by paging out some things.  Well, given that Photoshop at that point is probably occupying most of that RAM that the operating system wants to reclaim, it’s highly likely that parts of Photoshop memory will get paged out.  Well, when Photoshop then goes to try and operate on the part of the image that was in that memory that got paged out, it has to wait for it to get paged in.  Even worse, if the system paging file and Photoshop’s scratch file are on the same physical drive (meaning you only have one set of drive heads), what will often happen is that Photoshop wants to read a new tile into memory from it’s scratch files, but the memory it’s trying to read that into has been paged out – so a little gets paged in (operating systems have annoyingly small page sizes), Photoshop continues trying to read in it’s tile, a little gets read in, then more of the place it’s trying to read it into needs to get paged back in, then a little more gets read, then…  Well, you get the idea.  Now, when both Photoshop scratch and the paging file are on the same physical disk, each of those little flips between reading the scratch file and paging in the memory to read it into forces the drive head to go to that part of the disk.  Photoshop will now be running about the slowest it could run on your machine.    So, let me repeat myself.  You really, really don’t want free RAM to go below 20MB.  Complicating matters are the filters that like to allocate a single chunk of memory to hold the part that’s going to be operated on.  You’ll know these filters because memory will get used up in a big chunk.  Photoshop will try and be good about giving up tile memory to trade in, but it’s probably not getting it all back (I’ll cover that some other time, too).  If you use those filters often (or if you use a few other memory hungry applications at the same time), then your overall session performance may be better if you back off on the Photoshop memory percentage.  Again, the key is that free memory not go below 20MB (I think I heard that somewhere else before… ).


The next thing to watch for is disk.  There’s disk activity, of course, related to paging activity which will also show up in the paging graphs.  And then there is disk activity related to reading files, and related to reading scratch.  The latter is more interesting, especially as it relates to CPU usage.  Kick off a global filter on a big file, and depending on what the filter is, you’ll see a bunch of CPU activity, then a bunch of disk activity.  There should be very little paging activity during this period (you did watch that free memory amount, right?).  Shortening the period of time the CPUs are waiting for the disk would be the purpose of setting up a RAID array for scratch, if you’re so inclined.  You’ll notice that on a heavily fragmented disk, the disk read amounts won’t form a smooth graph.  To avoid scratch file fragmentation, setting up the Photoshop scratch on it’s own disk is a good thing to do. Concentrating on scratch over actual file open/save reads and writes is preferable because you are usually exercising the disk for scratch usage many times more than for file reading and writing.  It’s often instructive to watch how what you are doing in Photoshop affects the CPU and disk activity.


So, who else is hogging all that memory anyway?  That’s where Windows Task Manager or Activity Monitor can come in handy once in a while.  Simply bring them up (and switch to the process tab in Task Manager) and look for the other applications which are using big chunks of memory.  Sometimes, you’ll find surprises there, like e-mail programs taking up 80MB, or a web browser eating 40MB, or an MP3 player taking 60MB (yes, these amounts seem big to me – but I don’t know what’s going on inside of them, they could be being efficient, but I doubt it).  Now, if you run Photoshop occasionally, and you want to keep those other programs running, no problem – just turn back Photoshop’s memory percentage.  With that level of use, tuning Photoshop’s memory percentage to the last megabyte isn’t going to buy you enough over that short a term to justify the likelihood that you’ll get below that magic 20MB free number.  In the busy machine case like this, less is often more.  If a machine is used to run Photoshop, pretty much Photoshop, from morning to night, then watching and tuning the memory percentage is definitely going to save you some time over the long run.  And, even better, if there are unavoidable reasons why your machine acts slow sometimes (like a disk is already busy even before you start saving a file) you can catch it and plan for that inevitable coffee break.


Note that there are side issues, such as the current pause on Macs with more than 4GB of RAM in them (a stale leftover of the Unix days is kicking in and spoiling the fun – the disk cache that Photoshop writes the scratch file through in large memory situations is getting flushed occasionally, and the operating system ends up forcing Photoshop to wait until that’s done), but this should get you started with the gross-level tuning of your Photoshop environment that is fairly easy and, best of all, cheap.


-Scott

Modifiers

Modifiers.  I have a love-hate relationship with modifiers.  On the one hand, they allow instant access to a lot of functionality in a way that’s easy to manipulate with your non-dominant hand.  On the other hand, they have, in the past, hidden useful functionality in ways that are almost impossible to discover.  This was exasperated with the introduction of Tablet PCs, and their complete lack of useful non-dominant hand input buttons or other devices (which is a very annoying shortcoming of Tablet PCs – c’mon, I may not be able to control a second mouse or whatever with my left hand, but for gosh sakes I can push a darned button!).


In some ways, they are a necessary evil – we’ve already essentially used the entire keyboard for shortcuts, the menu system is already complicated enough without being overloaded with more command variants, and while we’ve been on a general kick to make sure all previously hidden modifier only behavior is exposed in the UI somewhere, if you’re really in a groove you don’t want to have to track your mouse all the way over to the options bar just to add a chunk onto a selection.  Pointer locality is an important part of usability (even if we do sometimes fall short – a subject for later).  Heck, I’ve always meant to either find or implement some sort of floating window modifier substitute widget in my copious free time.


We’re in such desperate straights for additional modifier keys that we treat the space bar as one (which has caused more than one funky issue in the past, with keyboard and locale switching).  I’ve joked in the past that we need Emacs style prefix keys, but I really was only joking, as nobody would get those.  It doesn’t help that when you’re trying to squeak in a feature at the 11th hour, and it’s already too late to modify the documentation, that attaching behavior to a modifier doesn’t make a liar out of the screen shots.  But it also doesn’t make for a discoverable feature.


We have tried to fix this in general by exposing what was modifier behavior only in the user interface – particularly in the options bar.  We’ve also tried to rectify this a little bit for the Macintosh folks by adding the tool hint text to the Info palette in Photoshop CS2.  Windows users will recognize that this is the same text that was in the status bar previously (there were other reasons for getting rid of the status bar, plus I’m a stickler for trying to maintain platform parity).  Unfortunately, the Info palette is underneath the Navigator palette by default.  So, bring that Info palette to the front (the tool hint text is enabled by default – check out the Info palette options from the palette flyout menu for more stuff) and watch the tool hint text while you use different tools.


Yes, of course that doesn’t explain everything.  Like it misses explaining what hitting a modifier before the first mouse click with one of the lasso tools does (it is different than what happens with the modifiers after the first click with one of the lasso tools).  In that case, the cursor should give some hint of what will happen.


It also doesn’t help when you’re not trying to do something with a tool.  Like with the Option/Alt key in most dialogs (hint: watch the button labels).  Or in Photoshop 6, 7 and CS, holding the shift key while modifying a linked type layer will operate on all of the type layers (in Photoshop CS2, you can just multi-select the type layers).


Now, most if this is documented in the manual I’m sure you’ve read (nudge, nudge, wink, wink).  But there’s always the undocumented ones, too, like holding the shift key on Windows when dragging the Photoshop CS2 application window (I’m going to make you try it instead of describing it – and no, it’s not perfect, but it really helps in multi-monitor situations).


Remember, nothing goes boom when you press a modifier.  And there are times when us engineers really want to put in useful functionality when it’s too late to modify the UI.  So don’t forget to give your local friendly modifiers a try.


-Scott

Shovelware

So, by this point we hopefully have all heard about spyware, adware, malware. And we all know that this is nasty stuff, and that we should avoid having it on our systems – they tend to slow things down, eat resources, get in the way, destabilize things, what have you. Bad stuff.

Today, I want to cover shovelware. Shovelware is that software which is of such low quality or is otherwise so nearly useless to you (or has an atrocious usefulness to resources ratio) that just about the only way the companies can get it in front of you is to pay hardware manufacturers to pre-install it on new machines, or otherwise bundle it with other software you do want. If you don’t get a choice to install it or not, it’s probably shovelware. Shovelware takes up resources, but usually isn’t as malevolent as spyware – it’s not trying to record keystrokes or the web pages you visit. But it does cost you. And often it ends up in front of you because someone somewhere along the supply chain didn’t have the right motivations – they weren’t thinking about how to help you, the customer, but how to extract the biggest revenue dollars. Maybe some engineer stopped being excited about going to work and it just became a job.

Who knows?

I’ve hit a bunch of examples of shovelware lately. It bugs me, because I like a neat, clean system. A clean system performs better, feels better. And logs in a whole bunch faster.

The first resource offense most shovelware commits is the task tray icon on Windows. Hey, some software really has a legitimate reason for starting up when I log in and having an icon there. Palm Hot Sync, a virus scanner, Microsoft AntiSpyware – these all have a legitimate reason for starting up when I log in, and might as well have an easy access icon. But QuickTime? Java? C’mon, these things can be lazy started. And don’t insult me by giving me an option to hide the darned icon. That’s not what I want – I want you to not eat resources just because I logged in. Some of these things are even more pointless. HP’s printer drivers now install an additional little tray icon utility, and I don’t see the value add – everything it gives you access to is available elsewhere as far as I can figure out. But can I stop it from loading at startup? Is some critical service wrapped up in that resource theft? Do I really need some WinCinema manager in the taskbar tray making login take an extra 10 seconds? And just why is it so hard to keep Microsoft Messenger from starting up?

Of course, there’s also the Start menu pollution and disk space munching that happens. I needed to burn a CD from an .iso image on my Media Center the other day, and the program I would normally use (CDRWin) doesn’t like that drive. So, I’d heard about Nero, decided to give that a try. First, the download was 100MB – crazy. Clearly, there’s other junk in there. When I installed the trial, it installed all sorts of shovelware I didn’t want (now, I don’t know if what they installed was good stuff or not – it was just useless *for me*, and that’s the key). I just wanted the burning component. But because I couldn’t just get and buy the burning component, I gave up, uninstalled Nero, and searched a little harder, finding CD Burner XP Pro. Not only was it free, it worked and didn’t come with any shovelware. Nice.

Heck, I think shovelware is one of the things that hurt RealPlayer so much. Once they started bundling all the shovelware with the player, I gave up on it. Who needs all that garbage – some of which verges on adware – when there are ways of getting media content without it?

Then there’s the shovelware that comes with a new system. Both Dell and HP are bad about this (I’m sure others are as well, I just don’t have direct experience with them). It’s why many of us have a policy of wiping out and installing from a *real* Windows XP install CD right after getting a new machine, and why I think most of those “restore” disks are practically useless. My policy is to never buy a machine that doesn’t come with a real Windows install CD – a policy I violated with my Media Center PC, and of course ended up regretting when I restored it to try and stabilize it only to find that all the shovelware HP was installing meant the system didn’t even start out too terribly stable. Yuck. Wouldn’t it be nice if a system just coming out of the box was actually *clean*, and not the mucked up, dirty, registry scrambled mess that most manufacturers think pass for acceptable these days?

Of course, there’s also the case of software that starts out useful and devolves. I like my MusicMatch radio, but over the past two years, the darned thing has gone from eating 10% of my machine CPU to just about half. I don’t know what you do to waste that sort of time, but that’s getting near the limit of tolerability – that is software that is on the verge of becoming shovelware for me (though I really like having music playing while I work, so I can tolerate quite a bit). MP3 players tend to skirt this edge more than other programs it seems, grabbing memory and keystrokes and otherwise doing things that get in the way.

Now, I’m sure that any and all this stuff may be useful to someone, somewhere. But it isn’t to me, and I don’t want to have to go digging around afterward shutting it off. I know conventional wisdom says that modern machines have enough power and memory for all this stuff, but there is just so much of this junk that it’s death by a thousand paper cuts. Every percentage of CPU actually counts – it could make the difference between a clean and glitchy video capture. And every chunk of RAM that is forcefully kept alive counts – when that blue line (free RAM) in my Performance Monitor hits 10MB, things are going to come to a crawl (I always have Performance Monitor running on Windows and X Resource Graph running on Mac, though Activity Monitor is OK in a pinch).

I suppose the worst part is that it’s just so darned hard to keep a machine clean. Yeah, I go into Add/Remove Programs regularly and clean up old junk – but sometimes things are bundled in ways that you can’t get rid of the bad without getting rid of the good. And even if you could, you still have to have a good registry cleaner because so many of the uninstallers are half-hearted efforts. And you shouldn’t really have to pull open the Program Files directory to go cleaning up disk files left behind, but often do. It used to be that you’d have to completely re-install Windows 98 every so often because the system itself became unstable. Now, while Windows XP itself is stable, it’s getting to the point where you have to do a re-install just to really get rid of all the vestiges of all that shovelware.

Sigh.

-Scott

The Sad State of Hardware

I (vaguely) remember when I was a young, naive programmer, and I assumed that compilers could essentially do no wrong. Gosh, if something was broken or didn’t work right, it _must_ be my fault. Of course, that assumption didn’t last long. I remember when I first realized that my program was compiling differently based on if and where comments were – must have been some bug in the compiler parser, who knows? Once that line was crossed, though, I began to get a sense of what problems might be mine, what actually might be the compiler, when I should pull up the generated code to look for compiler faults – and figuring out how to work around them. And I’d get a sense of which compilers were the worst offenders. Heck, there was one compiler which seemed to have the development philosophy of shoving in every optimization they could think of with a major version release, then back them out in patches as they got code generation bug reports.

Hardware was different. Because hardware could always break, you’d always be on the lookout for it to go funny. From loose cables, to unseated cards in slots, to overheating problems. You’d expect something to eventually go bad because it lived in the physical world and eventually everything dies. But you didn’t expect design flaws – at least not flakey ones that affected specific software packages. If something was busted, it was pretty consistently busted. And there wasn’t nearly the variety of hardware that you could plug together – not to mention that tolerances of dodgy parts was looser.

Things have changed, and not for the better. It was always the case that chips might have errata – but they’d usually be covered up in the microcode or in the BIOS before things shipped. Devices would have issues, but the driver would be tweaked before the device was shipped to compensate. In general, you could buy a box, and have reasonable confidence it was put together reasonably and would do what you wanted. Sure, there were the manufacturers who had the (very) annoying habit of putting in cheesed-up and dumbed down OEM versions of certain boards, and it would be hard to get driver updates for them. You just learned to avoid them.

Then there was that first interesting case of a design flaw on the motherboard. An undersized capacitor specification on a reference motherboard design would allow the system memory bus to go undervolt if the bus was saturated for more than a certain length of time. Guess which program was best at managing to do that? I think for a lot of the users affected by that problem it was the first time they had encountered real hardware issues – and much like my early experience with compilers, they were hard pressed to believe that it was a hardware issue. Heck, isn’t it always the fault of the software? There had always been bad RAM issues that Photoshop seemed to be the only piece of software that could tickle it, but this was the first really widespread hardware issue. And worse, the only fix was a physical motherboard swap.

Now, of course, it’s even worse. With the internet being an accepted method of delivering drivers and BIOS updates and whatnot, I think most PC manufacturers have gotten lazy. Things no longer necessarily go out the door in a working state. Heck, the Media Center PC I have at home couldn’t burn DVDs for the first 10 months I owned it, until a BIOS update and a driver firmware update (and a re-image).

Don’t get me wrong – I think the internet updates are great for letting slightly older hardware adapt to new things without having to go replacing it all. Much better to just flash your WiFi card to add better security than it is to replace it.

But I think we’ve hit a point where things are just abusive for the average consumer. We now see RAM issues so frequently that it’s part of the FAQ list. Video drivers seem to be perpetually broken in some way or another. I simply don’t feel comfortable recommending machines from any name brand PC makers anymore because they all seem to have serious weak points in their product lines. Now, one could argue that it’s the price preassures that have been at least partly to blame for this – if people didn’t try and save every last dollar from their purchases, that the manufacturers could use more reliable and better tested parts.

I just nailed down an issue today having to do with Hyperthreading and memory allocation. Yes, a BIOS update solved it (as did turning off hyperthreading). But should the average Photoshop user really have to know how to figure out the motherboard manufacturer, find the BIOS update and apply it? Or turn off Hyperthreading? And why wasn’t the BIOS update pushed out over Windows Update for such a serious issue? To the user, it just looked like Photoshop was freezing up in the middle of painting or other operations.

And, of course, it hits Photoshop more than other applications. It’s the nature of the beast – the data sets we’re dealing with are very large for an end-user application, and we move it around a lot faster than other applications. It’ll expose marginal hardware more often than the best system diagnostic software.

Some would argue that we should make Photoshop tolerant of bad hardware. On this I have to disagree. We’re talking about penalizing all users because some people bought dicey systems or cheap RAM. For those who get bad hardware, it sucks – but the right place to take that up with is the hardware manufacturer (yes, I know such things generally fall on deaf ears). But until they get some kickback, they’re going to continue to put out flakey stuff, crammed with shovelware, that’ll manage to run your MP3 player or browser, but gives up the ghost when trying to capture video or touch up your pictures. Or, let’s push the PC magazines to put together some real stress tests, and rate the hardware vendors on long-term stability – knowing what machine is fastest at Quake 4 is useless if it reboots after 9 hours of heavy use because of a thermal issue. I’d say start buying Macs, but things don’t seem to be too much better there, either. I think the hardware is (generally) in better shape, but I think the OS could use a bit more bug fixing.

Until then, just realize that marginal hardware can affect software, especially those programs which try and get the best performance out of your machine. Users shouldn’t have to become hardware diagnosticians just to remove red eye from their kids’ pictures, but that’s where we’re at.

Sucks but true.