Reap what you Measure

You reap what you measure.  Measure twice cut once.  Don’t tweak without measuring.


So you want Photoshop to perform faster?  You’re gonna want to measure.  But measure what?


There are many things on a system that can go wrong.  You can run out of RAM, a disk can be going bad, you can be bottlenecking on that disk, and you could, in fact, simply be trying to do too much at once.  Unless you watch most of this at once, you’re not going to know what could be going wrong.


Fortunately, both the Macintosh and Windows systems have tools available – built-in or easily available – which measure most of what you need to know.  On Windows, I use the built-in Performance Monitor.  On the Macintosh, there’s Activity Monitor, but I personally prefer a great little application called X Resource Graph, because I like the visual history of the strip chart format, and it has most of the indicators I like to use. 


On the Windows side, I have a standard set of things I like to map on each machine.  I use a consistent color scheme, so that I can go from machine to machine and just glance at Performance Monitor and know what I’m looking at – if you only have one machine, this may not be an issue for you.  I choose blue for memory, red for processor (and green for a dual processor system), yellow for networking items, and various of the dark colors for disk items, with a dashed style.  I map both Available KBytes and Available MBytes so I can play the dual-sensitivity trick – I adjust the scaling on Available MBytes so that the blue line is somewhere in the middle of the strip window, adjusting the scale on the graph so that the blue line ends up somewhere between 10 and 100 (on a machine with 1GB, you’ll end up with it somewhere in the upper half using 0.1, on a machine with 2GB to 5GB, you’ll end up with it in the bottom half using 0.01), then I adjust the scaling on Available KBytes to match, then set the scale to 0.001.  What this does is give you a gross-level idea of free memory on the system, and when memory gets below 100MB free (the danger zone), you get a detailed view.  For each physical disk, I like to add % Disk Time, I set the line style to dashed and a dark color, for no particular reason beyond that’s the way I’ve always done it.  And of course, mapping % Processor Time to a big bright red is to make it obvious.  I then hit the General tab and turn off all the Display Elements.  You may need the legend turned on until you’ve learned what all the colors are, but I’m used to this setup by now, and it lets me have a real uncluttered window with no wasted real estate.  While I’m there, I set the update interval to something useful like 5 seconds.  Yeah, it’s not as much fun to watch that way, but you can usually get a better sense of tends over time this way.


Once I’ve done that, I go back into the Performance Monitor part and use View / Customize View to turn off everything.  I then go into File / Options and set Console mode to User Mode, limited access, single window.  I then use File / Save As to save out perfmon.msc, putting it into my Documents and Settings\<USERID>\Start Menu\Programs\Startup folder, so that it loads up when I log in.  Yeah, you could save it somewhere else and open it when you need it, but I find that having it up all the time is a big advantage.  I then close Performance Monitor, and select perfmon.msc from the Start Menu, All Programs / Startup (you’ll see it there if you saved it there in the previous step).  That gets you this nice, clean, compact monitor window that you won’t mind always having up and that you can glance at (when it’s not buried behind the Photoshop main window – it’s a great window to go along with all the palettes on your new second monitor  and see what’s going on.


On the Macintosh side, things are a little easier.  You can go with the standard Activity Monitor, in which case I’ll leave it up to you to figure out where to watch, or you can install the excellent X Resource Graph, which will let you quickly map all the items you need, and again has a nice strip chart format that gets you that sense of history that can often help figure out what’s going on.  I know some of the folks around here like some of the other monitor programs, but when I glance over to see what’s going on, I don’t need to try and have me brain interpret fancy colored circles on anything.  The key thing here again is turning on enough items in the preferences, and set up a color scheme you like.  Again, I set this as one of my login items.  I really like X Resource Graph’s ability to stalactite/stalagmite some of the graphs, but I really wish I could have a strip graph of the memory items instead of the paging graph (well, in addition to, really).  And as on Windows, I like to slow down the refresh time to once every five seconds, to get a reasonable amount of history into those strip charts.  There are color schemes which you can download which aren’t so colorful, and won’t be so intrusive on your image editing, and I usually have mine set up a little more subdued than this.  And again, it’s a great thing to tuck over on that second monitor.


On both systems, the question becomes, just what are you looking for?


Well, let’s start with memory.  Photoshop uses a lot of memory.  And it has it’s own virtual memory systems because it has to deal with data sets far larger than the address space allowed to 32-bit applications (and we can’t just do a 64-bit version and it’s still not clear if it’s worth it), plus additional reasons I’ll cover some other time.  What it means is Photoshop is going to use the memory percentage of free RAM you’ve told it, and try and live within that amount.  What’s key here is that while living within that amount, you really, really don’t want the amount of system free memory to get below 20MB.  What happens if you get somewhere below that amount of free RAM (the exact amount shifts around between the operating systems, and there are other variables – 20MB seems to be a reasonable safe zone on both platforms) is that the operating systems start trying to get free RAM back by paging out some things.  Well, given that Photoshop at that point is probably occupying most of that RAM that the operating system wants to reclaim, it’s highly likely that parts of Photoshop memory will get paged out.  Well, when Photoshop then goes to try and operate on the part of the image that was in that memory that got paged out, it has to wait for it to get paged in.  Even worse, if the system paging file and Photoshop’s scratch file are on the same physical drive (meaning you only have one set of drive heads), what will often happen is that Photoshop wants to read a new tile into memory from it’s scratch files, but the memory it’s trying to read that into has been paged out – so a little gets paged in (operating systems have annoyingly small page sizes), Photoshop continues trying to read in it’s tile, a little gets read in, then more of the place it’s trying to read it into needs to get paged back in, then a little more gets read, then…  Well, you get the idea.  Now, when both Photoshop scratch and the paging file are on the same physical disk, each of those little flips between reading the scratch file and paging in the memory to read it into forces the drive head to go to that part of the disk.  Photoshop will now be running about the slowest it could run on your machine.    So, let me repeat myself.  You really, really don’t want free RAM to go below 20MB.  Complicating matters are the filters that like to allocate a single chunk of memory to hold the part that’s going to be operated on.  You’ll know these filters because memory will get used up in a big chunk.  Photoshop will try and be good about giving up tile memory to trade in, but it’s probably not getting it all back (I’ll cover that some other time, too).  If you use those filters often (or if you use a few other memory hungry applications at the same time), then your overall session performance may be better if you back off on the Photoshop memory percentage.  Again, the key is that free memory not go below 20MB (I think I heard that somewhere else before… ).


The next thing to watch for is disk.  There’s disk activity, of course, related to paging activity which will also show up in the paging graphs.  And then there is disk activity related to reading files, and related to reading scratch.  The latter is more interesting, especially as it relates to CPU usage.  Kick off a global filter on a big file, and depending on what the filter is, you’ll see a bunch of CPU activity, then a bunch of disk activity.  There should be very little paging activity during this period (you did watch that free memory amount, right?).  Shortening the period of time the CPUs are waiting for the disk would be the purpose of setting up a RAID array for scratch, if you’re so inclined.  You’ll notice that on a heavily fragmented disk, the disk read amounts won’t form a smooth graph.  To avoid scratch file fragmentation, setting up the Photoshop scratch on it’s own disk is a good thing to do. Concentrating on scratch over actual file open/save reads and writes is preferable because you are usually exercising the disk for scratch usage many times more than for file reading and writing.  It’s often instructive to watch how what you are doing in Photoshop affects the CPU and disk activity.


So, who else is hogging all that memory anyway?  That’s where Windows Task Manager or Activity Monitor can come in handy once in a while.  Simply bring them up (and switch to the process tab in Task Manager) and look for the other applications which are using big chunks of memory.  Sometimes, you’ll find surprises there, like e-mail programs taking up 80MB, or a web browser eating 40MB, or an MP3 player taking 60MB (yes, these amounts seem big to me – but I don’t know what’s going on inside of them, they could be being efficient, but I doubt it).  Now, if you run Photoshop occasionally, and you want to keep those other programs running, no problem – just turn back Photoshop’s memory percentage.  With that level of use, tuning Photoshop’s memory percentage to the last megabyte isn’t going to buy you enough over that short a term to justify the likelihood that you’ll get below that magic 20MB free number.  In the busy machine case like this, less is often more.  If a machine is used to run Photoshop, pretty much Photoshop, from morning to night, then watching and tuning the memory percentage is definitely going to save you some time over the long run.  And, even better, if there are unavoidable reasons why your machine acts slow sometimes (like a disk is already busy even before you start saving a file) you can catch it and plan for that inevitable coffee break.


Note that there are side issues, such as the current pause on Macs with more than 4GB of RAM in them (a stale leftover of the Unix days is kicking in and spoiling the fun – the disk cache that Photoshop writes the scratch file through in large memory situations is getting flushed occasionally, and the operating system ends up forcing Photoshop to wait until that’s done), but this should get you started with the gross-level tuning of your Photoshop environment that is fairly easy and, best of all, cheap.


-Scott

13 Responses to Reap what you Measure

  1. Claus Jacobsen says:

    For the windows platform You should try out http://www.samurize.comConfigure it to your delight, and as minimal as you want!In fact You have access to pretty much every kind of systeminfo you would ever want via this program.Claus

  2. Cadyellow says:

    Scott, I really appreciate your article on what to monitor with regards to performance, as it’s just what I’ve been wishing to know. I have 1 GB of RAM and plan to upgrade to 2 GB (and recommendations on which brand of RAM is super-reliable?) which is my system’s limit (bummer). When I use Photoshop, I almost always need to have open Word for indexing what I’m working on, CS2 Bridge, and another piece of software open that I use to lay out my images once I’ve retouched them. So that’s four memory hogging apps running besides NAV and all the other shovelware. What ends up happening is that either Photoshop or, more often Bridge, just close suddenly. They don’t freeze, they just close. Is that unusual? That’s why I’m monitoring performance, but I haven’t known how to judge when the situation gets problematic. So again I thank you. And I sure I’m gratified to read of 2 Sr. Engineers actually going out to a user’s site to troubleshoot! That’s super! Thanks!CadYellow

  3. Scott Byer says:

    Applications just closing indicates something is seriously going wrong. You could be running out of Windows paging space – check to make sure the boot drive has plenty of free space. Free space on your drives is something that can be set up in Performance Monitor. I usually like to set up my own Windows paging file settings, with min equal to max equal to at least 1.5x RAM size.On the Mac, there’s nothing to do but make sure the boot drive alsways has some free space.This is one of the reasons it can be important to set Photoshop up so that it’s primary scratch is on a different hard drive than the operating systems’ paging file.-Scott

  4. Joel Sciamma says:

    Scott,Thanks for these insights both into performance monitoring and the thinking that lies behind it.As users it can be very difficult to demystify the black box that is our computers and the challenges faced by developers to make their applications reliable.More good information like this there is, better we can understand each other and make rational decisions about our systems.Joel.

  5. Chris Ogden says:

    Swapfiles – ugh.When I work on relatively small files in PS, it creates more and more swap files, slowing the machine down as it goes. I recently added another GB of RAM to my Dual 2.5Ghz G5, bringing the total up to 3.5GB (less than the 4GB+ that’s supposed to cause problems). No Change.Looking at memory usage (via MenuMeters) shows 2.6GB free, including 858MB wired free.I have experimented with PS’ “Memory Usage”, going from 50% to 90% and don’t notice much/if any changes.Any ideas would be much appreciated!-c

  6. marc dale says:

    i use VISAGE captures all user activities on a PC. Similar to the surveillance camera, VISAGE takes screen snapshots at the time interval you specify. You can later watch all captured screens like a VCR. It also captures all applications run, all keystrokes even passwords and records all web sites visited by the user. You can set up a customized recording schedule to record activities only at the specified time of the day or day of the week.

  7. tony says:

    i just have read your blog, to clean your disk by using Disk Space Clean Clear – the Garbage and Junk File Cleaning utility.http://www.yaodownload.com/utilites/system-utilities/disk-space-clean-clear-pro/

  8. Greg says:

    For measuring MSWIN processes and structures, MS recently bought Mark Russovich. His new work is found at ( http://www.microsoft.com/technet/sysinternals/default.mspx )The “Process Monitor” and other tools are killer ! Enjoy.[Sysinternals has always been one of my go-to places when needing to dive deep into Windows things. – Scott]

  9. Steven Bland says:

    Scott, I have been looking for the appropriate place to introduce this and since this blog topic is about performance and you are the system performance guru I will do it here.I know you are usually trying to encourage most PS users to have a seperate drive for their scratch space and to have that drive be as fast as possible like RAID-0.Can we take it one step further and see more performance gains?I like you have had perfmon in my startup for quite some time now and I have observed that there are times when PS is performing both read and write to the scratch drive during the same time span which is a killer on performance. My guess is that this is about defragmentation of the scratch space, but the reason doesnt matter.It seems to me that avoiding the read-write during the same time span would result in improved (almost double) performance.Proposal: Why not use two drives with a scratch on each of the drives not to increase scratch space, but to improve performance? When the need to read and write at the same time occurs, read it from scratch driveA and write it to scratch driveB. When working with the data on driveB and the need to R/W at the same time comes up, write the data back on driveA.I have DUAL ULTRA 320 SCSI. Channel A is reading as high as 210MB/s and writing at around 190MB/s. But when the heads are challenged by trying to do both, the performance suffers. In the mean time, Channel B is capable of the same throughput and sitting there idle.Love to hear your thoughts on this. Actually starting to love to hear your thoughts on everything you write. Thanks for being such a great communicator of such great topics like the recent 64bit blog.Thanks,Steven Bland[While this sounds simple, it would be quite complex to implement in practice. We are doing work to try and gather the writes together better. Even better is to stuff the machine with 6GB or more of RAM, which lets us read/write to scratch through the system file cache – the systems themselves will then be able to better gather reads/writes. – Scott]

  10. Steven Bland says:

    Thanks Scott, I will be adding 4GB to my 2GB soon.However, this will just raise the threshold of when scratch starts getting used. When I manually stitched five 2GB images together, the scratch space was over 50GB. The good news is PS does not hit a ceiling and crash. The bad news is performance becomes sluggish even when PS settings are tweaked for working with gijantor data sets.It just seems ashamed to have another 200MB/s sitting there idle. ;)Steve[Actually, at 6GB, when scratch file reads and writes going through the system file cache, it gives the system a lot more leeway to batch up those delayed writes, and reduces the number of reads that have to come from the disks. Also, you could consider bonding the two hardware RAIDs using an overlaid software RAID 0. We’re starting to set up experiments here on this stuff, more info as we have it. -Scott]

  11. Steven Bland says:

    So excited, I just installed second 3.6Ghz Xeon proc!I tried exactly what you are experimenting with as I was setting up my two arrays. My results indicated that there would actually be a small performance hit rather than improvement.There are a couple factors that I think should be considered in your testing:#1 – test software RAIDing two slow RAIDs using a fast CPU *versus* two fast RAIDs using a slow CPU. Also fast RAID w/fast CPU.#2 – I am using Adaptec HostRaid which may be the reason I did not see gains when soft RAIDing the two HostRAIDed arrrays. So, test both HostRAID and 100% hardware raid.I will be very interested in the results. If 100% hardware raid works, I could see that ZCR card in my future.Congrats to Adobe for the recent awards,Steven Bland[We’ll add those ideas to our RAID testing. – Scott]

  12. Steven Bland says:

    Proc = 2 x 3.6Ghz HTFSB = 800MhzMem = 6GBRAID-0 = 230MB/sAdded second proc and 4GB ram.I just love chasing these bottle necks! Noticed that there is a HUGE difference in performance when saving a multi-layered image vs saving a single layered image. I can save a 2GB single layer in the same time that I save a 200MB two layer file. That’s like 10 times slower and begs the question, why. I routinely open a 16bpc image, add curve layer, save as PSD. To investigate further, I used perfmon.mscDuring the first 90% of the save time, the graph data shows:50% utilization = proc1+proc2Disk write speed = 8MB/sDuring the last 10% of the save time, the graph data indicates that the processors are not busy and the write speed jumps to 100MB/s or more.#1 – What is being processed during that 90% that has the processors so busy that disk write is limited to 8MB/s?#2 – Assuming that the processing is necessary, during this processing, why aren’t both Processors pegged? The graph indicates that they are inversely proportional such that as proc1 approaches 100%, proc2 approaches 0%.#3 – Why does adding a curve have to double the file size especially since the curve’s mask is solid white?Thanks for your analysis![Note that when adding that adjustment layer and saving with backwards compatibility, the composite of the image must be generated and stored with the file. That’s also why the file gets so much larger. It’s also where the processing time is going during the save. There are parts of that process that currently aren’t threadable, thus the use of only one CPU. Save without maximum compatibility and the file size should go back down and you shouldn’t pay the penalty for generating the composite (though there was a bug in that area at some point in the past). – Scott]

  13. Steven Bland says:

    Scott,Congratulations on producing an efficent architecture!I have been using CS3 for the past 3 weeks and am impressed with the performance improvements over the CS version!I have been able to put away the performance monitoring tools and get some work done!!Honorable Thank You!Steven Bland