By David Stephens
On September 27, 2012

There is a lot of misinformation going around about Adobe Muse. Much of this misinformation is rooted in evaluations of early Public Beta versions of the software which were released many months before the initial release of Adobe Muse 1.0. Adobe released Adobe Muse 2.0 in August, 2012 and the code is quite different from those early pre-1.0 Public Betas.

You Design, We Code

The promise of Adobe Muse is that it allows Designers to design without worrying about the code. Adobe keeps this promise with frequent releases of Adobe Muse with new features for Designers and significant code improvements.

Muse is a tool that targets traditional Designers so that they do not have to know about code. Therefore, we put a lot of emphasis on creating features that are familiar to Designers such as master pages, paragraph and object styles, ruler guides and other graphic design paradigms. However, as engineers and developers ourselves, we care a great deal about the quality of code that Muse generates. From the very beginning, our mantra has been “Please the Designer, Honor the Developer.” To that end, we continue to make significant improvements to the code that Muse generates. All you need to do to obtain those improvements is to re-publish your site with the latest version of Muse.

How do we evaluate code improvements?

We are committed to improving the HTML/CSS/JavaScript code that Muse generates and we evaluate it based on the following criteria:

  1. Cross-browser compatibility
  2. Load performance of the site
  3. Accessibility and Usability

We also work to ensure all of our code follows good SEO practices and guidelines whenever possible.

Frequent Releases

Adobe Muse is offered via subscription, which allows Adobe to release updates as soon as they are ready. Adobe Muse 2.0 was released in August, 2012, three months after the initial Adobe Muse 1.0 release and it is packed with a number of new features for Designers as well as significant code generation improvements. Latest Features

Some Benchmark Metrics

We frequently put Muse through a series of automated tests to not only find bugs and improve stability, but also to analyze the output. Let’s look at some of the metrics collected for Adobe Muse after publishing a collection of sites using the various public releases of Adobe Muse.

One measure of output is the total size of the site including images, HTML files, CSS files, Javascript files and all other assets. There are limits on how small a site can be made without sacrificing rendering quality, but the reasoning goes that the smaller the total data, the less time it takes to load in your browser. Similarly, the fewer the number of files, the fewer the number of http requests to load the contents of those files.

Benchmark metrics across Adobe Muse releases
Metric Beta 5 Beta 6 Beta 7 Muse 1.0 Muse 2.0 Avg. Improvement
Total Output Size 7643K 7764K 7947K 6915K 5966K 22% smaller
Total HTML Size 1498K 1549K 1527K 1428K 922K 38% smaller
Total Image Size 4239K 4231K 4224K 4208K 3781K 11% smaller
Total JavaScript Size 222K 224K 226K 226K 193K 13% smaller
Total Number of Images 547 541 433 187 109 80% fewer
Total Number of JavaScript Files 10 10 10 10 8 20% fewer

Note: Averages are from a collection of customer sites published from the listed releases. Results vary based on the design of the site.

This table of metrics shows that there is a concerted and continual effort to improve load times by decreasing the overall size of the site as well as the total number of files generated.

Specific Code Improvements

What specific changes led to these size improvements? That will be the subject of my next post.