Tag Archives: CSS

CSS Wish List

Don’t get me wrong, I think CSS is awesome. It is a great way of defining the UI, but it could be even better. I’m excited about the special effects, transitions, and graphic elements currently being added to the CSS specification. They will help us write faster pages by eliminating the need for UI graphics for things like rounded corners. On the other hand, we also need to add structure to the language to make it easier for designers and developers to author new pages and applications. I hope the CSS Working Group will consider my suggestions.

  • Make CSS a better declarative language
  • Abstract repeating patterns in the code
  • Do not make CSS a programming language

Suggestions for the CSS Working Group

The community has been talking about some form of constants or variables in different incarnations for almost a decade, it is time to make this a reality. While it is possible to duplicate this functionality with a preprocessor (an excellent stop gap until browser support catches up), ultimately this is a tool which should live within the CSS itself. It is a powerful way of expressing more about the objects we are building.

The mixins and prototypes (and associated includes and extends properties) are designed to allow document authors to abstract reusable bits of code and better manage and maintain their style sheets. The goal is to mimic the effect of using multiple class names in the HTML without the drawbacks associated with current implementations. Authors need extends and include in order to write faster, smaller, more efficient style sheets.

CSS is a powerful expressive language. It needs certain modifications so that it will be robust, maintainable, and easy to implement.

From JSConf.eu

How did this begin?

At some point, I realized I was coding CSS with a bunch of “rules” and best practices that weren’t the currently accepted norm. I found myself writing comments to try to express these rules like:

/* ----- faq (extends mod) ----- */

and

/* margin top = bottom height - corner height */

At that point I realized that I was trying to express via comments something which was missing from existing CSS syntax. I looked at other proposals and my own work building web sites for the last 9 years (eek! almost a decade). I first cataloged many of the rules because, at this point, they were mostly intuition and I needed to figure out why I do things the way I do.

When that had stabilized, I began to explore a syntax for this new language of declarative objects. Trying to balance standard programming practices with a desire that the syntax fit well with document authors expectations for how the language works.

CSS Inheritance Today

It is currently possible to implement CSS inheritance in three ways, however, each has its drawbacks.

  1. Multiple class names, clog the HTML with classes that have the same semantic value (just more or less abstract). They introduce an unnecessary dependency because CSS inheritance is defined in the HTML.
  2. By chaining comma delimited selector strings, authors lose the ability to group related rules into discrete blocks of code. The quantity of code increases dramatically. A preprocessor could make option #2 more feasible, but abstracting a necessary part of the language into another layer isn’t a permanent solution. As well, the resulting CSS is not human readable, this step should take place in the document tree.
  3. With a JavaScript parser, the heavy lifting could be done on the client-side, however this would likely have a negative performance impact particularly on older browsers and slower machines.

A Concrete Example – Based on YUI Standard Module Format

I found myself writing the following code:

/* weatherModule (extends .mod) */
.weatherModule{...}
.weatherModule .hd{...}
.weatherModule .ft{...}
.weatherModule .bd{...}

The html looked like this:

<div class="mod weatherMod">
  <div class="hd">Local Weather for SF</div>
  <div class="bd">Cloudy with 100% chance of rain, and fog.</div>
  <div class="ft"><a href="newLoc.html">Change your location?</a></div>
</div>

I was using multiple class names (which is great because ultimately it reduces the footprint of the CSS file by an order of magnitude), but it would be better if the browser could understand my intention, and process any instance of weatherModule as if it was also a mod.

<div class="weatherMod"> 
<!-- Can the browser treat this as an instance of mod? -->
  <div class="hd">Local Weather for SF</div>
  <div class="bd">Cloudy with 100% chance of rain, and fog.</div>
  <div class="ft"><a href="newLoc.html">Change your location?</a></div>
</div>

Thanks so much to the following people who gave invaluable advice, feedback, and (in some cases) tolerated an enormous number of iterations. Nothing gets created in a vacuum and dissenting opinions were as helpful as supporters. :) William Cook, David Federman, Gopal Vijayaraghavan, Douglas Crockford, Stoyan Stefanov, Ryan Grove, Marcel Laverdet, Eric Meyer, Dan Cederholm, Ethan Marcotte, Jeffrey Zeldman, Gonzalo Cordero, Jenny Han Donnelly, Chris Klaiber, Tantek Çelik, Jeremy Keith, Wendy Chisholm, Aaron Gustafson, Fred Sauer, Dave Hyatt, Eric Schurman, Brad Neuberg, John Allsopp, Tom Hughes-Croucher, Chris Mills, Nicholas C. Zakas, Peter-Paul Koch, Doug Schepers, Amy Hoy, Kyle Simpson, Brian LeRoux, Chris Blizzard, Philippe Le Hegaret and (of course) Daniel Glazman.

The Year of Business Metrics – Don’t make your users run away!

Performance at Velocity Conference

A marked change has occurred since the first Velocity Conference a year ago, and while the effects are not yet obvious, they will be. The web is still slow, but we have something now, that we didn’t a year ago: business metrics. This was the year we quantified the impact of performance choices on our businesses, and the results were astounding.

Those of us who worked in the field had a gut feeling that users want a fast web experience, but most of the studies done previously were lacking something, either in experiment design or reliability of the data. They were all strong indicators that more research needed to be done, but they weren’t damning enough to provide real certainty. This year we found a real correlation between websites speed and its ability to establish and keep relationships with visitors. Not everyone could attend, so I’d like to share with you some of the key moments of an amazing conference. Please feel free to add others in the comments.

David Artz at AOL

David Artz from AOL presented their findings from a study which measured page views per visit against performance. They divided users into buckets based on response time and plotted it against PV. The results were startling. Across six AOL sites there was a clear inverse correlation.

The take away: AOL

Users who had a slower experience view far fewer pages.

AOL PV-speed correlation

Goog and Bing sitting in a tree, K-I-S…

Goog and Bing got together (whoa!) to do a study looking at search behavior when performance is worsened over very narrow increments. This study was unique particularly because it followed the same users over a period of time. The data can be used to determine the threshold at which clicks, refined searches, revenue, satisfaction, and time to click are likely to be impacted by features which slow a website. Their methodologies were a bit different, but the conclusions were remarkably similar. A 50MS delay seemed to have no impact, but as little as 200-500MS changed user behavior across the board. Revenue, clicks, and time to first click were most profoundly impacted.

The Take Away: Bing

One key point was that users seem to lose their focus if you make them wait too long. Progressive rendering and flushing the header (which are also recommended by Yahoo!) can help. Bing had this to say:

Notice that as the delays get longer the Time To Click increases at a more extreme rate (1000ms increases by 1900ms). The theory is that the user gets distracted and unengaged in the page. In other words, they’ve lost the user’s full attention and have to get it back.

~ Google & Bing

We’ve all experienced that. We open a new tab and run a search. Multitasking fools that we are, we flip to a new tab or open our email if the results take too long to load.

The Take Away: Google

The most interesting data to come out of the Google tests took place long after the experiment had finished. As much as five weeks later, some users, especially those who saw delays greater than 400MS, were still searching less than before. Performance is a feature users want. Fail them, and they may never come back.

The % change recorded was very small. For instance, a half a second delay caused a -1.2% loss of revenue per user. What does that mean? We need to think big, and simultaneously work on incremental and profound ways to make the web faster.

Shopzilla – Profound improvements

Shopzilla also presented their (profound) performance improvements. They decreased their response time by around 3.5 seconds and the data showed their conversions increased by 7-9% and their page views skyrocketed 25%. This is good stuff. This is how we go to business and make the case that performance is an important feature that deserves attention, not a band-aide that you stick on afterwards. Dave Artz has more details.

JavaScript versus CSS versus Network Latency: Which is killing our sites?

In a separate session with Mike Belche from the Chrome team, he discussed his experiments which tested total time spend on executing JavaScript, rendering the page (CSS), and network latency. He said the vast majority of the time is being spent on network latency. There was a subtle flaw in his methodology, because his rendering time included only one full rendering and painting, because all resources were already in cache and no JavaScript was used, there would be no unnecessary reflows.

The Take Away: Reflows

This got me thinking about images and other fixed dimension media. We should always set height and width of images to avoid reflows being caused when the resource is finally downloaded and available.

I agree with him that, except in extreme cases (and a lot of selector/reflow experiments have been too extreme to really reflect reality), rendering will be much less important that network latency. It is much more important to keep page weight and HTTP requests as low as possible. Over-complicating our CSS selectors to reduce render time would be a mistake. Browsers are really good at parsing selectors, we need to be really good at writing the minimum number we actually need. This is clearly missing not handled correctly in the current suite of testing tools such as Page Speed.

My talk included (not yet released) suggestions for coding performant selectors. More on that later. ;)

Further Reading

  • Aladdin Nassir spoke about linking performance and business metrics via Performance-Based Design.
  • Lindsey Simon spoke about reflows and an open source tool he is building to better measure these things. The methods for accurately measuring reflows are still a WIP, and the numbers are fuzzy, but that makes this a really interesting project to get involved in.

Reflows & Repaints: CSS Performance making your JavaScript slow?

I’ve been tweeting and posting to delicious about reflows and repaints, but hadn’t mentioned either in a talk or blog post yet.

I first started thinking about reflows and repaints after a firey exchange with Mr. Glazman at ParisWeb. I may be stubborn, but I did actually listen to his arguments. :) Stoyan and I began discussing ways to quantify the problem.

Going forward the performance community needs to partner more with browser vendors in addition to our more typical black box experiments. Browser makers know what is costly or irrelevant in terms of performance. Opera lists repaint and reflow as one of the three main contributors to sluggish JavaScript, so it definitely seems worth a look.

Let’s start with a little background information. A repaint occurs when changes are made to an elements skin that changes visibility, but do not affect its layout. Examples of this include outline, visibility, or background color. According to Opera, repaint is expensive because the browser must verify the visibility of all other nodes in the DOM tree. A reflow is even more critical to performance because it involves changes that affect the layout of a portion of the page (or the whole page). Reflow of an element causes the subsequent reflow of all child and ancestor elements as well as any elements following it in the DOM.

For example:

<body>
<div class=”error”>
	<h4>My Module</h4>
	<p><strong>Error:</strong>Description of the error…</p>
	<h5>Corrective action required:</h5>
	<ol>
		<li>Step one</li>
		<li>Step two</li>
	</ol>
</div>
</body>

In the html snippet above, a reflow on the paragraph would trigger a reflow of the strong because it is a child node. It would also cause a reflow of the ancestors (div.error and body – depending on the browser). In addition, the h5 and ol would be reflowed simply because they follow that element in the DOM. According to Opera, most reflows essentially cause the page to be re-rendered:

Reflows are very expensive in terms of performance, and is one of the main causes of slow DOM scripts, especially on devices with low processing power, such as phones. In many cases, they are equivalent to laying out the entire page again.

So, if they’re so awful for performance, what causes a reflow?

Unfortunately, lots of things. Among them some which are particularly relevant when writing CSS:

  • Resizing the window
  • Changing the font
  • Adding or removing a stylesheet
  • Content changes, such as a user typing text in
    an input box
  • Activation of CSS pseudo classes such as :hover (in IE the activation of the pseudo class of a sibling)
  • Manipulating the class attribute
  • A script manipulating the DOM
  • Calculating offsetWidth and offsetHeight
  • Setting a property of the style attribute

Mozilla article about reflows that outlines causes and when they could be reduced.

How to avoid reflows or at least minimize their impact on performance?

Note: I’m limiting myself to discussing the CSS impact of reflows, if you are a JavaScripter I’d definitely recommend reading my reflow links, there is some really good stuff there that isn’t directly related to CSS.

  1. Change classes on the element you wish to style (as low in the dom tree as possible)
  2. Avoid setting multiple inline styles
  3. Apply animations to elements that are position fixed or absolute
  4. Trade smoothness for speed
  5. Avoid tables for layout
  6. Avoid JavaScript expressions in the CSS (IE only)

Change classes as low in the dom tree as possible

Reflows can be top-down or bottom-up as reflow information is passed to surrounding nodes. Reflows are unavoidable, but you can reduce their impact. Change classes as low in the dom tree as possible and thus limit the scope of the reflow to as few nodes as possible. For example, you should avoid changing a class on wrapper elements to affect the display of child nodes. Object oriented css always attempts to attach classes to the object (DOM node or nodes) they affect, but in this case it has the added performance benefit of minimizing the impact of reflows.

Avoid setting multiple inline styles

We all know interacting with the DOM is slow. We try to group changes in an invisible DOM tree fragment and then cause only one reflow when the entire change is applied to the DOM. Similarly, setting styles via the style attribute cause reflows. Avoid setting multiple inline styles which would each cause a reflow, the styles should be combined in an external class which would cause only one reflow when the class attribute of the element is manipulated.

Apply animations with position fixed or absolute

Apply animations to elements that are position fixed or absolute. They don’t affect other elements layout, so they will only cause a repaint rather than a full reflow. This is much less costly.

Trade smoothness for speed

Opera also advises that we trade smoothness for speed. What they mean by this is that you may want to move an animation 1 pixel at a time, but if the animation and subsequent reflows use 100% of the CPU the animation will seem jumpy as the browser struggles to update the flow. Moving the animated element by 3 pixels at a time may seem slightly less smooth on very fast machines, but it won’t cause CPU thrashing on slower machines and mobile devices.

Avoid tables for layout (or set table-layout fixed)

Avoid tables for layout. As if you needed another reason to avoid them, tables often require multiple passes before the layout is completely established because they are one of the rare cases where elements can affect the display of other elements that came before them on the DOM. Imagine a cell at the end of the table with very wide content that causes the column to be completely resized. This is why tables are not rendered progressively in all browsers (thanks to Bill Scott for this tip) and yet another reason why they are a bad idea for layout. According to Mozilla, even minor changes will cause reflows of all other nodes in the table.

Jenny Donnelly, the owner of the YUI data table widget, recommends using a fixed layout for data tables to allow a more efficient layout algorithm. Any value for table-layout other than "auto" will trigger a fixed layout and allow the table to render row by row according to the CSS 2.1 specification. Quirksmode shows that browser support for the table-layout property is good across all major browsers.

In this manner, the user agent can begin to lay out the table once the entire first row has been received. Cells in subsequent rows do not affect column widths. Any cell that has content that overflows uses the ‘overflow’ property to determine whether to clip the overflow content.

Fixed layout, CSS 2.1 Specification

This algorithm may be inefficient since it requires the user agent to have access to all the content in the table before determining the final layout and may demand more than one pass.

Automatic layout, CSS 2.1 Specification

Avoid JavaScript expressions in the CSS

This rule is an oldie but goodie. The main reason these expressions are so costly is because they are recalculated each time the document, or part of the document, reflows. As we have seen from all the many things that trigger a reflow, it can occur thousands and thousands of times per second. Beware!

Further study

The Yahoo! Exceptional Performance team ran an experiment to determine the optimal method to include an external stylesheet. We recommended putting a link tag in the head because, while it was one second slower (6.3 to 7.3 seconds) all the other methods blocked progressive rendering. While progressive rendering is non-negotiable (users hate staring at a blank screen), it does make me curious about the effects of rendering, repaints, reflows and resulting CPU thrashing on component download and overall response time. If we could reduce the number of reflows during loading could we maybe gain back a tenth of the lost time (100ms)? What if it was as much as half?

At SXSW I was trying to convince Steve that reflows are important by telling him about an experiment I’ve been meaning to run for a long time, but just haven’t had time. I do hope someone can pick up where I left off (hint! hint!). While loading the page I’d like to intentionally trigger reflows at various rates. This could perhaps be accomplished by toggling a class name on the body (experiment) versus the last child of the body with no descendants (control). By comparing the two, and increasing in the number of reflows per second, we could correlate reflows to response time. Measuring the impact of reflows on JS responsiveness will be harder because anything we do to trigger the reflows will likely impact the experiment.

In the end, quantifying the impact is only mildly interesting, because browser vendors are telling us it matters. Perhaps more interesting is to focus on what causes reflows and how to avoid them. That will require better tools, so I challenge browser vendors and the performance community to work together to make it a reality!

See it in action

Perhaps you are a visual person? These videos are a really cool visualization of the reflow process.

  1. http://www.youtube.com/watch?v=nJtBUHyNBxs
  2. http://www.youtube.com/watch?v=ZTnIxIA5KGw
  3. http://www.youtube.com/watch?v=dndeRnzkJDU

Reflow gone amok

In order to improve performance browser vendors may try to limit reflows from affecting adjacent nodes or combine several reflows into one larger change such as Mozilla’s dirty reflows. This can improve performance, but sometimes it can also cause display problems. You can use what we’ve learned about reflows and trigger them when necessary to correct related display problems.

For example, when toggling between tabs on our image optimization site, http://smush.it, the height of the content is variable from tab to tab. Occasionally the shadow gets left behind as it is several ancestor nodes above the content being toggled and its container may not be reflowed. This image is simulated because the bug is difficult to catch on camera as any attempts to shoot it cause the reflow that corrects it. If you find yourself with a similar bug, move the background images to DOM elements below the content being toggled.

Smush.it! with un-reflowed shadows

Smush it on tab change in Firefox only.

Another example is dynamically adding items to an ordered list. As you increase from 9 to 10 items or 99 to 100 items the numbers in the list will no longer line up properly across all navigators. When the total number increases by an order of magnitude and the browser doesn’t reflow siblings, the alignment is broken. Quickly toggling the display of the entire list or adding a class, even if it has no associated styles, will cause a reflow and correct the alignment.

Tools

A few tools have made waves lately. Stoyan Stefanov and I have been looking for decent ways to measure reflows and repaints and there are a few tools which show promise (despite being very early alpha). Beware, some of these seriously destroyed my browser before I got them working correctly. In most cases you’ll need to have installed the latest nightly builds.

When Mozilla announced the MozAfterPaint Firefox API, the internets were abuzz.

Has anyone else seen any cool tools for evaluating reflows? Please send them my way!

And a couple other tools not directly dealing with reflows.

Ultimately, we need a cross browser tool to quantify and reduce reflows and repaints. I’m hoping that the performance community can partner with browser vendors to make this tool a reality. The browser vendors have been telling us for a while that this was where we needed to look next, the ball is in our court.

Object Oriented CSS video on YDN

Yahoo! Developer Network has released a video of my Object Oriented CSS talk at Web Directions North just in time for Ada Lovelace day. I’ve also been included in a feature on Women in Technology. I’m absolutely flattered to be included among these fantastic technical women. Wow.

Object Oriented CSS: for high performance websites and web applications.

Find out more about object oriented css

  1. Open source project on github (GIT is having some DNS issues, be patient)
  2. Follow along with the slides on slideshare
  3. Join the OOCSS google group

Thanks to Havi, Julie, Ricky, Yahoo! Developer Network, and the whole Web Directions North team for their hard work putting this together!

Design Fast Websites – Don’t blame the rounded corners! on YUI Theater

Nicole at the Design Fast Websites Presentation by Eric Miraglia

I visited Yahoo! last week to record a talk I had given at the Front End Summit in October. If you are a designer or an F2E it is essential that you understand the ways in which design choices impact overall site performance. This talk establishes guidelines for High Performance Design including 9 Best Practices.

9 Best Practices

  1. Create a component library of smart objects.
  2. Use consistent semantic styles.
  3. Design modules to be transparent on the inside.
  4. Optimize images and sprites.
  5. Avoid non-standard browser fonts.
  6. Use columns rather than rows.
  7. Choose your bling carefully.
  8. Be flexible.
  9. Learn to love grids.

Baby Steps to a Faster Site

In honor of the video being made available on YUI theater, I’ve removed the non-standard browser fonts from my site. While the design was changed slightly it is infinitely more maintainable and I also eliminated an unnecessary HTTP request at the same time. One more step to a faster page.

Web Directions North, Denver, February 2-7

I’ll be speaking more about Design and also CSS best practices at Web Directions North in February where I’ve been invited to give both a Performance Bootcamp Workshop and a CSS Performance for Websites and Web Apps Presentation. I look forward to seeing you there!

Check out the Web Directions North Program.

They have some really amazing speakers lined up. I’m especially excited to talk to Dan Cederholm, who wrote one of my favorite books.