Category: Blog

  • Selectivizr with CSS on a sub-domain

    Updating a WordPress starter theme recently (among other things I was porting it to HTML5), I needed to decide which shims and/or polyfills to use. I starterd with Remy Sharp’s HTML5 enabling script but another to consider was Selectivizr to improve IE‘s support of CSS3 selectors.

    One of the disadvantages of using Selectivizr is it rules out using a CDN for one’s style sheets. To quote their site:

    Style sheets MUST be hosted on the same domain as the page due to browser security restrictions. Likewise, style sheets loaded using the file: protocol will not work.

    After umming and ahhing for a couple of days, the following solution involving conditional comments, occurred to me:

    <!--[if gte IE 9]><!-->
    <link rel="stylesheet" href="http://cdn.example.com/styles.css" type="text/css">
    <!--<![endif]-->
    
    <!--[if lte IE 8]>
    <link rel="stylesheet" href="/styles.css" type="text/css">
    <![endif]-->

    With a few lines of conditional comments, browsers supporting the relevant selectors natively can take advantage of the performance boost from a CDN while developers can take advantage of the advanced selector support provided by Selectivizr for IE<9 users.

    I’ve set up a quick demonstration in which three paragraphs have different ARIA roles – featured, unfeatured and neverfeatured – different styles are applied to each paragraph using [role=something]. The demo renders fully in: IE 6-9beta, Firefox (Win & Mac), Chrome (Mac), Safari (Mac), and Opera (Mac).

  • A half-baked (CSS) idea

    Spritebaker has been doing the rounds a fair bit in web development circles over the past few weeks, for the simple reason that it’s a great idea, done well. The best desciption comes from the site itself:

    It parses your css and returns a copy with all external media “baked” right into it as Base64 encoded datasets. The number of time consuming http-requests on your website is decreased significantly, resulting in a massive speed-boost.

    While baking images into your CSS to lower HTTP requests reduces the rendering time of your site overall, the downside is that CSS files block the initial rendering of your site. While an unbaked site may render and be built up as external images load, a baked site will not render until both the CSS and baked images have loaded. This has the strange effect of making it seem like the page is actually taking longer to load.

    I’ve thrown together a quick example of an unbaked page and its baked equivalent. In this example, the unbaked page begins rendering earlier than its baked counterpart but finishes later.

    In an attempt to kick off rendering earlier, I tried what I’ve named a half-baked idea: splitting the standard CSS into one file and the baked images into another. My hope was that browsers would render the standard CSS while the other was still loading. As you can see on the example page,  this failed.

    With CSS only solutions delaying rendering of the page, it’s time to pull JavaScript out of our toolbox. Anyone whose read my article on delaying loading of print CSS will find the solution eerily familiar. The CSS files are still split into the standard files and the file containing the baked-in images, the one with the baked in images is wrapped in <noscript> tags in the HTML head.

    <link rel="stylesheet" href="halfbaked-1.css" type="text/css" />
    <noscript><link rel="stylesheet" href="jshalfbaked-2.css" type="text/css" /></noscript>

    This prevents the second/baked stylesheet from loading during the initial rendering of the page. Without this file blocking rendering, this version of the example begins rendering as quickly as the first, unbaked, example.

    The second/baked stylesheet needs to be added using the JavaScript below:

    <script type="text/javascript">
    window.onload = function(){
      var cssNode = document.createElement('link');
      cssNode.type = 'text/css';
      cssNode.rel = 'stylesheet';
      cssNode.href = 'jshalfbaked-2.css';
      cssNode.media = 'all';
      document.getElementsByTagName("head")[0].appendChild(cssNode);
    }
    </script>

    Using the method above for baking images into your CSS will give you the best of both worlds, your page will render quickly with a basic structure before a single HTTP request is used to load all of your images.

    I used Web Page Test to measure the first-run load times using IE9 Beta, averaged over 10 tests. On the test pages, with only a few images, the advantage of a baked stylesheet isn’t apparent, on a site with more images it would quickly become so.

    Version Starts Render Fully Loaded
    Unbaked 0.490s 1.652s
    Baked 1.862s 1.836s
    Half baked 2.138s 2.114s
    JS baked 0.499s 1.993s

    As the Spritebaker info page says, IE versions prior to IE8 don’t understand data-URIs, so you’ll need to use a generous sprinkling of conditional comments to load images the old fashioned way in these browsers.

    The examples above were tested on IE9 Beta, Chrome 6.0, Safari 5.0, Opera 10.10 and Firefox 3.6.9. Individual results may vary.

    We’d love to hear of your experiences with baking stylesheets, or other techniques you use to speed up apparent rendering of your page, especially if it slows the total load/rendering time.

  • Why we host Big Red Tin on US servers

    Some time ago, I wrote a post in which I stated I’d be sticking with Australian web hosting provider Quadra Hosting. Shortly after writing that post I’d relocated the Soupgiant sites to an American service provider.

    Even though I’ve done similar before, it’s not because I’m a compulsive liar. I promise. It’s because situations change and they can change quickly.

    About the time we switched hosting providers, Soupgiant became responsible for hosting the Boxcutters podcast. Each week Boxcutters releases an MP3 between 35 and 45 MB which is, in turn, downloaded at least 1000 times.

    The maths is pretty simple, each week Boxcutters, alone, uses 35 GB or more of bandwidth. With our current host, Linode, we’re paying a little under $US60 for a VPS and 600 GB of bandwidth. To get the equivalent bandwidth in Australia, we would be charged at least ten times that amount.

    It’s not entirely the fault of Australian hosts that they’re pricing themselves out of an international market. According to the OECD, retail bandwidth in Australia is 50% more expensive than in the US. It’s safe to conclude this is a reflection of wholesale pricing.

    To state the obvious, hosting pricing has to follow the lead of bandwidth pricing. So if Australian web hosting providers are gouged, gouging of their customer must follow.

    I’d love to host the Soupgiant websites in Australia, partly due to home-town pride but mostly because the 25,000 km (15,500 mi) round trip to California is pointless.

    The simple fact is: Soupgiant can’t afford to host our sites locally, given that most of the bandwidth is used for a loss-making podcast.

    Hosting the sites in the USA means we may break even some months but locally there’s no chance.

    Update: Since writing this post I’ve relocated Soupgiant’s hosting to Media Temple‘s dv service. A managed service is a better fit for Soupgiant.

  • jQuery 1.5 as jQuery 1.5.0

    In early 2009, I wrote a post on browser caching times for the Google AJAX Libraries API.

    The cheat notes are that three different URLs point to the current edition of jQuery and each URL is cached in the browser for a different length of time:

    The point of my post was that, when using Google’s AJAX Library to host jQuery, or any of the other libraries, it is best to specify the exact version to receive the full benefit of hosting on the same server as gazillions of other web sites, that being, your visitors don’t need to download the library for every site they visit. I was reminded of this when Dave Ward appeared on episode 32 of the official jQuery podcast.

    When jQuery 1.4 was released, the Google URL being publicised by the jQuery team was http://…/jquery/1.4/jquery.min.js – while Google had set it up as http://…/jquery/1.4.0/jquery.min.js. I had two problems with this:

    • The publicised URL, coming from the official jQuery team, was only cached for 1 hour;
    • Anyone using the publicised URL would automatically be upgraded to jQuery 1.4.1 upon its release, regardless of their expectations.

    My request to John Resig and the jQuery team is to avoid confusion by officially numbering the next version as jQuery 1.5.0 and publicising the URL that contains the full version number.

    That means that for most people the default version of jQuery they download will be updated least often. It means that the people building the websites have more control over which version of jQuery their end users download and when.

    It may seem trivial now but if your visitors leave your site while they’re waiting for jQuery to download, you’ll think it less so.

    Update: In the blog post announcing version 1.5, the jQuery team did publicise the 1.5.0 URL on the Google CDN.

  • JavaScript Localisation in WordPress

    Recently on Twitter @iamcracks asked

    Attention WordPress Wizards & Gurus. Is it possible to “get WordPress to write a custom field into a javascript variable”?

    source

    While I wouldn’t be so bold as to claim I’m either a wizard or a guru, I happen to know the answer to @iamcracks question.

    A while back I wrote a two part tutorial on using JavaScript the WordPress way, the code below builds on that. The first step is to load the JavaScript in functions.php using wp_enqueue_script() as detailed in the earlier tutorial:

    <?php
    function brt_load_scripts() {
      if (!is_admin()) {
        wp_enqueue_script(
          'brt-sample-script', //handle
          '/path/2/script.js', //source
          null, //no dependancies
          '1.0.1', //version
          true //load in html footer
        );
      }
    }
    
    add_action('wp_print_scripts', 'brt_load_scripts');
    ?>

    This outputs the html required for the JavaScript when wp_footer() is called in footer.php

    Localising the script is done using the function wp_localize_script() it takes three variables:

    • $handle – (string) the handle defined when registering the script with wp_enqueue_script
    • $javascriptObject – (string) name of the JavaScript object that contains the passed variables.
    • $variables – (array) the variables to be passed

    To pass the site’s home page and the theme directory, we’d add this function call below the wp_enqueue_script call above:

    <?php
    ...
    wp_localize_script('brt-sample-script', 'brtSampleVars', array(
      'url' => get_bloginfo('url'),
      'theme_dir' => get_bloginfo('stylesheet_directory')
      )
    );
    ...
    ?>

    The output html would be:

    <script type='text/javascript'>
    /* <![CDATA[ */
    var brtSampleVars = {
      url: "http://bigredtin.com",
      theme_dir: "http://bigredtin.com/wp-content/themes/bigredtin"
    };
    /* ]]> */
    </script>
    <script type='text/javascript' src='/path/2/script.js?ver=1.0.1'></script>

    Accessing the variables within JavaScript is done using the standard dot notation, for example brtSampleVars.theme_dir to access the theme directory.

    Using a post’s custom fields is slightly more complicated so I’ll write out the code in full:

    <?php
    function brt_load_scripts() {
      if (is_singular()) {
        wp_enqueue_script(
          'brt-sample-script', //handle
          '/path/2/script.js', //source
          null, //no dependancies
          '1.0.1', //version
          true //load in html footer
        );
    
        the_post();
        $allPostMeta = get_post_custom();
        wp_localize_script('brt-sample-script', 'brtSampleVars',
        array(
          'petersTwitter' => $allPostMeta['myTwitter'][0],
          'joshsTwitter' => $allPostMeta['joshsTwitter'][0]
          )
        );
        rewind_posts();
      }
    }
    
    add_action('wp_print_scripts', 'brt_load_scripts');
    ?>

    Only pages and posts have custom fields so the check at the start of the function has become is_singlular() to check the user is on either a post or a page, earlier we were testing if the user was anywhere on the front end. The arguments for wp_enqueue_script have not changed.

    the_post() needs to be called to start the loop and initiate the $post object so the associated custom fields can be accessed in the following line and put in an array.

    With the custom fields easily available, the information can then be passed to wp_localize_script() as earlier demonstrated. The final step is to rewind the loop so next time the_post() is called, from either single.php or page.php, the post data is available.

    The html output from the sample above would be:

    <script type='text/javascript'>
    /* <![CDATA[ */
    var brtSampleVars = {
      petersTwitter: "@pwcc",
      joshsTwitter: "@sealfur"
    };
    /* ]]> */
    </script>
    <script type='text/javascript' src='/path/2/script.js?ver=1.0.1'></script>
  • Delay Print Stylesheets Plugin

    A few weeks ago I wrote a post in which I adapted an idea from a zOompf article to delay the loading of print stylesheets until after a web page has fully rendered. I finished that post with the following point/question:

    Another question to ask is whether all this is actually worth the effort – even when reduced through automation. On Big Red Tin, the print.css is 595 bytes, the delay in rendering is negligible.

    Chris and Jeff at Digging into WordPress picked up the article and posted it on their site. In turn it was picked up elsewhere and became the surprise hit of the summer at Big Red Tin. Not bad when one is shivering through a bitter Melbourne winter.

    As a result of the interest, I decided to convert the code from the original post into a plugin and add it to the WordPress plugin directory.

    Further Testing

    As I warned in the original article, I’d tested the code in very limited circumstances and found it had worked. Fine for a code sample but not enough for a sub version-1.0-release plugin. Additional testing showed:

    1. Stylesheets intended for IE, through conditional comments, were loading in all browsers
    2. When loading multiple stylesheets, the correct order was not maintained in all browsers

    If jQuery was available I wanted to use it for JavaScript event management otherwise I’d use purpose-written JavaScript. There’s no point, after all, of worrying about the rendering delay caused by 600-1000 bytes only to load a 71KB (or 24KB gzipped) file in its place.

    Other things I wanted to do included:

    1. Put the PHP in a class to reduce the risk of clashing function/class names
    2. Put the JavaScript in its own namespace
    3. Keep the output code as small as possible

    To support conditional comments for IE required adding each stylesheet within a separate <script> tag, using this method the output HTML takes the following form:

    <script>
      // add global print.css
    </script>
    <!--[if IE 6]>
      <script type='text/javascript'>
        // add ie6 specific print.css
      </script>
    <![endif]-->

    This violates my aim to keep output as small as possible but footprint has to take second place to bug-free. I could have translated the code to use JavaScript conditional comments by translating the IE version to the JavaScript engine it uses but this could lead to future-proofing problems.

    To maintain the order of stylesheets, I added each event to an array of functions and then used a single event to loop through the array of functions. If jQuery is used, I add multiple events because jQuery runs events on a first in first out basis.

    Putting the PHP in a class and the JavaScript in its own namespace is fairly self-explanatory. Google is your friend if you wish to read up further on this.

    Minimising the footprint was also a simple step. I wrote the JavaScript out in full with friendly variable names. Once I was happy with the code, I ran the code through the YUI JavaScript compressor, commented out the original JavaScript in the plugin file and output the compressed version in its place.

    The JavaScript is output inline (within the HTML) to avoid additional HTTP requests. I was in two minds about this because browser caching is lost in the process. So it may change in a later version.

    I’ve worked out another way to keep the footprint small. Rather than creating a function to pass the stylesheet’s URL and ID to brt_print.add(url, id), I wrote out the full function for each style sheet. I’ll fix that in the next release.

    You can download the Delay Print CSS Plugin from the WordPress plugin repository.

  • Delay loading of print CSS

    Recently I stumbled across an article on zOompf detailing browser performance with the CSS print media type. In most recent browsers, Safari being the exception, the print stylesheet held up rendering of the page.

    The zOomph article suggests a solution, to load print stylesheets using JavaScript once the page has rendered (ie, on the window.onload event), with a backup for the JavaScript impaired. You can see their code in the original article.

    Automating the task for WordPress

    Most sites I develop are in WordPress so I decided to automate the process. This relies on using wp_enqueue_style to register the stylesheets:

    function enqueue_css(){
      if (!is_admin()){
        wp_enqueue_style (
          'bigred-print', /* handle */
          '/path-to/print.css', /* source */
          null, /* no requirements */
          '1.0', /* version */
          'print' /* media type */
        );
      }
    }
    add_action('wp_print_styles', 'enqueue_css');

    The above code will output the following HTML in the header:

    <link rel='stylesheet' id='bigred-print-css'  href='/path-to/print.css?ver=1.0' type='text/css' media='print' />

    The first step is to wrap the above html in noscript tags, the WordPress filter style_loader_tag is ideal for this.

    function js_printcss($tag, $handle) {
      global $wp_styles;
      if ($wp_styles->registered[$handle]->args == 'print') {
        $tag = "<noscript>" . $tag . "</noscript>";
      }
      return $tag;
    }
    add_filter('style_loader_tag', 'js_printcss', 5, 2);

    The filter runs for all stylesheets, regardless of media type, so the function checks for print stylesheets and wraps them in the noscript tag, other media types are left alone.

    The first two arguments are the filter and function names respectively, the third argument specifies the timing (5 is the default) and the final argument tells WordPress how many arguments to pass to the filter – two – in this case $tag and $handle.

    With the new filter, WordPress now outputs following HTML in the header:

    <noscript>
    <link rel='stylesheet' id='bigred-print-css'  href='/path-to/print.css?ver=1' type='text/css' media='print' />
    </noscript>

    The next step is to add the JavaScript to load the stylesheets, we can do this by changing our original function, js_printcss, and making use of a global variable:

    $printCSS = '';
    
    function js_printcss($tag, $handle){
      global $wp_styles, $printCSS;
      if ($wp_styles->registered[$handle]->args == 'print') {
    
        $tag = "<noscript>" . $tag . "</noscript>";
    
        preg_match("/<s*links+[^>]*hrefs*=s*["']?([^"' >]+)["' >]/", $tag, $hrefArray);
        $href = $hrefArray[1];
    
        $printCSS .= "var cssNode = document.createElement('link');";
        $printCSS .= "cssNode.type = 'text/css';";
        $printCSS .= "cssNode.rel = 'stylesheet';";
        $printCSS .= "cssNode.href = '" . esc_js($href) . "';";
        $printCSS .= "cssNode.media = 'print';";
        $printCSS .= "document.getElementsByTagName("head")[0].appendChild(cssNode);";
      }
      return $tag;
    }

    The code creates the PHP variable $printCSS globally, which is then called into the function using the global command.

    After wrapping the tag in the noscript tags, the new function uses a regular expression to extract the URL of the stylesheet from the link tag and placing it the variable $href.

    Having extracted the stylesheet’s URL, the function then appends the required JavaScript to the PHP global variable $printCSS.

    The final step is to add the JavaScript to the footer of the HTML using the wp_footer action in WordPress. The PHP to do this is:

    function printCSS_scriptTags(){
      global $printCSS;
      if ($printCSS != '') {
        echo "<script type='text/javascript'>n";
        echo "window.onload = function(){n";
        echo $printCSS;
        echo "}n</script>";
      }
    }
    
    add_action('wp_footer', 'printCSS_scriptTags');

    The above code uses window.onload as dictated in the original article. A better method would be to use an event listener to do the work, for those using jQuery we would change the function to:

    function printCSS_scriptTags(){
      global $printCSS;
      if ($printCSS != '') {
        echo "<script type='text/javascript'>n";
        echo "jQuery(window).ready(function(){n";
        echo $printCSS;
        echo "});n</script>";
     }
    
    }
    add_action('wp_footer', 'printCSS_scriptTags');

    The above solution had been tested for very limited circumstances only and found to have worked. Were I to use the function in a production environment I would undertake further testing.

    Another question to ask is whether all this is actually worth the effort – even when reduced through automation. On Big Red Tin, the print.css is 595 bytes, the delay in rendering is negligible.

    Update Aug 23, 2010: Fixed a type in the code block redefining js_printcss.

    Update Aug 27, 2010: I’ve decided to release this as a plugin, get the skinny and the plugin from the followup article.

  • Thesis V WordPress, Pearson V Mullenweg

    Reading my WordPress feeds this-morning, it appears a war of words broke out overnight between Matt Mullenweg (the lead developer of WordPress) and Chris Pearson, the developer of the Thesis theme.

    In brief, Mullenweg believes that, because WordPress is released under the GPLv2 license, all themes and plugins developed for WordPress must also be released under the same license. Pearson disagrees.

    This situation has never affected us directly at Soupgiant so we haven’t needed to, and this is important, ask our lawyer if my interpretation is correct. This is a layman’s opinion and should be treated as such.

    The battle comes down to these clauses in the GPLv2 license:

    You must cause any work that you distribute or publish, that in whole or in part contains or is derived from the Program or any part thereof, to be licensed as a whole at no charge to all third parties under the terms of this License.

    If identifiable sections of that work are not derived from the Program, and can be reasonably considered independent and separate works in themselves, then this License, and its terms, do not apply to those sections when you distribute them as separate works.

    –source GPLv2 license

    Due to the second clause quoted above, I believe that Mullenweg is wrong. WordPress themes can operate on other blogging platforms with minimal changes. This has been done before with the Sandbox theme for WordPress which was successfully ported to Movable Type.

    WordPress themes output HTML with a series of calls to the blogging platform. To output the post’s title and contents in our base theme, we use the code:

    <h2 class="entry-title"><?php the_title() ?></h2>
    <div class="entry-content">    
        <?php the_content("Continue reading " . the_title('', '', false)); ?>
    </div>

    To output the same HTML in a Movable Type theme, we would output:

    <h2 class="entry-title"><$mt:EntryTitle$></h2>
    <div class="entry-content">
        <$mt:EntryBody$> <$mt:EntryMore$>
    </div>

    In terms of a page’s output, the above code is a minor part of the page. The theme’s template is mostly made up of HTML and CSS, HTML and CSS operate in the browser and not in the blogging platform. It’s for that reason that I believe that Pearson is correct in this case.

    I acknowledge that WordPress hooks may complicate the matter but these hooks output such a minor part of a theme’s HTML, that I consider the theme uses the platform but isn’t derived from the platform. I’ve left plugins out of this discussion as these are a more complicated matter: they can output HTML or they can build on the platform.

    The above said, were I to release a WordPress theme I would probably release it under the GPL as a hat tip and thank you to the community that has assisted me so much. However, if the theme was as complicated as the Thesis theme, I may feel differently about the matter when it’s crunch time.

    Again, this is a layman’s opinion and should be treated as such. If you have a layman’s opinion too, we’d love to hear it in the comments.

  • Getting the bloginfo correctly

    A previous version of this site ran on an WordPress MS install.

    As with most WordPress sites we use plugins to enhance WordPress, including Donncha O Caoimh‘s excellent WordPress MU Domain Mapping plugin. As the name implies, the domain mapping plugin allows us to use top level domains for each site rather than being stuck with sub-domains.

    Taking care with plugins

    Many plugins are tested for the single site version of WordPress only. I don’t have a problem with this as most plugins are released under the GPL and free in terms of both speech and beer. If I’m not paying for software, it’s up to me to test it in the fringe environment of WordPress MS.

    Now that WordPress is WordPress MS is WordPress, more developers may test in both environments but they certainly can’t be expected to test with all manner of combinations of plugins.

    The standout problem

    One of the standout problems when using plugins with WordPress MS is when they define a constant for the plugin’s url as the script starts executing, the PHP code may look similar to:

    <?php
    
      define('PLUGIN_DIR', get_bloginfo('url') . "/wp-content/plugins/peters-plugin");
    
      function plugin_js_css(){
        wp_enqueue_script('plugin-js', PLUGIN_DIR . '/script.js');
        wp_enqueue_style('plugin-css', PLUGIN_DIR . '/style.css');
      }
    
      add_action('init', 'plugin_js_css');
    
    ?>

    The above stands equally for themes mapping the stylesheet directory at the start of execution:

    <?php
    
      define('THEME_DIR', get_bloginfo('stylesheet_directory') );
    
      function theme_js_css(){
        wp_enqueue_script('theme-js', THEME_DIR . '/script.js');
        wp_enqueue_style('theme-css', THEME_DIR . '/style.css');
      }
    
      add_action('init', 'theme_js_css');
    
    ?>

    The get_bloginfo and bloginfo functions return information about your blog and your theme settings including the site’s home page, the theme’s directory (as in the second code sample above) or the stylesheet url. bloginfo outputs the requested information to your HTML, get_bloginfo returns it for use in your PHP.

    Outside of code samples, bloginfo and get_bloginfo are interchangeable throughout this article.

    The problems occur when a subsequently loaded plugin needs to change something retrieved from bloginfo. In this site’s case, Domain Mapping changes all URLs obtained through bloginfo, but it could be a plugin that simply changes the stylesheet url to a subdomain to speed up page load.

    In a recent case, a plugin – let’s call it Disqus – was defining a constant in this manner. As result an XSS error was occurring when attempting to use Facebook Connect. Replacing the constant with a bloginfo call fixed the problem.

    The improved code for the first sample above is:

    <?php
    
      function plugin_js_css(){
        wp_enqueue_script('plugin-js', get_bloginfo('url') . '/wp-content/plugins/peters-plugin/script.js');
        wp_enqueue_style('plugin-css', Pget_bloginfo('url') . '/wp-content/plugins/peters-plugin/style.css');
      }
    
      add_action('init', 'plugin_js_css');
    
    ?>

    bloginfo doesn’t hit the database everytime

    I presume the developers set their own constants because they’d like to avoid hitting the database repeatedly to receive the same information.

    Having run some tests on my local install of WordPress, I can assure you this is not the case. Running bloginfo('stylesheet_directory') triggers a db call on the first occurrence, the information is then cached for subsequent calls.

    I realise I sound incredibly fussy and that I’m suggesting we protect against edge cases on our edge cases. You’re right, and it’s not the first time, but as developers it’s the edge cases that we’re employed to avoid.

  • ‘Skip to Content’ Links

    Big Red Tin co-author, Josh, and I were discussing the positioning of Skip to Content links on a website. In the past I’ve placed these in the first menu on the page, usually positioned under the header.

    According to the fangs plugin, the JAWS screen reader reads the opening of Soupgiant.com as:

    Page has seven headings and forty-three links Soupgiant vertical bar Web Production dash Internet Explorer Heading level one Link Graphic Soupgiant vertical bar Web Production Heading level five Heat and Serve Combine seventeen years of web production experience, twenty years of television and radio experience, put it all in a very large pot on a gentle heat. Stir regularly and serve. Soupgiant goes well with croutons and a touch of parsley.List of five items bullet This page link Skip to Content bullet Link Home bullet Link About bullet Link Contact bullet Link Folio

    – my emphasis

    That’s a lot of content to get through, on every page of the site, before the Skip to Content link. It would be much better if the skip to content link were earlier on the site.

    As the HTML title of the page is read out by JAWS, the best position would be before the in-page title. The opening content would then read as:

    Page has seven headings and forty-three links Soupgiant vertical bar Web Production dash Internet Explorer This page link Skip to Content Heading level one Link Graphic Soupgiant vertical bar Web Production

    – again, the emphasis is mine

    That gives the JAWS user the title of the page and immediately allows them to skip to the page’s content. I don’t read the header of on every page of a site, nor should I expect screen reader users to.

    I realise screen readers most likely have a feature to skip around the page relatively easily, regardless of how the page is set up but our aim should not be relative ease, our aim should be absolute ease.

    As a result, we’ve decided to move the skip to content links on future sites to earlier in the page.

    Sadly, this revelation came up as a result of what I consider to be a limitation of the WordPress 3.0+ function wp_nav_menu, the inability to add items at the start of the menu. I should have considered the accessibility implications much earlier. It serves as a reminder, to all web developers, that we should constantly review our practices and past decisions.