Notice: Undefined index: order_next_posts in /nas/content/live/gadgetmag/wp-content/plugins/smart-scroll-posts/smart-scroll-posts.php on line 194

Notice: Undefined index: post_link_target in /nas/content/live/gadgetmag/wp-content/plugins/smart-scroll-posts/smart-scroll-posts.php on line 195

Notice: Undefined index: posts_featured_size in /nas/content/live/gadgetmag/wp-content/plugins/smart-scroll-posts/smart-scroll-posts.php on line 196

RESPONSIVE DESIGN: Coding for change

With devices, platfroms and technologies all on the change, keeping up can be difficult, but your code shouldn’t need to


Think HTML first

From its conception in the early Nineties, HTML has always been considered and maintained as a standardised language.

The simple and consistent structure of HTML tags means, that even if HTML contains errors, it is easy for an application to compensate and predict the desired structure. This enables the information held within HTML to always be accessible which is, and always has been, the main purpose of the world wide web since it began. HTML is largely the same as it was in the Nineties but the basic concept and use of it has remained.

HTML is the most consistently interpreted language across browsers and devices. Therefore it is important to rely on HTML whenever possible within a responsive build. Try to avoid building your markup with JavaScript and ensure that all of your vital information is present within the page as soon as possible. There are circumstances when you will need to render data within a page using JavaScript, but by thinking HTML first you can be confident in saving page-load performance at every opportunity. An added benefit of this is that you can ensure that search engines will be indexing all of your important content.

Data seperation

A web interface that requires a large dataset can cost you performance if all the data is initially held within the markup. If you have a filtered list of 500 items, you would be rendering all 500 on the initial page load. As this is HTML, this will load relatively quickly. However once you start to use JavaScript to filter and adjust this list, the data you are filtering against needs to be referenced straight from the markup. In JavaScript one of the slowest things you can do is query DOM elements, and as this will need to be done each time the data is filtered against, it can really slow down the interface.

Alternatively the data can be held within a JSON object, served to the page as a prerendered JavaScript object or through a web service. This dataset can then be held within your JavaScript and filtered against without having to extract anything from your markup. The DOM elements will still need to be updated by JavaScript, but there are some great performance benefits to taking such an approach.

Page load vs render

A page load is completed when all data or assets required to display the page have been loaded into the browser, whereas the page is only rendered when these assets have been interpreted by the browser and displayed to the user as intended. This means that when looking at site performance you should really be trying to consider both. These two considerations can contradict each other – if you make savings in page load, you are in turn asking more of the browser to piece the data or assets together during the render.

To decrease your page load you can look to reduce the number of HTTP requests your page needs. This works as a browser queues HTTP requests, therefore stopping any other pending assets loading. The largest assets are normally image files, these can be lazy and loaded with JavaScript after the initial page load which will dramatically decrease your page load-time.

To decrease your page-render time you can ensure that as much information is held within your HTML as possible, this negates the need to use JavaScript to render markup. Any JavaScript should be referenced after your page markup, as browsers not only load JavaScript but actually parse it as it is loaded. This will delay the render of any HTML/CSS that follows in your page and therefore slow down the page render.
Another thing to consider is the user’s perception of load speeds, for example loading the images at the top of the page normally and then lazily loading other images as they scroll. The truth is there is not one silver bullet approach to gain perfect page loads and renders, but it is vital to strike the correct balance for your project.

Precomplie your code

We have all had to work with a colossal CSS file at some point. This makes ongoing development of a platform very difficult especially when working with CSS, due to its inheritance structure.

The most popular precompilation tools are for CSS, namely Sass, SCSS and Less. These tools provide a wide range of extensions to the standard CSS syntax enabling you to manage your styles in separate files and bring them together into one file. You can also utilise mixins or variables which let you declare sets of CSS properties once and reuse them throughout your code. This makes ongoing development and changes very easy to manage, and on compilation, these files are all brought together into one, ensuring that as few HTTP requests as possible are needed to load a page.

“The browsers that run our code, and the tools and frameworks we use, are ever-evolving and improving, and we need to keep up.”

JavaScript files can be precompiled in a similar way. Many of the tools for this run on Node.js, such as Grunt, and also provide minification of your JavaScript further increasing your page performance. Imagine two developers working to produce the same UI and functionality – both solutions go through the same testing process and then fulfil all the defined requirements. Coding is such a creative medium to work in that it is extremely difficult for two solutions to match. Therefore we can assume that there are some differences, and these differences could help both developers improve their code. Now that these two approaches have been merged, a third developer can take it and make further improvements. This is the nature of web development today and is the core reason behind the success of many open source projects.

The browsers that run our code, and the tools and frameworks we use, are ever-evolving and improving, and we need to keep up. This rate of evolution combined with the range of tools available, means that one person cannot hold and keep up with everything. As soon as you realise this, be it within your team or a wider open source community, you will put less pressure on yourself to stay up to date. You will also be sure to benefit from the wealth of knowledge around you, being a jack of all trades isn’t that great when you are a master of none, but being one part of a collective expert group is.