Are you visiting from outside your region? Visit your regional site for more relevant services and pricing.

Do DECs add up?November 2010

Mandatory energy certification aims to give building owners a true picture of their energy consumption. How well is the system working? Andrew Geens and Richard Hillyard investigate.

BSRIA voluntarily displays a DEC for its HQ building (2010 DEC shown)

Existing buildings in both the private and public sector consume significant amounts of energy. Although there are plenty of new build projects designed to be energy efficient, the UK's existing stock of buildings are going to be occupied for many years to come. Even in a construction boom we only replace one per cent of our building stock every year, which means eighty seven per cent of buildings that will be around in 2050 have already been built. That means the pressure to reduce energy consumption in all buildings can only grow, particularly in refurbishment. In itself this will put pressure on building operators to improve their understanding of how they are using energy, and the scope for reducing waste.

This is why the UK now has two forms of mandatory energy performance certification, courtesy of the Energy Performance of Buildings Directive: the Energy Performance Certificate (EPC) and the Display Energy Certificate (DEC).

In 2008 the department for Communities and Local Government (CLG) took the bold step of requiring mandatory DECs for all public buildings over 1000 m2 that were regularly visited by members of the public. In recent months, Government has been sounding out extending DECs to cover public buildings over 250 m2 that are also regularly visited by the public.

But what are we learning from the Display Energy Certificates issued to date? Is the assessment system working well? Can we tell if building energy efficiency is improving? What needs to be done next?

BSRIA has been looking at the statistics behind 28,259 completed DEC assessments made up until November 2009. It is useful to look at this dataset now, as it represents the first year's DEC assessments conducted and paints an interesting picture of the current situation in building energy management. The dataset was made available through Environmental Information Legislation upon request of the BBC. (Note that Landmark manages and maintains the database from EPC and DECs on behalf of the CLG, and this data is not made publicly available.)

A DEC is designed to give an Operational Rating (OR) score of 0 (A) to over 150 (G). A default score has been set at 200 for buildings that have insufficient energy data to complete the assessment. A score of 100 is defined as typical for the building type.

The first thing that stands out is that the database represents 28,259 assessments, not 28,259 different buildings. Some properties have had multiple assessments. It is likely, in a process that is new to everyone, that errors have been discovered after lodgement, and having been corrected, the certificate has been re-lodged. Nevertheless, this large amount of data gives a good insight into the pattern of UK public sector building energy consumption.

Figure 1: The distribution of 28,259 assessments against the A to G scale

Figure 1 shows the distribution of the 28,259 assessments against the A to G scale. At the time the DEC system was introduced it was considered that the average for public sector buildings would sit at around the D rating, which has proved to be the case. The average DEC operational rating is 113, which is just above the typical rating of 100 - the 'normal' consumption for a specific building type. One can infer from this that the typical building figure is optimistic.

Approximately 98 per cent of the buildings surveyed and assessed for DECs fall in the 0-200 range, which is also good news. This leaves around two per cent of buildings with operational ratings between 201 to 4,574. Of these, 661 are between 201 and 500. The latter indicates a high number of buildings without effective energy management.

Only 24 buildings appear to have an operational rating greater than 500, four of which have operational ratings in excess of 1,000 (two are over 4,000). At the far end of the scale there is a DEC rating of 4,574. This immediately highlights the quality of the data being made available or gathered for the assessment. Moreover, dig deeper, and one often finds that such high figures are for a modest building such as a community school.

Another figure that is interesting from the analysis is that 3001 buildings have been given an operational rating of 200, which is the software default score for buildings with insufficient energy data to enable a score to be calculated. It is not possible to establish how many of them are actual scores of 200 rather than defaults without sight of each assessment. There is also a possibility of misunderstanding the guidance which could result in new public buildings being awarded default ratings when an asset only DEC should have been awarded.

These high value results are highly questionable. The precise cause of the extreme data in those specific assessments clearly requires further investigation.

At the other end of the spectrum, there are 55 buildings with DEC ratings of 0. Are these buildings powered and heated? Are they not in use, or was there no available energy data? Maybe it is an error of data entry.

Another possibility is that assessors do not understand the CLG guidance documents. This has led to some new public buildings being awarded zero ratings as opposed to asset-only DEC's. An asset-only DEC is provided for a newly constructed public building that will not have 12 months energy consumption data. The data set made available does not list the numbers of asset-only DECs.

While the CLG's process is not likely to be at fault, these examples at each end of the spectrum may be symptomatic of a deeper malaise - the competency of the assessors conducting the surveys, and their level of knowledge, training and experience.

It is also a concern that the 3001 buildings with an operational rating of 200 simply do not have sufficient energy data, or the systems in place to monitor energy consumption. This is especially worrying when energy efficiency and cost savings should be a high priority for all public buildings.

While CLG is to be applauded for bringing in a system that reveals this energy management gap, the trick now is to close it. As the BRE has said: "Display Energy Certificates fail to provide evidence of real energy efficiency improvements and carbon emission reductions in public sector buildings".

It could be argued that there are problems throughout the entire data set. Without the means of validating all the data, it is very misleading to aggregate the consumption data into something meaningful - it simply has too many distortions within it.

To its credit the CLG has recognised the shortcomings and is working with accreditation providers to identify the problems and to devise solutions. It is vital that the quality delivered by the various assessment schemes is continuously improved to meet the requirements of the Energy Performance of Buildings Directive. So what are the pressure points?

One area that the DEC scheme does not account for is process energy, which is discounted during a DEC assessment if it can be separated out. However, this energy is still being consumed and needs to be accounted for - after all, its use may not be efficient. The requirement for separable energy to have been reviewed for energy efficiency in the past two years is a bit vague.

As of 2010, while certain DEC benchmark categories (such as offices) take into account extended hours of operation, the calculation does not take into account occupancy levels and whether they have risen or fallen between annual DEC assessments. An organisation that runs a building with low occupation densities could easily earn a C-rated DEC, but they could well be consuming more energy and generating more carbon dioxide emissions per person than another organisation with high levels of occupancy in, say, an E or F-rated building.

DECs don't take into account building age, so it's not possible to examine and compare buildings by age. This is a shame as it would be good to separate buildings built in the last few years from the older building stock for the purposes of comparison.

Ultimately there is still a lot of work to do with improving building energy performance and efficiency. For instance, if newer buildings could be drawn out of the data it would be possible to monitor the success or attempts to deliver zero carbon schools by 2016 and other public buildings by 2019. The Government's aim must be to establish if the performance curve of energy use in buildings moves away from the current average and closer to the A-rated zone.

What the introduction of DECs has done though, is bring to attention of industry to energy consumption in buildings. They have also provided a foundation of data (in access of 30 000 sets) that is increasing all the time. Before DECs there was no standard form of collecting and analysing this data. Fortunately something is now being done and the process improved over time.

Identifying and correcting data collection errors will serve to rectify the problem of poor data; it could potentially generate a skills set of professionals that are equipped to gather important energy consumption and building data.

The fact that DECs are renewed on annual basis can provide building occupiers, owners and operators with a simple visual guide for a year on year comparison of performance. This should encourage pro-active energy management decisions to be implemented in a bid to achieve energy performance improvements.

The data shows that improving the management of energy use is still very much work in progress. However, over the next few years, with the annual renewal assessments being conducted, a trend of improvement should emerge, reflected in the curve in the graph shifting left as energy efficiency improves.

An improved data distribution would not only indicate a positive response to Display Energy Certificates, an increased awareness of building energy consumption and an improvement in management, but might also indicate better overall building performance. After all, only the best buildings deserve to survive in the long term.

Richard Hillyard was with BSRIA and is now an environmental and sustainability manager with Norland Managed Services.  Andrew Geens is Professor of Building Services Engineering at the University of Glamorgan.

BSRIA has accredited DEC assessors and provides independent consultancy to improve energy efficiency in buildings.  For more information contact BSRIA:

T: 01344 465600