Twinkle twinkle happy stars

HBF Star Rating logos (such as the one pictured) are frequently displayed by volume housebuilders as an ‘independently-awarded’ quality assurance hallmark.

 

These logos appear on builders’ websites, in marketing material, around developments and generally anywhere prospective customers might see them and (we assume this is the point) assume that lots of stars means lots of quality.

 

However, look more closely and you might wonder if such an assumption were warranted.

Twinkle twinkle happy stars

 

The UK housebuilding industry goes to considerable lengths to promote the quality of its new homes and its customer satisfaction levels, both of which it claims have been improving year on year since 2005.

 

In promoting this message, the industry relies heavily on an annual Customer Satisfaction Survey carried out by HBF in conjunction with the NHBC (National House Building Council, the UK’s biggest provider of new home warranties). The survey is sent out to homebuyers around 8 weeks after they move into their new homes and the previous year's results are published in or around March the following year.

 

For their Star Ratings, HBF works out the percentage of “yes” answers to one of their survey questions, and this determines the number of HBF Stars each participating builder gets. 80%+ gets you 4 stars. 90%+ gets you 5 stars. The percentages that get you less than 4 stars are irrelevant because no-one gets less than 4 stars.

 

So who is HBF?

 

 

HBF is the “Home Builders Federation”, the industry’s political lobby group which represents over 80% by volume of the housebuilding industry in the UK.

 

HBF’s primary purpose is to promote their members’ interests. They are funded by their members and their agenda is their members’ agenda.  They are a separate legal entity from their members and so are “independent” in that sense, but they are not “independent” in the sense of owing duties or loyalty to anyone but their members.

 

They do not, for instance, promote the interests of new build homebuyers (save to the extent this might be consistent with furthering their members' broader interests).  Nor are they impartial in relation to activities that are seemingly meant to be independent such as monitoring and reporting on construction quality and customer satisfaction in response to high-level government[1] and OFT[2] criticism of endemic poor construction quality and low customer satisfaction – this being why HBF do their annual Customer Satisfaction Survey at all.

 

Satisfied Customers?

 

The question HBF asks homebuyers in order to generate its Star Ratings is:

 

Would you recommend your builder to a friend?
[answer: yes or no].

 

This is not the last word in sophisticated customer satisfaction analysis:

 

  • It is a vague and subjective question to which neither of the definitive answers offered is particularly suitable;
  • It may be in people’s nature to answer “yes” to such questions even if not fully deserved, especially if “no” were not fully deserved either – people are often more reluctant to unfairly criticise than to offer unwarranted praise;
  • There is no direct correlation between a “yes” answer to this question and the implicit endorsement of the builder’s general build quality and customer satisfaction levels that the Stars are used to promote.

 

Still, at least the statistics, even if they are a bit vague about what they measure, are reliable, right?

 

Wonky statistics?

 

HBF publish their survey results and award their members with Star Ratings in or around March each year, and have done since 2005.

 

Each year HBF draws reference to previous years’ results and almost invariably promotes the message that quality and customer satisfaction have improved again on the previous year.

 

However, to meaningfully compare something from one year to the next, you would generally need to measure the same thing in the same way each year.  You might also think it appropriate to include the answers from all the surveys you collected in, to give the most accurate results.

 

1.  Variations in (stated) survey methodology

 

The following facts and figures are taken from published surveys over the period 2005-2012, and reflect the percentage of respondents who answered "yes" to the 'recommend-a-friend' question:

 

  • 2005 - 22 homebuilders building over 500 units per year - 75%
  • 2005/06 - 20 homebuilders building over 500 units per year - 77%
  • 2006/07- 17 homebuilders building over 500 units per year - 75%
  • 2007/08 - 15 homebuilders building over 500 units per year - 76%
  • 2008/09 - 15 homebuilders building over 500 units per year - 88%
  • 2009/10 - Customers of "the 15 HBF members" (no volumes given) plus “buyers from a representative sample of home builders from across rest of the industry" (none of whom were identified in the report) - 86%
  • 2010/11 - 16 home builders (no volumes given) - 90%
  • 2011/12 - 16 homebuilders building over 300 units per year plus “5 medium and small builders" - 91%

The (stated) methodology during the first 5 years is consistent, which is fine. But in 2009/10, HBF indicated that they used survey results from “a representative sample of home builders from across rest of the industry”, which is not the same as in the first 5 years.  They made this change without identifying the builders whose customers were surveyed, how many surveys from each were used, or why they had changed their methodology in such a dramatic way.

 

HBF did acknowledge that their results for 2009/10 were “not strictly comparable” to previous years’ results, but this perhaps should have read “not comparable at all”.  Nevertheless, HBF represented that these incomparable results were comparable.

 

Another curiosity is the big jump in the percentage of yesses in 2008/09, up from a steady 75-77% during the first 4 years to 88% – a performance that appears to have been maintained or improved upon (according to the statistics at least) every year since.

 

The results for 2008/09 are similar to those for 2009/10 – the year the methodology was said to have changed - but the stated methodology for 08/09 was the same as for the previous 4 years.

 

The obvious question is, if the original methodology was used for the 08/09 results, which recorded a leap in satisfaction levels, why change the methodology the following year so as to render those results incapable of comparison?

 

Another question might be, was it coincidence that the very different 09/10 methodology generated similarly impressive results to the previous year?

 

Another question might be, did HBF change the methodology the year before, but just not mention it?

 

We do not know the answers to these questions, but we do consider it apt to raise such questions of surveys whose uses are likely to include informing public policy on the housebuilding industry.

 

2.  Lots of surveys not included in the data analysis

 

Aside from changing methodology, it appears that large numbers of surveys that are collected in each year are not then used to generate the results:

 

  • 2005 - 15,295 surveys colleted, 4.9% discarded
  • 2005/06 26,000 surveys colleted, 15.5% discarded
    2006/07 28,479 surveys colleted, 23.8% discarded
  • 2007/08 20,879 surveys colleted, 15.7% discarded
  • 2008/09 16,741 surveys colleted, 14.1% discarded
  • 2009/10 20,335 surveys colleted, 12.4% discarded
  • 2010/11 23,778 surveys colleted, 15.3% discarded
  • 2011/12 29,330 surveys colleted, 13.5% discarded

 

HBF do not say why so they discard so many questionnaires, or who decides which ones are counted and which are not, or on what criteria these decisions are made, which we think they should given the obvious potential for manipulating the results that would accompany a decision to discard up to 23.8% of the surveys received.

 

Conclusion

 

Star Rating logos are, in our view, badges awarded by builders to themselves via their independent (but not impartial) political lobby group, HBF, for doing well on a survey that HBF designed (badly, in our view) and then redesigned for reasons unknown, and whose results are based on varying and opaque methodologies.

 

Of course, it may be that build quality and customer satisfaction have risen over the last few years, but HBF Star Rating logos are not, in our view, evidence of that.

 

If you really want to know how well a builder builds houses and how satisfied its customers are, go and look at their houses and ask their customers. Knock on the doors of people who have already moved in and ask them if they are happy with quality and customer care. Walk around site with a critical eye and look for yourself - you don't need to be an expert to see if things look like they are being done properly. Satisfy yourself of what you should expect from your builder and don't rely on your builder to tell you anything other than what they want you to hear.

 

 

Footnotes

[1] Barker Review of Housing Supply 2004 (section 6 relates to the housebuilding industry)

[2] Office of Fair Trading Housebuilding Market Study 2008

Recommend this page on:

Contact us

T: 01904 799400

M: 07711 893943

E: hello@wingrovelaw.co.uk

A: Wingrove House, Beech Avenue, Holgate, York, YO24 4JJ

Feedback & Testimonials

"your support is giving me and my wife peace of mind..."

"thank you for your continued support and what you are doing for our community..."

[read more]

Print Print | Sitemap
© 2014-17 WINGROVE LAW, registered office Wingrove House, Beech Avenue, Holgate, York YO24 4JJ, authorised and regulated by the Solicitors Regulation Authority - no. 617667.