Ever fill out a survey and wonder if anyone ever reads it? I do -- and I'm in the business of teaching people how to conduct a survey. So imagine my surprise when I got an email from Will Maloney, the General Manager of the Hilton hotel in Pearl River, NY, the day after I completed a survey about my stay at his hotel. I was surprised -- and thrilled -- to see the manager take the survey so seriously. Not surprisingly, I inquired about Hilton's survey design practices, and he put me in touch with Stephen Hardenburg, who is the Director of Syndicated Customer Research for all the Hilton brands.
I always learn something from other survey practitioners, and Stephen was kind enough to share with me information about the Hilton Hotel survey program. In particular, Stephen spoke about how to improve survey response rates, how Hilton uses its survey program to drive customer satisfaction measurement and improvement, and the challenges of international surveys.
Stephen, you have an interesting title. What exactly does a Director of Syndicated Customer Research do?
We are part of the market research department that covers all the Hilton Family brands. I am responsible for the “syndicated” research, meaning the regular surveying program we do or surveys that we subscribe to, for example the JD Power and Associates syndicated hotel guest satisfaction studies. A colleague in our department handles the custom research including more ad hoc research, conducting focus group research and such.
How did you get into this?
I went to school for hotel & restaurant administration. First, I worked at the O'Hare Hilton, and then moved on to a number of different Hilton hotels in the Los Angeles area handling food and beverage event services. I then went to the Hilton corporate office in Beverly Hills as regional food & beverage director. I was in operations for the first half of my career and then got into brand management.
For the Hilton full service brand I started doing performance management to track how well the Hilton brand was doing. That got me involved in their Balanced Scorecard and all the performance metrics -- financial, revenue, customer service, quality assurance. I did this for 6 to 7 years. I then started to focus on the customer service levels information, which led to the surveying program and my current job. I started here in early 2007.
Tell me about the Hilton surveying program.
We do transactional surveys after a guest's hotel stay has ended. We conduct surveys for all 10 of our brands at close to 3,000 hotels worldwide. This past year we have over 1.5 million completed surveys administered in 25 languages across 78 countries.
We partner with Medallia, Inc. to conduct the surveys. Most of the surveys are completed on-line using a web form, but we supplement this with paper surveys sent by postal mail for those brands where they don't have email addresses, for example, smaller hotels like Hampton and Homewood Suites, which tend to have more pop-in stays in smaller markets. If the guest hasn't made a reservation, then we might not have an email address.
Usually we send out the survey 24 to 48 hours after the stay. We're getting better and better with the timeliness. It was 72 hours a year ago, and then we got it back to 48 hours. Now we send the records [about guests who have completed their stay] to Medallia each night. We have a goal of cutting it back to 24 hours.
The majority of the responses we receive are within 24 hours after sending the email invitations out. Therefore, in most cases we are getting feedback on guest stays within 48 to 72 hours after they have checked out. So there is a higher recollection of the satisfaction with their stay then there would be with a stay 2 to 4 weeks prior.
One of the great things about having 10 different brands is that it allows for experimentation. Each brand has its own survey, so we get to learn about surveying practices from these comparisons.
So what have you learned?
A key goal has been to get our response rate up. We've looked at trends over the years, and the response rates and abandonment rates have everything to do with length of the survey. For those brands that do have the longer surveys, they have the lower response rates and higher abandonment rates. It doesn't take a rocket scientist to figure this out that the shorter the survey the better.
The response rate is now about 30% for all online surveys, and abandonment rate is about 6% down from 7% last year. We've done a better job of shortening the length of the survey. They were on average about 12 minutes, and we've gotten them down to 10 minutes. We need to better and get them down to 7 minutes.
How have you shortened the time it takes someone to complete the survey?
One of things we've done this year is to use branching to shorten the survey. For example, we ask, “Did you use the restaurant?” and only if you check off “yes” do we ask you to rate it. That was a change we made in 2007. So, that helped us cut down the time of the survey, and gave us better data.
So, how sensitive is the length of the survey questionnaire to the response rate?
The longest survey is for the Hilton full service brand. The mean time to complete the survey is 11.3 minutes. 16% [of respondents] take more than 15 minutes to complete [the survey]. For this brand the response rate is 27.5%.
The shortest survey is for the Waldorf Astoria Collection. The mean time to complete the survey is 4.8 minutes. 5% take more than 15 minutes, probably due to open-ended comments, while 85% take less than 7 minutes. Our response rate here is 32%.
I think the length of the surveys impacts future response rates. There is a lag. Initially, customers do respond and take the survey because they don't know the length, but they will remember if it's a really long survey and not respond to a future request. The Conrad brand has the highest response rate at 36%, but this is the first year we've conducted surveys for that brand. The Hilton brand is the lowest at 27.5%.
How do you decide who gets a survey invitation? Or do you send an invitation out to every guest?
Our goal is to get a certain number of responses for each hotel each month. For Hilton brands it's 75 completed surveys, 45 for Hampton, and 35 for Homewood. We do random sampling to get our invitation list, and we target the same mix of Hilton Honors members versus non-members.
One of the controls we have in place is to not over-survey a customer. If they stay within a particular brand, we will not survey them more than once every 30 days. If they stay within the family of brands, we won't survey more than once every 90 days. We have several different rules in place. For example, we maintain also an opt-out list.
Do you offer any incentive to get people to complete the survey?
No, at this time we don't offer any incentive.
Why do people fill out the survey?
Good question. We do get a pretty good distribution of responses from positive to negative. So I can't say we only get those who complain or compliment. I think customers just want their voices to be heard.
How about Hilton Honors Members? Is their response rate higher?
There's no real difference in response rate. However, Honors members tend to be tougher graders as well as business travelers.
How is the survey instrument structured?
We use a 10-point scale now. We switched to that 4 years ago from a 7-point scale. We were trying to align with the JD Power survey, which uses a 10-point scale. The endpoint anchors are Extremely Satisfied to Extremely Dissatisfied. And of course, we use the branching I mentioned earlier.
Now let's get to the interesting part since this is what got me connected to you. What do you do with the data?
The great thing about Medallia reporting is that it is live data, updated on a nightly basis. So, a hotel could see how the front desk is being rated on, say, friendliness today versus yesterday. If there's a change, they can take quick action to address the issue.
Will Maloney, the General Manager at the Pearl River Hilton, wrote to me that “each morning over my coffee I review the surveys from the day before and write a response to every customer survey our hotel receives, whether it's positive, negative, or somewhere in between.” I guess he wasn't joking!
No, he wasn't. Our goal is to get the hotel managers to be responsive and to fix problems quickly. By getting them this feedback faster, we're giving them the tools to do this. That's part of the reason why we're pushing to get the surveys out faster after the hotel stay ends. It also shows our customers that their feedback is important to us and we really do value their input.
At an aggregate level, how do you report the data?
One of the changes we made this year to our reporting was how we report our “Top Box” scores - on a 10-point scale -- for our overall loyalty calculation. We used to report the percentage of responses in the top 3 boxes [that is, 8s, 9s, and 10s]. Now we report only the top 2 boxes [that is, 9s and 10s]. We felt this was a better definition.
The data are used in the Balanced Scorecard process both at the brand level and at the hotel level as part of their bonus plans. What gets measured gets results. These bonuses go deep into the staff. The hotels will have different incentives for each department, for example, front desk staff and housekeeping, and for the hotel as a whole by week, by month, etc. The bonuses then roll up for the region level and the brand level. It's a very complex bonus structure.
Sounds like a Six Sigma program with some influence from the Net Promoter Score® concept.
We don't do Six Sigma formally, but we do have hotel performance staff that work with groups in the hotel to improve performance. We don't use the Net Promoter Score® directly either. One of the problems I see with the Net Promoter Score® is that it reports just that one calculated number. You could get the same number with different combinations of promoters and detractors, but the combinations would really tell different stories.
You mentioned you survey worldwide. Is that new?
We just launched internationally in January this year , and cultural issues are presenting new challenges. This is what we're struggling with now. In Asia guests rate much harsher than Americans or Europeans. The Japanese are most definitely the harshest. However, not only do the Japanese not give 10s, they also give few 1s. So their scores tend to cluster more in the middle. Surprisingly, the response rates are still near 30% in Asia.
The Middle East presents other challenges. In Middle East guests just don't want to give out their email addresses. So we're having trouble getting enough of a sample. Though, the response rate is the same.
We're still learning.
So, how is it being the “Survey Guy” in your company?
There is some disdain for the person who leads the survey program. Hotel managers [who don't like their scores] will say, “You're talking with the wrong people…”When we first launched, the biggest misimpression was that many managers felt that if we got them a bigger sample, their scores would go up. “If you got me more surveys, my scores would go up.” Of course, that's not likely to be true once we get a certain number of responses.
Do you have any recommendations for others in a similar situation?
It's a constant learning process, especially with the international piece now.
Thanks, Stephen, for your time and your insight.
-- Fred Van Bennekom, Dr.B.A., Principal, Great Brook
Please contact us with feedback on this article or other topics that you might find interesting.
Add your name to our Service Insights newsletter which highlights newly published articles.
Great Brook Home | Products & Services | Events | Clients | Partners | Resources | News | About Great Brook | Contact Us | Site Search & Map
Customer Satisfaction Surveys - Survey Workshops - Survey Consultant - Product Supportability Planning - Survey Software