MAY 2018CIOAPPLICATIONS.COM 19as a vehicle with the same technical features. Participants reported trusting it more, and were even more relaxed in a (simulated) accident, and blamed their vehicle less for an accident caused by another car. Furthermore, research suggests the more humanlike a robot is perceived to be, the less aggressive people tend to be toward it, so there may be some benefits to consider in those infamously frustrating customer experiences that inevitably arise. In a 2015 lab experiment conducted by Kate Darling et al., when asked to strike a robot, participants hesitated significantly more when it was introduced through anthropomorphic framing like a name and backstory. Thus, people may be more forgiving of Siri or Alexa than they would a nameless block of text.2. Be transparent: Honesty is best--and control is sacredNew technology intended to personalize customer service can sometimes be met with backlashÂreactance, in psychological terms. Research on targeted advertising, for example, indicates that people sometimes react negatively to messaging that they perceive as unnecessarily personalized. This is where transparency is key. Tiffany Barnett-White and her colleagues found that when customers were told why the video store ads they were viewing included mention of their zip code (only customers in that area were subject to a discount), they were more likely to click on the ad. In this case and others, explicit justification can mitigate feelings of reactance customers have towards newly personalized experiences. When introducing a new service technology like a chatbot, try alleviating the creep factor with transparency justification, e.g. "We are using a chatbot because it will make things faster for you." This is especially true in Maritz' context of consumer loyalty and reward sites, where trust is higher than the average ecommerce site.Along the same lines, it's important to remind customers that they are in control. Research on Facebook advertising finds that people are more likely to click on targeted ads after they are reminded that they own their data, and have the right to change their permissions at any time. In case of a chat bot, customers should be reminded of their right to pass on the bot and wait on a real person.3. Don't overdo the automation--make the bot "show their work"In their research on operational transparency, Ryan Buell and his colleagues uncovered an intriguing phenomenon: people like to see how the sausage is made. Among other contexts, the researchers demonstrated this when students in a lab participated in a simulated online dating site. Students filled out input fields describing their dream partner, submitted the forms, and waited for their results to load. In one condition, the dating algorithm produced results immediately. In the other, the computer "showed its work," and took significantly more time to load the identical results, while indicating to the subjects that it was reviewing potential matches one step at a time, based on a sequence of criteria such as location, height, and interests. Though it took longer, the students preferred the second condition, in which they felt the system put more thoughtful effort into choosing their matches. This preference even manifested in their ratings of the matches, which though identical in either condition, were considered superior in the transparent condition.The lesson here is that just because a service can be automated and produce immediate results, doesn't mean it should. Thanks to the labor illusion, it can be wise to consider the perspective of the customers and insert a signal like, "Hold on, I'm researching that for you," to ensure they feel you're putting in some work for them. Whether they are interacting with humans or a bot, customers value effort. Remind them of the work behind the scenes to show you care. Charlotte Blank
<
Page 9 |
Page 11 >