Connected cars provide an attractive revenue opportunity for service providers and advertisers, especially those that include in-car payments that are currently gaining traction in the North American market. But, this level of opportunity also makes it highly likely that some service providers will attempt to employ techniques that take advantage of user assumptions to subscribe them to services they do not want or need.
Some of these techniques, currently seen in more established, non-automotive platforms, are known as ‘Dark Patterns’.
What are Dark Patterns?
Dark Patterns are deceptive UX/UI interactions designed to exploit human psychology in order to get users to do things they don’t actually want to do. Dark patterns aren’t a new phenomenon, but they are likely to be found more frequently as our connected, digital lives expand and infiltrate more of our everyday devices, including our vehicles. While not directly illegal (though some ways of implementing them certainly can be), dark patterns reside in a space of questionable morality and are certainly damaging to the user’s experience. Rarely do they offer any perceived or tangible benefit to the end user. If the term “dark pattern” is not familiar, the implementation of strategies under this umbrella almost certainly will be. Once recognized by a user, the obvious use of these techniques can have an extremely negative affect on their view of the brand. Here are some dark pattern techniques, with fictional examples (you may recognize some)
"Sneaking"
This technique targets the way we perceive and prioritize information that is shown to us, and 'sneaks-in' other commitments or information that, if noticed, a user would likely object to.
Common example: Use of personal information for marketing purposes during a sign-up process, or adding additional services to a final bill.
"Misdirection"
Misdirection is a visual manipulation technique that attempts to steer the user toward or away from certain choices. This often involves the use of color, or more prominent design, to highlight the option that the user would likely find undesirable, as well as de-prioritising the selection that the user is most likely to want. Although a type of choice manipulation, in most cases, all of the necessary control actions remain available to the user – the user is not trapped.
Common example: Ending a subscription – the option to end the subscription is colored or presented in a way that makes it appear that it is the opposite choice to what the user is looking for.
"Obstruction"
Also known to UX experts as 'the Roach Motel" (named after the popular North American insect trap), this technique provides users with a very easy, often streamlined process of enrolling themselves into a service or subscription, but provides a difficult, complicated, or confusing exit/cancellation process in the hope that the user simply gives up trying.
Common example: Deleting a social media account, or confusing privacy opt-outs.
"Forced Action"
A concept whereby the target outcome is obstructed by the need to carry out an additional (sometimes seemingly unrelated) process. This is often also known as “Forced Enrollment” as the forced action frequently requires the user to add a payment method before they can achieve their intended action.
Common example: Creating an “Account”/”Profile” before a free-to-use feature or service can be accessed, or adding payment methods so a free trial can auto-roll into a paid subscription.
"The continued rapid increase of connectivity and functionality within vehicles is absolutely certain" says Adam Jefferson, User Experience Consultant at SBD Automotive. "so, the issue facing OEMs is not only that they risk alienating their clients by implementing these techniques, but that the third-party services they add to their vehicles may already be doing this." The emerging challenges for manufacturers over the next few years will be not which features to include in the in-vehicle user experience, but which features and functions to avoid.