Member Article
The Trust Paradox, How Much Trust is Enough?
Author - Freyja Lockwood
As the adoption of autonomous systems, artificial intelligence and data-driven innovation to develop ‘smart cities’ accelerates, an emerging challenge is building trust - one of trust and public acceptance. These technological advancements are beginning to solve some of our most intractable problems and shaping nearly all aspects of our daily life.
Polarised
Yet they are viewed as a double-edged sword. This duality is clearly reflected in the media, either the narrative is one of utopian optimism where technology is the silver bullet, or it heralds a nightmare future of unintended consequences and societal breakdown. Neither of these polarised views are helpful, but they highlight the important role that perception plays in the development of these technologies and their ultimate impact on public acceptance. They point to some fundamental questions that we should be exploring to understand how to develop ‘trusted’ systems and how to unlock the immense opportunities digital innovation could bring.
The Trust Paradox
What I want to briefly explore is the emergence of a ‘trust paradox’1 associated with new, unfamiliar technologies and the implications for emerging technology. A recent article noted that “people are both increasingly dependent on, and distrustful of, digital technology”1 . Yet here is the paradox: despite our wariness we do not, for the most part, shun technology, quite the contrary we are embracing it. To illustrate, take two voice-activated virtual assistants, Apple’s Siri and Amazon’s Alexa, as examples. Despite on-going, high-profile debates about privacy concerns and publicity about ‘rogue’ devices we are, for the most part, still enthusiastically adopting these life aids. We appear to be consciously choosing to ‘trust’ these devices. However, the concept of trust is a complex thing to unpick. It is inherently fluid, dependent on many contextual factors and innately personal.
An Invasion of Privacy?
Returning to the story of the rogue digital assistant, the owner had accepted the manufacturer’s claims that Alexa would not invade her privacy2 . Was this trust a conscious choice? Probably not, we are more often weighing up how useful we think something is, tolerating less than perfect systems when the overall usefulness of the solution outweighs other considerations. This highlights how vital it is to understand what solutions people will value.
When Trust is Lost
We stop trusting and start questioning when something triggers our distrust, such as when a private conversation is recorded then emailed to a contact without our explicit permission. Often it is only then that we consciously weigh up perceived benefits against perceived risks and revise our original choices. Once lost, trust is fiendishly difficult to regain - as the owner of the misguided stated: “I felt invaded… I’m never plugging that device in again because I can’t trust it.”develop inclusive solutions to increase adoption, or address concerns around data use with transparent systems and what constitutes acceptable use.
Contributing to this scenario is a common pattern of significantly overplaying the advantages of technology whilst understating the limitations. Unfortunately this can lead to a belief that technology is infallible, making us less forgiving when the inevitable happens. It can be easy to overlook the importance of user-centred approaches in our race to implement new technology cost-effectively, but we risk jeopardizing our goals by doing so. Ultimately the goal of digital innovation is to change people’s behaviour and encourage new habits – be that in a team, an organisation, or a city. Therefore success is not simply about resolving technical challenges, but to address the social aspects associated with technological innovation. For example, how to actively build trust through design, develop inclusive solutions to increase adoption, or address concerns around data use with transparent systems and what constitutes acceptable use.
Understanding the Impact
Fundamental to this will be a greater understanding of the public impact as well as a better understanding of what influences our willingness to engage with and trust new technologies. User-centred design, with people at the heart of the technology development process, is vital especially in a smart city context such as Bristol (where our focus is), where generating greater social value is set alongside economic benefits. The need to consider societal acceptance, especially during the early phases of design and development, whilst the technology is still ‘novel’, will require greater partnership and collaboration between all stakeholders involved in technological change - from innovators, technology providers and citizens, through to investors, regulators and policy makers.
Each brings diverse perspectives and knowhow that can usefully inform the design, development and deployment of trusted digital solutions.
News and Events
Get Involved
Join the Drone Delivery Group today and put your weight behind the drone industry's voice...it's free forever.