Technology is an intervention in our lives and almost always not a neutral object. Even with classic information systems, technology alters the human mind, influencing whether or not we hold a particular set of beliefs and, consequently, follow or reject a certain course of action. 
To illustrate, when it comes to deciding whether or not to read a certain book, the decision is influenced by how easy it is to get hold of, whether it is likely to be useful, and whether others found it appropriate and recommend it. When shopping for the same book from an online bookstore, perceptions of ease, usefulness, and social norms are all impacted because, using persuasive design principles, technology can influence buyers to follow certain actions. 
An example is the use of the social proof principle for persuasion. When browsing a website or using an app, statements such as: “100 people bought the book in the last hour” can be indicative of the book’s popularity and make buying it appear to be a sensible option. The principle refers to the fact that we are social creatures influenced often, whether consciously or unconsciously, by what others believe to be the right behaviour. We are also influenced by the principle of scarcity and our tendency to seek resources that we perceive to be rare. A statement like “only two items left”, for example, activates that desire to possess the resource before it becomes unavailable.
Intervention and persuasive messages are all around us in our physical environment. We see labels on cigarette packs warning about the dangers of smoking and those on sugary drinks warning of their risks to our health. Such messages can become much more powerful when facilitated by technology – persuasive technology – and based on data messages. Interventions can be issued dynamically and smartly to be personalised and to reflect real-time behaviour.
Imagine an intelligent coffee cup that senses the number of times a person uses it or an intelligent chair that senses the length of time a person sits. These objects can gather behavioural data continuously and then become more able to tailor recommendations accordingly. 
Messages of persuasion need not be explicit and may also exploit our unconscious biases. For example, creating friction can help one to rethink alternatives. When trying to prepare another cup of coffee, the machine may ask you to confirm that you want to do so by entering a code sent to your mobile and also by the use of reminders such as “this will be your fourth cup today”. 
Similarly, it can also try to throttle mindless activity by giving you 10 seconds to reverse the decision after pressing the ‘prepare’ button. Other subtle examples include making healthier options appear first, e.g. water or fruit juice may feature at the beginning of the list instead of coffee or energy drinks in a vending machine. 
While it may appear easy and cost-efficient to use persuasive technology to change behaviour, it is often risky to do so without proper analysis and testing. It can create pressure and stress, even leading to adverse effects. For example, persuading students in an e-learning environment can be based on rewarding those providing answers quickly and on the number of correct answers. Rewards and persuasive mechanisms can include badges, progress bars, points, leaderboards, and counters and timers. However, by doing so, we may be encouraging workarounds and cheating given that it can be extra stressful for some students, who may feel they are being observed to a high level of scrutiny. 
It can even be overly simplistic to measure their performance and translate it to points for creative activities based on a correct and speedy answer. Instead of encouraging students to focus on quality, they may become more interested in satisfying the algorithm. Indeed, Goodhart’s law states that it ceases to be a good measure when the measure becomes a target. Additionally, it may replace the intrinsic motivation to follow a pattern of behaviour, i.e. genuine interest and joy, with the extrinsic motivation, i.e. to win the reward or get recognition from others. 
Sound responses to these and other tensions were discussed during the 16th International Conference on Persuasive Technologies in April (PT2021), hosted virtually by Bournemouth University, UK, and co-sponsored by Hamad Bin Khalifa University (HBKU). 
Speakers agreed that the design, development, and evaluation of these persuasive techniques – including those used in social media apps, games, e-learning websites, or persuasive AI – needs an interdisciplinary approach and an understanding of behaviour and persuasion theories together with software and technology design methods.   
lDr Raian Ali, professor, and Dr Dena al-Thani, assistant professor, in information and computing technology at the College of Science and Engineering, HBKU, were on the chairing committee of the 16th International Conference on Persuasive Technologies.
Related Story