The Psychological Paradox of Airline Artificial Intelligence (AI)
"Unlike other technologies before, AI impacts the human psyche of staff differently"
There are many jokes about psychologists, as in psychotherapists. Did you know anybody can be a therapist? You just need to repeat the last two words as a question.
The patient says: “Last night, I had a bad and disturbing dream”. “Disturbing dream?”, the therapist asks.
And so, it goes on. All kidding aside, humor was historically framed negatively by psychologists as it was perceived as a defense mechanism, a mask, or a reflection of assumed superiority. Luckily – as psychology is still one of the youngest sciences – it has evolved.
One more lame joke: How many psychotherapists does it take to change a lightbulb? Well, how many do you think it takes?
Origins of Psychology and AI
Psychology is a field that has evolved rapidly of late. And that is predominantly due to globalization and the internet.
Nonetheless, with the first laboratories founded in 1886 in Germany, research quickly went from studying dreams (1900) to behaviorism, and to cognitive psychology, the scientific study of mental processes, problem-solving, and memory by 1956.
Around the same time, the origins of artificial intelligence began. And of course, it is no coincidence. Technologists and psychologists started to work on using machines to mimic how people solve problems.
AI has had one big ‘winter’ (when credibility, funding, and enthusiasm falls) before. This happened between the 1970s-1990s, and some in my field say we are hitting another one. The first one happened because machines did not have enough capacity for memory.
"Psychologists are increasingly involved and brought into airlines. Specifically, to help with adoption of artificial intelligence (AI). In their work to update change management practices, experienced psychologists stumble upon new challenges."
Today, it is more related to ethics (bias and DEI) and the fears created by unfounded talk about sentient machines that go overboard or can think, perceive, and feel like humans and thus have emotions.
But that is not true. AI goals are defined by humans, and Ethical AI is a practice even regulators are looking at.
Also, there are many successful AI applications in flight operations where AI is helping to optimize routes (Eurocontrol) and even reduce fuel burn in-flight (offered by companies like OpenAirlines). Amadeus’ Sky Suite also uses AI in network optimization and airport slot planning.
But AI is also helping to accelerate recruitment, the review of CV content in HR and furthermore staff planning in airline maintenance and overhaul.
Increasingly machine learning and AI is applied for obtaining travel intention and audience segmentation insights that can be ingested in updated demand forecasting functions to enable better revenue management.
There are many more applications on the way (some that cannot be shared publicly, yet), but none of them include goals of feeling emotions like humans or disrupting society. It’s not the problem-solving role of AI.
Yet, psychologists are increasingly involved and brought into airlines. Specifically, to help with adoption of artificial intelligence (AI). In their work to update change management practices, experienced psychologists stumble upon new challenges.
This first article explains the reasons why and how successful bridges are built to modernize organizations consisting of people that interact with machines. It is also the reason I recently completed another degree in applied organization psychology, concentrating on AI in airline management.
Current State of AI in Aviation
Recently, an airline in the USA developed the perfect AI model to detect payment fraud. It was ready to be deployed but failed, like the 55% of AI projects that stall in the industry.
Yet, the typical promises were all there. Great value, efficiencies, fewer boring tasks, and more time to think strategically and experiment.
The training sessions were well attended; everybody says it worked.
Often, the assumption airlines take is that an AI tool that is cool, cutting-edge and delivers value will motivate people to adopt it.
But this only worked in conversational chatbots and automation in call centers, such as DigitalGenius being deployed at KLM.
Aircraft and jet engines are also full of hidden AI, until the Boeing 737MAX demonstrated that we had not explained it well to pilots, which led to disastrous results.
Yet, AI is everywhere, consumers are at the end of it in all sorts of Apps. They love it. At work, it is different.
The Adoptability Issue
How come the tried-and-tested change management principles that worked so well before failed? It is for one specific reason.
Unlike other technologies before, AI impacts the human psyche of staff differently. While other technologies impact the work itself, AI impacts and triggers emotions related to personal identity.
More specifically, it raises questions in people about how good they do their jobs, if they do it fast enough, if their experience is not good enough, or whether they are good at solving problems to begin with.
It often leads people to doubt their skills or creates feelings of incompetence.
"Unlike other technologies before, AI impacts the human psyche of staff differently. While other technologies impact the work itself, AI impacts and triggers emotions related to personal identity."
And thus, triggers fears of being redundant. AI itself cannot reason or communicate this part with people, not even a conversational AI bot that is already proposed for automated therapy sessions in the field of psychotherapy.
But these things matters because in an era of IoT and connected devices, more decisions must be made with higher volumes of data in real time to interact with customers.
There is no escaping from automation of some tasks. Staff cannot cope. And customers won’t wait.
How AI Should Be Approached
So, what do applied psychologists do in AI application development and deployment?
Psychologists start looking at the 7 emotional systems and how they are triggered when intelligent work requiring experience is (partly) done by machines. They include fear, rage (anger), lust (pleasure), care, grief (panic), and play (fun) and seeking (exploring).
These emotional systems trigger hormonal changes, and we should enable positive ones and avoid triggering negative emotions. That’s caring about people and colleagues.
The primary ingredient in AI is people. The glue in the approach is the combination of actions that (1) positively balance the emotional systems, (2) psychological needs, and (3) recognize personality traits.
Applied psychologists on AI projects remove fears, enable play, and instill curiosity.
They also boost feelings of autonomy, safety, and achievement around what the AI-powered tools do to lift people.
But they recognize that the approach will be a mix of tactics depending on personality types. Much like leadership, that is an art that works best with authenticity.
Psychologists also help create the future role of staff before the AI tool is deployed. Staff further have a role in designing how they want to interact with the decision intelligence tool.
This is also the focus of hybrid intelligence, which makes the best of human-AI interaction.
You may ask, why does this matter? The answer lies in the fact that ‘local’ AI applications within departments will be part of a layer that supersedes departments to create automated workflows across departments.
"Applied psychologists on AI projects remove fears, enable play, and instill curiosity. They also boost feelings of autonomy, safety, and achievement around what the AI-powered tools do to lift people."
This is necessary to service end customers better, and in real time. That is the Enterprise AI level. Applied psychologists are preparing for this level of facilitation, too.
In the meantime, as an overall tactic, AI can be used to upskill people in their work environment and it helps them feel more appreciated by doing more strategic work, toward measurable objectives they can be proud of.
It is therefore important that airline staff have a hands-on role in improving the quality and effectiveness of AI models, which is done by providing “glass box” technology that is visual, such as Explainable AI.
In practice, they can follow these steps that often work well:
Define the goal of the AI model (application) in human terms (how it will help them feel better about their role and tasks)
Develop the model around people (where their unique skills will be required)
Ensure that the model’s results show the people’s contribution (demonstrate how their input has added value)
Make sure that the interfaces do the applications are intuitive, easy to use, and appealing to the eyes, so that people feel it is an extension of how they logically think and work.
There are many additional things applied psychologists do in helping airline organizations make the best use of AI-powered automation. In the next article, we will look at a few more practical cases.
Feel free to write to me for comments or reach out on LinkedIn:
ricardo DOT pilon AT millavia DOT com