This goes hand in hand with our brain’s tendency toward cognitive laziness.
Repeated interactions with chatbots trigger the constructions of a new mental model that will inform these interactions.
This happens as bots create a false mental perception of the interaction, encouraging the user to ascribe to the bot other human-like features they do not possess.
This may seem alien, but this attribution of human characteristics to animals, events or even objects is a natural tendency known as anthropomorphism.
Communication with a bot is different — the gratification derives from a change of mental state, a sort of detachment: You can achieve your goal (getting help, information, even a feeling of companionship) with no immediate “cost.” No investment is required: there’s no need to be nice, to smile, be involved or be emotionally considerate.
It sounds convenient — but the problem arises when we become addicted to this form of bot interaction and slowly start developing a preference for “easy communication.” This can lead to secondary problems.
Power refers to a capacity of influencing another’s behavior, making demands and having those demands met (Dwyer, 2000).
A chatbot doesn’t need the emotional involvement and interpretation of nonverbal cues required by humans, thus making our interaction with it much easier.As such, businesses might be better off trying to connect with consumers in the apps where they are already spending plenty of time.And a bot can potentially provide greater convenience than apps and web searches because it can understand natural speech patterns — and provide the personal touch in an otherwise impersonal user interface.After all, computers have a memory and speak a language; they can contract viruses and act autonomously.In recent years, the personal characteristics element has been increasingly strengthened in an effort to present these inanimate objects as warm and humanoid.Unconsciously this makes them feel better about themselves and gain back a sense of control over their lives.