It’s been a while since I have written.
Actually, that isn’t quite true. It’s been a while since I have published.
Sometimes, my work takes me down paths that are particularly hard for me to explain to others. The reasons for it may not appear obvious at first glance, but almost always, the findings are but a small part in an overall architecture.
Personality is one of those areas. The beginning of a much larger cognitive process.
On a psychological level, personalities are vital to creating relationships. They are varied and beautiful and magical. They are one of our greatest complexities.
On an engineering level (while considering the psychological level), creating a personality solution is quite difficult to be honest. Not because we don’t understand what the outcome should be, but because a personality, and all the magic that results, must “fit” into an architecture.
Personality brings in all those seriously tough challenges around fluid and flexible structures that must be created to support changes in logical and emotional direction that can and will arise at any point in a conversation.
There are a couple of parts to the equation, which get progressively more complex to consider when you take memory, the concept of time and emotional variation into account.
At its simplest view (considering an input from a person and an output from the AI host), we start with the design of a personality.
We then take the input from a user and analyse it for emotions. We push those two things with a context and a collection of emotion based outcomes into the engine, and based on the personality models, it figures out if the response should be angry, sad, happy etc.
This results in a meaningful and fluid conversational exchange that is emotionally based.
The interesting thing is that when done correctly, we as humans start to forget we are conversing with a synthetic intelligence. We form the very beginnings of relationship bonds and that means we imbue more emotional information into our contribution to the conversation. It builds on both sides from there.
What are the applications for personality?
Having a personality means an AI host could better respond in various applications such as health and wellbeing, or commercially the host may represent a brand in customer service and sales based tasks.
For example, a caring and more sympathetic host could be deployed in those industries that deal with very emotionally bound issues around mental health or pets.
Or a happy and excited AI host could work better in hospitality and tourism when helping plan that next amazing holiday.
Think about the volatility in customer service we all see from day to day. One experience can be great, another really bad. Both could be from the same company, just different people having good and bad days respectively.
For the most part, people want to be listened to. They want to communicate with someone that understands their problems, and connects on an emotional level.
Personality in an AI host can deliver consistent, yet variable experiences. And with the right amount of empathy, sympathy and charm, most people engage with a deeper level of trust.
When can you create these personalities?
One of the other reasons I haven’t published for a year is that I have been building an Editor for my engine. That desktop software will be released very soon.
Creating bespoke and custom personalities will be one of our key selling points in the first version.
It’s great fun to play around and experience the variations in the conversations that occur.
As I mentioned at the start of this post, personality leads to a much large set of cognitive processes which I will unveil in due time.
For now, it’s a start to allow others to create generic personalities and for those personalities to be completely disconnected from the conversational blueprints that use them.