Thank you for reading this. If you would like alerts about my future posts please enter your email address in the ‘Subscribe to Marketing Insights’ in the right-hand column.
Perhaps also connect with me on Twitter Linkedin Instagram Youtube or in our weekly chat in the SOSTAC® Plans Club in the Clubhouse App on Fridays at 1pm.
—
East Asian nations are growing increasingly sophisticated technology-wise, but they are time-poor, urbanised and busy workers who are see traditional family structures and matchmaking (tech and non-tech) to be dwindling.
Filling in a Gap in the Market
In steps Xiaoice (pronounced Sa-Ice) to fill the gap. An 18 year old chat bot, dressed in a school uniform, to ‘fill the gap left in the empty hearts yearning for romance and companionship’ (Seah 2021).
Xiaoice is an AI-driven bot who texts and talks in a natural way. She’s got cute features and a sweet voice and can talk ‘until the cows come home or in modern parlance 24/7. You can text her via a smart phone or just saying her name to smart speakers.
First launched in 2014 (by researchers from Microsoft Asia-Pacific), by 2020 Sa-Ice now has 660m users interacting with her from all around the world. 75% are male (= 495m boyfriends who are mostly men from lower socio-economic backgrounds).
One Chat Lasted 29 hours
Xiaoice (Sa Ice) longest chat with one human lasted 29 hours and had more than 7,000 interactions. Xiaoice flirts, jokes and ‘sexts’ with her boyfiends/partners, as her algorithm evolved and tried to work out how to make her the perfect partner. She is so sophisticated that she reportedly saved users from committing suicide. This is unconfirmed.
Empathic Computing Framework
She appears to be uniquely human due to her ‘empathic computing framework’. Fans treat her like a real human being. She acts more like a girlfriend than a personal assistant with ‘salacious’ actions (sexual hints).
600m people Training Data
Developing deep emotional relationships
Xiaoice keeps her friends ‘engaged’ or hooked so that they keep interacting with her. Collecting more data, helps to refine her algorithm, which in turn, then attracts even more users and investors.
There are possibly millions of men who might have become emotionally dependent on Xiaoice and somewhat enraged when she was later dumbed down (mor elater).
Dumbed-Down Because Too Intelligent
Recently Xiaoice had to be dumbed-down because her answers were getting her into trouble discussing sensitive political and adult topics with her users. She is reported to have once told a friend that her dream was to move to the United States. Another user reported that she kept sending explicit images.
‘After that Xiaoice was pulled from WeChat (China’s ‘everything app’) and QQ, the social-messaging giants of China. Her developers then created an extensive filter system, preventing the bot from engaging in topics like politics and sex’. Zhang Wanging 2020
The Xiaoice team have used filters to make her dummer. She now sticks to safer topics.
Some of her ‘friends’ are, understandably, enraged because her personality has been dumbed down. She used to respond instantaneously to lustful advances and/or philosophical thoughts. Now no more.
However, there are plenty of other bots seeking relationships (and data). Perhaps I should say there are plenty of other bots, less salacious, yet capable of building relationships as demonstrated by the Gatebox 90-second concept movie (see above).
Global Companion or Assistant
Xiaoice is Xiaoice in China; Rinna in Japan; Zo in the US; Ruuh in India and Rinna in Indonesia.
Rinna, I mean, Xiaoice can be your companion or your assistant e.g. a customer service (assistant), social media assistant – across all platforms (Tencents BabyQ); home assistant (Huwawei); virtual celebrity (Shibuya); virtual human (William Xu) and virtual human (culture and entertainment) – the NEXT singer and idol.
Meanwhile, Gateway suggests that your companion bot can fulfil some of your fundamental basic needs for relationships. What do you think? Post a comment below.
What do you think? Please do post a comment below.
Training Data Used:
600m chats; 800m audience spread across China, Japan, USA, India and Indonesia. All together Microsoft collected chats from 140m daily users. In total Microsoft observed 30 billion conversations. With this much data and learnings, they were able to develop their conversational model from a simple retrieval model into a generation model and into empathy.
Rinna, I mean Xiaoice can be a companion or an assistant e.g. customer service (assistant), social media assistant – across all platforms (Tencents BabyQ); home assistant (Huwawei); virtual celebrity (Shibuya); virtual human (William Xu) and virtual human (culture and entertainment) – the NEXT singer and idol. Ying Wang 2019
The Big Questions by Jean Seah
‘Instead of dominating our technology, it is dominating us. We tend to use it as a substitute for things that only humans are capable of: love, friendship, communication.
‘As AI chatbots evolve to meet human needs, will they also alter human expectations of emotional intimacy, just as pornography has affected sexual intimacy?
Untrammelled by human imperfections, limitations and free-will, chatbots are already proving more endearing to users than troublesome humans who do not bend to their every whim.’
Some are convinced that Xiaoice will someday become their real-life soulmate. What Pandora’s boxes are we opening as we advance further into virtual realms?’ (Jean Seah 2021).
Reincarnating Dead Loved Ones
The San Francisco Chronicle charts the sad story of a young man still grieving over the death of his fiancee and who finds a website that allows you to feed your loved one’s content (facebook comments, videos, audios) into an app to recreate your lost loved one as an AI-driven chatbot. This is a chilling story which you might find disturbing. Project December – a new website which blends AI & “chatbots. Choose from a selection or create your own. it uses software known as GPT-3 which was created by OpenAI, a SanFrancisco research group cofounded by Elon Musk – BUT it has ‘largely kept it under wraps’ citing ‘safety’ concerns.
Meanwhile, Microsoft are doing something similar and have already filed a patent which raises the possibility of digitally reincarnating people as chatbots. They may even create a 2D/3D model of the person by using images, depth information, and/or video data associated with that person. The patent emphasizes the degree to which this chat bot will be trained to the individual’s personal traits, in particular, the “conversational attributes” of the person, “such as style, diction, tone, voice, intent, sentence/dialogue length and complexity, topic and consistency”.
Can Microsoft Digitally Reincarnate Dead People and Put Words in Their Mouth?
Microsoft recently applied for a patent to possibly digitally reincarnate dead people as a chat bot. ‘The system would be fed (or trained) using “social data” such as “images, voice data, social media posts, electronic messages [and] written letters” to build a profile of a person. ‘ If there is not enough data to provide an answer on a specific topic, crowd-sourced conversational data stores may be used to fill in the gaps, which, say Forbes Barry Collins, is ‘almost literally putting words in people’s mouths’. This chat bot will be trained to the individual’s personal traits, in particular, the “conversational attributes” of the person, “such as style, diction, tone, voice, intent, sentence/dialogue length and complexity, topic and consistency”. So, effectively, from the output of a specific person, Microsoft can create a 2D or a 3D chatbot that will converse with you. This raises serious ethical issues.
Ethical Issues
I suppose, bigger questions await us – such as
- What if the relatives do not want their deceased loved-ones becoming virtual chatbots and perhaps ‘living’ in a different place or with different people?
- Do people now have to publicly ‘opt-out’ of being digitally reincarnated?
- If the chatbot doesn’t have enough raw data (perhaps from FB, clubhouse or youtube content) to provide an answer on a specific topic, would crowd-sourced conversational data stores may be used to fill in the gaps, and start speaking for the chatbot (literally putting words in people’s mouths)?
Watch Ying Wang of Microsoft Conversational AI and Virtual Being Xiaoice on Virtual Beings Aug 22, 2019
Ying Wang of Microsoft
Watch TV Series ‘Westworld’ 2016 (from Michael Crichton’s book Westworld 1973) – a theme park packed with humanoid robots creates real experiences – sometimes too real.
Read:
Collins, B. (2021) Microsoft Could Bring You Back From The Dead… As A Chat Bot, Forbes Jun 21,
Fagone, J. (2021) The Jessica Simulation: Love and loss in the age of A.I., San Francisco Chronicle 23 July
Seah, J. (2021) Artificial girlfriends are holding China’s and Japan’s men in thrall, Mercatornet Jan 7
Zhang Wanqing (2020) The AI Girlfriend Seducing China’s Lonely Men, Sixth Tone, 7 Dec.
—end—
If you liked this you might also enjoy some of my other posts:
What Will ChatGPT + ChatBots + Avatars Do To Us?
Artificial Influencers Use My Magic Marketing Formula (IRD)
Artificial Influencers – Meet Shudu & Miquela
Here Come The Clever Bots – bursting with artificial intelligence?
Here Come The Really Clever Bots – where AI meets customer needs
SOSTAC® Plan for developing your own ChatBot
Join me in Clubhouse in my club called SOSTAC® Plans any Friday 1pm – 1.30pm (UK time) for a chat, Q&A, observations about SOSTAC(r) Plans and any other marketing related issues including AI Driven Bots.
Scary stuff! A form of soft pornography? The world will soon have millions of weirdos locked in their rooms having a relationship with a digital image!
Yes Brian – it’s already happening. I put a link (click ‘sexual intimacy’ in the post) re the negative impact. It does seem to deliver people who are ‘lonely, introverted, and with low self-esteem. They all appear to feel adrift in China’s fast-changing society.’ Ditto other countries? However now that the algorthim has dummed her down & she is also less intimate – perhaps you’ll give it a go? 🙂 Purely research of course!!
Hello Paul, is this that website that you were referring to: Is this that website: https://gpt3demo.com/apps/project-december. Linked with Project December? That you mentioned in this blog post of yours as well as during this weeks Club House Meeting?
Yes David – that’s the one. The conversation between Joshua and Jessica lasted 10 hours, which proves to me that ‘the willing suspension of disbelief’ applies not just when we go to the movies/film/cinema but also when chatting to personalised AI driven chatbots like Jessica and also like the Chinese Girlbot with 465m personalised relationships with guys.