Press "Enter" to skip to content

Resurrect a loved one to talk to him turned into a chat Bot, Microsoft offers it to you

“Modern scientists promise very little; they know that metals cannot be transmuted, and that the elixir of life is an illusion “, Mary Shelley’s Frankenstein

In the video game Bioshock Infinite, an action FPS that took place in Columbia, a hypothetical but captivating colonial flying city of the early 20th century, one of the main enemies were the Handyman – the rapturian Big Daddies of Columbia – huge creatures that in the game They function as the kind of heavy hitting enemy, but whose backstory was sad: People with disabilities, seriously ill or injured, whose head and heart are transferred to a gigantic mechanical body so that they can “live forever.”

“Who wants to live forever?”

The sad part of the Handymans It was not only the eternal pain in which they live, but the idea of ​​Fink Manufactoring to give a second chance, an extension of life to people who were going to lose it, paradoxically turned them into monsters, into entities that only had a head and a heart. The same to the creature that Victor Frankenstein you want to create in your laboratory. Or the end of technological fable of Chappie (Neill Blomkamp, 2015).

The question is the same: Should we accept that life is finite and, therefore, let go of that loved one whom we cannot save? Or is it legal to use all the technology around us to extend his life even at the cost of suffering, or even bring him back if he has already left? One of the latest Microsoft technologies does not give you an answer to that, but it does allow you to do the latter: ‘Resurrect’ who is no longer by your side.

A Chat Bot of someone who has passed away

At the moment no, it is not about going as far as in some films and games, and literally having a robot that represents that person you miss -or a bio-technological avatar-, but it does have a ‘bot’ component, given that Microsoft filed a patent in January this year posed by the intriguing (as well as morally questionable) pPossibility of digitally reincarnating people in the form of a chat bot.

Instead of using the conventional method of training chat bots using conversations and material from a large sample of users, the Microsoft patent raises the possibility of creating a chat bot from the results of a specific person. It’s about using Artificial Intelligence technology and ‘training’ it, ‘educating’ it so to speak so that he would learn to speak, ‘reason’ and act like the deceased person.

Imitating a loved one, not replacing him

The system would use “social data” to build a person’s profile such as:

  • Images
  • Voice data
  • Social media posts
  • Emails
  • Written letters

As can be read in the patent, “Social data can be used to create or modify a special index on the subject of the specific person’s personality. The special index can be used to train a chat bot to converse and interact with the personality of a specific person. “.

In fact even that chat bot could sound like the real person, mimicking their tone of voice: “In some aspects, a voice source of the specific person can be generated using recordings and sound data related to the specific person.”.

3D hologram that speaks to you

In the long-forgotten Seaquest series of the early 90s, one of his most curious ‘futuristic’ finds was a device in which you put a real photo on paper and it created a 2D / 3D hologram that could even speak to you. As the iconic Snake Plissken said in 2013: Rescue in LA “The future is now”, and Microsoft technology also allows “Generate a 2D / 3D model of the specific person using images, depth information and / or video data associated with the specific person “.

The Microsoft patent is not only to recreate subjects already deceased, but this can be alive as well: “The specific person [a la que representa el chat bot] it may correspond to a past or present entity (or a version of it), such as a friend, relative, acquaintance, celebrity, fictional character, historical figure, random entity, etc. “

Replicate yourself in a chat bot to talk to yourself

Even whoever wants you could create a chat bot of yourself, raising the possibility that people train a digital version of themselves before dying. The patent emphasizes the degree to which this chat bot will be trained for the personal traits of the individual, particularly those “conversational attributes” of the person, such as:

  • Style
  • Diction
  • Tone
  • Voice
  • Intention
  • Length and complexity of the sentence / dialogue
  • Theme
  • Coherence

If the chat bot does not have enough data to provide an answer on a particular topic, collectively sourced conversational data stores can be used to fill in the gaps, something that already deviates from the main objective, because it is almost literally to put words in the mouth of the people who did not use or usually say.

Become aware of yourself

The patent also addresses the delicate issue of handling the profiles of the dead, suggesting that the bot may even be aware (for lack of a better word) that it is mimicking a dead person. For example, if the bot were asked a question about an event that took place after his death in real life, “such questions may indicate that the specific person represented by the custom personality index (for example, the deceased relative) possesses a perceived awareness that they have, in fact, passed away“.

The subject of the finiteness of human life has haunted us since we saw that technology can help us counteract it. But,would it be morally legal to make a chat bot represented by a 3D hologram of those who left? Is it good on a psychological level, or bad because it doesn’t help us turn the page? More questions of all kinds, from philosophical to ethical, than answers all this raises. But without a doubt always with the same human interest / obsession: Overcoming their own mortality.

Although, as Queen was raised in the beautiful subject of the film The Immortals, Who wants to live forever?

.

Article Source

Disclaimer: This article is generated from the feed and not edited by our team.