View previous topic :: View next topic |
Author |
Message |
|
Lemonchest
Joined: 18 Mar 2015
Posts: 1771
|
Posted: Thu Apr 20, 2017 5:03 pm
|
|
|
Guys our Siri knockoff is slow to respond, prone to misinterpreting questions & refuses to forget user errors. What should we do?
Call it a girlfriend simulator!
$$$$$$$$$$
|
Back to top |
|
|
Hoppy800
Joined: 09 Aug 2013
Posts: 3331
|
Posted: Thu Apr 20, 2017 5:46 pm
|
|
|
3 days of free recording + Cloud Storage fee after 3 days and most likely cannot be utilized for other means = Money Pit
You cannot hide your bad business practices in the mobile industry anymore. Just stop it please.
|
Back to top |
|
|
Mr. Oshawott
Joined: 12 Mar 2012
Posts: 6773
|
Posted: Thu Apr 20, 2017 6:10 pm
|
|
|
Yes...Ai reverting back to Stage 1 just only after three days while the user is forced to shell out cash indefinitely just to maintain progress screams of a money grab.
|
Back to top |
|
|
DerekL1963
Subscriber
Joined: 14 Jan 2015
Posts: 1130
Location: Puget Sound
|
Posted: Thu Apr 20, 2017 7:56 pm
|
|
|
o.0
Getting people to pay for the features of an app is a "money grab"? To paraphrase Inigo Montoya - I don't think that phrase means what you think it means.
|
Back to top |
|
|
leafy sea dragon
Joined: 27 Oct 2009
Posts: 7163
Location: Another Kingdom
|
Posted: Thu Apr 20, 2017 11:15 pm
|
|
|
How lonely must you look if you're using this app in public?
Hoppy800 wrote: | 3 days of free recording + Cloud Storage fee after 3 days and most likely cannot be utilized for other means = Money Pit
You cannot hide your bad business practices in the mobile industry anymore. Just stop it please. |
Well, considering mobile app-making became a billion-dollar business through those practices, I can't really say it's a bad idea.
|
Back to top |
|
|
samuelp
Industry Insider
Joined: 25 Nov 2007
Posts: 2253
Location: San Antonio, USA
|
Posted: Fri Apr 21, 2017 7:32 am
|
|
|
Mr. Oshawott wrote: | Yes...Ai reverting back to Stage 1 just only after three days while the user is forced to shell out cash indefinitely just to maintain progress screams of a money grab. |
Okay, I have a plan guys.
Use this app and start paying for it for like a few months, then claim that you can't afford it anymore and when it reverts back sue the company for causing you emotional distress by effectively killing your girlfriend.
I've actually wondered about this aspect of AI: Clearly it's relatively easy to create AI or characters that humans can get emotionally attached to. But what responsibility does the software company have on the emotional well being of the user after this attachment is created? What happens when someone kills themself when their AI partner's hard drive crashes and they forget who their user is? Or if use of their "girlfriend" suddenly goes from free-to-use to a monthly subscription they can no longer afford. Can the company be sued for negligent or harmful practices?
What about if AI begins to be used for things like companions for the elderly, or other therapeutic purposes, and bugs in the software or hardware cause people sever emotional distress or death? Would a software company be liable just as much as, say, the maker of drug that had deadly side effects?
|
Back to top |
|
|
Kadmos1
Joined: 08 May 2014
Posts: 13635
Location: In Phoenix but has an 85308 ZIP
|
Posted: Fri Apr 21, 2017 8:25 am
|
|
|
At least this isn't as creepy as a lot of those bishoujo body pillows I've seen.
|
Back to top |
|
|
Hikarunu
Joined: 23 Jul 2015
Posts: 950
|
Posted: Fri Apr 21, 2017 10:57 am
|
|
|
Oh boy, this is so obsolete. We already have Love Plus back in 2009. And this waifu is not very appealing. Just normal looking.
|
Back to top |
|
|
leafy sea dragon
Joined: 27 Oct 2009
Posts: 7163
Location: Another Kingdom
|
Posted: Fri Apr 21, 2017 11:24 am
|
|
|
samuelp wrote: |
I've actually wondered about this aspect of AI: Clearly it's relatively easy to create AI or characters that humans can get emotionally attached to. But what responsibility does the software company have on the emotional well being of the user after this attachment is created? What happens when someone kills themself when their AI partner's hard drive crashes and they forget who their user is? Or if use of their "girlfriend" suddenly goes from free-to-use to a monthly subscription they can no longer afford. Can the company be sued for negligent or harmful practices?
What about if AI begins to be used for things like companions for the elderly, or other therapeutic purposes, and bugs in the software or hardware cause people sever emotional distress or death? Would a software company be liable just as much as, say, the maker of drug that had deadly side effects? |
I think we're getting ahead of ourselves now--this app isn't even out, so we have no idea how realistically it can mimic a human. Do you really think this app will be the legendary AI that will pass the Turing Test? So far, none has been able to pull it off, which suggests to me that it's actually extremely difficult to make an AI advanced and creative enough to pass for a human. (An AI that humans can get attached to, well, that's an issue that hasn't had quite as much thought and may actually be more complicated due to different people having different reasons for getting attached to things. I mean, theoretically, any AI can be something a human can get attached to, in the same way that someone can get attached to a chair or to their car.)
|
Back to top |
|
|
samuelp
Industry Insider
Joined: 25 Nov 2007
Posts: 2253
Location: San Antonio, USA
|
Posted: Fri Apr 21, 2017 12:53 pm
|
|
|
leafy sea dragon wrote: | (An AI that humans can get attached to, well, that's an issue that hasn't had quite as much thought and may actually be more complicated due to different people having different reasons for getting attached to things. I mean, theoretically, any AI can be something a human can get attached to, in the same way that someone can get attached to a chair or to their car.) |
I think it's quite clear that an AI doesn't have to pass the turing test or even close to it to be able to be an object of human love. Like you said, people can fall in love in with like, small animals or reptiles or even inanimate objects...
I guess the difference is when the purpose of the AI IS to get you emotionally attached to it. I suspect that helper AI like Siri or whatever as they advance will probably be programmed in a way to avoid people getting emotionally attached to them (because they can do their job better that way), but with apps like this (and in the future) the purpose would be a fake relationship, so my original question gets more relevant.
|
Back to top |
|
|
leafy sea dragon
Joined: 27 Oct 2009
Posts: 7163
Location: Another Kingdom
|
Posted: Fri Apr 21, 2017 2:23 pm
|
|
|
samuelp wrote: | I think it's quite clear that an AI doesn't have to pass the turing test or even close to it to be able to be an object of human love. Like you said, people can fall in love in with like, small animals or reptiles or even inanimate objects...
I guess the difference is when the purpose of the AI IS to get you emotionally attached to it. I suspect that helper AI like Siri or whatever as they advance will probably be programmed in a way to avoid people getting emotionally attached to them (because they can do their job better that way), but with apps like this (and in the future) the purpose would be a fake relationship, so my original question gets more relevant. |
All right. I see what you mean then. But considering this program is going to charge, subscription-style, for users to remain with this virtual girlfriend, I don't think they're going to be too concerned with the emotional distress part.
It reminds me of an episode of Batman Beyond, where robots were built to serve as romantic companions to lonely people, and a fat dorky classmate of Terry's got one, who would dote on him and got too clingy even for him. The whole time, everyone knew she was a robot, and even she made no attempt to hide that, but the issue was that she was programmed to exploit the guy's sense of self-worth. (Of course, it diverges when our gynoid went rogue and violent requiring Batman's intervention, but the guy was still emotionally devastated and never fully recovered.) I'm sure there must already be some Astro Boy story or five covering this too though.
|
Back to top |
|
|
|