View previous topic :: View next topic |
Author |
Message |
|
thomas8166
Joined: 25 Apr 2012
Posts: 85
Location: Tainan, Taiwan
|
Posted: Sat Jan 11, 2020 9:33 pm
|
|
|
Huh. I get the feeling these guys are missing the point (with regards to the end of the first season, at least), since it's revealed that actual criminal brains are doing the judging, not AI.
Quote: | What about thoughts that aren't acted upon? |
We've been there before, with Article 58 of the Soviet Russian penal code; see how well that turned out...
|
Back to top |
|
|
AkumaChef
Joined: 10 Jan 2019
Posts: 821
|
Posted: Mon Jan 13, 2020 9:27 am
|
|
|
thomas8166 wrote: | Huh. I get the feeling these guys are missing the point (with regards to the end of the first season, at least), since it's revealed that actual criminal brains are doing the judging, not AI. |
I'm not sure that reveal actually matters in this context. A computer running on human brains instead of silicon chips is still a computer, and the show implies that statistics and reason are used to perform the judging as opposed to opinion. The Sybil System might be a bunch of human brains hooked together, but for the purposes of this discussion it's no different than a bunch of silicon CPUs hooked together. It's how the computing power is used which matters.
What I find fascinating about this is the fact that researchers willingly work on this sort of thing. Do these researchers not take a step back and think about just exactly what it is that they are working on? Or do they somehow believe that their creation is infallible and cannot be twisted towards evil?
|
Back to top |
|
|
maximilianjenus
Joined: 29 Apr 2013
Posts: 2913
|
Posted: Mon Jan 13, 2020 10:49 am
|
|
|
Nah, some scientist (unlike climate change related ones both denier and supporters) are not self absolved freaks who think they should control how the rest of the inferior humans think because they don't have their superior scientific minds. so they limit themselves to creating science while they leave the moral decisions to people who are actually involved in that.
|
Back to top |
|
|
Chrono1000
|
Posted: Mon Jan 13, 2020 11:59 am
|
|
|
thomas8166 wrote: | Huh. I get the feeling these guys are missing the point (with regards to the end of the first season, at least), since it's revealed that actual criminal brains are doing the judging, not AI. |
Yeah, that is one of the main issues of Psycho-Pass that the Sibyl system was a complete lie and there was no magical AI system it was just a bunch of sociopaths. There was a reason this show was banned in China since it was an indictment of having complete trust in government authority. There is some irony in people debating the merits of the Sibyl system since it would be like debating the merits of the government system in 1984.
|
Back to top |
|
|
AkumaChef
Joined: 10 Jan 2019
Posts: 821
|
Posted: Mon Jan 13, 2020 1:42 pm
|
|
|
maximilianjenus wrote: | Nah, some scientist (unlike climate change related ones both denier and supporters) are not self absolved freaks who think they should control how the rest of the inferior humans think because they don't have their superior scientific minds. so they limit themselves to creating science while they leave the moral decisions to people who are actually involved in that. |
I wasn't accusing the scientists of being self-absorbed control freaks.
I was asking why they seem not to be concerned about what they are creating, especially given its vast potential for abuse.
Suppose you're an engineer. Your job is to make things that other people request. If someone asks you to design a new car transmission there really isn't much for you to worry about. Yeah sure, maybe someone uses a car with your transmission to run over their cheating husband but that's a pretty unlikely thing and you probably wouldn't worry about it one bit. On the other hand, if someone asks you to design something nefarious then wouldn't you start wondering about what it might be for and have concerns about working on it?
It's true that nearly anything could be used for evil purposes--after all, people use cars as murder weapons, they use cell phones to deal drugs and traffic humans, they use computers as hacking tools. But those things are mainly used for benign purposes. Heck, even an outright weapon such as a knife or a gun is usually used for benign purposes rather than crime. But this is different. It's hard to imagine anything other than evil from a system like this one, it has no benign purpose, which makes me wonder why people would ever work on it.
@Chrono1000
Quote: |
Yeah, that is one of the main issues of Psycho-Pass that the Sibyl system was a complete lie and there was no magical AI system it was just a bunch of sociopaths. |
Is that the case? It's been a long time since I've watched Psycho-Pass, but my understanding was that there was a "magical AI system", it simply ran on neurons instead of silicon. The reason why the Sybil system used "sociopath's" brains is because they worked like a proper computer and were not subject to whims and emotion. The fact that human brains were being used was a neat twist to the story, but it had little to do with how the Sybil system functioned. The fact that it ran on brains instead of chips doesn't make it not an AI
Agreed 100% with the rest of your post.
|
Back to top |
|
|
DavetheUsher
Joined: 19 May 2014
Posts: 505
|
Posted: Mon Jan 13, 2020 4:21 pm
|
|
|
AkumaChef wrote: | It's true that nearly anything could be used for evil purposes--after all, people use cars as murder weapons, they use cell phones to deal drugs and traffic humans, they use computers as hacking tools. But those things are mainly used for benign purposes. Heck, even an outright weapon such as a knife or a gun is usually used for benign purposes rather than crime. But this is different. It's hard to imagine anything other than evil from a system like this one, it has no benign purpose, which makes me wonder why people would ever work on it. |
An AI algorithm isn't inherently bad. Companies and governments already use them in daily activity in various areas, including crime and terrorism analysis. Nothing as huge or absolute as the Sibyl system obviously, but they work and have helped out a lot. Sure, it's a bit spooky how YouTube knows exactly what I'm going to search for after watching a video before I even type anything in the search bar but I don't see AI has inherently bad or evil myself.
|
Back to top |
|
|
AkumaChef
Joined: 10 Jan 2019
Posts: 821
|
Posted: Mon Jan 13, 2020 5:36 pm
|
|
|
DavetheUsher wrote: |
An AI algorithm isn't inherently bad. Companies and governments already use them in daily activity in various areas, including crime and terrorism analysis. Nothing as huge or absolute as the Sibyl system obviously, but they work and have helped out a lot. Sure, it's a bit spooky how YouTube knows exactly what I'm going to search for after watching a video before I even type anything in the search bar but I don't see AI has inherently bad or evil myself. |
AI? no, it's not inherently bad. My comments were not directed at something as vague as AI in general. But when you start talking about, say, accusing people of criminal activity because of something they haven't actually done...well, now we've moved well beyond just talking about AI. And just because something is in widespread use by companies or governments doesn't mean that use is morally justified or acceptable either.
Say you're an engineer or a programmer working on computer facial recognition: Sure, one might argue the work is justified if it manages to stop a known terrorist from committing some heinous act. But wouldn't you, the engineer/programmer, be worried that you yourself might be among those false-positived by the very tech you are creating? That's a point that Psycho-Pass makes very clear: often times these fancy systems we create have serious and fundamental flaws.
|
Back to top |
|
|
|