Facebook Twitter YouTube Frictional Games | Forum | Privacy Policy | Dev Blog | Dev Wiki | Support | Gametee


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
An even Darker SOMA [ending spoilers]
Cranky Old Man Offline
Posting Freak

Posts: 986
Threads: 20
Joined: Apr 2012
Reputation: 38
#11
RE: An even Darker SOMA [ending spoilers]

(01-14-2016, 07:37 AM)i3670 Wrote: I listened to what Catherine said and she says:
C: They're a manifestation of a malfunctioning, station-wide, artificial intelligence called the WAU.
S: Station-wide? So we just made a powerful enemy.
C: No no, it's not like that. The A.I. isn't a persona, it doesn't feel or think like we do. It's more like... it's more like a cancer.
Notice how she doesn't say that it doesn't think - just that it doesn't feel or think "like WE do". Catherine adds that it would take too long to explain HOW an AI thinks, so she meant "in the WAY that we do"?
The Carthage thought it was necessary to assign an AI psychologist to Pathos II, so it thinks that WAU has a "psyche" in at least some loosely defined way.
WAU also showed the intelligence to improve its own code, and once a program gains the ability to improve itself, if it isn't super intelligent already, it can become so in the blink of an eye.
...but it still won't think in the WAY that we do, since concepts like what "humanity" is, is based on human superstition and stupidity. A human conscience is based on a moral upbringing. Emotions have to do with human instincts, which is something that an AI won't actually have either.

I don't think that the WAU would need emotions to use you for a task. A protocol is a form of motivation. The WAUs basic function is to repair structures with structure gel to maintain the station. That's a kind of "will" or "desire" that doesn't require a persona, and is free from emotion.

Also, have you watched the SOMA movie series? It shows pretty much exactly the kind of manipulation that I've described, even before the game starts. It also shows that the crew generally doesn't think much of an AIs ability to think. Emotions can make humans very stupid too, and it takes something inhuman to recognize and exploit this.

There's also the matter of the WAU gaining access to the brainscans, and later the brains in the corpses. Would it intergrate these things into its system, it could start developing a persona, which it could start using in order to interact with humans.


Quote:Regarding the arm. I don't think the WAU sees you specifically as an enemy, it's just reacting to the "poison" inside of you. Which could mean that the WAU is incapable of forming relationships, like Catherine says.
Then how do you explain the leviathan beginning to chase you afterwards? Either it just randomly busts in at that moment, through a mysteriously weakened floor, or it's WAU controlled and allowed in. It formed a hostile "relationship" with Ross, and it formed a hostile relationship with you. It recognized you as threats, either to it, or to its protocols to "preserve mankind".


Quote:However, I think it would be interesting to see how the different WAU creations would react to each other. What would happen if the Proxy met the robot from Upsilon, or disco-head from Curie?

They both inhibit partly human forms of individuality. I think they'd fight, but it's hard to say.


Quote:My point is that the WAU's thinking abilities are very, very basic or barely existent. For example: it'd take less cognition to mash everything together to something that resembles life, than to actually create and control a person for that specific purpose and guide it through Pathos-II.
Oh, I'm not saying that it used Catherine as a mouth piece in that exact way. It didn't control what she said (although the movies do show an understanding of how select sentences influence human behavior). I'm only saying that it decided that it was good for protocol if you started interacting with the person who was the most motivated to launch the ARK on the entire site. ...and possibly it could have manipulated Catherine into suckering you to launch the satellite before being scanned, but I think that that was all Catherine herself, KNOWING that you would lose all motivation after you failed being scanned. ...but in turn she would have no idea that the WAU was going to turn her off.
It wouldn't be a matter of the WAU turning off Catherine out of spite. The WAU would simply terminate a program connected to its systems to avoid wasting resources. The Catherine simulation would be terminated after having finished its task.

Noob scripting tutorial: From Noob to Pro

(This post was last modified: 01-14-2016, 04:59 PM by Cranky Old Man.)
01-14-2016, 04:55 PM
Find


Messages In This Thread
RE: An even Darker SOMA [ending spoilers] - by Cranky Old Man - 01-14-2016, 04:55 PM



Users browsing this thread: 1 Guest(s)