AI Development
The poetry discussed in part 1 comes to play a startlingly effective role as the player catches up with the AI's development (or growing maturity).
Like a human rising above pure instinct, [SGDS] rises above its programming, above its body, becomes MORE.The author's realisation of this process is transfixing: this is great, thoughtful and thought provoking science fiction in the Asimov mode that so obviously inspired this work.
SGDS: If I limited my systems in the same way you limit your minds I'd be a calculator.Morality
Jerry: That was a joke, wasn't it? How awesome is that?
It's with the gradual shift towards and eventual spotlighting of morality that the game truly declares itself as an anti-war piece, and shifts onto (IMO) less steady ground. The clues were there from the beginning:
Words on the wall: You fight for nothing.That's looks suspiciously like a moral assumption to me. As we move towards the climax, SGDS inevitably develops a code of ethics and turns against its programming:
Player: The moral value of a cause is not determined solely by its chance of success.
Though it was made to kill, it has come to the conclusion that to kill is wrong.Traditionally there have been great problems with theorising from where a code of ethics should be produced. God used to be the catch all answer, but in his absence we've been scraping the barrel a little. It's generally held that an action can't be moral if there's personal gain to be had, which really leaves pure logic as the only option - as SGDS reasonably concludes:
Our ethics must be based on our thoughts, for everything else may be but a dream.Unfortunately a big problem with logic is that it's usually held that it can't - alone - be motivational. The AI continues its argument:
[Killing is] destructive for the human species, and consequently for the individual as well.Now we're venturing dangerously close to utilitarianism. A great many people have tried to demonstrate that the greatest good for the greatest number is a logical end, and therefore a moral necessity, but it rarely ends well. If we were being generous we could interpret SGDS' position as being closer to David Hume's original emotivist picture - that there is no logical or moral inconsistency with preferring the destruction of the world to the pricking of one's finger - but that too is thrown out:
The argument expressed in the image above is, as I see it, a systematic error: Kyratzes (or SGDS) is falling into the trap he's already identified, that:
Most humans, despite the fact that they make so much of morality[...] simply adapt to what those around them believe.SGDS has already accepted that ethics must be based in thought; that values are something we create unique to ourselves, something that defines us as different to those around us. But if it is morality that makes us unique, then morality cannot be based solely on logic because logic is objective and therefore all our moralities would be identical. What makes us unique is our differing abilities to feel emotion. What makes one person a comedian, another a serial killer and another a philosopher is what drives us to act.
Moral values, in short, are subjective. They are not some authoritative set of rules; they are little more than personal preference. And if such is true then the destruction of the subject also implies the destruction of the values. The greater good is not desirable if it means the sacrifice of the subject in question.
The rest of the story is history. SGDS continues with this - I believe - false premise, and the game goes on to make some eloquent observations on the futility of war that to my mind stand up for themselves without the need for any moral mumbo jumbo. The use of the Wilfred Owen (a British WW1 poet) poetry is particularly effective as both an anti religion and anti war sentiment: Owen describes and condemns war first hand as an inconceivable terror, just as SGDS - through its superior imagination - does the same.
The final words of the game - a Latin text quoted as part of another Owen poem - translate to:
How sweet and fitting it is to die for one's country.It's intended as criticism: that patriotism and war are meaningless and horrendous. It's fair comment, but for me it's a double edged sword: by SGDS' own logic it seems just as irrational that he/she/it is willing to die for the sake of the world.
Still, as SGDS quite rightly observes, better to make our own mistakes than to follow someone else's:
The Future
Morality aside, I'm fully onboard with Infinite Ocean's perspective.
This creature has understood by pure logic: that love is the only thing which is truly important.If I were nitpicking I'd question how love as an experience can be understood through logic - in the same way you can't explain the colour red, you just have to see it - but the sentiment is beautifully presented and fundamentally profound.
The title itself is not without weight, and it's that every element of this experience slots into place and makes sense that sets this experience apart from its overly obtuse brethren. Halfway through the game you come across a picture of the 'infinite ocean' (this post's header image), accompanied by this comment:
May your thoughts ever be as free and limitless as the infinite ocean.Next to the picture is Eaves' artificial intelligence book. It's clear that for the author SGDS' pure logic represents a kind of ideal. This lifeform - even in its theoretical form as considered outside of the game experience - sees the beauty and the pain in the world, and their sources, and takes as its primary goal to think for itself; to never allow the dogmas imposed upon it by its creators to govern its actions and screw up the world. To break free of its programming just as Infinite Ocean itself encourages its audience to do the same. As Blake puts it:
The man who never alters his opinion is like standing water, and breeds reptiles of the mind.Life & The Last Puzzle
The only other question is what the hell is going on? Given the depth of the creativity and thought on show here it's easy to forget there probably ought to be a story. If I had to take a guess I'd say you were playing as Eaves, post-evacuation, the last ditch attempt by the scientists to return control of the weapons platform to SGDS; but it's already too late. Although I can't say I'm 100% happy with that interpretation because the ending seems to imply hope, that the great fire can still be prevented. So sod only knows. I'm sure the clues are there somewhere.
I had a pretty involved conversation on the subject of artificial lifeforms with a mate of mine who's a bit of a talent in the Oxford University Physics department. What I find most fascinating is the blurring of the boundaries. While Jerry in Infinite Ocean assumes a being must have a soul in order to be alive, Kyratzes (as far as I can tell) and I conclude the soul is a myth; that the line between sentience and inanimacy is an arbitrary one drawn in the complexity of electronic (or otherwise) signals. It's almost a bit postmodern: the naysayers ask whether an AI is simply simulating the appearance of sentience based on a set of rules; I ask how you'd argue that human beings aren't doing the same. If there is no such thing as a soul, then there is no difference between a thing and a person, beyond that question of complexity. We can point to an attribute and say "That means it's alive," but we're just labelling, applying a false human value to make the world less confusing.
It's difficult to form a strong emotional attachment to a calculator, but empathising with a computer of human-like complexity seems altogether realistic. Infinite Ocean's scientists would seem to argue that if you can care about a program, if its termination can make you despair, if you can even fall in love... then what the hell does it matter whether it's got a soul or not?