[Time for math with Luna, because figuring things out for yourself is important.]
There were ten participants. By the end of the game, seven of them died, two escaped, and one was left behind.
Per evidence found before the escape, six of the dead participants were dead due to murder. The seventh participant was the unconscious escapee. The eighth was the conscious escapee. The ninth was the participant who was left behind.
What happened to the tenth participant, according to the information I have given you?
[something is missing. it takes him a while to figure it out, a few moments of silence where he goes over the numbers again. ten participants, seven dead, six murdered, two escaped, one left behind--]
[-- it comes to him in a flash of lightning, but even so, thinking it doesn't feel right. not when compared to the Luna he knows. he opens his mouth. . . shuts it again. takes a moment to search for his words]
[. . .]
Are. . . are you trying to tell me that you were responsible for the deaths of those six?
According to all of the evidence... I was the guilty party, yes.
Whether that was true or not wasn't debated. It was the only option they had, and I wasn't in a position where I could deny my guilt. So I was abandoned in the facility with the one remaining participant... and shortly afterwards, I was killed by the facilitator as a result of my decisions.
[. . . he still doesn't pull away from her, his fingers tightening around her shoulders. not painfully so, but it's absolutely a slightly uncomfortable grip]
"According to all of the evidence" doesn't mean it was true.
[There's a quiet tone, a touched but broken flutter in what had otherwise turned into a mechanical rambling.
...]
I knew about the game.
I wasn't the orchestrator, of course. But I worked alongside them. I was specific given orders to participate. I was to pretend to be one of the players, to help them choose the correct path of decison making, and to do or act a specific way, should... different circumstances arise.
I... I just sat and... watched all six people die. And the worst thing is... I could have stopped them. All of them. But I...
[he's gone from grasping her shoulders to holding her hand now, his grip just as tight, though he's doing his best not to hurt her in the process. as she tells him about what happened, no part of him is judgmental about her story. and when he finally speaks. . . he's more curious than upset or angry or disappointed]
It's a set of fictional rules created by Isaac Asimov that he used in many of his science fiction stories through the mid 1900's and turn of the millenium. It's since become a foundation for reliable, safe forms of advanced artificial intelligence.
A robot without the Three Laws is just a bunch of metal and plastic. Like a toaster or an automobile. To behave erratically makes a machine more likely to be dangerous.
. . . [he isn't sure he likes where this is going]
You are.
[but the rules are fictional, he thinks. so why would you bring them up? these thoughts he doesn't yet say aloud, but depending on what she says next. . . he might]
[. . . his gaze narrows when she discusses the second law, because!! don't like that!! she can read it on his features, his distaste at the idea of someone else having so much control over her agency]
. . . are you saying that you have to follow those rules, Luna? Or that someone told you that you do?
[she doesn't even need to say the word "I am a robot." he has already figured it out. and it still doesn't change his opinion on her]
[Mm, yeah, she figures he's realized it by now. It's so casual in how he addresses it, that... he reminds her of a very young Sigma, in a way. It's nostalgic in a way that makes her ache deep down.
...]
Both, I suppose. The former is the more thorough answer, though. It's an inherent part of modern AI programming, just like any other type of code of the past.
...Well, um... "modern" in my case, anyway. You said on our first day here that it was only within the first decade of the 21st century in your last memory, correct?
no subject
[well. now he looks confused. he's taking in what she's saying, of course, but if she wasn't an escapee, then. . .?]
If you didn't escape, then how do you know--?
[how many people were killed. murdered. how many managed to leave, and the conclusions they drew as they did so]
no subject
[Time for math with Luna, because figuring things out for yourself is important.]
There were ten participants. By the end of the game, seven of them died, two escaped, and one was left behind.
Per evidence found before the escape, six of the dead participants were dead due to murder. The seventh participant was the unconscious escapee. The eighth was the conscious escapee. The ninth was the participant who was left behind.
What happened to the tenth participant, according to the information I have given you?
no subject
[oh. he gets it now. he had been operating that the only two survivors were the ones who escaped, but. . .]
So you. . . were the one left behind?
no subject
[Do your math again, Akira. Something is missing.]
no subject
[something is missing. it takes him a while to figure it out, a few moments of silence where he goes over the numbers again. ten participants, seven dead, six murdered, two escaped, one left behind--]
[-- it comes to him in a flash of lightning, but even so, thinking it doesn't feel right. not when compared to the Luna he knows. he opens his mouth. . . shuts it again. takes a moment to search for his words]
[. . .]
Are. . . are you trying to tell me that you were responsible for the deaths of those six?
no subject
Whether that was true or not wasn't debated. It was the only option they had, and I wasn't in a position where I could deny my guilt. So I was abandoned in the facility with the one remaining participant... and shortly afterwards, I was killed by the facilitator as a result of my decisions.
no subject
"According to all of the evidence" doesn't mean it was true.
[he meets her gaze firmly]
Did you? Did you kill them?
no subject
No.
But... I was an accomplice.
no subject
[quietly, with no judgment in his tone]
How. . .?
no subject
There's a ragged sigh, her eyes drifting to where Akira has a vicegrip on her arms.]
Do you truly want an answer to that, Akira?
Or are you determined to tell me that what I know is a lie?
no subject
[he says, chin lifted firmly]
However ugly or awful it may be. I want to understand-- I want to know.
Because you're still my friend, no matter what.
no subject
[There's a quiet tone, a touched but broken flutter in what had otherwise turned into a mechanical rambling.
...]
I knew about the game.
I wasn't the orchestrator, of course. But I worked alongside them. I was specific given orders to participate. I was to pretend to be one of the players, to help them choose the correct path of decison making, and to do or act a specific way, should... different circumstances arise.
I... I just sat and... watched all six people die. And the worst thing is... I could have stopped them. All of them. But I...
I didn't.
no subject
Why. . .? Why didn't you?
no subject
And I...
...
[She takes in another slow breath, placing her free hand on top of where Akira grips her as though she might vanish.]
Akira... are you familiar at all with the Three Laws of Robotics?
no subject
I'm. . . not, no.
[he looks a bit confused and bewildered]
What are they?
no subject
It's a set of fictional rules created by Isaac Asimov that he used in many of his science fiction stories through the mid 1900's and turn of the millenium. It's since become a foundation for reliable, safe forms of advanced artificial intelligence.
A robot without the Three Laws is just a bunch of metal and plastic. Like a toaster or an automobile. To behave erratically makes a machine more likely to be dangerous.
...
[She pauses, her look almost timid.]
...Am I making sense so far?
no subject
You are.
[but the rules are fictional, he thinks. so why would you bring them up? these thoughts he doesn't yet say aloud, but depending on what she says next. . . he might]
no subject
Second, a robot must listen to and obey the orders given to it by a human being, unless those orders conflict with the First Law.
Third, a robot must not allow any harm to come to itself, unless that action would conflict with either the First or Second Law.
no subject
. . . are you saying that you have to follow those rules, Luna? Or that someone told you that you do?
[she doesn't even need to say the word "I am a robot." he has already figured it out. and it still doesn't change his opinion on her]
no subject
...]
Both, I suppose. The former is the more thorough answer, though. It's an inherent part of modern AI programming, just like any other type of code of the past.
...Well, um... "modern" in my case, anyway. You said on our first day here that it was only within the first decade of the 21st century in your last memory, correct?
no subject
2017. Yeah. That's right.
[presses his lips tightly together before he mutters]
. . . it isn't. . . fair.
[?? what isn't fair??]
no subject
[Did... she say something upsetting?]
What isn't fair?
no subject
[he clarifies. she'll notice his thumb is brushing against her knuckles now]
It shouldn't matter that you're not human. You should be able to make your own choices. Follow our own path, regardless of where it takes you.
[he sets his jaw tightly, expression firm]
You're not a tool, or a piece of equipment, or. . . someone to whom someone else can bark out orders like they own you.
You're yourself. And it's not. . . fair.
[she deserves freedom, too]
no subject
[She's touched. She really, really is. But there's a sadness to the smile that she gives him, squeezing the grip of his hands.]
...Can I continue with my explaination?
no subject
. . . yes. Sorry, I-- yes.
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
YOU BETTER NOT JINX THIS BC IF HE DIES NOW WOW BOY BAD BAD BAD BAD
B)