[something is missing. it takes him a while to figure it out, a few moments of silence where he goes over the numbers again. ten participants, seven dead, six murdered, two escaped, one left behind--]
[-- it comes to him in a flash of lightning, but even so, thinking it doesn't feel right. not when compared to the Luna he knows. he opens his mouth. . . shuts it again. takes a moment to search for his words]
[. . .]
Are. . . are you trying to tell me that you were responsible for the deaths of those six?
According to all of the evidence... I was the guilty party, yes.
Whether that was true or not wasn't debated. It was the only option they had, and I wasn't in a position where I could deny my guilt. So I was abandoned in the facility with the one remaining participant... and shortly afterwards, I was killed by the facilitator as a result of my decisions.
[. . . he still doesn't pull away from her, his fingers tightening around her shoulders. not painfully so, but it's absolutely a slightly uncomfortable grip]
"According to all of the evidence" doesn't mean it was true.
[There's a quiet tone, a touched but broken flutter in what had otherwise turned into a mechanical rambling.
...]
I knew about the game.
I wasn't the orchestrator, of course. But I worked alongside them. I was specific given orders to participate. I was to pretend to be one of the players, to help them choose the correct path of decison making, and to do or act a specific way, should... different circumstances arise.
I... I just sat and... watched all six people die. And the worst thing is... I could have stopped them. All of them. But I...
[he's gone from grasping her shoulders to holding her hand now, his grip just as tight, though he's doing his best not to hurt her in the process. as she tells him about what happened, no part of him is judgmental about her story. and when he finally speaks. . . he's more curious than upset or angry or disappointed]
It's a set of fictional rules created by Isaac Asimov that he used in many of his science fiction stories through the mid 1900's and turn of the millenium. It's since become a foundation for reliable, safe forms of advanced artificial intelligence.
A robot without the Three Laws is just a bunch of metal and plastic. Like a toaster or an automobile. To behave erratically makes a machine more likely to be dangerous.
. . . [he isn't sure he likes where this is going]
You are.
[but the rules are fictional, he thinks. so why would you bring them up? these thoughts he doesn't yet say aloud, but depending on what she says next. . . he might]
[. . . his gaze narrows when she discusses the second law, because!! don't like that!! she can read it on his features, his distaste at the idea of someone else having so much control over her agency]
. . . are you saying that you have to follow those rules, Luna? Or that someone told you that you do?
[she doesn't even need to say the word "I am a robot." he has already figured it out. and it still doesn't change his opinion on her]
[Mm, yeah, she figures he's realized it by now. It's so casual in how he addresses it, that... he reminds her of a very young Sigma, in a way. It's nostalgic in a way that makes her ache deep down.
...]
Both, I suppose. The former is the more thorough answer, though. It's an inherent part of modern AI programming, just like any other type of code of the past.
...Well, um... "modern" in my case, anyway. You said on our first day here that it was only within the first decade of the 21st century in your last memory, correct?
You asked why it was that I could allow so many people to die, right? I was under orders not to intervene. On top of that, I was not the only AI in the facility. There were only a few of us active during the game, and we shared the same computer, but... how should I put this...
I had less authority over what I could do and see. Part of this was to make me seem, um, believable. But part of it was to make sure the most prominent AI could facilitate the entire project properly, without other prograns interfering. So... if he wanted to turn my body off, or shut down a part of my memory... well, I didn't really have a choice.
[he gives her hand a squeeze in return. is it okay, Luna!! is it!!]
. . . you make it sound like you were practically blackmailed. Forced into it, without any option to refuse.
[how else was he supposed to interpret her being an AI programmed to follow certain rules, with the threat of being shut down hanging over her head if she didn't cooperate?]
It wasn't what I enjoyed or thought was wise. But... it wasn't blackmail.
[She's gettin there!!!]
Originally, my job called for me to be dead. Not, um... not actually dead, of course. But another player had attacked me in a way that any normal human could not survive. The main AI turned my body off completely at that point. All I was supposed to do was... watch whatever else might happen.
[There's an odd turn to her voice, breathy and hard to make out the emotion.]
no subject
[Do your math again, Akira. Something is missing.]
no subject
[something is missing. it takes him a while to figure it out, a few moments of silence where he goes over the numbers again. ten participants, seven dead, six murdered, two escaped, one left behind--]
[-- it comes to him in a flash of lightning, but even so, thinking it doesn't feel right. not when compared to the Luna he knows. he opens his mouth. . . shuts it again. takes a moment to search for his words]
[. . .]
Are. . . are you trying to tell me that you were responsible for the deaths of those six?
no subject
Whether that was true or not wasn't debated. It was the only option they had, and I wasn't in a position where I could deny my guilt. So I was abandoned in the facility with the one remaining participant... and shortly afterwards, I was killed by the facilitator as a result of my decisions.
no subject
"According to all of the evidence" doesn't mean it was true.
[he meets her gaze firmly]
Did you? Did you kill them?
no subject
No.
But... I was an accomplice.
no subject
[quietly, with no judgment in his tone]
How. . .?
no subject
There's a ragged sigh, her eyes drifting to where Akira has a vicegrip on her arms.]
Do you truly want an answer to that, Akira?
Or are you determined to tell me that what I know is a lie?
no subject
[he says, chin lifted firmly]
However ugly or awful it may be. I want to understand-- I want to know.
Because you're still my friend, no matter what.
no subject
[There's a quiet tone, a touched but broken flutter in what had otherwise turned into a mechanical rambling.
...]
I knew about the game.
I wasn't the orchestrator, of course. But I worked alongside them. I was specific given orders to participate. I was to pretend to be one of the players, to help them choose the correct path of decison making, and to do or act a specific way, should... different circumstances arise.
I... I just sat and... watched all six people die. And the worst thing is... I could have stopped them. All of them. But I...
I didn't.
no subject
Why. . .? Why didn't you?
no subject
And I...
...
[She takes in another slow breath, placing her free hand on top of where Akira grips her as though she might vanish.]
Akira... are you familiar at all with the Three Laws of Robotics?
no subject
I'm. . . not, no.
[he looks a bit confused and bewildered]
What are they?
no subject
It's a set of fictional rules created by Isaac Asimov that he used in many of his science fiction stories through the mid 1900's and turn of the millenium. It's since become a foundation for reliable, safe forms of advanced artificial intelligence.
A robot without the Three Laws is just a bunch of metal and plastic. Like a toaster or an automobile. To behave erratically makes a machine more likely to be dangerous.
...
[She pauses, her look almost timid.]
...Am I making sense so far?
no subject
You are.
[but the rules are fictional, he thinks. so why would you bring them up? these thoughts he doesn't yet say aloud, but depending on what she says next. . . he might]
no subject
Second, a robot must listen to and obey the orders given to it by a human being, unless those orders conflict with the First Law.
Third, a robot must not allow any harm to come to itself, unless that action would conflict with either the First or Second Law.
no subject
. . . are you saying that you have to follow those rules, Luna? Or that someone told you that you do?
[she doesn't even need to say the word "I am a robot." he has already figured it out. and it still doesn't change his opinion on her]
no subject
...]
Both, I suppose. The former is the more thorough answer, though. It's an inherent part of modern AI programming, just like any other type of code of the past.
...Well, um... "modern" in my case, anyway. You said on our first day here that it was only within the first decade of the 21st century in your last memory, correct?
no subject
2017. Yeah. That's right.
[presses his lips tightly together before he mutters]
. . . it isn't. . . fair.
[?? what isn't fair??]
no subject
[Did... she say something upsetting?]
What isn't fair?
no subject
[he clarifies. she'll notice his thumb is brushing against her knuckles now]
It shouldn't matter that you're not human. You should be able to make your own choices. Follow our own path, regardless of where it takes you.
[he sets his jaw tightly, expression firm]
You're not a tool, or a piece of equipment, or. . . someone to whom someone else can bark out orders like they own you.
You're yourself. And it's not. . . fair.
[she deserves freedom, too]
no subject
[She's touched. She really, really is. But there's a sadness to the smile that she gives him, squeezing the grip of his hands.]
...Can I continue with my explaination?
no subject
. . . yes. Sorry, I-- yes.
no subject
You asked why it was that I could allow so many people to die, right? I was under orders not to intervene. On top of that, I was not the only AI in the facility. There were only a few of us active during the game, and we shared the same computer, but... how should I put this...
I had less authority over what I could do and see. Part of this was to make me seem, um, believable. But part of it was to make sure the most prominent AI could facilitate the entire project properly, without other prograns interfering. So... if he wanted to turn my body off, or shut down a part of my memory... well, I didn't really have a choice.
no subject
. . . you make it sound like you were practically blackmailed. Forced into it, without any option to refuse.
[how else was he supposed to interpret her being an AI programmed to follow certain rules, with the threat of being shut down hanging over her head if she didn't cooperate?]
Is that true?
no subject
[She's gettin there!!!]
Originally, my job called for me to be dead. Not, um... not actually dead, of course. But another player had attacked me in a way that any normal human could not survive. The main AI turned my body off completely at that point. All I was supposed to do was... watch whatever else might happen.
[There's an odd turn to her voice, breathy and hard to make out the emotion.]
...But I didn't do what I was supposed to.
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)
YOU BETTER NOT JINX THIS BC IF HE DIES NOW WOW BOY BAD BAD BAD BAD
B)