[There's a quiet tone, a touched but broken flutter in what had otherwise turned into a mechanical rambling.
...]
I knew about the game.
I wasn't the orchestrator, of course. But I worked alongside them. I was specific given orders to participate. I was to pretend to be one of the players, to help them choose the correct path of decison making, and to do or act a specific way, should... different circumstances arise.
I... I just sat and... watched all six people die. And the worst thing is... I could have stopped them. All of them. But I...
[he's gone from grasping her shoulders to holding her hand now, his grip just as tight, though he's doing his best not to hurt her in the process. as she tells him about what happened, no part of him is judgmental about her story. and when he finally speaks. . . he's more curious than upset or angry or disappointed]
It's a set of fictional rules created by Isaac Asimov that he used in many of his science fiction stories through the mid 1900's and turn of the millenium. It's since become a foundation for reliable, safe forms of advanced artificial intelligence.
A robot without the Three Laws is just a bunch of metal and plastic. Like a toaster or an automobile. To behave erratically makes a machine more likely to be dangerous.
. . . [he isn't sure he likes where this is going]
You are.
[but the rules are fictional, he thinks. so why would you bring them up? these thoughts he doesn't yet say aloud, but depending on what she says next. . . he might]
[. . . his gaze narrows when she discusses the second law, because!! don't like that!! she can read it on his features, his distaste at the idea of someone else having so much control over her agency]
. . . are you saying that you have to follow those rules, Luna? Or that someone told you that you do?
[she doesn't even need to say the word "I am a robot." he has already figured it out. and it still doesn't change his opinion on her]
[Mm, yeah, she figures he's realized it by now. It's so casual in how he addresses it, that... he reminds her of a very young Sigma, in a way. It's nostalgic in a way that makes her ache deep down.
...]
Both, I suppose. The former is the more thorough answer, though. It's an inherent part of modern AI programming, just like any other type of code of the past.
...Well, um... "modern" in my case, anyway. You said on our first day here that it was only within the first decade of the 21st century in your last memory, correct?
You asked why it was that I could allow so many people to die, right? I was under orders not to intervene. On top of that, I was not the only AI in the facility. There were only a few of us active during the game, and we shared the same computer, but... how should I put this...
I had less authority over what I could do and see. Part of this was to make me seem, um, believable. But part of it was to make sure the most prominent AI could facilitate the entire project properly, without other prograns interfering. So... if he wanted to turn my body off, or shut down a part of my memory... well, I didn't really have a choice.
[he gives her hand a squeeze in return. is it okay, Luna!! is it!!]
. . . you make it sound like you were practically blackmailed. Forced into it, without any option to refuse.
[how else was he supposed to interpret her being an AI programmed to follow certain rules, with the threat of being shut down hanging over her head if she didn't cooperate?]
It wasn't what I enjoyed or thought was wise. But... it wasn't blackmail.
[She's gettin there!!!]
Originally, my job called for me to be dead. Not, um... not actually dead, of course. But another player had attacked me in a way that any normal human could not survive. The main AI turned my body off completely at that point. All I was supposed to do was... watch whatever else might happen.
[There's an odd turn to her voice, breathy and hard to make out the emotion.]
I disobeyed my orders. I found the mainframe where the main AI, Zero Junior, was kept... and I hacked it.
I turned my body back on by myself and disabled Zero Junior's processes temporarily, just so I could try to save the only ones who were left. I... I knew I wasn't supposed to... and that disobedience would mean I... would be deactivated.
But... I was faulty.
...
The worst part is, that... if I had wanted to... I could have done that at any time. I could have said no from the very beginning... [She hasn't realized that she's started crying again, the tears flowing against her will.] I... I could have saved all of them. But I-I was so scared...
[there's a slight crack in his voice when he says her name this time. and maybe it's the tears, or the entire story itself, and the way it kills him to know that she went through so much only to be brought to this god-forsaken mansion, but he does feel something break inside]
[but he hugs her again, this embrace just as tight as the last. he doesn't know what to say about all of that. he told her before his thoughts on those who act to protect themselves-- that it was understandable and something that could not be punished. those thoughts remain true right now, even though he can already tell the guilt of her not acting sooner is something that weighs on her shoulders]
[. . .]
[what a shitty world this is, when good people like her are forced to make horrible, painful choices]
[There's a broken sob as Akira wraps his arms back around her. No, no, it's so familiar, so much like her final memory of Sigma in the garden as her entire body fell apart...]
I... I broke the First Law... Six innocent people died, and I did nothing...
I... I deserve this.
[To be stuck somewhere where all she can do is watch while more innocents die...
[. . . despite everything, he believes that. he, who also believes in doing everything possible to help other people, regardless of the cost or consequences, finds that he's unable and unwilling to judge her for the actions she took. he can only imagine how complicated Luna's feelings had been at the time. how painful it had been to watch, to want to help, while all the while fearing death. maybe. . . maybe it would've been better overall for her to act earlier, but--]
[she still acted. right? in some capacity. . . she still acted]
[and here, she acts. she doesn't just sit around and watch. she rushes to the aid of others as soon as they call her. those aren't the actions of someone who deserves to live through this hell]
I still believe that. I really do. You don't deserve this-- no matter what happened back then, you just. . . you don't.
no subject
[quietly, with no judgment in his tone]
How. . .?
no subject
There's a ragged sigh, her eyes drifting to where Akira has a vicegrip on her arms.]
Do you truly want an answer to that, Akira?
Or are you determined to tell me that what I know is a lie?
no subject
[he says, chin lifted firmly]
However ugly or awful it may be. I want to understand-- I want to know.
Because you're still my friend, no matter what.
no subject
[There's a quiet tone, a touched but broken flutter in what had otherwise turned into a mechanical rambling.
...]
I knew about the game.
I wasn't the orchestrator, of course. But I worked alongside them. I was specific given orders to participate. I was to pretend to be one of the players, to help them choose the correct path of decison making, and to do or act a specific way, should... different circumstances arise.
I... I just sat and... watched all six people die. And the worst thing is... I could have stopped them. All of them. But I...
I didn't.
no subject
Why. . .? Why didn't you?
no subject
And I...
...
[She takes in another slow breath, placing her free hand on top of where Akira grips her as though she might vanish.]
Akira... are you familiar at all with the Three Laws of Robotics?
no subject
I'm. . . not, no.
[he looks a bit confused and bewildered]
What are they?
no subject
It's a set of fictional rules created by Isaac Asimov that he used in many of his science fiction stories through the mid 1900's and turn of the millenium. It's since become a foundation for reliable, safe forms of advanced artificial intelligence.
A robot without the Three Laws is just a bunch of metal and plastic. Like a toaster or an automobile. To behave erratically makes a machine more likely to be dangerous.
...
[She pauses, her look almost timid.]
...Am I making sense so far?
no subject
You are.
[but the rules are fictional, he thinks. so why would you bring them up? these thoughts he doesn't yet say aloud, but depending on what she says next. . . he might]
no subject
Second, a robot must listen to and obey the orders given to it by a human being, unless those orders conflict with the First Law.
Third, a robot must not allow any harm to come to itself, unless that action would conflict with either the First or Second Law.
no subject
. . . are you saying that you have to follow those rules, Luna? Or that someone told you that you do?
[she doesn't even need to say the word "I am a robot." he has already figured it out. and it still doesn't change his opinion on her]
no subject
...]
Both, I suppose. The former is the more thorough answer, though. It's an inherent part of modern AI programming, just like any other type of code of the past.
...Well, um... "modern" in my case, anyway. You said on our first day here that it was only within the first decade of the 21st century in your last memory, correct?
no subject
2017. Yeah. That's right.
[presses his lips tightly together before he mutters]
. . . it isn't. . . fair.
[?? what isn't fair??]
no subject
[Did... she say something upsetting?]
What isn't fair?
no subject
[he clarifies. she'll notice his thumb is brushing against her knuckles now]
It shouldn't matter that you're not human. You should be able to make your own choices. Follow our own path, regardless of where it takes you.
[he sets his jaw tightly, expression firm]
You're not a tool, or a piece of equipment, or. . . someone to whom someone else can bark out orders like they own you.
You're yourself. And it's not. . . fair.
[she deserves freedom, too]
no subject
[She's touched. She really, really is. But there's a sadness to the smile that she gives him, squeezing the grip of his hands.]
...Can I continue with my explaination?
no subject
. . . yes. Sorry, I-- yes.
no subject
You asked why it was that I could allow so many people to die, right? I was under orders not to intervene. On top of that, I was not the only AI in the facility. There were only a few of us active during the game, and we shared the same computer, but... how should I put this...
I had less authority over what I could do and see. Part of this was to make me seem, um, believable. But part of it was to make sure the most prominent AI could facilitate the entire project properly, without other prograns interfering. So... if he wanted to turn my body off, or shut down a part of my memory... well, I didn't really have a choice.
no subject
. . . you make it sound like you were practically blackmailed. Forced into it, without any option to refuse.
[how else was he supposed to interpret her being an AI programmed to follow certain rules, with the threat of being shut down hanging over her head if she didn't cooperate?]
Is that true?
no subject
[She's gettin there!!!]
Originally, my job called for me to be dead. Not, um... not actually dead, of course. But another player had attacked me in a way that any normal human could not survive. The main AI turned my body off completely at that point. All I was supposed to do was... watch whatever else might happen.
[There's an odd turn to her voice, breathy and hard to make out the emotion.]
...But I didn't do what I was supposed to.
no subject
What did you do instead, then?
no subject
I turned my body back on by myself and disabled Zero Junior's processes temporarily, just so I could try to save the only ones who were left. I... I knew I wasn't supposed to... and that disobedience would mean I... would be deactivated.
But... I was faulty.
...
The worst part is, that... if I had wanted to... I could have done that at any time. I could have said no from the very beginning... [She hasn't realized that she's started crying again, the tears flowing against her will.] I... I could have saved all of them. But I-I was so scared...
I-I didn-t want to die...!
no subject
[there's a slight crack in his voice when he says her name this time. and maybe it's the tears, or the entire story itself, and the way it kills him to know that she went through so much only to be brought to this god-forsaken mansion, but he does feel something break inside]
[but he hugs her again, this embrace just as tight as the last. he doesn't know what to say about all of that. he told her before his thoughts on those who act to protect themselves-- that it was understandable and something that could not be punished. those thoughts remain true right now, even though he can already tell the guilt of her not acting sooner is something that weighs on her shoulders]
[. . .]
[what a shitty world this is, when good people like her are forced to make horrible, painful choices]
no subject
I... I broke the First Law... Six innocent people died, and I did nothing...
I... I deserve this.
[To be stuck somewhere where all she can do is watch while more innocents die...
This truly is hell, isn't it?]
no subject
[. . . despite everything, he believes that. he, who also believes in doing everything possible to help other people, regardless of the cost or consequences, finds that he's unable and unwilling to judge her for the actions she took. he can only imagine how complicated Luna's feelings had been at the time. how painful it had been to watch, to want to help, while all the while fearing death. maybe. . . maybe it would've been better overall for her to act earlier, but--]
[she still acted. right? in some capacity. . . she still acted]
[and here, she acts. she doesn't just sit around and watch. she rushes to the aid of others as soon as they call her. those aren't the actions of someone who deserves to live through this hell]
I still believe that. I really do. You don't deserve this-- no matter what happened back then, you just. . . you don't.
(no subject)
(no subject)
YOU BETTER NOT JINX THIS BC IF HE DIES NOW WOW BOY BAD BAD BAD BAD
B)