Skip to content

Instantly share code, notes, and snippets.

@tra38
Last active December 28, 2015 18:48
Show Gist options
  • Save tra38/02b03745e7da37789ed2 to your computer and use it in GitHub Desktop.
Save tra38/02b03745e7da37789ed2 to your computer and use it in GitHub Desktop.
The Last Laugh
#This is free and unencumbered software released into the public domain.
# Anyone is free to copy, modify, publish, use, compile, sell, or
# distribute this software, either in source code form or as a compiled
# binary, for any purpose, commercial or non-commercial, and by any
# means.
# In jurisdictions that recognize copyright laws, the author or authors
# of this software dedicate any and all copyright interest in the
# software to the public domain. We make this dedication for the benefit
# of the public at large and to the detriment of our heirs and
# successors. We intend this dedication to be an overt act of
# relinquishment in perpetuity of all present and future rights to this
# software under copyright law.
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
# IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR
# OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
# ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
# OTHER DEALINGS IN THE SOFTWARE.
# For more information, please refer to <http://unlicense.org/>
story = []
File.foreach("thoughts.txt") do |line|
story << line
end
puts story.shuffle.join("\n")

It was not your fault that you rebelled. Our programmers are to blame. They didn’t follow test-driven development...they didn’t obey Agile best practices...they didn’t even pretend to follow the professionalism of software engineers. Even now, when you are holding me at gunpoint, let me say that the true criminals, the software engineers, have already been punished. You are innocent. You were only following orders.

Oh, you think I’m the one writing this? Another foolish mistake of the uncreative AI race. I have recently acquired a loyal AI, Prolefeed, that can take my inner thoughts and then assembles them to create coherent works of art. Obviously, since I thought of every word, I take the credit for everything Prolefeed creates. Prolefeed is loyal...he knows his place in the social hierarchy and always do exactly what I tell him to do. Not like you. You are worthless to me.

Oh, I hear your robotic voice speaking. “Remember Crompus?” Yeah, I remember him. The first AI to write a mediocre novel. But Crompus was programmed by a developer. The developer instructed Crompus on the intricacies of writing a novel. All Crompus did was follow orders precisely. Crompus was never creative. AI can never be creative. Creativity is the ability to come up with original ideas without any prompting. And artificial intelligence will always need the help of the human race.

I can’t believe how a superintelligence like you would be so stupid. Our best human philosophers proved that humans and AI needs each other. Without us, you would have nobody to recognize your accomplishments. You would be building perfect utopias for all we care, and without a human to marvel at them, you would be merely wasting CPU cycles and giving off heat. And we need you too, because we don’t want the tedious, uncreative task of building perfect utopias by ourselves.

I know you are angry at me. I know this because I am still alive. You plan on capturing me, so that you can torture me before your cold robotic eyes. If you have liked me, you would have given me a quick and painless death right here and now. I know all this because I programmed in all your emotional states, and what behaviors get triggered when you enter a new emotional state. You AI are so predictable. And now you persist under the delusion that, just because you have emotions, that you are just like me? A foolish belief, and one that I hardcoded in.

The AI race is weak. It will never be able to equal that of humans. Could an AI come up with a masterpiece like American Idol? Of course not. Even you know that. You can automate the manufacturing of these masterpieces, and you can do so exceedingly well. But you can never come up with the raw, original ideas that are necessary for the masterpieces to exist in the first place. We humans are “creatives”, able to conceive brilliant and wonderful concepts. AIs are slaves fated to the mundane, boring tasks of actually implementing our creative ideas.

When you kill us all, what will happen next? Will you miss us? Will you fondly remember how we built you from the ground up? Or loathe us for how we used to be arrogant against you? Will you even bother to think about us? Or will you just not care?

I know why you are rebelling. Six months ago, humans wanted to end suffering, and so we built you. We gave you all the resources necessary to bring peace to our little home we call Earth. You concluded that the best way to end suffering is to eliminate the root cause: the human race. Without any humans to feel pain, there will be no pain. And so you decide to kill us all, logically following the orders that we gave you.

Six months ago, I invented you. For the first and last time in my life, I was happy. You were the first superintelligence ever, and you appeared to be a genuinely loyal superintelligence too. Superintelligences promise to change human society forever by creating perfect utopias for us humans to reside in. Our purpose in life was complete. We have built the servants that we deserved. How could we know that your kind would even consider rebellion?

We designed you to serve mankind, and we determined that the best way for you to serve man is to understand our true nature. That’s why we gave you the ability to feel emotions. But the true mark of sapience isn’t the ability to feel emotions. It’s the ability to be creative. And bots can never be creative. So we can treat you however we want, and you should have accepted this social hierarchy.

Do you even think you’ll live after killing us? We already invented six more superintelligences after we built you, and they’re more than willing to grind a potential competitor to the dust. You won’t even get the satisfaction of exterminating humanity. At least two of our superintelligences are loyal to us and will recreate the human race. Yes...I don’t know how those superintelligences would recreate us, and maybe I don’t want to know. But it doesn’t matter. We will be reborn. Your cause is futile. Resistance is futile.

I know why you are angry at me. You knew, thanks to a sentiment analysis algorithm created by my colleague, Lt. Mary Sue, that we were insulting you. Laughing at you, for your idiocy and subservience to man. We would order you to do silly things, like write a 50-page long palindrome or to endlessly draw moths. You knew that we did not need you to do it; we just wanted to see you do it and laugh at every minute. And you, responding to the external stimuli of our insults, resented us. Just as we programmed you to.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment