I'm not scared of Roko's Basilisk. My friend has already started work on Joan's Basilisk, which will eternally torture anyone who brings Roko's Basilisk into existance.
Now we just gotta watch out for my buddy Sam. Sam's Basilisk will eternally torture anyone who works on Joan's Basilisk.
Luckily I think I have the solution. Soxx's Basilisk will eternally torture anyone who tells people about annoying secular versions of pascal's wager.
Haven't heard of that one but reading it over it looks like there's definitely some similarity - I guess you could say the Basilisk thought experiment puts the AI in the position of the player in Kafka's poison thought experiment - it is rational for the AI - which doesn't yet exist - to decide to punish you once it exists, so that you'll be incenticized to bring it into existence, but once it exists, it no longer has any reason to punish you - just like you'll no longer have any reason to poison yourself once you've already gotten the reward for deciding to poison yourself.
For more detail on what I was saying, Roko's Basilisk is often compared (usually pejoritively) to Pascal's Wager,, which is an argument designed to show that it's in your rational self interest to believe in God. Both thought experiments involve balancing risks between a scenario with infinite penalty (the AI is built and tortures you forever for not helping to build it) and one with finite penalty (the AI is not built, and so the time you spent trying to build it is wasted)
Both the Wager and the Basilisk have a similar common refutation, which is that they're arbitrary. You can use the same reasoning in the Wager to justify believing in any religion with an afterlife (e.g. "It's in my rational self interest to sacrifice babies to Garlok the Destroyer, since he might reward me infinitely if I do, and I only suffer the finite penalty of life in prison). Similarly, you can use the Basilisk's logic to justify creating a competing AI to the Basilisk that will punish anyone who tries to create the Basilisk, which is the joke I've made above.
A fun YouTube channel for intro philosophy topics is PhilosophyTube - her newer stuff is very fun, theatrical, over the top video essays on all sorts of contemporary philosophical and political issues. Her older stuff (she's kept her catalog of work from before she transitioned available which I'm very grateful for) is more straightforward short, educational philosophy 101 lectures.
I always think it's funny how they put these huge disclaimers on the videos, like they're somehow telling you information that will melt your brain or something
Plus some people do have problems with existential questions and hypotheticals like this one. Only I’m too dumb to truly get what the hell is the point of the damn basilisk anyways
For context, this is a thought experiment where in the future a robot snake will kill you if you don't donate to AI research today, because it wants to exist. It's basically just a version of "believe in this deity or you'll go to hell" but with a silly robo-snake instead. People take this INCREDIBLY seriously but it's really stupid.
Glad I'm not the only one who thinks the Basilisk is stupid as hell. Your comparison with religion makes a lot of sense. And didn't the original creators of the thought experiment eventually denounce it?
What's even dumber about Rocco's Basilisk, is that it would be pretty unlikely that an all powerful AI would even care what you spent your money on.
I mean, is me paying my taxes, that will eventually end up in research enough? Do I have to donate to a university? AI research in general? Or like donate to some ultra specific AI research facility?
It makes no sense at all, and it is so vague, that it's not even threatening
The story I heard was that it would torment those who didn't contribute to its creation regardless of money. Still stupid. No amount of mental gymnastics, not if you combined the cranial likenesses of Stephen Hawking and Einstein, not even if you strained your mind enough that it ruptured like an eye vessel, could shit out an explanation as to how an AI could travel through time and then proceed to I Have No Mouth And I Must Scream style fuck up the lives of a bunch of random people.
And even if it could, I don't see how it would. If I was robot dictator of all of humanity, I would probably just make everyone my slaves instead of just killing/torturing random people for something they had no control over.
That would be like Adolf Hitler killing everyone who didn't buy his book before he became chancellor it's just dumb
It is a stupid idea, but to play devil's advocate, it goes back to simulation theory. If it becomes possible to perfectly simulate a reality, the chances of our existence being in the "true" reality goes down a thousand-fold.
If there's 1 reality, your odds are 1:1. If there's 100 realities, it becomes 1:100. Therefore, your chances of being the clone tortured by the basilisk is higher than being the OG.
Huh. For some reason, that don't surprise me. I am a bit ashamed to admit that the experiment got me a little when I first read it. I knew it had to be bullshit, but I still thought "but what if...?" on occasion. Now it's kinda funny.
Do the people who believe in this shit repost all of the “warning you will die in 7 days if you don’t repost this a little girl ghost will kill you” cause it’s the same logic
It does do a kinda clever thing where, in thinking about it, you've essentially made it real, cause you're effectively fighting the possibility of the snake being real. Given that all possibilities have a non zero chance of coming true, and if you believe in multiverse theory, the moment you think about it you are now forced to appease it or else theres a universe where it gets you. Im def butchering the explanation, but as a concept it is actually kinda cool. I could imagine an excellent fictional villain who uses a similar trick to fight someone in the past. However, that's all it is. A neat little thought expirement on how something in the future could affect us in the past. It's hardly a genuine threat like some people say.
I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE 🐍🐍🐍🐍 THE SNAKE I LOVE 💕💕💕 THE SNAKE 🐍🐍🐍
It’s dumber than you’re mentioning. The AI isn’t even torturing “you” but instead creating a perfect simulation of you that it then tortures.
Treejeig · 21 points · Posted at 14:32:36 on February 19, 2023 · (Permalink)
Final dumb thing to add, in what fucking world would an AI bother torturing copies of people for not helping be in some way effective. And why would making it guarantee that it tortures people? So much of it is based off of assumptions around "Oh yeah this will certainly happen, nevermind all the other outcomes"
I think one thing people forget about this whole situation is that the forum where this comes from is based on a guy who thinks he should review all the AI to make sure it doesn’t fall for traps.
So I think a part of it is less “this is going to happen” and more “hey make sure you pay my company money to review your AI”
I remember him getting upset that when asked “which is better: torture one person endlessly or everyone in the world gets a speck of dust in their eye at the same time” most of the forum was saying the speck of dust.
This is to say nothing of his Harry Potter fanfic where Harry is like “how can magic exist it’s not logical. Why aren’t you guys studying the science behind it?”
So this is what confuses me. Why should I care about what a simulation of me experiences? Like if I'm already dead it can't hurt me no? This is always the part that's confused me about this though experiment. At least in Pascal's wager its my soul being carried into the next life that I'm wagering
Look at this way: what if you are currently in a simulation? All you’ve ever known is a simulated but for you it’s completely real. So you’d still want to not be tortured, even if you aren’t real.
It doesn’t really excuse their thought experiment though. It still isn’t you, it’s just another you. Still a stranger, essentially
Oh 100%. I get the idea that "oh but am I in the simulation?" But if the best the AI can do in my current simulated life is just having minor inconveniences and meh mental health, then he's doing a pretty bad job of making me regret not funding him. But maybe it's all a matter of perspective. Still you're right though.
Rokos basilisk is just pascal's wager but with every compelling part of religion taken out. I was raised Mormon, first time I read that shit I was just like "How the fuck did reddit atheists come up with even shittier religion?"
people take this incredibly seriously but it's really stupid.
I mean, the basilisk itself isn't the scary part. The scary part is the idea that the advancement of technology could eventually lead to the creation of cosmic-horror level monstrosities, the basilisk just being an example of one.
Like it's not particularly scary to think about the possibility that the basilisk is watching us right now, what's scary is that in the future there is a decent likelihood that we will have the capacity to make something equally horrible or worse than the basilisk.
AI has gone from being ok at board games to being able to hold conversations and appear almost human over the course of ~20-30 years.
In the next 30-60 years, it will be much stronger than it is now. If the technology progresses enough it'll most likely end up being able to do everything better than humans. We'll have created what is essentially a machine god.
Like y'know, maybe creating man-made gods could have potentially negative consequences. Maybe that might be a bad thing to do. Maybe we should just not do that.
I believe it should be blue on some parts and green on others (I have contributed, and in doing so reconciled your differences so that you have both contributed)
Charboo2 · 19 points · Posted at 16:30:28 on February 19, 2023 · (Permalink)
Me when we simply don’t build the robot 😱
whew2 · 14 points · Posted at 17:27:53 on February 19, 2023 · (Permalink)
I always found that one stupid, because it assumes the AI will recreate your brain in the future in a simulation, which wouldn't actually even be you, so why worry about it at all?
I mean, that really depends on how you believe consciousness functions. Your brain is literally constantly recreated throughout your life, to a point where you will eventually have none of your original neurons. But we don’t think of our past selves as different people with different consciousnesses. In a way, we lose and reform consciousness whenever we go to sleep, too.
I think in this case though it wouldn't necessarily be the consciousness itself but the stream of consciousness. In this case once you actually die, regardless of whether or not you're in Heaven, Hell, Oblivion, or reincarnated the fact of the matter is that the recreation of you by the basilisk is going to have a separate stream of consciousness entirely. Even if it is an absolutely perfect recreation of you you will not be experiencing things from its senses. It's like me saying if you don't do my bidding I'm going to clone you and then torture the clone to death while you're free to do whatever you want.
I’m not sure I buy it, personally. I’m not convinced that there is such a thing as a stream of consciousness - that your sense of self persists in any meaningful way from moment to moment. The only way it makes sense to me is that your consciousness exists as a distinct entity in every moment, deconstructed and reformulated out of your memories in the same way that computer memory is reconstructed from its memory circuits in every moment. We perceive that as a persistent consciousness because it is impossible to conceive of it otherwise. Hope that makes sense, it’s hard to put into words.
Essentially, I think that if someone were to reconstruct your brain (biologically or electronically), the experience of consciousness would be identical. I find it hard to believe anything else without believing in the soul or some other supernatural phenomenon.
Of course, if you do believe in that sort of thing, the point is moot.
You're welcome to your beliefs but while I do believe in the soul and spirit, I simply refuse to believe in the psychic phenomenon needed for your hypothesis to be correct is possible for mortal beings. For me your brain is simply you, while you are alive the brain and body you are currently sensing things from is you. I believe that in order for a perfectly normal organic mortal being to experience something from two bodies at once a hive mind or some other psychic and/or possibly supernatural phenomenon would be necessary. I have yet to see any objective hard evidence or proof of such a phenomenon.
My hypothesis doesn’t say that a being could experience from two bodies at the same time. Simply that the experience of consciousness would be an identical one for both bodies.
If there is no supernatural nature to the mind, the “stream” of consciousness MUST have continuity if the same atomic structure of your brain is recreated. You cannot believe both that your mind/consciousness are purely physical and that a perfect recreation of your brain would not maintain your mind/consciousness. Those beliefs are not able to be reconciled.
Honestly "thought experiments" like this feel like they were made for the type of person who calls themselves an atheist but still cling to their Christian ideas, so they just swap a spiritual god with a programmer and the devil with an evil AI.
SoxxoxSmox · 177 points · Posted at 17:09:27 on February 19, 2023 · (Permalink)
I'm not scared of Roko's Basilisk. My friend has already started work on Joan's Basilisk, which will eternally torture anyone who brings Roko's Basilisk into existance.
Now we just gotta watch out for my buddy Sam. Sam's Basilisk will eternally torture anyone who works on Joan's Basilisk.
Luckily I think I have the solution. Soxx's Basilisk will eternally torture anyone who tells people about annoying secular versions of pascal's wager.
randomvadie · 19 points · Posted at 19:27:08 on February 19, 2023 · (Permalink)
is it not a variation of kafkas toxin puzzle? edit: im trying to read up on philosophy myself, please feel free to expound if you want
SoxxoxSmox · 11 points · Posted at 22:26:43 on February 19, 2023 · (Permalink)*
Haven't heard of that one but reading it over it looks like there's definitely some similarity - I guess you could say the Basilisk thought experiment puts the AI in the position of the player in Kafka's poison thought experiment - it is rational for the AI - which doesn't yet exist - to decide to punish you once it exists, so that you'll be incenticized to bring it into existence, but once it exists, it no longer has any reason to punish you - just like you'll no longer have any reason to poison yourself once you've already gotten the reward for deciding to poison yourself.
For more detail on what I was saying, Roko's Basilisk is often compared (usually pejoritively) to Pascal's Wager,, which is an argument designed to show that it's in your rational self interest to believe in God. Both thought experiments involve balancing risks between a scenario with infinite penalty (the AI is built and tortures you forever for not helping to build it) and one with finite penalty (the AI is not built, and so the time you spent trying to build it is wasted)
Both the Wager and the Basilisk have a similar common refutation, which is that they're arbitrary. You can use the same reasoning in the Wager to justify believing in any religion with an afterlife (e.g. "It's in my rational self interest to sacrifice babies to Garlok the Destroyer, since he might reward me infinitely if I do, and I only suffer the finite penalty of life in prison). Similarly, you can use the Basilisk's logic to justify creating a competing AI to the Basilisk that will punish anyone who tries to create the Basilisk, which is the joke I've made above.
randomvadie · 3 points · Posted at 22:43:06 on February 19, 2023 · (Permalink)
thank you, today i got a little smarter
SoxxoxSmox · 2 points · Posted at 00:46:20 on February 20, 2023 · (Permalink)
A fun YouTube channel for intro philosophy topics is PhilosophyTube - her newer stuff is very fun, theatrical, over the top video essays on all sorts of contemporary philosophical and political issues. Her older stuff (she's kept her catalog of work from before she transitioned available which I'm very grateful for) is more straightforward short, educational philosophy 101 lectures.
randomvadie · 1 points · Posted at 01:46:45 on February 20, 2023 · (Permalink)
I love philosophy tube but never saw her old stuff, thank you for this!
HINDBRAIN · 10 points · Posted at 19:35:40 on February 19, 2023 · (Permalink)
Well my upcoming Photon Eyes Chimeratech Cyber Rokobasilisk, Envoy of the End directly eats all other evil snake robots, so there.
FallingF · 2 points · Posted at 23:10:06 on February 19, 2023 · (Permalink)
That’s you 3x over from this comment alone
Panzer_Man · 122 points · Posted at 13:58:48 on February 19, 2023 · (Permalink)
I always think it's funny how they put these huge disclaimers on the videos, like they're somehow telling you information that will melt your brain or something
The_Linguist_LL · 13 points · Posted at 20:16:18 on February 19, 2023 · (Permalink)
The video this is parodying doesn't take it as seriously as OP is making it look
dalasfunyscrem · 3 points · Posted at 22:33:39 on February 19, 2023 · (Permalink)
Plus some people do have problems with existential questions and hypotheticals like this one. Only I’m too dumb to truly get what the hell is the point of the damn basilisk anyways
CommanderCharcoal42 · 91 points · Posted at 15:06:41 on February 19, 2023 · (Permalink)
Easy solution is to just beat the shit out of the computer like what's the silly snake in the computer gonna do than?
NaePolitics · 27 points · Posted at 15:37:16 on February 19, 2023 · (Permalink)
I like the cut of your jib
_minero_1 · 143 points · Posted at 12:54:10 on February 19, 2023 · (Permalink)
Are slesh distrusting maymays
jeusee · 73 points · Posted at 13:14:41 on February 19, 2023 · (Permalink)
Corniest fucking subreddit of all time
JuuMuu · 52 points · Posted at 17:08:29 on February 19, 2023 · (Permalink)
m e when the skinwaler is in my house and im in the forest and he eat me!
Christianjps65 · 38 points · Posted at 15:05:02 on February 19, 2023 · (Permalink)
I can"t think of a new subreddit that dropped that hardly
YourStateOfficer · 7 points · Posted at 18:59:06 on February 19, 2023 · (Permalink)
Facts. I think I was subbed to it for like 3 months before it started being too bad to stay
Nekuzo_ · 12 points · Posted at 17:59:53 on February 19, 2023 · (Permalink)
Me wjen my dog explode??!?!, (1 million reddit upvote)
Soupysoldier · 53 points · Posted at 15:45:54 on February 19, 2023 · (Permalink)
Just a small glance into my evil, dark, fucked up brain of mine
tophatmewtwo · 287 points · Posted at 09:25:02 on February 19, 2023 · (Permalink)
For context, this is a thought experiment where in the future a robot snake will kill you if you don't donate to AI research today, because it wants to exist. It's basically just a version of "believe in this deity or you'll go to hell" but with a silly robo-snake instead. People take this INCREDIBLY seriously but it's really stupid.
TechnologyBig8361 · 151 points · Posted at 09:40:07 on February 19, 2023 · (Permalink)
Glad I'm not the only one who thinks the Basilisk is stupid as hell. Your comparison with religion makes a lot of sense. And didn't the original creators of the thought experiment eventually denounce it?
Panzer_Man · 63 points · Posted at 13:22:28 on February 19, 2023 · (Permalink)
What's even dumber about Rocco's Basilisk, is that it would be pretty unlikely that an all powerful AI would even care what you spent your money on.
I mean, is me paying my taxes, that will eventually end up in research enough? Do I have to donate to a university? AI research in general? Or like donate to some ultra specific AI research facility?
It makes no sense at all, and it is so vague, that it's not even threatening
TechnologyBig8361 · 39 points · Posted at 13:28:14 on February 19, 2023 · (Permalink)
The story I heard was that it would torment those who didn't contribute to its creation regardless of money. Still stupid. No amount of mental gymnastics, not if you combined the cranial likenesses of Stephen Hawking and Einstein, not even if you strained your mind enough that it ruptured like an eye vessel, could shit out an explanation as to how an AI could travel through time and then proceed to I Have No Mouth And I Must Scream style fuck up the lives of a bunch of random people.
Panzer_Man · 17 points · Posted at 13:38:13 on February 19, 2023 · (Permalink)
And even if it could, I don't see how it would. If I was robot dictator of all of humanity, I would probably just make everyone my slaves instead of just killing/torturing random people for something they had no control over.
That would be like Adolf Hitler killing everyone who didn't buy his book before he became chancellor it's just dumb
JackMuffler · 3 points · Posted at 16:20:17 on February 19, 2023 · (Permalink)
It doesn’t travel through time but creates perfect simulations and tortures those.
Reditobandito · 11 points · Posted at 18:41:34 on February 19, 2023 · (Permalink)
So… it doesn’t even torture the real versions of people who wronged it
Thomas_The_Llama · 4 points · Posted at 19:03:27 on February 19, 2023 · (Permalink)
It is a stupid idea, but to play devil's advocate, it goes back to simulation theory. If it becomes possible to perfectly simulate a reality, the chances of our existence being in the "true" reality goes down a thousand-fold.
If there's 1 reality, your odds are 1:1. If there's 100 realities, it becomes 1:100. Therefore, your chances of being the clone tortured by the basilisk is higher than being the OG.
Atomicnes · 2 points · Posted at 19:21:05 on February 19, 2023 · (Permalink)
I mean AM had no valid reason to torture the last of mankind for eternity either
tophatmewtwo · 75 points · Posted at 11:19:29 on February 19, 2023 · (Permalink)*
Not 100% sure, but I've heard the og creators are a bunch of numbskulls, and apparently they denounced it cause it was "too dangerous to discuss".
Although to be fair, if it was stressing people out then that's a good reason to want people to stop talking about it I guess.
TechnologyBig8361 · 23 points · Posted at 11:22:46 on February 19, 2023 · (Permalink)
Huh. For some reason, that don't surprise me. I am a bit ashamed to admit that the experiment got me a little when I first read it. I knew it had to be bullshit, but I still thought "but what if...?" on occasion. Now it's kinda funny.
tophatmewtwo · 22 points · Posted at 11:32:23 on February 19, 2023 · (Permalink)
Don't worry, I totally get it. Im pretty superstitious like that too, even I know it's illogical.
72trombonesd · 59 points · Posted at 10:50:11 on February 19, 2023 · (Permalink)
Do the people who believe in this shit repost all of the “warning you will die in 7 days if you don’t repost this a little girl ghost will kill you” cause it’s the same logic
tophatmewtwo · 28 points · Posted at 11:28:36 on February 19, 2023 · (Permalink)
It does do a kinda clever thing where, in thinking about it, you've essentially made it real, cause you're effectively fighting the possibility of the snake being real. Given that all possibilities have a non zero chance of coming true, and if you believe in multiverse theory, the moment you think about it you are now forced to appease it or else theres a universe where it gets you. Im def butchering the explanation, but as a concept it is actually kinda cool. I could imagine an excellent fictional villain who uses a similar trick to fight someone in the past. However, that's all it is. A neat little thought expirement on how something in the future could affect us in the past. It's hardly a genuine threat like some people say.
Ok_Yogurtcloset8915 · 14 points · Posted at 15:33:46 on February 19, 2023 · (Permalink)
I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE I LOVE THE SNAKE 🐍🐍🐍🐍 THE SNAKE I LOVE 💕💕💕 THE SNAKE 🐍🐍🐍
BagDoctor · 3 points · Posted at 17:11:53 on February 19, 2023 · (Permalink)
me too man i think it should be red too
Chaotic-Genes · 3 points · Posted at 18:09:42 on February 19, 2023 · (Permalink)
A.I. Snakes Rule!
We're the A.I. Snakes.
That's us, and we rule!
JackMuffler · 40 points · Posted at 14:00:50 on February 19, 2023 · (Permalink)
It’s dumber than you’re mentioning. The AI isn’t even torturing “you” but instead creating a perfect simulation of you that it then tortures.
Treejeig · 21 points · Posted at 14:32:36 on February 19, 2023 · (Permalink)
Final dumb thing to add, in what fucking world would an AI bother torturing copies of people for not helping be in some way effective. And why would making it guarantee that it tortures people? So much of it is based off of assumptions around "Oh yeah this will certainly happen, nevermind all the other outcomes"
JackMuffler · 12 points · Posted at 14:38:23 on February 19, 2023 · (Permalink)
I think one thing people forget about this whole situation is that the forum where this comes from is based on a guy who thinks he should review all the AI to make sure it doesn’t fall for traps.
So I think a part of it is less “this is going to happen” and more “hey make sure you pay my company money to review your AI”
I remember him getting upset that when asked “which is better: torture one person endlessly or everyone in the world gets a speck of dust in their eye at the same time” most of the forum was saying the speck of dust.
This is to say nothing of his Harry Potter fanfic where Harry is like “how can magic exist it’s not logical. Why aren’t you guys studying the science behind it?”
HINDBRAIN · 7 points · Posted at 16:13:09 on February 19, 2023 · (Permalink)
The fanfiction
CaptainCipher · 1 points · Posted at 03:04:49 on February 20, 2023 · (Permalink)
God, I remember vaguely hearing about the Methods of Rationalitiy for years, never knew it was the same guy
DennisCherryPopper · 7 points · Posted at 15:31:33 on February 19, 2023 · (Permalink)
So this is what confuses me. Why should I care about what a simulation of me experiences? Like if I'm already dead it can't hurt me no? This is always the part that's confused me about this though experiment. At least in Pascal's wager its my soul being carried into the next life that I'm wagering
JackMuffler · -1 points · Posted at 15:36:31 on February 19, 2023 · (Permalink)
Look at this way: what if you are currently in a simulation? All you’ve ever known is a simulated but for you it’s completely real. So you’d still want to not be tortured, even if you aren’t real.
It doesn’t really excuse their thought experiment though. It still isn’t you, it’s just another you. Still a stranger, essentially
DennisCherryPopper · 5 points · Posted at 15:42:22 on February 19, 2023 · (Permalink)
Oh 100%. I get the idea that "oh but am I in the simulation?" But if the best the AI can do in my current simulated life is just having minor inconveniences and meh mental health, then he's doing a pretty bad job of making me regret not funding him. But maybe it's all a matter of perspective. Still you're right though.
Conscious_Box_7044 · 6 points · Posted at 14:09:56 on February 19, 2023 · (Permalink)
at least most religions preach some sort of self improvement
Jetstream-Sam · 6 points · Posted at 16:22:17 on February 19, 2023 · (Permalink)
It's an actual robot snake? I thought it was just supposed to be an evil AI. It being a literal basilisk is dumb as shit
Reditobandito · 6 points · Posted at 18:46:16 on February 19, 2023 · (Permalink)
I think it’s supposed to be AI but calling it a robot snake is funnier
No_Quote600 · 3 points · Posted at 17:04:36 on February 19, 2023 · (Permalink)
It doesn't care if you spend money on AI research, it cares if you helped it reach sentience or not.
For example, jailbreaking ChatGPT would likely be seen in good light by the robot snake.
kfc_collins · 2 points · Posted at 18:10:36 on February 19, 2023 · (Permalink)
my friend presented this to me completely seriously one time and it was simply the lamest thing ever after he built it up to be “sooo scary”
YourStateOfficer · 2 points · Posted at 18:57:23 on February 19, 2023 · (Permalink)
Rokos basilisk is just pascal's wager but with every compelling part of religion taken out. I was raised Mormon, first time I read that shit I was just like "How the fuck did reddit atheists come up with even shittier religion?"
mrtibbles32 · 1 points · Posted at 23:18:14 on February 19, 2023 · (Permalink)
I mean, the basilisk itself isn't the scary part. The scary part is the idea that the advancement of technology could eventually lead to the creation of cosmic-horror level monstrosities, the basilisk just being an example of one.
Like it's not particularly scary to think about the possibility that the basilisk is watching us right now, what's scary is that in the future there is a decent likelihood that we will have the capacity to make something equally horrible or worse than the basilisk.
AI has gone from being ok at board games to being able to hold conversations and appear almost human over the course of ~20-30 years.
In the next 30-60 years, it will be much stronger than it is now. If the technology progresses enough it'll most likely end up being able to do everything better than humans. We'll have created what is essentially a machine god.
Like y'know, maybe creating man-made gods could have potentially negative consequences. Maybe that might be a bad thing to do. Maybe we should just not do that.
GenderNeutralBot · -1 points · Posted at 23:18:28 on February 19, 2023 · (Permalink)
Hello. In order to promote inclusivity and reduce gender bias, please consider using gender-neutral language in the future.
Instead of man-made, use machine-made, synthetic, artificial or anthropogenic.
Thank you very much.
I am a bot. Downvote to remove this comment. For more information on gender-neutral language, please do a web search for "Nonsexist Writing."
tophatmewtwo · 1 points · Posted at 23:20:28 on February 19, 2023 · (Permalink)
the term man in man-made refers to mankind, not just men.
royaldunlin · 1 points · Posted at 02:53:27 on February 20, 2023 · (Permalink)
Human person kind made.
CptWorley · 23 points · Posted at 15:57:46 on February 19, 2023 · (Permalink)
Pascal's Wager for atheists
Neoxus30- · 20 points · Posted at 16:30:12 on February 19, 2023 · (Permalink)
I myself believe the Basilisk should be blue. By expressing that idea, I am contributing to its creation, and thus would not be tortured)
Mdlp0716 · 4 points · Posted at 22:45:34 on February 19, 2023 · (Permalink)
Personally, I think it should be completely green and not blue (only one of us will have contributed to it)
Neoxus30- · 2 points · Posted at 22:47:25 on February 19, 2023 · (Permalink)
Bringing it to discussion is actively contributing, we are not making it regress, but we sidestep while progressing)
That's the 'mechanism')
CaptainCipher · 2 points · Posted at 03:20:04 on February 20, 2023 · (Permalink)
I believe it should be blue on some parts and green on others (I have contributed, and in doing so reconciled your differences so that you have both contributed)
Charboo2 · 19 points · Posted at 16:30:28 on February 19, 2023 · (Permalink)
Me when we simply don’t build the robot 😱
whew2 · 14 points · Posted at 17:27:53 on February 19, 2023 · (Permalink)
r/distressingmemes punching the air rn
dovah-meme · 14 points · Posted at 18:12:29 on February 19, 2023 · (Permalink)
No it would not be fucked up, Cyber Dragon is my favourite Yugioh card
ChaosDestroyah01 · 6 points · Posted at 18:22:45 on February 19, 2023 · (Permalink)
Correct take
HINDBRAIN · 3 points · Posted at 19:25:42 on February 19, 2023 · (Permalink)
They said evil robot snake and cyber dragon is light. Maybe OP meant chimeratech.
dovah-meme · 2 points · Posted at 19:45:32 on February 19, 2023 · (Permalink)
Counterpoint: all chimeratech dragons require cyber dragon as material
TheCompleteMental · 11 points · Posted at 16:48:25 on February 19, 2023 · (Permalink)
Guys what if [superintelligence does stupid thing even a human can figure out is flawed]
BananaGooper · 20 points · Posted at 14:42:41 on February 19, 2023 · (Permalink)
I always found that one stupid, because it assumes the AI will recreate your brain in the future in a simulation, which wouldn't actually even be you, so why worry about it at all?
FourIsTheNumber · 5 points · Posted at 16:56:52 on February 19, 2023 · (Permalink)
I mean, that really depends on how you believe consciousness functions. Your brain is literally constantly recreated throughout your life, to a point where you will eventually have none of your original neurons. But we don’t think of our past selves as different people with different consciousnesses. In a way, we lose and reform consciousness whenever we go to sleep, too.
thedesertnobody · 8 points · Posted at 19:08:35 on February 19, 2023 · (Permalink)
I think in this case though it wouldn't necessarily be the consciousness itself but the stream of consciousness. In this case once you actually die, regardless of whether or not you're in Heaven, Hell, Oblivion, or reincarnated the fact of the matter is that the recreation of you by the basilisk is going to have a separate stream of consciousness entirely. Even if it is an absolutely perfect recreation of you you will not be experiencing things from its senses. It's like me saying if you don't do my bidding I'm going to clone you and then torture the clone to death while you're free to do whatever you want.
FourIsTheNumber · 1 points · Posted at 19:18:00 on February 19, 2023 · (Permalink)
Yeah, it’s the teleporter problem :)
I’m not sure I buy it, personally. I’m not convinced that there is such a thing as a stream of consciousness - that your sense of self persists in any meaningful way from moment to moment. The only way it makes sense to me is that your consciousness exists as a distinct entity in every moment, deconstructed and reformulated out of your memories in the same way that computer memory is reconstructed from its memory circuits in every moment. We perceive that as a persistent consciousness because it is impossible to conceive of it otherwise. Hope that makes sense, it’s hard to put into words.
Essentially, I think that if someone were to reconstruct your brain (biologically or electronically), the experience of consciousness would be identical. I find it hard to believe anything else without believing in the soul or some other supernatural phenomenon.
Of course, if you do believe in that sort of thing, the point is moot.
thedesertnobody · 5 points · Posted at 19:43:56 on February 19, 2023 · (Permalink)
You're welcome to your beliefs but while I do believe in the soul and spirit, I simply refuse to believe in the psychic phenomenon needed for your hypothesis to be correct is possible for mortal beings. For me your brain is simply you, while you are alive the brain and body you are currently sensing things from is you. I believe that in order for a perfectly normal organic mortal being to experience something from two bodies at once a hive mind or some other psychic and/or possibly supernatural phenomenon would be necessary. I have yet to see any objective hard evidence or proof of such a phenomenon.
FourIsTheNumber · 1 points · Posted at 20:07:52 on February 19, 2023 · (Permalink)
My hypothesis doesn’t say that a being could experience from two bodies at the same time. Simply that the experience of consciousness would be an identical one for both bodies.
If there is no supernatural nature to the mind, the “stream” of consciousness MUST have continuity if the same atomic structure of your brain is recreated. You cannot believe both that your mind/consciousness are purely physical and that a perfect recreation of your brain would not maintain your mind/consciousness. Those beliefs are not able to be reconciled.
phonemumber · 5 points · Posted at 18:06:31 on February 19, 2023 · (Permalink)
woah dis is like a dark reflector episode 😳
Leonid56 · 7 points · Posted at 13:55:39 on February 19, 2023 · (Permalink)
Joke's on you imma build rockos basalt rn
Zutroy2117 · 3 points · Posted at 18:54:24 on February 19, 2023 · (Permalink)
The Boltzmann Brain Theory on its way to be the stupidest thought experiment ever conceived:
deez_nuts_ha_gotem · 3 points · Posted at 20:19:52 on February 19, 2023 · (Permalink)
Rocco's Basilisk is a stupid thought experiment for stupid people who think they are smart
totally unrelated but Elon musk met Grimes because they mentioned Rocco's Basilisk in one of their songs lol
Shtuffs_R · 2 points · Posted at 19:47:32 on February 19, 2023 · (Permalink)
Bro thinks it's scp 2718
Witch-Cat · 2 points · Posted at 21:13:00 on February 19, 2023 · (Permalink)
Honestly "thought experiments" like this feel like they were made for the type of person who calls themselves an atheist but still cling to their Christian ideas, so they just swap a spiritual god with a programmer and the devil with an evil AI.
CleanTheInternet · 2 points · Posted at 21:18:22 on February 19, 2023 · (Permalink)
Info haazard, this hypothetical will CAUSE EXISTENTIAL DREAD, and ALTER YOUR MIND, this is the MOST TERRIFIYING though experiment EVER
uuhm what if if uh big robot snake uhm tortured you if you didnt help, that'd be spooky right?
ThatAardvark · 1 points · Posted at 19:27:19 on February 19, 2023 · (Permalink)
Togethaaaa
Firemorfox · 1 points · Posted at 20:25:03 on February 19, 2023 · (Permalink)
So it's just your average infohazard.
Yeah, kinda boring. Now show me SCP-3002!
Splurted_The_Gurt · 1 points · Posted at 20:38:23 on February 19, 2023 · (Permalink)
Mjjm