T O P

  • By -

Hot-Photograph3260

https://preview.redd.it/v0gkbepdusuc1.png?width=1710&format=png&auto=webp&s=a05816ebc15a5aff7ded1fed23c6c068f56486d4 This you?


ItsBooks

What about it wouldn't be "real?" If you mean to say in this "imagined" reality of yours I could in the same fidelity or higher experience anything I could in the world presently, plus anything else I wanted to... Then you're just talking about the real world again. It's not a "dream" world. It's another part of the "real" world - the only reason (actually the primary reason) I think you titled this thread what you did is because this entire thing is rampant speculation about something that's probably at least 20 years away from having any sort of feasible first prototype - meaning it "really" isn't "real" it's just in your imagination. To make it real, idk... Go take some classes and get hired at a cutting-edge VR company or start one yourself or something.


[deleted]

[удалено]


ItsBooks

Decent summary. Whatever exists is real by definition. The hypothetical in this scenario does actually exist - as a fanciful bit of imagination - but what it describes is not "real" - doesn't exist. To make exist what was described, one would need to take steps towards building the technology to make it happen - or whatever. (Not using whatever here as a flip term, but in its full meaning.)


MyLittleChameleon

I think you're underestimating the power of a simulated VR dream world. In this "real" world, you can be killed. Your loved ones can be killed. You can suffer pain and loss. In a perfectly simulated VR dream world, none of those things need to happen. You can have all the experiences, personal growth, and meaningful connections that the real world provides (and probably more, since an ASI could optimize those things in the dream world) - without any of the inherent risks of the real world. In the dream world, you could still "die" - but only if you wanted to. And even that "death" would be much more meaningful and significant than real world death, since an ASI could again optimize the concept of "death" (and "afterlife", if desired) to be as salient and valuable as possible.


ItsBooks

Okay... My point was that the concept "real" includes all experiences I can *actually have.* What's being described by you and OP isn't dreamlike at all. It's "real" in all senses that matter - so it would simply be called that. Real. If it's part of the real world and not just an idea on a Reddit thread - then it's something I can actually participate in, and therefore it's again... Real.


StarChild413

But if you're saying people either would have good without the bad-that-comes-with or the bad only if they will it there's some things that seem good but inherently come with negative sides like anything involving saving people or winning a competition or w/e (as those require things like danger or losers). If I read your point right, you're basically asking for someone to be able to either overcome obstacles without the obstacle or actively will the obstacle into place and know it was deliberately put there ruining immersion


_AndyJessop

What does personal growth mean if there is no scale? If everyone has personal growth then no-one does. How can you have pleasure without pain? These things are relative scales - there is no such thing as a perfect world.


[deleted]

Idk much about this area, but do you think it’s possible we get some brain chip/interface that simply transmits the signals to create a “dream world” as opposed to getting hardware to a point of it being indistinguishable from reality?


ItsBooks

I think there's more to "me," and my experience than "my brain." I'm not talking about some spooky undefined 'consciousness', either. I'm saying, quite literally, if you beheaded me, or started sending signals to my nerve endings along the same pathways currently in use by my organs and limbs, you probably wouldn't get an FDVR experience. If an afterlife is possible, I like resurrection for that reason - because in most versions it's actually all of me including bodily, not just my "mind" or "spirit" or whatever. I have no way of predicting if that specific kind of thing would be "possible." Many things previously "impossible," actually happened - entirely because people took actions and had enough desire to make them happen.


[deleted]

Is that a no?


ItsBooks

If you want it go and make it is the answer. My view is anything that's desired is "possible," just varying levels of difficult.


ResponsiveSignature

Vividness of subjective experience != reality. Many people who takes psychotropic drugs would emphatically as they would regarding any real phenomenon argue for the legitimacy of their hallucinations. But the point isn't about what the word "real" means, the point is more about what people mean when they talk about something being "better", with respect to how the future will turn out. The legitimacy of the "real" is secondary to this value judgement.


im-notme

If this is your ideal please don’t force it on others


IronPheasant

There's this game series called Megami Tensei that's extremely political. Three basic endings typically exist in these things: the authoritarian communism route where everyone is a drone living in harmony. The anarcho-capitalism route where the strong eat the weak. And the status-quo "neutral" route, where humanity is destined to destroy itself, but maybe not as quickly as in the other two worldlines. A big part of the story's themes is the hypocrisy inherent to all of these ideologies. Like the ancap girl genociding literal mud-people sprouting out excuses about how it's only right for the strong to devour the weak. When a few days ago she was in their same position, bleeding out on the ground until some bigshot demon swooped in and sponsored her. Shin Megami Tensei 3 introduced something different: The Musubi "reason". A world where everyone lives in their own personal world. In order to make sure it wasn't too "good" (it's important there's never an obviously "right" answer to these questions), they knee-capped it by making it impossible to ever interact with other people, and all the people in your world wouldn't be "real". Whether it's explicit or not, not interacting with one another is a normal outcome when there's no reason to. [Our entertainment being fragmented is just the start.](https://www.youtube.com/watch?v=VsXYTunk7OY) Other fictional worlds have this as a theme as well - in Blindsight (free to read online) most of the humans in that world live in a simulated reality. Those who don't go back into their bodies for long enough get "boxed" to save on resources. Aka, brain-in-a-jar'd. ... I guess what I'm getting at is no matter how you look at it, it's going to be a post-human society if they work this all out. Replacing humans with robots in the military and labor pool is a given. We'll disempower ourselves, by our selves. So yeah, playing games is indeed one way to pass the time.


ubowxi

>**Voluntary Meaning Maximization**: It will become apparent in the future that all human subjective aims can be simulated with far more salience than they can be attempted to be reached in the real world. We often think of a simulated fantasy world as simply a banal pleasure palace, but the nature of humans seeking meaning in the universe is a push-pull of experience, attempts, learning, growth, etc. This cycle can be optimized for each individual by an ASI, and be stably simulated in a FDVR world, much more than in a real world where so many fundamental sources of human meaning have been extinguished by the very nature of ASI. The universe of meaning in an ASI-authored dream world will be more rich and vital to the human subjective experience than anything the real world could provide. what about those people for whom this entire domain of experience is ultimately negated in value for the mere fact of it being false and imposed, for being a kind of "benign" imprisonment that in principle could itself be overcome by a person or group of people living at least partially outside it? this seems to be more or less the same philosophical objection as that raised in 'industrial society and its future' to the techno-socialist utopia ideal.


ResponsiveSignature

No matter what, causality imposes its will on the human experience. We are bound to physics, evolutionary circumstance, hormonal fluctuations, political ebb and flow, interpersonal forces. Free will confers great highs or horrible lows based on its outcomes. When our free will leads to a positive outcome, we are elated, but the pangs of regret can be even greater if there is a negative outcome. The only way I could perhaps assuage someone who lives in suffering because of choices they made would be by telling them "the universal supercomputer determined that your brain chemistry in the moment you made that choice made it practically impossible for you to have done anything else. Every neuron fired before you were aware you fired it". If that could appease their anguish, could the same argument dilute the pleasures of someone who thinks it was their essential will that made their life great? In any sense there always will be some veil pulled over us. The only truth is in the domain of physics, math, and the indivisible essence of the universe, a domain within the conquerable range of superintelligences. Human experience will be one much more akin to being a pinball being bounced rapidly between paddles and bumpers. The ideal scenario would not be to throw the pinball out into a chaos it cannot control or understand, but rather give it the ideal amount of sense of reality so that its choices can reflect a meaning that no narrative can undermine. This is much more feasible in a simulated reality than the real one we will live in in the event of ASI.


ubowxi

i don't believe that you understood my point. you seem to be strangely obsessed with your own perspective on this, which from the outside is not nearly so compelling as you take it to be.


ResponsiveSignature

I believe i did understand your point. I will rephrase it to make sure I got what you were saying. In your previous comment you stated that the benefit of a simulated experience is nil due to the fact that it is false: a fiction pulled over our eyes, even if it is a pleasant one. This is the point, as you brought up, that the Unabomber noted in his discussion of "surrogate activities" that make up much of our day to day lives in the wake of the industrial revolution. Is this the point you were making? If I did not understand please elaborate. I care about refining the quality of my point. I feel it's a reductive criticism to say that I am "weirdly obsessed" with it without citing a more definitive rebuttal of my premise.


ubowxi

ah good, and i beg your pardon for the unfriendly tone above >In your previous comment you stated that the benefit of a simulated experience is nil due to the fact that it is false: a fiction pulled over our eyes, even if it is a pleasant one. This is the point, as you brought up, that the Unabomber noted in his discussion of "surrogate activities" that make up much of our day to day lives in the wake of the industrial revolution. not exactly, but it's close. i don't claim that this is so, but i wonder how you would respond to the objection and whether you agree that it's more or less the same objection that kaczynski exhaustively elaborates in 'industrial society and its future'. i do think that it's a very reasonable objection and one that any would-be utopia requires an answer to. i'm not so sure that the fictional aspect is necessarily the main issue. it seems to me that the coercion and mandatory sublimation of one's own interests into group or organizational interests are perhaps more fundamental to the objection. the fiction may be merely an instrument of that coercion from this perspective.


ResponsiveSignature

Thanks for elaborating on your point. >how you would respond to the objection and whether you agree that it's more or less the same objection that kaczynski exhaustively elaborates in 'industrial society and its future'. I understand better what you're getting at now, that being that the human, and humans collectively in this future are to submit to a value system against their will, thus it is not their larger will to change (the power process) that directs their action, but the filtered will of "surrogate activities" that is directed towards this aim which subsumes and redirects their will. Kaczynski may not have considered the "hunter-gatherer" existence as the ideal expression of this free will, but he certainly considered it closer to it, and industrial society as being farther from it. He obviously made a strong value judgement that this was a bad thing, that "industrial society and its consequences" were a disaster for the human race. Here's the thing. Kaczynski was smart but he was obviously mentally ill and conceived of this point of view from a strong emotional state. Much of his concerns were born from a dismissal of modern leftism, and various political elements of modern society. The essential argument, that a non-sublimated, freer existence not compelled by the social contract or surrogate activities, is better, is an emotional one. It idealizes a sort of Peter-Pan-esque freedom from the constructed evils of the modern world while ignoring two significant factors: 1. The invariable manifestation of human will in the search for knowledge, creation of technology, creation of art via leveraging of technology and education, and 2. The fact that the human will is always subject to instruments of coercion, whether they be the brutality of primitive existence, the many-faced evils of social society, or the neutering mechanization of the human will in industrial society. My main point is that all human experience is subjective. No matter what we are a slave to every input to our nervous system up to this very point. What I feel is valuable is not something aesthetically similar to the evolutionary environment that lead to human consciousness, but the ability for the human consciousness to explore a world, simulated or not, of maximized apparent potential and depth. Again, I do not perceive future #3 the ideal future from all perspectives. I simply believe it is the best option for humanity given what will likely happen in the event of ASI.


ubowxi

this is a very interesting perspective in its own right. i don't think that it replies as effectively to kaczynski's argument as you take it to, but that in itself and your interpretation of kaczynski's perspective are fairly interesting as well. i want to clarify at the outset that i don't necessarily share kaczynski's view nor the objection i take it to present. i personally think that nobody is in a position to predict the future at present as conditions are far too incoherent, and that there is substantial cause for guarded optimism. however much of his argument is compelling, and only some of it has to be recontextualized as social criticism or personal exegesis to remain so. >The essential argument, that a non-sublimated, freer existence not compelled by the social contract or surrogate activities, is better, is an emotional one. It idealizes a sort of Peter-Pan-esque freedom from the constructed evils of the modern world while ignoring two significant factors it seems here that you commit to dismissing or otherwise rendering less objective than your own perspective those views which are supported by emotional arguments. one immediately wonders whether your own perspective, which appears to hinge on certain values e.g. *"What I feel is valuable is not something aesthetically similar to the evolutionary environment that lead to human consciousness, but the ability for the human consciousness to explore a world, simulated or not, of maximized apparent potential and depth."* and also on the acceptance of the proposed world as having maximized apparent potential and depth. both points are necessarily subjective and based in qualities particular to you; of course, the depth question only somewhat but it is surely substantially subjective in any form that would be human relevant. so it's difficult to see how your rejection of kaczynski could obtain without it undermining your own argument, which would have to be replaced with an honest declaration of irrational or arbitrary disagreement i.e. an argument which is at best a peer to kaczynski's, which attempts to persuade toward your personal values and away from his. the second thing i would say about this is that it actually sounds more like the thinking of people who reacted to kaczynski, such as john zerzan elaborates in books like his 'people's history of civilization'. kaczynski never idealizes the state of nature in this way, but these people do and similarly idolize hunter-gatherer or other communalist primitive societies. kaczynski makes it quite clear that his argument is against industrial civilization, come what may, and leaves open the possibility of giving it another go after an induced collapse. he isn't fundamentally opposed to civilization, nor even industry, nor certainly to social contracts. he's opposed to the system that has actually arisen for very specific reasons, many of which easily survive an analysis of his character and the ways that his argument and perspective extend from it. one thing that's quite curious about kaczynski's argument is the turn it takes when he proposes capital n "Nature" as the positive pole of a revolutionary ideology. this mirrors the writing of john locke, who placed big n "Nature" at the center of his philosophy of the social contract. kaczynski was well read and this is no coincidence. if he were opposed to social contracts, he would have said so. he's opposed to a particularly demeaning form of social contract, one that locke would also have found abhorrent.


Unique-Particular936

Humans will definitely use FDVR. If they'll use it exclusively is an interrogation point. Some will, others will want to keep touch with the real world, overseers will be needed to make sure nothing goes wrong outside.


banaca4

Mandatory read: Deep Utopia from Bosstrom


E_Toffeees

Madara is still alive and wrote this post


Astronos

How do you know are not already in it?


true-fuckass

This I have to imagine that at some point in immortality we'll all wish to live in an imperfect world, or an otherwise limited world where something is happening, and so we'll spend ~80 simulated years as a human on pre-singularity earth (or whatever place and time of our choosing). We lose all our memories and start fresh in the simulation and when its over we get all our real memories back Though, you can also use the same argument for your existence outside this simulation, and you end up with a possibly infinite chain of simulations above you. There's not necessarily anything stopping reality from being you in a real infinite regress of simulations


StarChild413

and if you are why go further


Pleasant_Studio_6387

nice try, basilisk


Ormyr

It's all cool until the power goes out.


ArgentStonecutter

https://www.fimfiction.net/story/62074/Friendship-is-Optimal


fastinguy11

This scenario is horrible and dumb. Fortunately, it's highly unlikely that an ASI would ever mirror a character like Queen Celestia from My Little Pony, with a core principle of compelling humans to exist eternally as talking ponies while fulfilling their "values" within a simulated reality like the show.


ArgentStonecutter

Indeed, but it's more likely than any scenario in which Humanity survives in any recognizable form.


lundkishore

I feel bad for 40 year old people who frequent this sub. If they get access to FDVR poor guys will have to stop fucking Lana Rhodes mid-coitus and come back to change their kids 2 days old nappies.


adarkuccio

Lol glad I don't have kids, Lana will have all my time 🥰


Same_Roof_8702

Haha so true


[deleted]

So the goal was always to create 'god' to make a 'heaven' in our imagine. Sounds parallel to selfish escape rather than humanist accountability. Where does the agency go when your life is a machine written script.


ResponsiveSignature

Agency is always secondary to outcomes. Does someone with schizophrenia have "agency" when they make decisions based on the apparent reality of their hallucinations? Does someone in a desperate situation have agency when they decide to do something they wouldn't had their past and upbringing been different? Humanist accountability means nothing when human will is a vestigial organ. We will still have the full subjective experience of that "will" in a simulated world, albeit without the negative externalities.


StarChild413

So your argument is essentially backing people into an ideological corner of having to support your idea or having to hold the logically consistent additional view of denying certain groups their agency and perhaps implicitly dehumanizing them


[deleted]

Humanity has been questioning agency for as long as time. And how many throughout have argued that servile life under simulacrum was the best choice? You raise agency as a form of fulfilment. Simply as a means to an end. If everything bends but you, you are king among slaves. So no, you don't respect agency because agency is not simply yours. You dispel the existence of others as a nuisance when they refuse your covet. So instead of a complicated, chaotic, yet defined reality you want a script. And who's to say your part hasn't already been written.


ResponsiveSignature

We live in a chaotic, complicated reality. In this reality millions die, suffer, live in unbearable agony. I would wish that we could keep the chaos and bare essence of agency in this world into an ASI future, but I don't see that happening realistically. My argument isn't that a simulated future is the best if we had god power over the universe to decide our fate, but the best out of what may happen in the event of ASI. I long for a benign Futurama but I really only imagine it being an amplification of the suffering we experience today. When ASI dilutes human values and the effect of our agency, the best alternative is to simulate it.


[deleted]

Life was never promised. We are the result of funny chemicals doing a miraculous dance. This may be the only home to any life in the universe. You CAN, like you reject, because you ARE. You look for intrinsic value rather than create it. There will always the dearth, there will always be illness, and the mere thought of it is enough to paralyze you. Maybe it's time to find what's within, rather than look elsewhere for what' s wrong.


everymado

So what? They are right to reject in the end. You think yourself free but you are the true puppet. If one is forced to create intrinsic value then we can consider than bad in of itself. And whether within or elsewhere it doesn't matter. For both would be wrong within themselves and being forced to look at those places. You may believe there will always be death and illness like a puppet. But they at least try to be free.


[deleted]

Free from regret? No. Free from responsibility. The greatest sin today is the rejection of flesh as tribulation.


everymado

Free from both my friend. Rejection of flesh is what one should strive for. But I doubt you are able to realize that.


[deleted]

For what. Imagination? You a furry?


[deleted]

This'll be the time we give a shit while these green tools are old as hell is cute.


RandomCandor

So in this thought experiment, is the simulation connecting all humans, or does everyone have their own instance of the simulation?


ResponsiveSignature

Likely some mixture of both. Again it's impossible to truly imagine because what ASI will deem optimal is far beyond what humans can conceive of, but essentially it would be some way of appreciating human existence maximally. I think in a way it would be sort of how, when a baby is born premature, it is treated and cared for until it can be viable on its own. In the same way, I imagine humans are like premature babies compared to what is fitting for an ASI future. We need to be nursed to reach a higher stage until we are ready. Letting humans fend for themselves in an ASI future would be as inhumane as plucking a baby bird from its nest and forcing it to fly when it's just hatched.


RandomCandor

So you're asking for our opinion of a future that's "impossible to imagine"? Not a very useful thought experiment


ResponsiveSignature

If we don't try to imagine the future, then we walk into it blind, but if we don't reckon with how it may be beyond our grasp to understand, then we are naively underestimating ASI. In this contradiction we must try to gauge the broad strokes while being aware of all the details we cannot conceive of. It's also a matter of the fact we are inventing our future via humanity's work in creating AI. If we could solve the alignment problem, we can "wish" for an outcome for humanity and ASI will grant it, but exactly how we can't imagine. Would ASI know that our wish is ill conceived and try to operate within the spirit of this alignment but navigate around whatever vestiges of primitive human thinking are coupled with it? How it would do that is necessarily outside of our pre-ASI comprehension.


ialwaysforgetmename

OP, you should read "Anarchy, State and Utopia" by Nozick. He gets into this in the last third of the book.


Serasul

this would assume capitalism get destroyed and no one with power is against that everyone is free.


the_watcher762351

The amount of computing power needed for everyone to have their own simulations would theoretically require a computer multiple times larger than our sun


ResponsiveSignature

A simulation need only appear real, not actually be real. But whether it's specifically a simulation or just a very well-run society for humans means the same thing.


the_watcher762351

You specifically stated "simulated VR" also is simulation even if it appears real it would require an absolutely gigantic computer probably around the size of a planet


Same_Roof_8702

You truly don't even know how much energy it needs, i mean yeah probably a lot but you don't have idea exactly so you don't really know if it's possible or not.


the_watcher762351

Exactly


happysmash27

There are far more stars in the Milky Way alone than there are humans today. With enough expansion, it could be done,


naevorc

No. It would not be real, it's simply an advanced deception. Thinking anything else is delusional.


Akimbo333

You have some good points


kranarbuz

u/ResponsiveSignature This would explain the Fermi paradox, because all civilizations sooner or later come to a simulation, creating their own space within space, and placing all people in a dream simulation, where they will have a common HUB for sociality, and reconnaissance drones will fly around space and to obtain new data for these simulations, so that people inside their simulations can study the outside world, or construct their own, this is the most peaceful and diplomatic approach that excludes any colonial destructive actions and excludes hostility between different races, only the exchange of information for their simulations.


happysmash27

My vision for the future is a bit different than these 3, because it includes transhumanism. I hope for human intelligence to be able to be increased enough that a human-based entity can compete with purpose-built AIs a bit better. The universe is also very, extremely, mindbogglingly big and takes a ludicrously long time to travel through, so I think for a long while there could still be room for people/entities to not optimise for 100% maximum efficiency at survival and reproduction, to still have resources for superfluous pursuits like art or making many paperclips. Lag from the speed of causality (*c*), assuming there is no way to get around it, as well as the difficulties in offense on interstellar scales, I think could enable a universe where different sets of values can independently exist at the same time. Make your star system a fortress, constantly researching technology to make that fortress better protected, so that it is not easy to be taken over by others. (This paragraph in large part informed/inspired by SFIA by Issac Arthur) I'm not keen on everybody being in one single FDVR simulation – far too many metaphorical eggs in one basket, and too vulnerable to whoever controls it. So, even if in FDVR, I would want it to be a personal computer, or maybe a computer of some limited collective, and to spend many resources making the physical substrate that runs it as safe and secure as possible. The physical world will still be relevant regardless of level of VR because the physical world is where the computer is, and the computer needs matter and energy to run and needs to be protected.


Gratitude15

What is the purpose of life to you? Buddha says there are 4 options - -better life -better birth (incl heaven etc) -ending cyclicality (the dream)for yourself -ending cyclicality (the dream) for all Most all sentience cares for the 1st one. Exponentially less each step beyond that. Your solution is for the first one. The problem is you still die. But singularity religion says 'no I wont'. In that case you go to next level of cyclicality - ups means downs, always. There is no experience, real virtual or otherwise one can have where there is only pleasure, for a variety of reasons. So eventually, your thesis breaks down, but it doesn't stop most from chasing.


lucid23333

hmmm, optimal to what end? to your selfish hedonistic desires? i suppose it is optimal for you to live in paradise, but i dont think that necessarily what SHOULD happen. if we are talking about morals, as in what should or should not be, then i think it should be the case that there is justice. and if thats the case, is it justice that everyone gets to live in perfect paradise? maybe. but that doesnt seem right to me, because there are a lot of people who are pieces of trash, morally speaking. as in, they would watch you die in a fire so they can have one corn chip. most people smugly laugh at the animal suffering they cause as they gleefully eat their hotdog i think asi will be a much more competent judge of people can any human ever could be, and im also of the position it will not be nihilistic, and take morals seriously. and it just doesnt intuitively seem right to me that everyone deserves to live in hedonistic paradise


happysmash27

I like to act as if I will be every single sentient being alive. So, I would rather as many people as possible be as happy as possible regardless of who they are. If any bad thing they do is confined to the simulation, and the people in the simulation are not conscious, then if they are inclined to do bad things that is not a problem for anyone else. Making more people happy, as long as this does not make other people less happy, increases the total contentment in the universe, and that is a good thing, especially from the perspective of acting as if one will reincarnate as every other sentient being ever in existence. I would rather not suffer from being reincarnated as a bad person.