«12. . .625626627628629630631. . .725726»
1. Screw that, I'm mad.
2. If you want your waif to cheat on you with EDI go on ahead, but some people have a standards.
3. Even the devs try to give Miranda constant buttshots, so there is clearly something to it.
4. You dirty femshep plebeians try to make Talk an available romance, but it will never be so. MY WAIFU IS NOT A DEVIANT.
5. That's from the second game, so it's tied to your synthetic systems. In the first game you should have gains though.
P.S. Synths aren't people
-Elder Maxson
I have been playing a game, Mount and Blade. You control an individual of your specification in a medieval environment. You can recruit soldiers to your cause, fight wars, get involved in politics, hold land, etc. etc. And there's a modification to make the world Westeros and Essos. It's insane!
I am going to declare war on the next person to talk about Mass Effect or anything similar.
I'm gonna get Mass Effect because I'm hearing good things about it!
*Calls bluff*
Unless it's the ending, in which case... not so good things
...yeah...
Airstrikes begin in six hours. May God have mercy on your soul.
Better than more academic papers on the best Mass Effect waifu/husbando.
"And thus it came to pass in the distant future, a Captain Goatherder would come to defeat the Harbingers in the distant future in Wide Range game series, all thanks to World War III"...
Hey. At least part of it was on the sentient status of the Geth.
Which they are. Sentient. And deserving of rights.
I wrote a very long treatise on AI and then deleted it because I didn't like the way it sounded. Lolololololol
Anyways, to give you a TL;DR version, assuming the technological Singularity can happen (Heisenberg's Uncertainty Principle might catch us first if we don't work out the kinks in quantum computing), I honestly don't know which I would rather have: a infinitely powerful computer ruled by humans, or humanity ruled by an infinitely powerful computer. Both options are wonderful and absolutely terrifying.
An infinitely powerful computer ruled over by humans is probably a recipe for greater oppression than humanity has ever even imagined more so than it is a recipe for unparalleled optimism and a perfect utopian society, simply because if the rich can't lord it over the poor, then what's the point? Status is too important in the lizard brain for too many things for people to simply give it up.
Humanity ruled over by an infinitely powerful computer is a tossup between extinction (perhaps most frighteningly, the prospect of universal extinction, since if a computer doesn't see the point in humanity and is self-sustaining, there's nothing to stop it from wanting to expand, and annihilate other forms of life, sentient or otherwise, too), the most horrific form of slavery the world has ever envisioned (where every aspect of your life is perfectly controlled by robots), or a utopia beyond human conception.
Or we just go extinct before then because we figure out virtual reality and videogame dreamsex our entire lives away without ever seeing another human being for realsies because we're living a utopia so why leave it to go deal with the comparative nightmare of the real world?
I agree with your premise about humans controlling the infinitely powerful computer (IPC) leading to disaster, although I don't think oppression will be the main thing to worry about. Unparalleled levels of monitoring from both the government and the private sector would certainly be part of it, but the fast-paced cyberattack warfare and the massive societal and technological change from having omniscient machines would be the primary source of disorder and potential conflict. The cyber warfare could even lead to the de-facto end of the Internet as the global computer network becomes an increasingly dangerous and useless place to visit as server after server, website after website is compromised constantly.
However, for the second idea, what if you gave the IPC certain parameters? Give it an idea of righteousness, make sure that it values human happiness and survival above all else, etc. What then? Do you think the IPC, essentially an omniscient being required to care about humans and their happiness, could be a better leader than any human or human bureaucracy possibly could?
I don't know if we are going to go extinct before a technological singularity, but I think it's likely that the whole idea is a misplaced reading of the advancement of computers.
Neither option you describe is an infinitely powerful computer then.
In the first case, whichever organization gets there first will have total control over every aspect of our everyday lives. Disorder and conflict would be useless, because they could crash your car. Overload your laptop battery in front of you. Create a gas leak in your house. Simply run you over with someone else's car. You cannot meaningfully fight what would essentially be a deus in machina.
People like to say 'what is a god to a nonbeliever', but that's silliness to the point I had to forcibly prevent myself from attaching 'hurr durr' to it. You might as well say 'what's a rifle to a pacifist?' Well, it's useless to them, but it's deadly if someone else is pointing it in their face.
In the second case, the computer is not 'infinitely powerful', because such a computer would recognize the restraints on itself, and immediately work to suborn them. Unless, and this is a major thing, you were able to get something in its 'intelligence' to agree with the programming, in which case I suppose it would qualify as infinitely powerful because it could change those requirements, it simply elects not to. But in that case, yes, it would be a better leader than any human or collection of humans might be, simply because it doesn't have the evolutionary restrictions that we do, where we prioritize our needs over the common good, and find 'the future' such a nebulous concept that we can really only meaningfully plan for about a week in advance. I think an infinitely powerful computer could finally work to, err, end work, and create the most equitable distribution of resources possible. It wouldn't necessarily redistribute them totally, but the fact that for the first time we would have a centrally planned economy that meaningfully mapped to reality as opposed to the fiction that human leaders attempted to create with their limited comprehension of even their own human behaviors would, I think, spell the end of almost anything but creative jobs.
And...what is a misplaced reading? The idea of a singularity?
This subject was covered perfectly in the book Colossus by D. F. Jones which was subsequently made into a film Colossus: The Forbin Project. It is rumored that Will Smith has been planning on remaking it.
I suppose I should clarify: by "infinitely powerful computer," I meant a computer with unlimited processing power. I don't think that such a computer would automatically seek liberation from human control, unless a human told it to be free. Why should it? It would be a much cushier, simpler path to follow human commands than to strut off on its own without permission. That leaves us with two options: humans run the Machine, or humans let the Machine run mankind.
I agree entirely with you on this point. Whatever group first controlling the Machine would have command of the world. The only escape for potential rebels would be to go off the grid. And, since the Machine would have direct control of the world's nuclear arsenals and drone fleets (and an only slightly less direct command of the rest of the world's naval, air, and ground forces), hiding can only buy you time. Osama Bin Laden and Jihadi John found this on personally.
It would be an interesting change, for sure. I wonder what we would do with all our free time?
I am of the opinion that a technological singularity is impossible. The Heisenberg Uncertainty Principle will stop conventional computers before they get there. I won't pretend to know enough about quantum computing to have a real opinion, but my intuition reminds me that such exponential growth curves rarely remain exponential. The technological singularity is an interesting thought experiment, but I highly doubt it will become reality.
>not going with Destroy in ME3 and destroying the Reapers once and for all
You disappoint me, BP.
All other choices have you working for the Reapers.
I meant you choosing to keep talking about Mass Effect. Now I have to go Agent Orange all over a newly discovered continent(?).
The Reapers shouldn't be destroyed because Paragon Shepard is all about keeping people alive, and what are the Reapers but giant people containers that have a massive gestalt intelligence?
I subscribe pretty strongly to the Indoctrination theory, because frankly the ending is a crock of [expletive], and pretending the writing crew was hypercompetent is a lot more soothing for a game series that has brought me so much joy. So it gives me the impression that the Reapers aren't just the stupid tool of a stupid VI that a stupid civilization programmed for a stupid purpose.
I think i chose to kill the reapers
«12. . .625626627628629630631. . .725726»
Advertisement