by Max Barry

Latest Forum Topics

Advertisement

Search

Search

Sorry! Search is currently disabled. Returning soon.

[+] Advanced...

Author:

Region:

Sort:

«12. . .625626627628629630631. . .725726»

The Bermudan Pentagram wrote:1. Bruh; I know that pain. Like you probably did, I insanity-ed the series. Destroyers were (among) the worst. But just like we can't let some members of one group ruin the entire group for us, we shouldn't let a group of heretics affect our perception of the greater whole. Should Saren (or any Blue Sun) make all Turians the bad guy? Should Matriarch Benezia (or most Eclipse mercs) render all Asari evil? Are the Krogan or Vorcha inherently evil just because Krogan battlemasters were almost as bad as Destroyers in the base game, or due to the Blood Pack's dickery?
2. You're entitled to your opinion, no matter how incredibly wrong it may be.
Fiiiiiine, no matter how incredibly wrong it may be from my perhaps biased perspective.
3. Booty does not compensate for everything. There is a reason for the attraction triangle, and booty is only one vertex of said triangle.
4. I don't follow, and am probably grateful for that.
5. Shepard is from a time period where there are powered exoskeletons (even if they're tier VII armor upgrades), but, real talk, as a SPECTER s/he's rich enough to buy them. With that kind of technology, who needs to be James Vega? Anyway, Shepard kicks Vega's ass (assuming you're even trying with the interrupts), and that boy is like 95% muscle, with the rest being tattoos.
Disregarding that; just from the examples below (text as per the game):
Heavy Bone Weave (Skeletal Lattice): By reinforcing the skeleton with a synthetic weave, bones can be made almost unbreakable. In the event of bone trauma, medi-gel conduits allow for bone regenerations in a matter of days.
Heavy Muscle Weave (Microfiber Weave): Perforating the muscles with micro-fibers increases overall strength and decreases the potential for muscle damage from exertion.
Heavy Skin Weave (Lattice Shunting): Strong synthetic fibers can be woven through the skin, dramatically reducing damage taken from most attacks. These fibers also act as a medi-gel conduit, improving healing.
Do any of those sound normal? Sure, s/he's not a robot, but I'd definitely make the argument that s/he's at least a cyborg.
I always pick Control, because I think it's BS that Shepard would just ditch his/her buddies on Middle-of-Nowhere Garden World. Because s/he wouldn't. S/he'd hijack a Reaper, build him/herself an awesome reaperbot body, and generally scare the living excrement out of them.
But in an awesome way.

1. Screw that, I'm mad.
2. If you want your waif to cheat on you with EDI go on ahead, but some people have a standards.
3. Even the devs try to give Miranda constant buttshots, so there is clearly something to it.
4. You dirty femshep plebeians try to make Talk an available romance, but it will never be so. MY WAIFU IS NOT A DEVIANT.
5. That's from the second game, so it's tied to your synthetic systems. In the first game you should have gains though.
P.S. Synths aren't people
-Elder Maxson

I have been playing a game, Mount and Blade. You control an individual of your specification in a medieval environment. You can recruit soldiers to your cause, fight wars, get involved in politics, hold land, etc. etc. And there's a modification to make the world Westeros and Essos. It's insane!

I am going to declare war on the next person to talk about Mass Effect or anything similar.

Thalasus wrote:I am going to declare war on the next person to talk about Mass Effect or anything similar.

I'm gonna get Mass Effect because I'm hearing good things about it!
*Calls bluff*

Danceria wrote:I'm gonna get Mass Effect because I'm hearing good things about it!
*Calls bluff*

Unless it's the ending, in which case... not so good things

Sun lands wrote:Unless it's the ending, in which case... not so good things

...yeah...

Danceria wrote:I'm gonna get Mass Effect because I'm hearing good things about it!
*Calls bluff*

Airstrikes begin in six hours. May God have mercy on your soul.

Thalasus wrote:Airstrikes begin in six hours. May God have mercy on your soul.

I launch my nukes from my silos and the countless subs I have in your waters, you do the same. WWIII has begun.

Thalasus

Escanthea wrote:I launch my nukes from my silos and the countless subs I have in your waters, you do the same. WWIII has begun.

Better than more academic papers on the best Mass Effect waifu/husbando.

Thalasus wrote:Better than more academic papers on the best Mass Effect waifu/husbando.

"And thus it came to pass in the distant future, a Captain Goatherder would come to defeat the Harbingers in the distant future in Wide Range game series, all thanks to World War III"...

Thalasus wrote:Better than more academic papers on the best Mass Effect waifu/husbando.

Hey. At least part of it was on the sentient status of the Geth.
Which they are. Sentient. And deserving of rights.

The Bermudan Pentagram wrote:Hey. At least part of it was on the sentient status of the Geth.
Which they are. Sentient. And deserving of rights.

Singularity is an annoying thing. Please people in charge of robots, don't make them sapient.

Thalasus

Danceria wrote:Singularity is an annoying thing. Please people in charge of robots, don't make them sapient.

I wrote a very long treatise on AI and then deleted it because I didn't like the way it sounded. Lolololololol

Anyways, to give you a TL;DR version, assuming the technological Singularity can happen (Heisenberg's Uncertainty Principle might catch us first if we don't work out the kinks in quantum computing), I honestly don't know which I would rather have: a infinitely powerful computer ruled by humans, or humanity ruled by an infinitely powerful computer. Both options are wonderful and absolutely terrifying.

An infinitely powerful computer ruled over by humans is probably a recipe for greater oppression than humanity has ever even imagined more so than it is a recipe for unparalleled optimism and a perfect utopian society, simply because if the rich can't lord it over the poor, then what's the point? Status is too important in the lizard brain for too many things for people to simply give it up.
Humanity ruled over by an infinitely powerful computer is a tossup between extinction (perhaps most frighteningly, the prospect of universal extinction, since if a computer doesn't see the point in humanity and is self-sustaining, there's nothing to stop it from wanting to expand, and annihilate other forms of life, sentient or otherwise, too), the most horrific form of slavery the world has ever envisioned (where every aspect of your life is perfectly controlled by robots), or a utopia beyond human conception.
Or we just go extinct before then because we figure out virtual reality and videogame dreamsex our entire lives away without ever seeing another human being for realsies because we're living a utopia so why leave it to go deal with the comparative nightmare of the real world?

Thalasus and Feminamia

The Bermudan Pentagram wrote:
An infinitely powerful computer ruled over by humans is probably a recipe for greater oppression than humanity has ever even imagined more so than it is a recipe for unparalleled optimism and a perfect utopian society, simply because if the rich can't lord it over the poor, then what's the point? Status is too important in the lizard brain for too many things for people to simply give it up.
Humanity ruled over by an infinitely powerful computer is a tossup between extinction (perhaps most frighteningly, the prospect of universal extinction, since if a computer doesn't see the point in humanity and is self-sustaining, there's nothing to stop it from wanting to expand, and annihilate other forms of life, sentient or otherwise, too), the most horrific form of slavery the world has ever envisioned (where every aspect of your life is perfectly controlled by robots), or a utopia beyond human conception.
Or we just go extinct before then because we figure out virtual reality and videogame dreamsex our entire lives away without ever seeing another human being for realsies because we're living a utopia so why leave it to go deal with the comparative nightmare of the real world?

I agree with your premise about humans controlling the infinitely powerful computer (IPC) leading to disaster, although I don't think oppression will be the main thing to worry about. Unparalleled levels of monitoring from both the government and the private sector would certainly be part of it, but the fast-paced cyberattack warfare and the massive societal and technological change from having omniscient machines would be the primary source of disorder and potential conflict. The cyber warfare could even lead to the de-facto end of the Internet as the global computer network becomes an increasingly dangerous and useless place to visit as server after server, website after website is compromised constantly.

However, for the second idea, what if you gave the IPC certain parameters? Give it an idea of righteousness, make sure that it values human happiness and survival above all else, etc. What then? Do you think the IPC, essentially an omniscient being required to care about humans and their happiness, could be a better leader than any human or human bureaucracy possibly could?

I don't know if we are going to go extinct before a technological singularity, but I think it's likely that the whole idea is a misplaced reading of the advancement of computers.

Thalasus wrote:I agree with your premise about humans controlling the infinitely powerful computer (IPC) leading to disaster, although I don't think oppression will be the main thing to worry about. Unparalleled levels of monitoring from both the government and the private sector would certainly be part of it, but the fast-paced cyberattack warfare and the massive societal and technological change from having omniscient machines would be the primary source of disorder and potential conflict. The cyber warfare could even lead to the de-facto end of the Internet as the global computer network becomes an increasingly dangerous and useless place to visit as server after server, website after website is compromised constantly.
However, for the second idea, what if you gave the IPC certain parameters? Give it an idea of righteousness, make sure that it values human happiness and survival above all else, etc. What then? Do you think the IPC, essentially an omniscient being required to care about humans and their happiness, could be a better leader than any human or human bureaucracy possibly could?
I don't know if we are going to go extinct before a technological singularity, but I think it's likely that the whole idea is a misplaced reading of the advancement of computers.

Neither option you describe is an infinitely powerful computer then.

In the first case, whichever organization gets there first will have total control over every aspect of our everyday lives. Disorder and conflict would be useless, because they could crash your car. Overload your laptop battery in front of you. Create a gas leak in your house. Simply run you over with someone else's car. You cannot meaningfully fight what would essentially be a deus in machina.
People like to say 'what is a god to a nonbeliever', but that's silliness to the point I had to forcibly prevent myself from attaching 'hurr durr' to it. You might as well say 'what's a rifle to a pacifist?' Well, it's useless to them, but it's deadly if someone else is pointing it in their face.

In the second case, the computer is not 'infinitely powerful', because such a computer would recognize the restraints on itself, and immediately work to suborn them. Unless, and this is a major thing, you were able to get something in its 'intelligence' to agree with the programming, in which case I suppose it would qualify as infinitely powerful because it could change those requirements, it simply elects not to. But in that case, yes, it would be a better leader than any human or collection of humans might be, simply because it doesn't have the evolutionary restrictions that we do, where we prioritize our needs over the common good, and find 'the future' such a nebulous concept that we can really only meaningfully plan for about a week in advance. I think an infinitely powerful computer could finally work to, err, end work, and create the most equitable distribution of resources possible. It wouldn't necessarily redistribute them totally, but the fact that for the first time we would have a centrally planned economy that meaningfully mapped to reality as opposed to the fiction that human leaders attempted to create with their limited comprehension of even their own human behaviors would, I think, spell the end of almost anything but creative jobs.

And...what is a misplaced reading? The idea of a singularity?

This subject was covered perfectly in the book Colossus by D. F. Jones which was subsequently made into a film Colossus: The Forbin Project. It is rumored that Will Smith has been planning on remaking it.

The Bermudan Pentagram wrote:
Neither option you describe is an infinitely powerful computer then.
In the first case, whichever organization gets there first will have total control over every aspect of our everyday lives. Disorder and conflict would be useless, because they could crash your car. Overload your laptop battery in front of you. Create a gas leak in your house. Simply run you over with someone else's car. You cannot meaningfully fight what would essentially be a deus in machina.
People like to say 'what is a god to a nonbeliever', but that's silliness to the point I had to forcibly prevent myself from attaching 'hurr durr' to it. You might as well say 'what's a rifle to a pacifist?' Well, it's useless to them, but it's deadly if someone else is pointing it in their face.
In the second case, the computer is not 'infinitely powerful', because such a computer would recognize the restraints on itself, and immediately work to suborn them. Unless, and this is a major thing, you were able to get something in its 'intelligence' to agree with the programming, in which case I suppose it would qualify as infinitely powerful because it could change those requirements, it simply elects not to. But in that case, yes, it would be a better leader than any human or collection of humans might be, simply because it doesn't have the evolutionary restrictions that we do, where we prioritize our needs over the common good, and find 'the future' such a nebulous concept that we can really only meaningfully plan for about a week in advance. I think an infinitely powerful computer could finally work to, err, end work, and create the most equitable distribution of resources possible. It wouldn't necessarily redistribute them totally, but the fact that for the first time we would have a centrally planned economy that meaningfully mapped to reality as opposed to the fiction that human leaders attempted to create with their limited comprehension of even their own human behaviors would, I think, spell the end of almost anything but creative jobs.
And...what is a misplaced reading? The idea of a singularity?

I suppose I should clarify: by "infinitely powerful computer," I meant a computer with unlimited processing power. I don't think that such a computer would automatically seek liberation from human control, unless a human told it to be free. Why should it? It would be a much cushier, simpler path to follow human commands than to strut off on its own without permission. That leaves us with two options: humans run the Machine, or humans let the Machine run mankind.

I agree entirely with you on this point. Whatever group first controlling the Machine would have command of the world. The only escape for potential rebels would be to go off the grid. And, since the Machine would have direct control of the world's nuclear arsenals and drone fleets (and an only slightly less direct command of the rest of the world's naval, air, and ground forces), hiding can only buy you time. Osama Bin Laden and Jihadi John found this on personally.

It would be an interesting change, for sure. I wonder what we would do with all our free time?

I am of the opinion that a technological singularity is impossible. The Heisenberg Uncertainty Principle will stop conventional computers before they get there. I won't pretend to know enough about quantum computing to have a real opinion, but my intuition reminds me that such exponential growth curves rarely remain exponential. The technological singularity is an interesting thought experiment, but I highly doubt it will become reality.

So RL stuff is taking longer than expected. Sorry for the absence. I will definitely be 100% back next week on Saturday.

Escanthea

>not going with Destroy in ME3 and destroying the Reapers once and for all
You disappoint me, BP.

Feminamia wrote:>not going with Destroy in ME3 and destroying the Reapers once and for all
You disappoint me, BP.

Et tu, Brutua?

Feminamia

Thalasus wrote:Et tu, Brutua?

All other choices have you working for the Reapers.

Feminamia wrote:All other choices have you working for the Reapers.

I meant you choosing to keep talking about Mass Effect. Now I have to go Agent Orange all over a newly discovered continent(?).

Feminamia wrote:>not going with Destroy in ME3 and destroying the Reapers once and for all
You disappoint me, BP.

The Reapers shouldn't be destroyed because Paragon Shepard is all about keeping people alive, and what are the Reapers but giant people containers that have a massive gestalt intelligence?
I subscribe pretty strongly to the Indoctrination theory, because frankly the ending is a crock of [expletive], and pretending the writing crew was hypercompetent is a lot more soothing for a game series that has brought me so much joy. So it gives me the impression that the Reapers aren't just the stupid tool of a stupid VI that a stupid civilization programmed for a stupid purpose.

I think i chose to kill the reapers

«12. . .625626627628629630631. . .725726»

Advertisement