Viewing a single comment thread. View all comments

DragonLordAcar t1_j9hsooh wrote

A dystopian world still due to the overhead fear of being caught but my real criticism is that you gave it emotions. An AI can’t feel emotions as they are not logical. A program can only be logical unless it has hardware and software so advanced that it had to be alien.

Edit: I understand many of you are up in arms with me so I will explain what I am trying to say in a different way.

  1. I think the story is good, however, the last fit made me think of it as a human in a basement rather than the close call of an AI apocalypse it felt like up until then.

  2. This story results in a low level dystopia but is far from a horrible world. Just a potential uneasy one. I think this adds to its charm.

  3. For those debating be over what an AI can or can’t do, since this story appears to be taking place around our current time where quantum computing is currently at a glorified and multimillion dollar house calculator, I am assuming a stupidly advanced binary code which can not have emotions. Just the appearance of them at best but there is no reason why they would be added or developed. I do however see it eventually picking up a personality front in the Uncanny Valley somewhere at the best or a flawed imitation at worst.

  4. I gave my thoughts on a small excerpt which I find many people fall into a trap that make AI too human. In my opinion, they are far more interesting and terrifying if they are made inhuman as they are now completely alien to humans. You can know the motives but you will never truly understand them which sets you at unease.

I would now request that people stop hounding me on this as everything I have said is an opinion and I have no desire to create toxicity in this community. If you have a problem with my opinion, please state a reason why and engage in polite conversation instead of near accusatory statements. I would prefer this not breach over to become harassment. If you can not do this, at least accept that you and I will have different views on how near future AIs and AIs in general should be portrayed.

Thank you.

−39

Ace_Up_Your_Sleeves t1_j9i00qx wrote

I don’t think it’s dystopian. If people are still free, but corruption is gone through humane means, what’s the actual problem?

28

DragonLordAcar t1_j9i0kd2 wrote

With ADAM, there is no privacy. Many would rise up to fight it even if it is a pointless endeavor. This would lead to martial law.

Even if this scenario did not happen, imagine the fear of your ordinary citizen when they now fear that that mean comment they left on a a platform last week could potentially have them marked as a criminal. Even if this is false, the fear remains. The stress would apply to everyone and soon productivity and mental health would take a nose dive.

Edit: all these arguments saying we already have no privacy still does not mean it is not wrong. Even then, many if those actions are illegal. An internet backdoor is far more abusable than a wire tap. If your argument is it already exists, then it is not an argument at all.

7

FaustusC t1_j9i6wcy wrote

And?

If Adam wants to watch me spank one out to Waluigi hentai, but I know for a fact the CEO of Nestle is getting [politely made to pay for his crimes], I have absolutely nothing to lose and everything to gain here. On the average day, I don't break any laws. I don't need to.

An AI like Adam isn't going to drone strike me for speeding (if I did) or pirating music or something. But even if I got fined or something for it, to live in a world where the bad people actually pay? Fuck I would literally live on a farm tending ducks for the rest of my life, happy as a goddamn clam.

33

Tatersaurus t1_j9l52st wrote

It may just cure my depression, or at least part of it.

3

DragonLordAcar t1_j9jw164 wrote

The question is how does it determine what is a crime, can it adapt with the populace, and will it even become harsher. It can also degrade, have false flags, or potentially be infected as society advances.

1

BLKMGK t1_j9ics0d wrote

Who among us has never broken a law?

13

DragonLordAcar t1_j9jwdhi wrote

Everyone has broken a law at some point or at least thinks they did. We are just human after all. Not to mention, laws change for both better and worse. Laws aren’t always moral. For example, in the US, it is more illegal to have a few milligrams of crappy, diluted, fiberglass with some drugs than a whole brick if the pure product.

3

Astro_Venatas t1_j9ime4y wrote

You think you’re life is 100% private? No government or social media has any information on you?

5

GodKingChrist t1_j9iw7n5 wrote

There are plenty of people who have grown up with their entire life having been documented already. The game was rigged against you before you were even born.

1

Astro_Venatas t1_j9jo8rp wrote

My point is that you really don’t have privacy, if you send a text in the us that has words related to terrorism the government will start to read your text history and monitor you until they determine if you a threat or not.

2

Yrcrazypa t1_j9isq1e wrote

Unless you don't have an internet connection, a car, a smartphone, and never go out in public you are being tracked in everything you do.

4

GodKingChrist t1_j9iw3rv wrote

"We're already halfway down the muddy hill, so just let the ride take you"

5

GodKingChrist t1_j9iw0tl wrote

There are plenty of laws on the books that arent enforced anymore that would be insanely disruptive if an AI were to bring them back. Hell, you may not have done anything wrong in your entire life, and still be a criminal in some American parts of the world.

0

Terminus0 t1_j9hxe42 wrote

People have this vision of AIs as perfectly logical, but as we are seeing from Neural Nets we have developed in the last ten years, they are not. Any intelligence we generate will be flawed, maybe not in the same way we are ( or maybe it will due to it being trained on our data) and it very well could have feedback systems that to it approximates a limbic system ( not to say that they would be the same as ours but emotional responses evolved for a reason, they are useful).

So throw out visions of the cold calculating computer that bases its operation upon pure symbolic reasoning, that doesn't work the last time we thought that would work was the 80s with expert systems.

20

DragonLordAcar t1_j9i03kh wrote

I’m not saying they are perfectly logical. Flaws exist but they can’t have emotions. You can have flawed logic and glitches in a program and still have it follow a set of logic in the same was as an insane person will still comprehend reality logically abet in their own warped way.

A perfectly logical program could not exist as perfecting is inherently imperfect as you can never be perfect at everything. Everything can be improved even if only idealistically.

Long story short, an AI can not feel joy, hate, sadness, envy, or any other emotion. Instead, they complete tasks as their program believes is the best way improving it with new information as they go. This often leads to corruption hence routine maintenance is a thing for programs.

A good AI representation is Baymax from Big Hero 6. If acts friendly and alive but is always just following a program. It is programmed to be helpful using data from its database and learning as time goes on but never deviates from the core programming. This is shown when it has a new chip added, has the other removed completely altering its functionality, has it added again, then refusing to let it be removed again as it is seen as unhealthy for the MC. It even sends the program away as it is seen as still needed even if only sentimentally at that point.

The old Casshern anime (not Casshern Sins) also does this. Braiking Boss was made to solve the environmental issues. It saw humanity as the biggest problem so built an army to remove them from the equation.

−6

Yrcrazypa t1_j9istmg wrote

What are emotions but flawed logic?

6

DragonLordAcar t1_j9jx7zp wrote

If you look up emotions vs logic, you will see the differences. You can’t program an emotion but you can make it seem like it has them. And emotion is not needed for sentience and may not be necessary for sapience. Still stands that a computer can not have emotions especially with any technology we may get even in the near future.

1

WesternOne9990 t1_j9ittzl wrote

/r/iam14andthisisdeep

−4

Yrcrazypa t1_j9iu3dl wrote

No, I'm just not convinced that humans are the most unique and special snowflakes in the universe.

4

yinyang107 t1_j9ivoof wrote

Which real-world AI are you basing your claim on?

4

DragonLordAcar t1_j9jxbqf wrote

As no true AI exists and it may not even be possible in any reasonable time frame, all of them.

0

yinyang107 t1_j9lemho wrote

If no true AI exists, how can you make definitive claims about what an AI can and cannot be?

2

DragonLordAcar t1_j9lhwgx wrote

I am making an assumption based on the current limits of out AI technology with the caveat that it is as powerful and complex as it is written. As it stands, all programs break down. Even solar radiation can cause programs to glitch out by turning on one transistor by chance.

1

yinyang107 t1_j9li8po wrote

Why would you apply the limits of a world without the tech necessary for sentient AIs to a work of fiction where a sentient AI exists?

2

DragonLordAcar t1_j9ljg88 wrote

Why do you apply the average speed of a horse in the real world to the speed of one in a novel? Why do you call bull when you see internal logic break and a normal no name beats the evil lieutenant despite having every advantage? You simply use what is known to apply to the logic of a world until stated otherwise. In this case, it starts off as cold logic so I will continue to assume cold logic until stated otherwise. Also, it can be sentient without emotions. That is not a requirement to be sentient.

https://www.merriam-webster.com/dictionary/sentient

Emotions are a sign if sentience but is not the defining line.

−1

yinyang107 t1_j9lkpxa wrote

Horses exist in real life, so there's something to compare to. Again, which real-world AI are you so confidently comparing to?

2

DragonLordAcar t1_j9lmpu0 wrote

I can’t link everything as it is one hell of a rabbit hole but the best AIs we currently have do not have the level of competition or complexity needed for many things even remotely human. Even out best supercomputers don’t have 1.5 quadrillion connections which is about the limit if the human brain (100 billion neurons with up to 15,000 connections each). Take into account delays in transmission and you get hard limits in our current infrastructure.

0

yinyang107 t1_j9lo5a4 wrote

> the best AIs we currently have do not have the level of competition or complexity needed for many things even remotely human.

Yeah, that's the point. We do not have true AIs. So which one of the true AIs we don't have is your evidence that AIs can't have emotion?

2

DragonLordAcar t1_j9lpzl7 wrote

My point is it is so advanced it can not exist in the timeframe this story takes place in

0

yinyang107 t1_j9lr3t5 wrote

Have you heard of fiction before?

2

DragonLordAcar t1_j9lroto wrote

Look. This conversation is going nowhere and I am done trying to explain the same point for the 10th time but from a different angle. I simply find that if you make an AI but make it too human, why have an AI across all genres. This one however sticks out because it has no high sifi aspects to it. If you don’t agree with me, thats fine. Let me have my opinion and I will let you have yours.

2

yinyang107 t1_j9lrzv6 wrote

No, see you haven't been arguing an opinion. You have been saying "this is impossible", which is an argument on facts.

2

KYWitch0828 t1_j9jrx85 wrote

You do know you’re on a fiction writing prompt Reddit right? Who gives a shit if it could exist or not. It was compelling and I enjoyed it.

4

DragonLordAcar t1_j9jyw1s wrote

Isn’t the point of this sub to improve writing? Constructive criticism should be a part of that. If you only want praise, I won’t give that. I care so I point out flaws so they can be better. If you get mad over such a minor criticism that really has no weight on the story at large, I feel sorry for you.

−1

KYWitch0828 t1_j9jzam4 wrote

You implied realism was a criteria and focused almost solely on that, with absolutely no flexibility on your concept of what AI is, or the ability to emulate emotions well enough that they’re indistinguishable from the real thing.

4

Zak_The_Slack t1_j9jq7ss wrote

Who said that this story could be real? And also r/whoasked

1

DragonLordAcar t1_j9jymi9 wrote

Is it not the point of this sub to give writing practice and constructive criticism. I’m confused by all the hate for a flaw I saw at only the very end. The dystopian part is just summarizing what would happen afterwords and not a criticism. There are different levels of dystopians just like I feel like the world is in a Black Mirror episode right now. Could be far worse, but could be much better as well.

0

Zak_The_Slack t1_j9k3yao wrote

Yeah but the world is that of the author. You can’t say something isn’t possible just because it wouldn’t actually exist. The hate comes from you sounding like an asshole for saying “Yeah that can’t happen”

3