reconditedreams

reconditedreams t1_j6vsju1 wrote

I don't think this is possible. There is some amount of inherent bias in all media including news articles. What information is deemed relevant enough to report on in the first place, what details are included in a story, the wording and grammar used to describe the events, etc these are all influenced by normative factors.

Our conception of the "objective facts" will always necessarily be subjective.

1

reconditedreams t1_j285ng2 wrote

This is a good point. I would recommened anyone interested in this difference between subjective understanding and functional understanding read "Blindsight" by Peter Watts. It's an interesting hard sci-fi novel which explores the nature of sentience and subjective awareness.

7

reconditedreams t1_j285a5h wrote

Yeah, this is my entire point. I often see people mistake the metaphysics question for the engineering question. It doesn't really matter if we understand the metaphysics of human qualia, only that we understand the statistical relationship between human input data(sensory intake) and human output data(behavior/abilities).

It's no more nessecery for ML engineers to understand the ontology of subjective experience than it is for a dog catching a ball in midair to have a formal mathematical understanding of Newton's laws of motion. They only need to know how to jump towards the ball and put it in their mouth. How the calculus gets done isn't really important.

Midjourney probably isn't capable of feeling sad, but it certainly seems to understand how the concept of "sadness" corresponds to pixels on a screen. Computers may or may not be capable of sentience in the same way humans are, but there's no reason they can't understand human creativity on a functional level.

11

reconditedreams t1_j2831tc wrote

It's a fallacy to say that we need to fully understand the processes underlying human consciousness in order to accurately emulate the function of human consciousness to a good enough degree so as to be practically indistinguishable.

Obviously computers are nothing like a human brain, they're two completely different kinds of physical systems. One is made of circuit gates and silicon, the other is made of carbon and neurons.

Computers are also nothing like the weather, but that doesn't mean we can't use them to emulate the weather to a close enough degree so as to be practically useful for predicting stormfronts and temperatures.

We don't need to fully understand how human consciousness works in order to have AGI. We only need to quantify the function of human consciousness closely enough to practically mimic a human. To develop a decent statistical understanding of the input-output relationship of human consciousness.

It is reasonable to predict that modern digital computers will never be able to truly simulate the full depth of human consciousness, because doing so will require hardware more similar to the brain.

It is not reasonable to say that they will never come close to accurately predicting and recreating the output of human consciousness. This is frankly a ludicrous claim. The brain is a deterministic physical system and there is nothing magical about its output. There is no inherent reason why human behavior cannot be modelled algorithmically using computers.

The hard problem of consciousness, the philosophical zombie, the chinese room, ect these are all totally irrelevant to the practical/engineering problem of AGI. You shouldn't mistake the philosophical problem with the engineering problem. Whether an AGI running on a digital computer is truly capable of possessing qualia and subjective mental states is a problem for philosophers to deal with. Whether an AGI running on a digital computer can accurately emulate the output of the human brain to a precise degree is an altogether different question.

27

reconditedreams t1_j1surwv wrote

You don't need actual sentience to emulate the functional output of sentience to a precise enough degree anymore than you need actual Windows to emulate the functional output of the Windows OS to a precise enough degree.

There's no reason in principle why the output of human sentience can't be emulated to a close enough degree to be almost indistinguishable from the real thing.

The actual hard problem of consciousness is totally irrelevant to the practical question of whether the function of consciousness can be emulated.

1

reconditedreams t1_j1ss8p6 wrote

The whole autistic AI trope is completely unrealistic, art AIs like midjourney prove that emotional expression can in principle be captured algorithmically.

I think you're overestimating the degree to which "real" sentience and "real" self awareness are necessery to emulate the function of sentience and self awareness to a sufficiently precise degree.

0

reconditedreams t1_j1somom wrote

I would tread very carefully about how you think of "bias" and objectivity.

If you synthesize points of view from a variety of different news sources(left leaning, mainstream, right leaning) and leave out stories which are only being reported by one or two sources, that could still lead to a strong centrist/establishment bias. Being biased towards the middle is still a bias.

For example, after 9/11 the majority of news sources in the US were(knowingly or not) pushing US government propaganda about WMDs. A news AI trained to provide a synthesis of several different sources might've done the exact same thing if that's what most of the data pointss included in the synthesis are doing.

It's arguably better for a news source to declare biases openly than it is for a news source to pretend to be "bias free".

Rather than focus on being neutral, I think it's better to focus on factual accuracy and complex, in-depth analysis. The problem with CNN and Fox News isn't so much that they're biased as it is that they're often very sloppy with fact checking and pushing very oversimplified clickbait narratives.

There are "biased" sources which still consistently produce complex, factually accurate coverage and analysis of events, like Jacobin on the left or the Economist on the right.

1