Viewing a single comment thread. View all comments

[deleted] t1_itnyrio wrote

[deleted]

15

Kingflares OP t1_ito0m34 wrote

8K probably isn't enough for a Home Theatre system, which is what enthusiasts are using it for. Its barely noticeable for some when you are TV viewing distance. But the further away you are, the more its noticeable, at least for me.

There are 16k displays in the works as well.

This entire bill is stupid though, the 8k + projectors consume more power and is what rich people are going for.

That said. The power consumption law is too low, they should focus on more efficient ACs or literally any other power hog than a TV. A TV is the least of your concerns in your home, fridge, oven, microwave, etc.

7

aintbroke_dontfixit t1_ito66l2 wrote

> Its barely noticeable for some when you are TV viewing distance. But the further away you are, the more its noticeable, at least for me.

Sorry but that's bollocks and working backwards. The further away you are the less you can notice higher resolution and therefore higher levels of detail. A person with 20/20 vision can resolve 60 pixels per degree, which corresponds to recognizing the letter “E” on the 20/20 line of a Snellen eye chart from 20 feet away.

This chart of viewing distance vs screen size for when resolution increases become noticeable has been calculated using the above.

19

projecthouse t1_itql3k3 wrote

There's mental phenomenon that occurs in subjective matters where when you think something is better, the more you enjoy it. This has been proven with wine. The more expensive a tester believes a wine to be, the better they will rate it. It's also most likely why the Steinway pianos don't sound any better in the lab, but do in the concert hall.

Comments like this that have absolutely no scientific basis immediately remined me of that. I'd be that the OP believes he sees a difference, but could prove it in a blind test.

1

conanf77 t1_itoh9tf wrote

“Vampire” Power consumption of cable boxes should be one item addressed, last one I had pulled 80 W on standby and 300 W while active. It was a while ago though. North America

11

happyscrappy t1_itoo8pz wrote

That's dropping as the hardd drivers are removed from the boxes and the storage moves to the cloud.

The energy use moves to the cloud too though. It doesn't completely go away.

0

wierdness201 t1_itp149g wrote

I highly doubt a hard drive would be pulling 30 watts.

7

C0rn3j t1_itpd6xr wrote

300*, and yes, what the poster above said makes no sense.

1

ACCount82 t1_itq04wi wrote

Especially not when it's spun down. As a hard drive that's not being accessed would be.

1

happyscrappy t1_itqnlq7 wrote

They generally are not spun down because the box uses it to buffer TV content all the time. So you can "rewind live TV". The drives are not just spinning but actively reading, writing and seeking. They basically copied that from TiVo.

https://www.latimes.com/nation/la-na-power-hog-20140617-story.html

The figure before the HDDs was 18W average usage by a cable box. Then it went to 35W with HDDs. Now that HDDs are being removed the figures are dropping again.

1

happyscrappy t1_itoo6v1 wrote

They already spend a lot of time regulating ACs and fridges. I dunno about ovens or microwaves. Both are already very efficient.

The people who make such regulations do have the statistics of how much energy is consumed typically using different appliances. And TVs really do add up.

1

CelebrationNo4962 t1_itq0kgn wrote

It's not stupid.

Because tv's get bigger and denser, the power consumption has shot upwards. So it's good to have some caps on this effect. Also, because every family has at least one TV, it's a rather quick and easy solution to save some energy. Same like the vacuum cleaners back in the day.

Having said that, I do agree other appliances need to be held to the same standard.

1

eugene20 t1_ito6v8z wrote

If the display is physically big enough such high resolutions starts to make sense, but under 50 or 60 inches I'm not so sure.

3