Submitted by randburg t3_103bh7f in technology
AdRelevant3167 t1_j2z6u0k wrote
Reply to comment by Ottobahn- in LG’s latest Signature OLED TV receives all of its audio and video wirelessly by randburg
Digital can transmit without any loss of data.
Edit: not sure why everybody thinks that 4k Netflix video, which came from a 100mbps internet connection, and then transmitted through a home wifi router, would bottleneck at another wireless transmission point.
TheFriendlyArtificer t1_j2zecjw wrote
If there's a consumer wireless AP that can transmit at the 18Gbps HDMI standard, I'd love to see it.
LeLefraud t1_j30qh9x wrote
dont forget hdmi 2.1 which goes up to 48 gbps and will be the future mainline speed
sybesis t1_j2zs7oo wrote
Well damn, I'll have to replace my AP with HDMI cables then it will certainly make the internet faster! /s
eugene20 t1_j30z5hn wrote
There you go mistaking highly compressed video for quality. Some people use their screens for actual high quality media, or displaying games rendered realtime to HD, where input latency is also an issue that isn't a problem for netflix content.
yungplantdad t1_j31mo2b wrote
How much can a Tv use per second? I worked with Netflix on a project to optimize their data transmission performance for phone and it was nowhere near 18Gbps
Gwthrowaway80 t1_j2zhkxk wrote
If you have no requirements, that’s a true statement.
The issue is that HDMI is a very high data rate of 48 Gbps. It’s mostly trivial to do that with a wire, but extremely difficult to do wirelessly. With enough bit errors, there will absolutely be data loss.
sybesis t1_j2zsyqs wrote
People seems to be missing the point that if you need 100Mbps to stream video in 4k. If all you need is 100Mbps having an extra 48Gbps isn't going to make you buffer the video faster since your internet connection speed is way under that even in a LAN and there's no way you'd be able to encode the video in a way you'd need more than a fraction of the potential speed HDMI gives you.
riffruff2 t1_j30fklm wrote
It has nothing to do with buffering or your internet speed. The video file you're streaming over the internet is compressed. If you're using a set top box (like a Roku) then that receives the video and decodes it. Then it sends the uncompressed video via the HDMI cable. That needs significantly more bandwidth than the compressed video. Hence why HDMI supports significantly faster speeds.
Nearby-Ad5092 t1_j30nvkk wrote
Can they put the decompression into the tv?
riffruff2 t1_j31kdoi wrote
Not with HDMI. The HDMI spec uses uncompressed video data. The audio can be compressed though.
Regardless, the issue with this tv is there's likely a 2nd compression and decompression happening between the tv and wireless box. That increases latency and almost certainly has quality loss too. The article is light on details, so there's not really much to go on.
sybesis t1_j3272dr wrote
It doesn't matter. Roku receives a streams and decompress/decodes it to send through HDMI. With that TV what's likely going to happen is that you'll just stream the encoded video directly to the TV without having to decode it.
So even if you used an other system what would happen is the box would stream the video from internet without decoding/decompressing and be sent directly over network to your TV.
HDMI supporting significant faster speed is irrelevant as you basically don't need it for video streaming. The only case it could make sense is realtime video streaming for gaming. That could be an issue but in reality I doubt it's much of an issue. It's not like encoding/compressing video is terribly hard to do as games could output an encoded stream without having to resize frames or anything "expensive".
riffruff2 t1_j35ukuf wrote
>So even if you used an other system what would happen is the box would stream the video from internet without decoding/decompressing and be sent directly over network to your TV.
It doesn't work like that. HDMI carries uncompressed video. Yes, it'd be great if it could do what you're talking about. But that's not possible with HDMI.
>HDMI supporting significant faster speed is irrelevant as you basically don't need it for video streaming. The only case it could make sense is realtime video streaming for gaming. That could be an issue but in reality I doubt it's much of an issue. It's not like encoding/compressing video is terribly hard to do as games could output an encoded stream without having to resize frames or anything "expensive".
Look at the bandwidth requirements for uncompressed video.
sybesis t1_j373e6g wrote
> It doesn't work like that. HDMI carries uncompressed video. Yes, it'd be great if it could do what you're talking about. But that's not possible with HDMI.
It works exactly like that. Because we're talking about not using HDMI. I'm not sure why you're talking about sending uncompressed over HDMI when I'm talking about sending compressed data over network. The speed requirement for uncompressed video over HDMI is irrelevant because we're talking about sending compressed/encoded video over network.
That's the whole point here. Nobody here is saying that HDMI speed is unnecessary for uncompressed video. We're saying sending uncompressed video isn't necessary since video format have lossless compression and can be sent over lower bandwidth network just fine.
MarkedZuckerPunch t1_j37r7l7 wrote
The streaming stick still uses hdmi to connect to the box, so it has to send uncompressed video (Edit: And it might then get recompressed by the box, which further degrades an already degraded video). Also there are many more sources than just a roku device where you get significant quality loss when sending compressed video and high wireless latency, like gaming consoles and bluray players.
So it's not about getting rid of hdmi cables, just the cable to the tv. Even if they magically solve all those issues, it's still stupid because there needs to be a power cable. Just do it like Samsung. One cable for video AND power.
sybesis t1_j37zftj wrote
HDMI sticks mainly exists because TVs didn't usually have the processing power to decode/decompress video streams. Cannot have alternative OS installed to make your TV a Roku TV or a Chrome TV or whatever you want.
So what you do is plug an external device to use your TV as a display. It just happen that HDMI is a widely used standard. It's not a necessity. We don't have to use HDMI. It's just convenient.
Point being, is that if you can receive video on a 100Mbps connection. That's the minimum you need to send it somehow to the TV. That you send uncompressed video over HDMI is completely irrelevant because HDMI isn't the bottleneck. Internet connection is the bottleneck.
It's a bit like how Bluetooth is a drop in replacement for RS232. What LG is doing seems to be introducing a wireless standard to use as drop in replacement for HDMI.
The TV still provide HDMI port on a hub. But see it as a convenience... because nobody else uses this wireless protocol. Eventually we could see HDMI stick connected to a gaming console to emit the video directly over the air just like you can replace RS232 cables by pairs of RS232 <-> Bluetooth <-> Rs232.
> So it's not about getting rid of hdmi cables
It's all about removing cables. It's just it's still early to completely drop HDMI for obvious reasons.
MarkedZuckerPunch t1_j389llo wrote
-
Streaming sticks exist for more reasons than that. Choice being one of them. Replaceability another. Imagine building the tech required for this to work without problems directly into one, skipping the connect box while keeping the small form factor and the 50$ price tag. Not gonna happen.
-
The tech is gonna be proprietary. So get ready for every device incorporating it directly to be vendor-locked. Or buying expensive transmitters to plug into the HDMI PORT of every device you own, which will still require an uncompressed signal because its still HDMI. Or, you know, a single connect box, just like what LG just presented.
-
HDMI IS the bottleneck for anything other than streaming services. That and storage space. Even movies on Blurays are compressed because storage space for uncompressed video would be insane. And even those require the full hdmi bandwidth. Sure they could be compressed even further, but with added latency on top of the wireless latency and even worse quality than direct netflix streams because of real-time encoding an already degraded source. Which brings me to
-
Incredibly high latency and video degradation on video games even IF consoles incorporated the tech directly, which they won't (because of all the reasons above)
And don't even try to compare this to high-compression, audio-only tech like bluetooth, which no audiophile would ever use in their home cinema. If this is going to be like that, they might as well not do it.
- The power cord, nuff said
sybesis t1_j38uq9c wrote
Ok, you clearly don't understand what bottleneck means. If the HDMI cables can handle up to 48Gbps... the cable clearly can't be the bottleneck when all other medium don't have those kind of bandwidth.
That's why we compress video because if we didn't. We'd need many more terabytes of storage or more than 48Gbps of bandwidth on network to be able to transmit them.
When you're able to compress, the bottleneck will still remain the slowest part of your system. In case of streaming, it's your internet connection. If you don't have the required internet speed to download the stream, you can't hope to be able to watch it even if you have 48Gbps capable cables.
Just to show how ridiculous the claim is. If we had a video that requires 2Gbps of bandwidth. That would require an internet connection of at least 2Gbps or more than 17GB of storage for a 60second footage.
In reality, the footage is compressed and can be compressed in a lossless format so quality doesn't degrade and doesn't induce necessarily any latency. One example is like having a frame and having 90% of the pixels identical to the previous frame. There's no point sending all of the pixels. You'd send the 10% of the pixels that changed and only update those. We're not even starting to compress the 10% we send but the frame size will be 90% smaller than the previous one and it won't be slower because if you spend time only updating 10% of a frame you spend less time than updating a whole frame.
> The tech is gonna be proprietary. So get ready for every device incorporating it directly to be vendor-locked.
It's possible, but I sincerely doubt it. Creating vendor-locked technology like that means nobody would want to integrate with their TVs... then it's just a matter of time until a consortium is created to replace their vendor-locked technology to replace it with an alternative that's used by everyone. That's why USB is used everywhere instead of firewire, that's why bluetooth is so common nowadays, how we had RCA cables then HDMI cable instead of vendor locked cables, just like wireless charging support a common protocol instead of reinventing the wheel. Having a vendor-locked system would be a terrible move nowadays. I'd imagine they'll build a consortium and use their basis to build a future standard backward compatible with what they currently made.
MarkedZuckerPunch t1_j39j7zb wrote
I know how lossless compression works. Netflix doesn't use lossless compression because the file size would still be way too large. blurays can't be further compressed losslessly because they already use lossy compression. Trying to compress them losslessly would best case do nothing, worst case it larger. So I don't even know why you mentioned lossless compression, unless you were either actually suggesting that or thought that netflix uses lossless compression.
Also I don't know why you're acting like netflix streams are in any way representative of the speed this connect box would need to reliably and with near zero latency transmit (can't buffer video games), while 4k 120hz 4:4:4 Chroma wasn't possible with HDMI 2.0 (sound like a bottleneck to you?). This means that for that you need more than the 14gbps, quite a bit more actually. 20gbps or more. That's double the wifi 6 max speed. There's also TVs now with 240hz, which even HDMI 2.1 can't do at 4k and once we get to 8k TVs those 42gbps probably won't be enough anyway. That' more than 4 times wifi 6 max speed. Remember: reliable and near zero latency.
Why did those
>That's why USB is used everywhere instead of firewire, that's why bluetooth is so common nowadays, how we had RCA cables then HDMI cable instead of vendor locked cables, just like wireless charging support a common protocol instead of reinventing the wheel.
Why did those even have to be replaced? Because some companies were doing it before everyone else and tried to profit off of it. Why did they get replaced? Because the technology got important enough to warrant the building of consortiums and standards. Will this eventually be the same? Probably. Are we there yet? No. Why? Because it's still experimental. Did other TV manufacturers announce a similar feature? No. So they probably didn't work together on a new standard. Also you act like no company would do anything proprietary these days, which is just false.
Note: I'm not saying they'll create vendor lock-in. I'm saying they won't do it as you described at all. At least not any time soon.
Note 2: we had open HDR10 at first, but the proprietary Dolby Vision Codec won against HDR10+ and Samsung doesn't want to pay royalties for it.
riffruff2 t1_j39d49i wrote
Dude the entire argument was that if you're using a HDMI device then you will have quality loss with this wireless solution opposed to using a standard non-wireless tv. You're pulling a classic strawman argument. Taking HDMI out of the equation is not an argument. I can't connect my Roku, computer, Xbox, PlayStation, or whatever other device I have to this tv without quality loss.
sybesis t1_j39givz wrote
You're basing your argument on assumptions. Here's a review made by Linus Tech Tips made 1 year ago about a wireless HDMI drop in replacement. Note here that's a replacement for HDMI not simply ditching it.
https://www.youtube.com/watch?v=kojTyPdhp3s&ab_channel=LinusTechTips
> I can't connect my Roku, computer, Xbox, PlayStation, or whatever other device I have to this tv without quality loss.
Based on what evidence? Do you sincerely believe LG would release a device with a technology that makes video look shitty on their TVs?
riffruff2 t1_j39k0m9 wrote
I mean that device severely limits the quality. The evidence is in your video. It doesn't even support HDR for example. Is that fine for you?
Yes I believe LG would release it. It's to be the first. I work very closely with both LG and Samsung with their hospitality displays. I have weekly meetings with their display engineers. My evidence is from experience.
Gwthrowaway80 t1_j30cd72 wrote
Regarding your edit: Your statement is almost true that 4k Netflix is streamed TK you at a lower data rate than a full bandwidth HDMI 2.1 stream. However, viewing that movie would take the lossy compression from the Netflix stream, decompress it at the receiver, then do a second lossy compression to stream it wirelessly to the tv, where it is decompressed a second time. There will almost certainly be quality lost.
However this scenario totally ignores use cases that aren’t streaming movies. 4k BluRay is one example that could lower quality, but a bigger one is gaming. Pushing a 4k display at 120 hertz will fill that 48 GHz bandwidth of a wired connection. I presume the frame rate will suffer as a consequence of the shift to wireless interconnect.
Without any documentation, I acknowledge that I’m speculating, but it is informed speculation.
[deleted] t1_j2zilud wrote
[removed]
Dawzy t1_j316521 wrote
Regarding your edit… Because that’s exactly what can happen?
It might not, but it certainly could.
Vybo t1_j31dn3v wrote
If you're not streaming it directly to the TVs operating system (using the app on the TV itself), you're introducing another recompression. That's compressed video of a compressed video. It worsens the quality a lot, try to save a JPEG few times over itself and then compare the initial image with the latest saved one. You'll see that the later has more artifacts.
Viewing a single comment thread. View all comments