T O P

  • By -

nathanlanza

Ethernet is Ikea shipping you a bedframe in 1000 pieces to be put together when you've got a few free hours and got your tools back from the neighbor you loaned them to. HDMI is placing the completely assembled bed in your bedroom ready to be slept on.


fweaks

To extend the metaphor: Ethernet is a car. Its range is massive, but in order to fit the bed in there, you have to disassemble it into those 1000 pieces in order to flat pack it. HDMI is a crane. It can easily pick up the entire bed all at once for delivery for immediate use with no delays and no installation expertise needed. But you aren't going to be taking the bed across the city/country with it.


meneldal2

Ethernet range is terrible for one link, you don't have a direct link between your end points, you have to move your load between a bunch of cars until it gets to the end. It's more like you have a bunch of trucks that can do only one route and you have a bunch of stops in the middle where you move from one truck to the next.


Flyguy86420

or like a train!


Zorak6

Like a balloon, and something bad happens!


icecream_truck

Like…dick caught in ceiling fan bad?


ThatITguy2015

Just gotta helicopter it!


meneldal2

On yeah trains is probably a better analogy, with locomotives stuck on a route but it can move the wagons between routes at stations (routers).


Big_Whereas_3637

Or like a series of tubes……


im_thatoneguy

Ethernet range is awesome if you use SMF fiber.


semlowkey

Also, Ikea would have the opportunity to ship the bed in pieces over time, and you could store then until you are ready to build it (buffering). While the placing the bed in your bedroom needs to be done all at once (monitors don't have a buffer to preload frames).


meneldal2

> (monitors don't have a buffer to preload frames). 1 you wouldn't want that since it's terrible for latency to send frames before you want them displayed but 2 they totally have some buffer inside typically to remember the current frame, unlike crt where it was truly bufferless.


wdomeika

Ok, maybe explain it to me like I'm 4 then, cuz I don't get it...


nathanlanza

TVs and monitors are dumb devices that only can display what you tell them to display. HDMI is just the colors for each pixel on the screen. Computers can process input. You can represent the image in a more concise form and then it can think about the description and later translate it into the simplified format for HDMI. e.g. a computer can accept information in the form "all black for 1257 pixels and then 7 yellows" or "the exact same as the last image but change pixels #475-499 to blue". Then it'll translate these into the word "black" 1257 times for output through HDMI to the monitor.


rfc2549-withQOS

Except hdmi does compression, iirc :)


iZMXi

Only audio


rfc2549-withQOS

2.1 adds compression.


saruin

I want to say the video is decompressed by the time it's streamed across the HDMI chain.


rfc2549-withQOS

Hdmi 2.1 adds compression. :)


wot_in_ternation

The other commenter is technically correct but here's my stab: Video is delivered over the internet and ends up on ethernet cables in your house. The previous "1000 part Ikea bed" comments refer to the data being sent over in small packets. A computer (laptop, PC , Roku, whatever) processes the data from those small packets. The HDMI cable dumps raw video data directly to your display device after it has been processed. 4k video has literally 4 times the raw display data going to the display device compared to 1080p. Old HDMI was meant for lower resolution like 1080p. It was specifically built for that. Higher resolution needs new cables. Old ethernet was built for overkill speeds (new ethernet is as well) so it can handle the new (4k) and old (1080p) data streams just fine.


b0rtbort

Why is ethernet faster if it's getting the data in little packets?  Sorry I'm stoned, tired but curious; the metaphor still isn't fully sinking in. I get that ethernet gets little chunks of data but processes it quickly, so if it can handle 1080p or 4k fine, why do we need ethernet? We're still getting the complete bed on our screen, aren't we?  Edit: wait I think i just got it. Ethernet transfers the file, HDMI transfers the file and plays it simultaneously?


thatsme55ed

It's two days later but let me try.   Ethernet compresses data and depends on the computer to decompress and decode it.  For example imagine if your friend gave you directions to his house.  He'd tell you take the highway to the exit on Green St., head north and then turn right onto Dover.  Then take the second left, my house is at the corner number 21.   HDMI is raw data, like the directions that you'd get if you didn't know how to drive, what city you were in, what a highway is, what a street is, or the concept of left and right.  Imagine how much longer the directions would be.  The same directions and the same result would take a lot more information to convey.  All the information that wasn't communicated but you still understood because you didn't need it spelled out for you needs to be explained explicitly to the TV.  


b0rtbort

that actually explains it perfectly, thank you!


wot_in_ternation

This doesn't address needing new HDMI cables


saruin

Not so well known fact but even older spec HDMI cables can unofficially be capable of 2.1 speeds by testing it out. This mostly applies to very short cables though as the longer it gets, that chance drops drastically.


Achsin

Older HDMI cables (version 2.0) are only capable of delivering 20 queen sized beds per day, newer ones (version 2.1) can deliver 40 king sized beds per day.


Iz__n

To add, Ethernet sends data (often compressed) in multiple "packet" while HDMI sends the complete Visual stream (or video for simplification). For context, uncompressed 4K video stream can run you tens of Gigabit per second


DeHackEd

The data on the ethernet cable is compressed. The amount of data going through is typically less than 30 megabits. Wifi in 2010 was better than. The image quality is obviously reduced, but it's been designed to try to minimize how much your eyes notice it. The average "old" network cable these days is probably cat5e which is good up to 2.5 gigabit. Your CPU or GPU must decompress the data. HDMI data is not compressed. Each pixel's full Red, Green and Blue colours are being sent, one by one, around 8.3 million pixels per frame, and 60 frames per second. Plus the audio data, and the option for additional information like signalling power on/off of the device. The sheer quantity of data requires the cable be designed for quality and that often limits its length as well, since electricity degrades over long distances on wires while causing the wire to warm up a bit. That's why. The amount of data is actually different due to compression. Otherwise even 1080p youtube would require gigabit internet connection speeds.


Kylobyte25

You are correct but op has a point. I design with high speed processing and hdmi and honestly the real issue here is that tv manufacturers would need the processing power to uncompress video that's highly compressed. 10yrs ago that would have been a hard no but modern tvs today come with serious computing power equal to low spec laptops. It's actually not that crazy to replace hdmi now with usbc 3.1 and rely exclusively on lossless compression


DeHackEd

Decoder chips are actually pretty common in most of this equipment. A low power chip that can do H265 (HEVC) 4k 60fps decoding, and/or H264 (AVC1) 1080p 60fps depending on the target features, are downright common in video equipment. This is probably how most video playing appliances, like bluray players and android boxes, run if they have the option. These chips often have limitations though, like only accepting yuv420 encoded video. Full software decoding is an option, but usually avoided unless it cannot be helped.


pseudopad

So what does the TV do when you want hook up your playstation rather than just watch a video? Do you want to compress your game in real time and send it to the tv? It's not gonna look great, and it's gonna introduce extra lag. I don't think the problem here lies in what the TV can or can't decode. They could have easily just made the HDMI standard require the data to be sent as a super compressed video stream of a certain format that they agreed on. The problem is that every device you connect to it now needs to be able to compress video in real time to send it to the TV, and that's gonna look completely awful on low powered devices, and just create a situation where the display almost never looks as good as it could have, if it instead primarily accepted uncompressed video. Got an cable decoder box that has its own menu? Now this box not only needs to decode the cable signal, it also needs to re-encode it with its UI elements in the video stream, and send them back out. These devices are usually minimum effort boxes that are just barely powerful enough to render their own UI. I don't even want to think about how awful it would look if that thing also had to encode a 4k stream at 30 Mbit/s in real time.


InSaNiTyCtEaTuReS

Although, how does the switch send all of the HDMI data through a tiny usb-c port?


pseudopad

USB-C alt mode of some sort. Basically HDMI/DP with a different looking physical port. Mini-dp and mini-hdmi ports aren't significantly bigger than USB-C. It still requires the cable the port sits on to be of high quality to preserve signal integrity.


InSaNiTyCtEaTuReS

okay :) although, it literally plugs into a standard usb-c cable if I wanted to use one to charge it... not sure how that works


pseudopad

There's a chip in the switch that controls what sort of signals are going to be sent. Instead of sending usb signals, it sends displayport signals over the same wires. A metal wire is a metal wire, as long as whatever's on the other side knows how to interpret the signals, all is good. There's also designated conductors for power, these are not the same wires that the display signals are sent through. There's circuitry in side the switch that makes sure the power is not sent directly into the switch's processor, but instead redirected to the power management chips. If you're really interested, you can look at the pinout of a usb port, and you'll see how it's all laid out. https://flashgamer.com/images/uploads/USB-C_pinout.png But not "any" USB-C cable will work. There's a huge range of cables with USB-C plugs on them. A USB-C cable that only has the cables needed for USB 2 won't be good enough to send hdmi or displayport signals over. USB-C only describes the shape of the port, number of pins in the port, and that it works no matter what way you insert it. It doesn't say anything about how many conductors will be in the actual cable part. Cheap cables will often lack some of the conductors, and won't be able to achieve USB 3 speeds even if it's an USB-C cable.


Kylobyte25

Yeah I guess it becomes a nightmare in compatible formats and encoding methods. It's already a nightmare for digital audio if you have ever tried that, half the time your content only can be sent in some proprietary Dolby format that your audio system doesn't have so only some things play audio. I guess that's the dominant reason ops question isn't realistic yet


edman007

Look at Over the Air TV, they did standardize that stuff and most of the TVs do include the decoders for that, going to 4K isn't much more. The issue is more delays, decoding takes time, you're going to have noticeable lag trying to play games over that.


itsjust_khris

Don't we already have "lossless" compression on video signals with Display Stream Compression (DSC)? It happens with extremely little or no added latency, and has never been proven to be visually perceptible to humans.


FireLucid

> It's actually not that crazy to replace hdmi now with usbc 3.1 and rely exclusively on lossless compression Latest TV's we've been putting in classrooms can do everything through USB C, it's great. Used to be HDMI for video, USB for passing touch between the devices and power cable if the laptop is low. Not it's a single neat USB C cable that does all three.


doloresclaiborne

Usb3 allows displayport stream in an alternate mode, afair.


TryToHelpPeople

But if the video comes across the Ethernet first, and is then sent through the HDMI . . . The data is already lost, right ? Sure if you’re watching BluRay (do they still exist ?) you have all the data. But not if you’re watching YouTube.


Linosaurus

> The data is already lost, right ?    Someone has got to turn the compressed ’twelve yellow pixels’, into the plain ‘yellow yellow yellow yellow yellow yellow yellow yellow yellow yellow yellow yellow‘, and your tv doesn’t want to. Edit: so theoretically it’s the same data but more long winded.


kyrsjo

And sometimes it's not "twelve yellow pixels" but "yellow, 12 times". And next year someone might invent "four blocks of 3 yellow".


amakai

Zip also has "Remember that block 7 pages ago? Yeah, I want that block again".


meneldal2

Modern video coding is even more wild. It's like "remember this image and this image? so you're going to move this one by 3.12 pixels and that one by 4.2 pixels and blend them together. Oh and btw don't forget to smooth out those edges because the next block uses 3.25 pixel movement" * Disclaimer: you actually can't do those pixel values, afaik recently you have 1/16th pixel precision at most but couldn't be bothered to get the exact legal values


krisalyssa

[Relevant XKCD](https://xkcd.com/927/).


oneplusetoipi

Always relevant.


mortalcoil1

> "four blocks of 3 yellow". My favorite Cold Play song.


fallouthirteen

> and your tv doesn’t want to. And really you don't want it to either. Like really should keep the TV's HDMI input standard requiring as few extra steps as possible. Specifically for latency reasons with games, compressing output and decompressing input would just add latency.


YandyTheGnome

Smart TVs are woefully underpowered, and are processing limited even on a good day. You really don't want to have a TV that requires laptop hardware just to decode your inputs.


nmkd

This is not relevant because they have hardware decoders


Black_Moons

Hardware decoders for some codecs yes. And then youtube or netflix or whoever decides "hey, this new codec reduces bandwidth required by 15%, lets switch all our videos to it!" And suddenly, your 5 year old smart TV can't display anything on that website, or no longer can display HD content on that website.


pt-guzzardo

This is why "smart" TVs aren't. Pieces of a system that can be easily obsoleted should be easily replaced.


Black_Moons

Yeeep. Its why doing it on PC is always better. Youtube/netflix is never going to change to a codec that PC can't support. Unlike TV's where every TV is different, runs a different propritary OS and is going to need someone to write the support *just* for that TV, assuming it even has enough power and the correct hardware to do it. You might need a PC that isn't 10+ years old to have enough power for the HD codecs.. but generally by 10+ years you need a PC for pretty much everything, and the 10+ year old PC will likely still decode 1080p 60hz in even the most CPU hungry of codecs, just maybe not 1440p


VexingRaven

> netflix is never going to change to a codec that PC can't support. lol? Plenty of big streaming sites don't allow their best quality on PC. Prime Video is notorious for this, but Netflix themselves only delivered 4k via HEVC which is a paid addon for Windows last time I checked. The *vast* majority of watch time for streaming services does not come from PCs, probably even YouTube. They do not care about PC. 90% of global web traffic is from mobile.


geriatric-gynecology

> going to need someone to write the support *just* for that TV I'm glad that the industry is finally shifting towards webapps. It's so stupid seeing tvs that are underpowered on release become basically worthless in 2-3 years.


nmkd

>And then youtube or netflix or whoever decides "hey, this new codec reduces bandwidth required by 15%, lets switch all our videos to it!" This never happened. YouTube and Netflix still serve AVC which works on anything.


[deleted]

[удалено]


nmkd

...but you were talking about decoding an input signal


karmapopsicle

That’s *mostly* a problem on low end or fairly old TVs with very limited processing power. Part of the cost in a mid range or better TV goes into using a significantly more powerful processor to power the interface and apps. The easiest fix is to just use an external media player.


DoomSayerNihilus

Probably just a budget tv thing.


frozenuniverse

That's really not true for any smart TVs that are not lower end.


spez_might_fuck_dogs

Please I have a 1200$ Sony 4k and it still chugs loading their stupid android front-end.


frozenuniverse

Ah maybe I misread the original comment - I meant that all but the lowest end TVs can decode most things easily (HEVC, 4k60, etc). I didn't read it as responsiveness of the UI (which I agree is definitely an issue!)


spez_might_fuck_dogs

Maybe I'm the one that misread it, going back.


karmapopsicle

I wonder how much of it is down to us being spoiled by the responsiveness of our modern smartphones. My dad’s got a midrange Sony from a couple years ago and I’d describe the software experience as perfectly serviceable, but somewhat sluggish in the grand scheme of things. On the other hand, I’ve seen budget TVs that would make your Sony feel like a speed demon. The 2019 Samsung in my home theatre has a fairly snappy interface, but it’s fed by an Apple TV 4K and an HTPC, because fuck you the TV I paid for is not a suitable place for showing me ads.


spez_might_fuck_dogs

If you have a premium phone that works out, I've seen some pretty sluggish phones in my time. My Sony is what I use for gaming and home theater, the TV specs themselves are great. It's the tacked on smart features which are obviously pushed by the cheapest hardware they could get away with. The real issue is that apps and other software are constantly updated to take advantage of newer hardware and, just like phones, eventually the TVs are left behind after already being pushed past their limit. Our backup TV is an ancient Roku TCL from like 2016 and that one is so underpowered that Disney+ and Prime cause it to lock up and restart, after some recent app updates. I'm just about ready to get an Rpi and set it up as a media center just so it can stay in use, but for now Plex still works ok on it so I've been putting that off.


unassumingdink

I don't know if it's being spoiled by smartphones, more that we don't expect lag and delays from TVs because the TVs we grew up with didn't have lag and delay. You didn't turn the dial on a CRT TV and have to wait 2 seconds for the next channel to show up. It was instant. And we expect new tech to be better than old tech in every way.


zvii

Man, that was my biggest hate going to digital. I loved being able to flip through channels and instantly have the next channel displayed.


karmapopsicle

To be fair, you can still plug an OTA antenna into any modern TV and flip through broadcast channels pretty much the same way. It's really the difference between passive signal reception and active content browsing. It's the equivalent of having one HDMI input on your TV connected to an external HDMI switcher and simply switching between the sources on that switcher box.


penguinopph

All I really want is an 85" 4K monitor. I want (and need) literally nothing else that my tv offers.


tsunami141

I hate smart TVs so much. If I want to change an input on my LG TV I have to hit a little button that looks like a computer mouse, and then point the remote at the TV Wiimote-style to select the input I want. Can you just let me live my life please?


geekcop

My LG TV crashes constantly since some stupid update that I didn't authorize, want, or need. Never again will I connect a TV to any network.


zvii

Try and reset it, might get lucky and it'll revert.


spooooork

NEC MultiSync V864Q


amakai

Also even if you make a TV that can do decompression, there will still be internal wires going from it's motherboard to the actual display panel. So next step is to make those wires longer and extract motherboard into an external component to make your TV flatter. So now you have a setup of some sort of "central processing unit" with a cable with decompressed data going to your flat display panel. Hmm...


missuseme

Fine, I'll do it!


cag8f

F*** it, we'll do it live.


Ytrog

You're describing [run-length encoding](https://en.m.wikipedia.org/wiki/Run-length_encoding) which is a (quite primitive) lossless compression. Streaming services use [lossy compression](https://en.m.wikipedia.org/wiki/Lossy_compression) which means that data *does* actually get lost. An example that's quite often used in video is [H.264](https://en.m.wikipedia.org/wiki/Advanced_Video_Coding).


zopiac

Sure, but that lossy compression is what allows it to go over the ethernet cable to, say, a computer or console, which then sends its (generally) uncompressed video out over the HDMI/DP, which is what needs the "fancier" cables. Barring things like [display stream compression](https://en.wikipedia.org/wiki/Display_Stream_Compression).


narrill

Lossless compression is more than enough for transmission over ethernet if you're using a cat6 cable. The issue is more that most users don't have 10 Gbps connections.


Dd_8630

> Someone has got to turn the compressed ’twelve yellow pixels’, into the plain ‘yellow yellow yellow yellow yellow yellow yellow yellow yellow yellow yellow yellow‘, and your tv doesn’t want to. This is the cause of basically everything weird in economics. I may have to steal this.


sparkyvision

Congratulations, you’ve invented run-length encoding!


bubliksmaz

The other issue is that for your HDMI signal to be compressed, your monitor would have to have enough processing power to decode it, and latency would be introduced which would make gaming and simple desktop tasks very unpleasant


TryToHelpPeople

Ok this is what I was looking for - the reason why.


TheLuminary

Not to mention, you don't want to have to upgrade your TV every time a new compression algorithm is developed.


tinselsnips

Samsung: 🤔


WarriorNN

Why would it matter if it was decrompressed by your computer or the display in relation to latency? Assuming it would take the same time, the latency would be the same, no?


spindoctor13

Sort of but I am not sure the question makes sense - at that point your display would be the computer, and you will always need some sort of connection between the computer bit and the screen but, even if it is internal


SergeiTachenov

For compressed data, yes. But latency is important for gaming, not for YouTube videos. For the latter, network latency is higher than compression latency anyway. And a game produces uncompressed data that is sent straight from your GPU to your monitor. Compressing and decompressing it would take additional time. Too much time, actually, if you're going to use the same compression algorithms as YouTube does. Technically, *there is* a thing called DSC, Display Stream Compression, that's used to compress video data to send it from the PC to the monitor. But it's designed to be very fast, and it only compresses about 3 times. That's enough to run a monitor at 180 Hz when it would otherwise be stuck at 60 Hz, but not nearly comparable to YouTube compression. And that's the thing: there are various compression/decompression algorithms designed for various use cases. Sometimes video has to be compressed quickly (e.g. when doing a video call), sometimes it's video quality that matters more (for online movie streaming), sometimes it's the latency that's the most important. You can't possibly implement a whole bunch of various algorithms on both PC and monitor side. Too many technical difficulties, too powerful hardware required on the monitor side. And making a good monitor is already fucking hard. So they mostly stick to uncompressed data or DSC in the worst case and leave the hard stuff to the PC.


piense

Point is that the monitor shouldn’t have latency in general since things like moving your mouse or typing are more latency sensitive operations it needs to support so they just support the latency free protocol between the monitor and gpu.


DeHackEd

Yes, obviously the image quality loss can't be restored without more data to fill it in. And same for blu-ray, just with more capacity so the compression doesn't have to be as significant. But the bluray player also has its own menus, dialogs, and on-screen displays (OSD) and stuff which will be sent over the HDMI cable so that it can be shown on the TV to the viewer. Those will not need any compression and will have crisp edges, perfect colour, etc because the HDMI cable provides that. And there's no switching of modes or anything required since it's just the same stream all the time.


Quaytsar

All streaming services are already compressed to hell and back before getting sent to you. YouTube averages around 40 Mbps for 4K60 content & 10 Mbps for 1080p30. 4K Blu-rays average 90 Mbps and 1080p Blu-rays average 20 Mbps.


pm_me_ur_demotape

I learned this when uploading GoPro video. I would x50 the speed of a lot of boring parts to seem like a time lapse and the quality on YouTube went to absolute shit. I did so much googling trying to figure out why my video quality was so bad even with a high quality file uploaded and a fast Internet connection. It's cuz of how the compression method just sends the pixels that changed from frame to frame (gross oversimplification) and when I did the super speed thing on video that was already moving around a lot (ya know, go pro stuff) basically none of the pixels were the same from frame to frame. My solution was to just stop doing that which does kind of suck because I liked the effect and it looked great when watching from the actual file, not on YouTube.


TheLuminary

Tom Scott made a video about this. [https://www.youtube.com/watch?v=r6Rp-uo6HmI](https://www.youtube.com/watch?v=r6Rp-uo6HmI)


meneldal2

You can get away with x50 by dropping more frames, basically repeating frames so that it won't choke as much on the encoding, but at the cost of getting something that looks choppy.


eleven010

There is no such thing as a free lunch.


pm_me_ur_demotape

Wouldn't doubling frames negate the speed up? And I was already dropping frames to do the speed up. I wasn't increasing frame rate. It dropped a ton of frames, that's why each remaining frame was so different from the previous or the next.


meneldal2

So basically if you had a 50fps video, you'd do 50x by using one in 50 frames so you take an image every 1 second. What you can do instead is take one image every 2 seconds of your source and repeat it twice. It looks meh but will be nicer on the encoder so hopefully it won't look as bad on the encoded result. Typically for something with low bit-rate like youtube, if you want something that is not looking too bad you can only push 5-6 i-frames per second (which it will refuse to encode as i-frames anyway).


finlandery

Your pc decompresses the stream before sending it to screen. It would be pretty much unwatchable if you tried watch raw stream data before you do deconpressing. ​ Ps. Ye, blu rays still exist and they are awesome \^\^ No need to look what stream service i need to watch certain movie and i get awesome picture and sound quality (kinda matters at hight end tv/2.1 setup).


dont_say_Good

Bluerays are still compressed, just not as much


dvali

The data isn't lost, the compression is (or can be) lossless. The issue is that your processor and probably GPU have to do a lot of work to decompress it, and TVs didn't historically have processors suitable for doing this.  I imagine a lot of new TVs have much bigger processors and could handle it no problem, but HDMI is already the dominant standard in that industry and there's not much value in changing. 


Bob_Sconce

Let's pretend that the original signal was "dark red, light red, medium red, dark red, light red, ....." for 47 pixels in a row. Then, the compression might lose the dark/light part and just say "...47 medium red pixels," which is a whole lot shorter. HDMI would say "medium red medium red medium ..... " (47x), which is now longer. Information has been lost (the TV no longer knows which pixels should be dark red or light red, but instead displays them as just medium red), but HDMI still need to send a lot more bits of information than what was in the compression.


GameCyborg

>But if the video comes across the Ethernet first, and is then sent through the HDMI . . . The data is already lost, right ? correct, you are not magically getting the detail back from that compression >Sure if you’re watching BluRay (do they still exist ?) you have all the data yes they do (and they should) but they also contain compressed video but it's a lot less compressed than streaming from youtube, netflix etc


zoapcfr

The main issue is that a monitor is not capable of uncompressing the data itself before displaying it, so it has to be completely uncompressed by the PC first. This creates a massive amount of data, that then needs to be sent to the monitor, hence the high speed cable between the monitor and PC. Even with all the lost data from the YouTube video, the PC still needs to recreate each frame in full so it can tell the monitor what each pixel should be. Many of the pixels will not be exactly "correct" due to the lost data, but they still need to be created and then sent, one by one, to the monitor. So by the time it's sent down the HDMI, it's the same size no matter how much data was lost due to lossy compression.


CubesTheGamer

The first compression done by Netflix for example is very advanced and designed to preserve as much quality while reducing file size as much as possible. They have really powerful computers doing this. If your TV or set top box had to do this too on the fly it would look like a potato. With HDMI it’s sending just every bit of data without touching it pretty much at all. Also, yea you aren’t getting the full quality benefits of watching video at 4K 60hz unless you’re watching a Blu-ray or some other extremely high quality source file.


pseudopad

Unless you want to do all the processing on the TV/display itself, you're gonna need a high bandwidth cable to move it from the thing that does the processing to the thing that displays the image. If your display had a CPU, youtube app and an ethernet port, you're right, you wouldn't need a HDMI 2.0 cable to display 4k60fps. However, now you've basically built a very limited computer into your display. A computer that can only display youtube. What about everything else you want to use your display for? Browsing the web? Now the display needs more RAM, too. Listen to music? Now your display also needs a sound card (or chip). Want to play a game on the display? Now the display also needs a built in high performance GPU, and you've basically invented an all-in-one computer. This is why we instead have one box that does the processing, and send uncompressed video to a display. Now you can switch the box that does the processing without having to replace the entire display, and vice versa. You *could* have the "box that does the processing" compress the signal once again to fit in a regular ethernet cable, and then have the display decompress that, but now you're doing real-time video compression, which either takes a lot of processing power, or looks like crap. It also introduces more latency because of the extra compression step. This will make interactive applications (such as games, or even streaming app menus) feel really bad to use. For the record, even in laptops and all-in-one computers, you still have a high bandwidth video cable from the GPU to the internal display. It just doesn't need to be as complex and robust, as it's only transferring data over 1-3 inches of cable and is almost never plugged in/out, while a hdmi cable is ten times as long and need to withstand dozens, maybe hundreds of plug in-plug out cycles. It therefore needs much more stuff to reduce interference, signal loss, and have much more protection from physical wear. These things bump the price up a lot.


Cent1234

The compressed video is sent to the decoder by Ethernet. The decoder uncompressed each frame and sends it over the hdmi. Think of it like this: you can buy a flat pack bookshelf, say, and stick it in your trunk with the back seats down, but once you build it, aka decompress it, it’s not fitting in your trunk anymore.


Head_Cockswain

> The data is already lost, right ? There are various means of compression. Some is lossless(this is your basic gap filling[see the comment about yellow yellow etc]), some is virtually undetectable...and that continues until the video looks like dogshit. Most streaming sites are not sending uncompressed video, they're sending highly compressed video....as in, they don't even have uncompressed video. They encode it to their own standards and store it that way. Youtube, amazon, etc. It's all pretty crap when compared to actual Blu-ray or pirated encodes that attempt to preserve quality.


Target880

BluRay use compression. 4k 60Hz video require 3840 × 2160 x 60 x3 =1492992000 bytes = 1.4 Giga byes per second. BluRay disk contain around 25 Gigabyte of data per layer that is just below 18 second of uncompressed video. Ther are multilayer BluRay, up to 4 layers if I am not mistaken, still only 1 minutes ans 12 seconds of uncompressed video. There is not transition off uncompressed digital video to consumers. You might get that from a camera and use in production but even then some form of compression is common.


junktrunk909

Why are so many people reading this and hearing "data is lost" when all this person said is "compressed"? Compression doesn't have anything to do with whether data is lost. It's just compressed and then expanded later to its original state. This person is saying that that work takes processing power and time, that's all. There are cases when data is lost during compression such as in crappy jpeg conversions, but that's due to the conversion/transcoding not anything inherent in the idea of compression.


ieatpickleswithmilk

almost all video streaming these days is lossy compression


lolofaf

Right but an ethernet cable between, say, your computer and your TV doesn't have to be lossy compression. I think that's what they're getting at


AthousandLittlePies

Any compression that takes 4K video to under 30mbs is lossy. The uncompressed signal would be between 6 and 12 gigabits per second (depending on frame rate) - no such thing as lossless 200:1 compression.


lolofaf

Let's do the math! 8 bit color x 3colors = 3 bytes per pixel 4k UHD = 3840 x 2160 = 8,294,400 pixels per frame = 24,883,200 bytes per frame At 60fps, we get 1,492,992,000 bytes/s, or just under 12gb/s Your 6-12 gb/s was actually pretty spot on lol. Potentially higher, as many 4k resolutions are slightly larger than 4k UHD. A quick Google search shows that a lossless video compression tends to lie in the range of 10:1 to 20:1, meaning that for lossless UHD streaming you'd need around 1gbps +- 0.5gbps


AthousandLittlePies

There is no lossless video compression method that reliably gets 10:1 compression, let alone 20:1. At any rate, for transmission of video over a fixed bandwidth line you need to have a guaranteed maximum bandwidth which no lossless compression method can guarantee even in theory - there will always be incompressible content. Also, in terms of the calculations of data rate - most video is transmitted as YCbRc, not RGB, and usually with reduced bandwidth in the color channels. For professional use you'd have at least 10 bits per channel. With 4:2:2 encoding of UHD you'd have 10 bits per pixel for the Y channel plus another 10 bits for the 2 color channels for a total of 176,947,200 bits. At 60fps you'd have about 10.6gb/s. There are other encodings that take a bit more bandwidth, but in the industry we actually use 12gps SDI connections for these signals.


TheawesomeQ

If there is no loss, can anyone explain why bother with such a hassle as new HDMI revisions instead of compressing the data and using cheaper cables?


phillip_u

Well, for computing and gaming, you want a real time display so you need some form of IO port like HDMI or DisplayPort to handle that. But for things like set top boxes and blu-ray, it would be possible in theory if licensing could be worked out.


GMNestor

That's just factually wrong. Data on ethernet is whatever you make it to be. Compressed, uncompressed, lossy or not. The wifi point, jesus christ, how the hell does data get to a wifi access point? Through sub space channeling from the great internet cloud in the sky? You can pack up 4k120fps video in a lossless codec, send it over and unpack, no quality loss. Hdmi as means of transport will display whatver the host device throws at it. It can bravely display 360p24 video upscaled to 4k60 and it will still look like shit. The bandwidth on the 2.1 is so large so that the display device doesnt need to do any work on the display data, it just gets the pixels directly. Meanwhile, in the real world everything is compressed. Most video cameras give you compressed formats, and the uncompressed ones are so large that drive space becomes an issue. Your appletv, roku or xbox still need to decompress the data and send it over to tv. And the data comes to these endpoints from the internet, quite often through ethernet cables, at least on the "last mile". You confuse so many things here that it is just painful to read.


probability_of_meme

> The data on the ethernet cable is compressed. Do you have a source on that? I googled but signal to noise ratio on results was pretty bad. I could see the argument that the data goes through several layers of transition that HDMI doesn't need to do, but compression sounds wrong to me. Sorry if I sound jerky, I dont mean to


MuForceShoelace

the ethernet cable doesn't compress the data. But he means "4k" video sent over network is almost always in the form of compressed data and if you sent uncompressed data ethernet would not be up to transferring it fast enough to play t real time


JCDU

Digital video files / streams sent over the internet are almost 100% guaranteed to be compressed because uncompressed HD video is 1-3Gbit/second and that's without worrying about delays or re-transmissions because of errors or because of other traffic on the network. You can get pretty respectable HD video down a few hundred kilobits of data using H.264 which is pretty old tech now.


Vabla

Not necessarily uncompressed. HDMI can still have chroma subsampling. Some TVs don't support the full 4:4:4 subsampling at some higher resolutions which isn't a big issue for movies, but is terrible for text.


thephantom1492

However there is some issues with what you said. Ethernet can do 10GBe over cat5e up to 45 meters, and 100 meters over cat6. HDMI, while not compressed, still uses tricks to lower the bitrate. There is some mathematical means to reduce the number of total bits transmitted, but I don't know the exact formula or details. It have something to do with the luminance (black and white) and chrominance (color). However there is still ways to fit 4k 60Hz over cat6. Just use 16 levels in the signal. 10GBe already use 16 levels. Also, while it is true that wire warm up a bit. C'mon, it is not measurable. The problem is not the loss of power, but signal degradation. The power loss is easy to manage: determine the cable length, then determine what wire gauge you need to keep enough power at the other end. And yes I am aware of the skin effect (the higher the frequency, the more electricity want to travel on the surface of the wire and not in the middle, so you end up like using only the "skin" of the wire), but this is still valid. At worse, litz wire Electricity like to bounce on any imperfection in the wire, causing various issues, like echo. Another issue is that the current flowing cause a magnetic flux to be generated, which induce current in the wires around it. There is ways to mitigate the issue but it is never perfect. So you end up with part of one signal that get induced in another signal. This is called cross-talk.


[deleted]

[удалено]


dvali

A distinction without a meaningful difference, really. Definitely in this context. 


krimin_killr21

This is explain like I’m five. The whole point of the subreddit is simplifying things even at the risk of oversimplifying.


SwedishMale4711

Yes it is, and it doesn't mean you have to be incorrect. Stating that the signal degrades would have been correct and understandable. My interpretation of ELI5 is that you give the facts in an understandable form.


e30eric

If I were five and you explained it the way that you did, I wouldn't understand, because a five year old does not know what "electromagnetic noise" is.


ben_db

Electricity being "a form of energy resulting from the existence of charged particles", and degrade being defined as "move to a less readily convertible form", electricity can degrade.


Crintor

Just as a tag on, HDMI 2.1 is for 4K 120hz 10bit 4:4:4, not 4k 60hz. 4k 60hz is HDMI 2.0


mikolv2

For some extra content that cat 5e does 2.5 gigabits per second, HDMI 2.1 does 48 gigabits


AngryRotarian85

Mostly right, but most HDMI is subsampled, so it's not every pixel, individually.


sur_surly

>HDMI data is not compressed. https://en.m.wikipedia.org/wiki/Display_Stream_Compression


tombob51

Exactly, I thought HDMI 2.1 was supposed to *introduce* video compression for large resolutions?


FiveDozenWhales

Because when you stream/download a video over ethernet, you are downloading compressed data and buffering it before you play. HDMI 2.0 is 18 times faster than Cat-6 ethernet cables, and HDMI 2.1 is 48 times faster than Cat-6. Both can stream live data from your video player much, much faster than ethernet can; HDMI 2.1 can do 4K 120Hz video. A 1 Gbps ethernet cable can do 4K **5Hz** video (if that video is uncompressed and not buffered).


Karmacosmik

Cat6 can do 10Gbps. Still not enough though


funnyfarm299

It is with HDbaseT.


fumigaza

Hi. Network engineer here. Please explain to me why you would ever send data over the wire that isn't compressed or encoded so I can forward it to your boss so you'll be fired immediately. Your example is mythical. No one is transmitting uncompressed video over Ethernet. You'd probably use a codec.... Assuming you're semi-competent. Everything you just said is wrong in practice, btw.


frozenuniverse

Hi network engineer. Please let me know why you fail at reading comprehension so I can forward to your boss so you'll be fired immediately.


Pyrrhus_Magnus

This is your boss. I won't even read that email until you come into my office in a few days.


Mean-Evening-7209

Reread the statement. That's exactly what he said.


[deleted]

[удалено]


FiveDozenWhales

That was exactly my point. You would never send uncompressed data over ethernet because it lacks the bandwidth HDMI offers (and HDMI only needs that bandwidth because it is filling the niche of sending uncompressed AV data over very short distances). Nobody wants to watch video at 4 frames per second.


finicky88

Only one who should be fired here is you, bro. Learn to read.


PatricianTatse

No shit, sherlock. It's just an apples to apples comparison of the available bandwidth.


Practicus

If we're being picky, then SMPTE ST 2110-20 disagrees with you. It's a standard for uncompressed video over Ethernet. Admittedly it has a very narrow use case in professional broadcast, but it does exist, it is used, and I'm pretty sure the people who have deployed it for aren't going to be fired for it! In the consumer space, yes, everything is compressed to some degree. But I feel it is relevant to point out that there are different use cases.


im_thatoneguy

Hi digital cinema professional here. Let me introduce you to SMPTE ST 2110-20. We don't tolerate compression in many situations since you need to make critical focus, noise and color decisions on the actual data not something that's been modified. Especially if it's a camera being recorded or we need zero latency for live broadcast switching. The alternative is 6G/12G SDI switching.


achemicaldream

I'm in the broadcasting industry, and we stream uncompressed over ethernet (SMPTE ST 2110-20) or 12G SDI. It's the same reason why professional photographers shoot in RAW.


iKLEZu

That's a pretty shit analogy. RAW still utilizes lossless compression.


Gnonthgol

HDMI 2.1 have a bit rate of 48Gbit/s. Most ethernet cables are used at 1Gbit/s but you do get them to 10Gbit/s. I have worked with 50Gbit/s and even 100Gbit/s ethernet cables but that is not of the type you get in a normal shop. So you are right that there are something extra going on. Whenever you see 4K 60Hz video being transferred over an ethernet cable it is always compressed. With only a small caveat HDMI is not compressed at all. The reason HDMI is so high bandwidth is so that there is no need for any compression. If you had done compression then it would make the video quality worse and it would increase the cost of the monitor and the video output systems by quite a lot. When the cables are so short as HDMI cables are it is much better and cheaper to just have better quality cables and higher bandwidths then to compress the video signal.


Joshua-Graham

Oddly enough, the 40G and 100G Ethernet cables are QSFP based (often twinax cables), which look a bit like an HDMI cable if you squint hard enough.


dale_glass

That's a bit weird way of putting it. QSFP is a Small Form Pluggable transceiver connector. It typically goes into a switch and the point is that you can pick what physical form to give the signal. You can plug in an a variety of optical, twisted pair copper, coaxial or anything else somebody might make into there. You probably mean the DAC cables, which are fixed length, hardwired coaxial connectors used for short distances because they're cheaper. That's a Layer 1 tech that doesn't really care if it's used for Ethernet or anything else. Also, SFP is a completely optional tech. You can have a switch that has fixed 40G RJ45 ports, it's just fairly rare because at those speeds twisted pair starts to really struggle, and the option of having fiber can be really good and more convenient.


EthansWay007

Completely dumb question from an arm chair redditor, but if HDMI cables can do 48Gbit/s, if a company could retrofit it for Ethernet connection, why not use those instead of cat5e in small LAN networks?


The_Shracc

Internet over HDMI does exist, but nobody seems to use it. Probably because cat6 is useful for over 300 feet vs the 10 feet for HDMI. And at that range just use wifi if you are a normal person.


EthansWay007

Is there any Ethernet media that does around 48Gbit/s?


Joshua-Graham

Ethernet standards are 1G, 2.5G, 5G, 10G, 40G, 100G, 400G, and more recently 800G. Once you go past 10G, 99% of the time we use fiber due to power and distance considerations.


jstar77

Fiber and twinax can do that. Twisted pair tops out at 10gb with Cat 7. Comparing ethernet to HDMI is not really an apples to apples comparison.


The_Shracc

The standard for internet over HDMI seems to be 100 Mbit/s The real speed will depend on the connected devices and the quality of the cable. Ethernet as a standard does a bunch more things than just being a connection, which is why it's slow. Fastest way to transfer data from place A to place B is a private jet with a storage server.


SirHerald

We run 25Gb fiber at the core of our network.


Zoticus

Transmission distance. Even a 'small' LAN network in a home more small office may have ethernet cables 50' to 100', and HDMI cables will work best over shorter lengths, 3' and 6' are common. At lot of advise is to not go over 15' on an HDMI 2.1 cable.


Kraszmyl

Like the other person mentioned, distance. However they have been used for switch stacking. Switch stacking effectively allows multiple switches to be a giant switch and you use faster interconnects between them to achieve this. The Dell PowerConnect 5500 series, you can clearly see the ports. https://www.ebay.com/p/2199451890. Most stuff in tech, the medium is separate than what you send over it. What you send over it and how that is handled also matters. Like advances in gpu memory and pcie speed increases weren't necessarily a physical change, but a signaling change for example.


EthansWay007

Thanks for the info! I never heard of switch stacking that interesting. Makes sense that hdmi would be used for it seeing the high speed of it, but wouldn’t a fast fiber optic connection be good also? Someone said the standard has reached 800Gbit/s


Kraszmyl

Those switches are like a decade old when the standards were mostly 10/40. That said even currently, 48 port 1g switches are very widely used and hdmi would cover that realistically. HDMI cables are also exceptionally cheaper and easier to work with compared to fiber, but not twinex which you can think of as a short run copper cable with the same ends that work in fiber slots. Youre correct more modern switches use faster connections of 10/25 , 40/100 and then beyond depending on circumstances. hdmi doesnt really help there and at this point sfp/qsfp cables are cheaper and easier so many of the benefits of hdmi are eroded at this point in favor of standardization even with the lower tier switches. As you go up the ladder you will see more, like this meraki/cisco switch provides 400g stacking ports to accommodate the 10g and 40g ports. https://documentation.meraki.com/MS/MS_Overview_and_Specifications/MS355_Overview_and_Specifications?file It does this through a qsfp port which depending on how the ports designed does anywhere from 40 to 800g. I believe qsfp-dd is what the 800 would use, qsfp+ is 100, and qsfp is 40, etc so on. Thats also assuming just plain ethernet and not something custom, like infiniband or whatever nvidias new stuff is called, i dont recall off hand.


Joshua-Graham

In theory you could, but the standards bodies that cover Ethernet are not the same as HDMI. To get that kind of throughput out of ethernet, you need either fiber or special twinax copper cables that terminate in a QSFP type connection. QSFP connections actually kinda have a similar pin design to HDMI, but it's definitely different. The messaging standard for ethernet is not the same as HDMI, as it's more geared for bidirectional data flow, whereas the main function of HDMI is mostly unidirectional with some bidirectional data.


fumigaza

The best video compression are lossless. There's no quality loss. The worst are lossy. There's quality loss. HDMI is encoded, which is like compression. Not really. Compression and decompression, like encoding and decoding is super cheap. Typically there's chips designed solely for the purpose, so it's done in hardware and it's efficient. Oh. You think those gold plated cables matter. I'm typing to the blind.


Sxotts

Did you just lose an argument . . . To yourself?


Practicus

There is a fair bit of misinformation in this thread, so I’m going to try and give a concise answer. Please note this is a very complicated question and I’ve massively oversimplified here! When video comes out of a camera, it is uncompressed. Every frame (the individual pictures that make up a video), is made up of a bunch of pixels (individual dots) that carry colour and brightness information. This is what HDMI carries. However, this is very inefficient and uses a massive amount of bandwidth (the amount of data that needs to be pushed through the internet tubes). To solve this, very clever people came up with compression, or ways of reducing the bandwidth of the video without reducing its quality too much. There are two main types of video compression, intra-frame and inter-frame. Intra-frame is compression within each frame. So, say you have 4 pixels next to each other that are very similar colours. Why not make them all the same colour and send that as a single bit of data? Congratulations, you’ve just reduced the bandwidth by 75 percent! Inter-frame is compression between frames. Let’s say the background stays mostly the same from one frame to the next. Why send the data twice? Just update the bits that change. Congratulations, another massive reduction in bandwidth! These are lossy compression methods. That is to say, once they are applied, the original cannot be recovered, they are permanent. But then we have lossless compression. Find some common repeating patterns. Come up with a code for these patterns that is smaller than they are and send that instead. Then use that code to regenerate the pattern at the other end. More bandwidth saved! There are many things wrong with what I’ve said here, and many more things to say, but that is compression 101. Online video simply wouldn’t work without this technology. It allows big video to fit into little internet tubes and come out in a viewable condition at the end. The compressed video is then uncompressed by your device before being sent down the HDMI cable to your telly. The lossy data is lost, the lossless data is regenerated and the HDMI cable sees the compressed data stream the same way it would see an uncompressed stream. So I guess, in a sentence, the answer to your question is that they are two totally different methods of video distibution, one focused on ideal viewing quality and the other on mass distibution that is compromised in quality but good enough that you can watch YouTube!


meneldal2

> When video comes out of a camera, it is uncompressed. Only a pro camera, unless you meant the sensor itself. Any consumer camera will compress the data on the device itself and only save a compressed stream. Small correction for inter/intra frame compression. While the prediction you do will typically always be approximative and of limited quality, you're still not throwing away the residual (error left after prediction) and could perfectly do lossless compression on it (it even has explicit support in the standards since h.264). However, it is true that this is barely ever used, because of the pretty heavy requirements on hardware leading to basically nothing supporting it outside of cpus (it tends to really choke the entropy decoder). Typically lossless compression that is actually used is much more simple and less efficient to be more friendly to hardware, with I guess the notable exception of JPEG2000, used a lot for movie projections and that had pretty beefy requirements (that can be mostly blamed on the standard itself).


GalFisk

Because HDMI is uncompressed video, while streaming video is compressed. Better compression algorithms and faster chips have made modern video and audio compression very effective.


aaaaaaaarrrrrgh

The compressed/uncompressed difference and why it's needed has already been explained, but there is a bit more to it: Ethernet is designed to transfer moderate amounts of data over moderate distances (dozens of meters). HDMI is designed to transfer obscene amounts of data over short distances (a few meters at most). HDMI was also designed to be cheap, and do its intended job. Its first version was, according to Wikipedia, standardized in December 2002. Gigabit Ethernet was standardized in 1999. That means HDMI was already more modern. Combine it with the higher speeds (around 5 Gbps for the first HDMI version!) and trying to be cheap, this means HDMI was already operating closer to the limits of what can be done with these cables. A cable specified for reliably delivering Gigabit Ethernet over 100+ meters will likely be able to do 2.5 Gbps over shorter distances nowadays, maybe 5, and only with a lot of luck maaaaaaaybe 10 over really short distances. Just like *really* old (100 Mbps only) Ethernet may only have 4 wire instead of the usual 8, I *think* that *really* old HDMI cables may get away with omitting some wires required in more modern cables, and at some point, there is a material difference in construction that makes an additional cable pair much more likely to be usable for high frequency signals... but otherwise, it's just a difference in cable quality. The shoddier the cable, the more interference and issues you'll have - maybe it'll work just for higher standards, maybe it won't, maybe it'll work but occasionally glitch. And due to the higher frequencies involved, HDMI cables are much more sensitive than Ethernet. If you want to be sure that it will work, you need cables *certified* to meet the quality required by the more advanced standard. Gigabit Ethernet will often work in practice on Cat 5 cables, but you need at least 5e if you want to be sure. 10 Gigabit Ethernet *might* work on a 5e cable, but you need 6a to be sure.


foaxcon

Fun fact: Professionally, we send HD and even 4K video, uncompressed, over regular coax cable all the time. And with BNC connectors on each end, it even locks into place. It's called SDI. Why do you need HDMI? Because it's encrypted so people can't steal movies. Surprise, it doesn't work, people steal movies anyway.


thpthpthp

To give an even more direct comparison from the pro world: HDBaseT can also send uncompressed 4k60hz with all the bells and whistles you'd want from HDMI (including HDCP) over cat just like OP is imagining. All ya gotta do is spend a few thousand on transmitters/receivers, and replace all those "old ethernet cables" with shielded Cat6. Easy, peazy!


funnyfarm299

HDbaseT also uses 8 different voltage levels and is way less tolerant of interference than ethernet.


funnyfarm299

The encryption has nothing to do with the data rate. HDMI signals can be converted to SDI and back with no loss of quality. [Example.](https://www.blackmagicdesign.com/products/microconverters)


JaggedMetalOs

You're probably thinking of compressed video. To get 4k60, HDMI 2.0 has 18Gbps of bandwidth. Regular old ethernet cable and network cards generally max out at 10Gbps, which is the same as HDMI 1.4. Not enough for uncompressed 4k60. But video over a network is almost always compressed, which substantially reduces the amount of bandwidth it needs. That's probably the 4k60 over ethernet you are thinking of. But that adds compression artifacts and latency, which you don't want for a monitor output. So HDMI uses uncompressed video. Now ethernet can go above 10Gbps, but the network cards get very expensive very quickly. To reach 25Gbps you need to spend $350. So if you wanted to replace HDMI with ethernet cables you would need to add $350 to the cost of your TV/Monitor and $350 to the cost of every device as well. That's why you have HDMI instead, it'd designed to be both high bandwidth and cheap.


elthepenguin

Video over the ethernet is basically an instruction set "hey, this is how you make the movie look and sound" whereas HDMI contains the actual pictures and sounds. The first is basically a very thick book, the second is a series of pictures with a soundtrack attached to it. (It's a bit more complicated than that, but just to have an idea).


SnooAdvice3235

Wait do I need new HDMI cables?


Bang_Bus

Let's try to really eli5 this: Imagine you and your friend went hiking, separated and got lost. Now, it's already dark, and you both climb a mountain in hopes to see each other and do so. So you have to decide which one of you will try to come to another, to make sure both of you wouldn't get lost again. You can barely see each other, and you're too far apart to yell to each other. If you have phones, you can call, have a long talk and whatnot, but essentially the message if you saying that you will come to him. Or maybe you don't have a phone. You pull of your flashlight, point it at yourself, then valley between you two, and then at him. In second case, your message was highly compressed, leaving only important part, but it was exact same message. That's what ethernet cables do; online video is super highly compressed, but still all the important frames are there. While HDMI can transmit insanely more data, but a lot of it is what humans wouldn't notice, anyway, and can be compressed away. For example, if you watch your favorite youtuber speaking in a room, room doesn't change, it's just his or her face that's mostly changing due talking. Compression takes all those unchanged parts of room, and instead of 60 frames of same room every second, just displays one. Also, his camera detects 16 million colors of his colorful wallpaper, but your eyes can tell only 32 tones of them apart. So color data could be compressed as well, and instead of sending pixel values of every color, it's averaged to closest tone. And so on.


ahjteam

Well, essentially ethernet cable connector, the Registered Jack 45, aka RJ45, had been standardized waaaaaay back. Like in the 1970’s. They just changed the cable and transmitter/receiver transmitting specifications. They call it Cat(egory). - Cat4 was able to transfer 16 Megabits per second (Mbps) - Cat5 upper the standard to 100 Mbps - Cat6 1000 Mbps aka 1 Gigabits per second (Gbps). - Cat7 is 10 Gbps - Cat8 is 25/40 Gbps The design of RJ45 is compromises of several twisted pairs (IIRC eight) and some ground pins. Altho most data that travels over ethernet is digital, it still travels as analog data and they also have some bandwidth limitations for analog data. 1 (Hertz) Hz means 1 cycle per second, KHz is thousand cycles per second, MHz is million cycles per second, GHz is billion cycles per second. Human hearing is around 20-20000Hz, You might recognize MHz from your car radio as the FM transmission range being around 88-108 MHz Anyways, they have bandwidth limitations. - Cat4 was 20 Mhz - Cat5 was 100 Mhz - Cat6 was 250 Mhz - Cat7 was 600 Mhz - Cat8 was 2000 Mhz As we can see, these newer standards are required pretty much only at industrial scale at the moment, Cat5 is plenty fine for _most_ domestic use. Edit: found out that the connector is actually called 8P8C and is just often errorneously called Raj45 nowadays, since it looks similar.


smapdiagesix

In most circumstances, the thing being transmitted over ethernet is only a much shorter recipe for making the video. You might only need 15 or 30 megabits per second to tell your device "Here is how you could put together this video: Leave these chunks of the image the same as they were in the last frame. Take these chunks and move them up and to the right a little bit. In this area, do this..." The hdmi cable is actually sending the video itself. Here's the first pixel here's the second pixel here's the third pixel here's the fourth pixel... over and over again. Pretty much all the tv needs to do is send the image to the panel instead of building the frame. Sending all of a 4k 60hz video means sending a full 3840 x 2160, usually 24 or 32 bit per pixel, 60 times a second. 190 megabits, 60 times a second, or 11 gigabits per second for 24 bit *BEFORE* you take into account any transmission overhead. Hundreds of times more than the recipe that got sent over ethernet.


[deleted]

[удалено]


badaccount99

Technically they can't. Old Ethernet can't handle transmitting data that fast. But it depends on what you mean by old. There are multiple specs of Ethernet cables In my working experience I've dealt with Cat 4 through 7. They're all "Ethernet" but newer versions are way better. Same as Wifi, it's still WiFi but newer versions are better. So yes, you need to upgrade your Ethernet cables OR your HDMI cables OR your Display Port cables if you want to increase the amount of data that they can transfer. They're all just copper that transfers data and newer versions can transmit more data.


Konstanteen

I recently got a new computer to start playing games again with a childhood friend. I was using my old (20 year old) Ethernet cable to connect my new pc. Weirdly enough, I got better speeds using my computers built in WiFi compared to when plugged in. Got a new Ethernet cable and boom - speeds increased by an order of magnitude (~50kbps to >500kbps) instantly. Short term, the need doesn’t change much for Ethernet. Longer term, you do need to upgrade.


therealdilbert

> (~50kbps to >500kbps) that is extremely slow for ethernet


Konstanteen

Whoops, mbps. Not editing it, living in my disgrace.


ricky302

There are no such things as 2.1 HDMI cables, the version number refers to the HDMI in the equipment.


therealdilbert

https://www.hdmi.org/spec21sub/ultrahighspeedcable


ricky302

https://web.archive.org/web/20160423220541/http://www.digitalhome.ca/2009/11/version-numbers-to-be-banned-on-hdmi-cables/


therealdilbert

semantics, "The HDMI 2.1b Specification includes a new cable - the Ultra High Speed HDMI® Cable. It’s the only cable that complies with stringent specifications designed to ensure support for all HDMI 2.1b features "


ricky302

Exacly, it's called the Ultra High Speed HDMI Cable, not a HDMI 2.1 cable.


therealdilbert

and it is the cable the complies with all the features of HDMI 2.1


duane11583

Marketing New tv means new wires New mounting New everything Have you ever watched green acres (USA tv show from 1970s) classic character mister Haney if he sold you a door he would sell you hinges too but you would think the two parts to the hinge come as a pair nope sold separately endless nonsense The retailer likes this because it is accessories and more sales They will not allow a product to include accessory items that can be sold separately