Iyellkhan

Iyellkhan t1_j1wnwiw wrote

Unfortunately the answer is that it depends. IMO 4k bluray is always worth it when it is the most accurate reproduction of the film at the time of release. Part of this is because it has a MUCH higher bitrate, so you'll be getting more of the original picture out of it. If you have a high end HDR setup, I also think thats worth it. Normal blurays have to fit into the rec709 signal standard that only gives you around 5 stops of latitude/dynamic range. An HDR version can have 12+ stops of dynamic range and most closely approximate the film print or theatrical DCP.

Now, where it gets wonky is that some blurays (1080 and 4k), especially ones made further away from the original release, sometimes the color grade gets "updated." The new Hurt Locker "master" is now more green than its first release. The new edition of LOTR movies have been re-graded and de-noised to look more modern. Granted, in that case it was the director making the changes, but it is a certifiably different experience than what you got originally. At least with blurays released closest to the theatrical release, they tend to be what we call a "trim pass", aka the original color off the print or DCP is being as faithfully reproduced as possible in the 5 stop rec709 color space. The only time you dont have to worry about this is if its a Criterion release, they're pretty nuts about preserving the original experience.

That all being said, 1080 blurays use to be (and I think still are) 8 bit color depth, where as 4k is 10bit (or is at least 10bit capable). its the different between being able to encode millions of colors vs billions of colors.

21

Iyellkhan t1_iuj180r wrote

Honestly it probably depends on the post production house prepared them. Everything has been scanned or copied to digital at this point. Sometimes that means the film was rescanned (if it still exists), some times its just just NTSC upscaled. Your black and white points may be at the whim of the junior color artists, which will partially determine the saturation and sharpness

3

Iyellkhan t1_iu69vsi wrote

generally speaking, if you get drift over the course of a single take either the camera or the sound recorder need repair. That, or someone screwed up the base time code settings (say the camera is set to 24fps for theatrical but the sound is set to 23.976 for tv), but if that happens on a professional set boy is someone getting yelled at. quite possibly fired.

2

Iyellkhan t1_iu69bnv wrote

you usually do it at the end if the talent is really in the zone and you want to go again immediately, or you're just grabbing a small piece immediately (otherwise known as a pickup). You'll also sometimes do it at the end if the scene is especially intense for the actors and you dont want that loud clap sound fucking up their state of mind

1

Iyellkhan t1_iu68n8y wrote

they are, but the cheap ones dont last long where a $100 one will tend to last years.

Theres also the fancier ones that are around $1500 that can show timecode numbers, which will be jam synced to the timecode on the sound recorder. When the clapper drops, it freezes the TC (and shows some other code info) that helps automate the sync process (it also lets you set your clip timecode to match whats on the screen. not a huge deal with digital cameras that are running wireless timecode, but a big deal if you're shooting film where there is no metadata)

2

Iyellkhan t1_iu67taj wrote

its probably due to not having a dedicated oscillating crystal generator. One would think the computers own time clock could hold it in sync, but I suspect with bad software processing things sometimes might not hold.

1

Iyellkhan t1_iu676ry wrote

this was also an old technique with 16mm documentaries and, to a lesser extent, 8mm. There was briefly a period where you could get 16mm and 8mm film with magnetic sound strips that could survive the chemical bath that could hold sync, but that died out when home video took over.

2