High framerates give the appearance of and in some cases substantially better images reducing significantly various temporal aliasing effects. They are without doubt better in every way from a technical point of view.
There is however a cultural issue that is worst amongst the film creators that "film" is regarded as expensive, high-end and good so that things that don't have the artefacts (low frame rate, film grain and colour) are regarded with at least suspicion. This will probably pass with time especially as the number of high budget, high quality TV series increases.
I feel like I'll go mental if I hear one more film buff talk about the essential nature of various imperfections in traditional film. It's so obvious that nobody would have chosen to go with film grain, low frame rates, or any of the other limitations that were foist upon them if they had had a choice. But now these technical limitations get enshrined as the medium's supposed true nature, at least by some....
As in so many creative endeavours it's the limitations which make the art. It's not always about going for perfection, but it always has to be emotional.
There's no question that the latter is more technically accurate, more lifelike, more realistic, but it lacks the guttural punch of the Monet. Monet was a master, but only by exploiting the limitations of his medium could he attain that mastery.
So it goes with film. A decent cinematographer uses the inherent faults of film to convey emotion. Without the low dynamic range of film, the dark corners of the Nostromo in Alien would have looked like a plywood movie set. Without lens flare and blown highlights then the plight of a dehydrated hero in the desert would be much harder to get across. Those things - 24fps, lens flare, low range, depth of field - have become part of our shared culture now. So much so that even media which aim for perfect realism (eg video games) mimic some of them to aid immersion and, once again, heighten the emotional response.
It's one thing to creatively take advantage of limitations to make art better. It's quite another to insist that an arbitrary limitation must be present on every work, regardless of whether it's good or not.
If 24fps is somehow good for certain types of film, then by all means, keep using it. But it's silly to say that 24fps is universally better, as I've heard some people say.
You point is well taken, but just as a minor nit, I wanted to add that artists did achieve photorealism first and then move on to abstractness. Arguably, that was driven by the advent of photography which commoditized photorealism.
Fair point. Impressionism was in no small part a reaction to the overly stuffy and formal schools of realistic painting of the time. You've made me think: I wonder how much photography contributed to their frustration with that academic painting style?
But still, painting is painting. Take the Dutch Old Masters: incredibly realistic, lifelike pictures, but also extraordinarily powerful. One doesn't preclude the other, but in every case something about the medium contributes to its power. It might be the fact that a sitter for one of Caravaggio's Christs was actually suffering due to holding his body in place for so long; it could be a need to invent some aspect of light in a scene which ends up illuminating a girl's face in a particularly lovely way.
There's plenty of emotion in photography, too, but it tends to come as a result of skilful use of that medium's own characteristics: spontaneity, completeness, and presence. Press photographs are a great example, exploiting the medium's immediacy to steal a few, shocking milliseconds of reality. That applies even to powerful landscape photographs, in the opposite way: They are very carefully staged, manipulated and contrived, all simply a way to align the limitations of the medium (only 1/500th of a second to make an image) with a particularly beautiful instant of passing space-time.
The second one, no question about it. It make me think of storms and fog and winter.
The first one made me think of cartoon depictions of the evil lair and children's drawings. It also looked like those ink blob psychological tests were you can see anything you want in the picture. (Which is presumably why you like it better.)
I'm a bit of a film buff, by that I mean I like shooting of film & I own 16mm film cameras. (I shoot most if my work in digital though, so I don't consider myself all that snobby about it)
To me some arguments for film hold up, but are rapidly becoming less relevant as digital sensors improve. The biggest case form film to me is the dynamic range (in film terms "latitude") compared to digital. Film is currently still better at retaining detail in high contrast shots (for example a person dimly lit against a bright sky). And when film does "blow out" it has a tendency to be a smooth curve and even the over-exposed parts of the frame that may be fully white still have some texture. Whereas digital has a harsh drop off where the information is just gone and it has a rather ugly look.
Newer camera sensors are starting to get very, very close to film in terms of latitude. But for the moment even the most expensive Hollywood level cameras are not quite as good. I expect this to change and eventually film won't have this advantage anymore.
Film does tend to have a pleasing, organic look. But plugins are getting very good at emulating the good parts of that without all of the hassles of film. I liken it to audio recording, which went through a similar analog-vs-digital revolution in the 90's. there are still people that choose analog for artistic reasons, but digital is accepted as the primary recording medium now.
As for 48fps, though, that's irrelevant in the digital-vs-film argument because both types of cameras can shoot at either frame rate.
Latitude is something I am more interested in too. But, once I've tried Sony SRW-9000PL I was blown away. Full 12 stops, image looked (stock) like it was hand delivered from heaven without any lighting used (I had it only for a test), and even digital noise was pleasing, film like, in low light situations. I can only imagine what F-65 does then if 9k did that. Hell, even RED, post MX sensor, looks really great.
Film has a charm to itself though (even just watching rushes projectioned vs monitor), but from a practical standpoint it's dead and gone. When you factor in a cost of stock only for a 90 minute film (with a, standard, coverage of 20:1 or 30:!) you have to pay for it as much as a new digital camera. Not to mention developing, scanning, handling, storing it...
Yea it seems like in the last year or two the sensors are finally starting to increase the latitude very close to film. I've heard film was 13-14 stops and digital cameras are just starting to hit near the 13 stop level. It's getting to the point where I can't even tell sometimes and I consider myself pretty well tuned into that!
I think it's a pretty exciting time actually in camera technology. I think film will still be an artistic choice for a while. Directors like Spielberg and Tarantino have claimed they will never stop shooting on film. But, I heard that same kind of stuff about audio back in the 90s, so we'll see!
I'm also one of those guys waiting on my pre-order of a black magic cinema camera which claims 13 stops, so I'm pretty excited to get my hands on it!
I'm perfectly fine with preferring film for things that film is better at. Stuff like the dynamic range you mention is an excellent example of this. I'm not sure if digital has caught up in terms of raw resolution, either. Arguments like this make a ton of sense: if film is actually better for something, then that's a reason to prefer it.
I only have problems with weird arguments where people prefer stuff that's obviously worse, like the aforementioned grain and low frame rates.
Hmm? There's all sorts of "noise" that are generally considered a positive thing, at least in certain contexts. After all, the end goal in most cases isn't perfect information capture, it's communicating emotion/feeling/message/etc, and "degrading" an image can add something (just not the original thing).
Of course the correlation between certain types of noise/distortion and the resulting interpretation is in many cases culturally determined, and that changes over time... One minute shakycam is considered the height of immersion, the next, it's an embarrassing affectation.
But still, there's nothing inherently wrong with preferring something "worse," especially when the latter judgment is made on a narrow technical basis which misses the larger picture.
Film is more than just the quality of the image - it's about the mindset you approach shooting. If you're shooting on film you'll be lucky if you can have more than a 3:1 overrun. On digital it's much higher...you can afford to shoot more because you're not paying processing costs. Directors working with film shoot in a very different way to digital, and this does affect the final product: I know not because I'm a film buff, but because I used to crew on shoots with both formats.
I love digital, I love film. I'd like directors to have a choice, but I suspect we'll see studios effectively enforcing digital for all but the most influential directors going forward.
Of course. Anything of this nature can be used to good effect, but they should never be imposed on everything just because people are used to inferior technology.
So, yes, increased framerates do make things like aliasing concerns -- long the bane of many an action movie director's existence -- go away. However, there is a concrete and visceral difference between 24fps and 30/48 that is impossible to ignore (unlike, say vinyl vs. digital, which requires an expert and a dollop of wishful thinking to recognize). This is not something that only the snooty directors and film buffs will notice -- everyone (and I mean everyone) in the theater will comment on it.
Now, perhaps we all as species just need to adjust to the new normal. We'll see. Personally, I find that 24fps adds a sense of larger-than-lifeness that I find really attractive. Maybe I'll just get over it, but I suspect that I'll miss 24fps for quite a while.
I thought that was an actual issue dealing with "loudness" and the dynamic range of the sound. ie, newer media is formatted so that it's so loud, it will clip parts off before it you can hear the softer bits in it. I could be way off though, I'm not really and audio buff in that sense...
It was more a case of hard clipping vs soft clipping. You could get tube ( valve ) amplifiers moving high tension volts way over spec at the cost of lots of third harmonic ( mostly ) distortion by providing things like DC amplifiers. Semiconductor amplifiers would simply turn into a digital switch with enough current. Fast enough current and they're more or less flip flops.
Normalising within 16 bits has led to loudness wars. That's the medium. Analogue chains all overflow towards a chaotic end. Some people prefer that chain of events.
You're confusing a couple of things here. DR is substantially increased with digital recording; so is SNR. Behaviour at the upper end of the DR differs between digital and analog media. Whereas tape, for instance, saturates (which may induce pleasant distortion!), digital audio clips (which induces headaches). However, that doesn't really matter in practice if the audio engineer in question is worth his salt. Levels at every point of the digital chain need to be adjusted accordingly: the old strategy of keeping everything close-to-maximum doesn't work anymore. One keeps everything well below clipping point instead.
Overeager compression (specifically, limiting) is a whole different problem, especially with modern pop production.
"Loudness" is an orthogonal debate revolving around mixing (pop music). There's nothing about the various media that requires or prohibits one from mixing 'loud'.
vinyl vs CD, tube vs solid-state/digital -- those are more similar to film's 24fps vs higher frame rates: straightforward arguments against higher-fidelity technologies because we've grown emotionally attached to the particular artifacts of legacy technologies.
Someone in the industry told me it felt like they were watching actors in a stage show, and not in a good way. All sense of immersion and suspension of belief was eroded away into watching people in costumes fight on fake sets against CG monsters. He said with such a high frame rate it's almost like live footage has reached the uncanny-valley. It seems quite on par with the statement from Entertainment Weekly.
Is this uncanny valley 1) a fundamental nature of the higher frame rates, 2) a temporary phenomenon of 24fps-acclimatized eyes viewing 48fps, or 3) a temporary phenomenon of fps technology getting ahead of costume/CG/etc technology that looks real at those rates yet? Or (I'm guessing) do we not really know yet?
This is exactly right. I don't understand why some people are jumping up and down as though this is some massive technological and artistic advance. TV has been able to do 60fps for decades (and HD, digital 60fps for at least one decade) and yet no television dramas (and even most comedies) are filmed/broadcasted at that rate. Why? Exactly the reasons you describe.
High framerates are great for live events and reality-based programming. 24-30fps is suited to taking the viewer "out of reality" into an artistically constructed world. This is not going to change because some movie theaters get 48fps projectors.
Indeed, Douglas Trumbull invented Showscan in the early 80's. It was a high FPS 65mm film process, and it resulted in a overly "real" subjective effect very much this article is complaining about. Showscan was demoed at some conferences and some critics (Ebert?) were big fans but it was never used for a feature film (although he tried to get the studio to use for his film Brainstorm). It was positioned against IMAX and lost (although to my eye, IMAX suffers greatly from strobing and could use a frame rate boost).
But there's something else going on with the Hobbit. I watched the trailer, which is definitely not 48 fps, and more than once I got the feeling of watching an actor-on-a-stage-on-video. This might be more related to the use of RED cameras than the framerate.
Using my film degree about once a year here on HN!
My guess would be that the ad network they are using doesn't give anything for German views/hits. So they are saving on traffic costs.
If it were the case that they have some local brand (as is the case when you can't buy music/games/etc online), it would probably redirect somewhere else as Google.
For some reason I found myself at an IATSE event. IATSE is a union for motion picture professionals. At the end, they projected a scene, once in 24fps, the second time in 48fps.
48fps looked "real" to my eyes, insomuch as it looked how I imagine it would've looked to have been in the room.
Unfortunately, that's not the look I like in films and movies. It truly looked like a soap opera.
There's a chance that 30 minutes into the Hobbit, peoples eyes will acclimate, and you'll forget, but I think those initial scenes are going to be a bit of a disappointment.
My friend saw the 10 minute Hobbit footage and said the exact same thing.
He felt like he was looking at a play of the Hobbit--watching costumed people walk around a set. He couldn't get it to shift into that "watching a movie" place in his mind.
But you're right--it might be that 10 minutes is just not long enough to adjust. I'm skeptical but I do want to try it.
I half wonder if PJ has just spent so much time seeing his movies as they are being made,that he's frustrated that audiences don't see them -that way-... maybe on set is normal for him, and in order to get the audience to see "normally"...
48p and 60p can't come too soon. I'm with Ebert on this one. The biggest problem with film is we're stuck with 1920s frame rates that end up destroying a lot of otherwise impressive shots. It's just too jerky under a variety of common circumstances.
I agree, I'd take framerate over resolution and beyond a doubt I'd take framerate over 3D (though when I tried 3D It in all effect just looked like 3 layers paralexing).
A lot of "3D" films are basically parallax because they add in the 3D after the fact, painting it on in a special compositing tool. Only a few live action films are genuinely shot in 3D from the start, like Avatar.
I've found that computer animated films often translate better to 3D since, obviously, it doesn't require any fancy cameras.
60fps isn't even that good in video games, but so many gamers these days have been raised on 60Hz consoles and PC monitors. They don't remember the old days of CRTs, where 60Hz was painfully flickery, and you wanted at least 75Hz to relieve eye strain. 75 or 85Hz with corresponding fps was much slicker than 60Hz.
IMO the purists complaining about 48Hz are complaining about a subconscious association, perhaps a bit like the smell of popcorn in the lobby. It's not an objective complaint, but as a subjective one it may be shared by a large proportion of the audience. Hard to tell until someone takes the risk with a broad audience, like Jackson is doing here.
As micampe said, you're mixing refresh rates with frame rates, but actually you aren't entirely wrong, because I remember the heady days of upgrading my graphics card and having Counter-Strike hit 100-120 frames per second on my 120 Hz CRT monitor, and the visual difference was very noticeable vs. when I was back in 60fps zone.
The main visual difference was when rapidly moving the mouse; turning quickly was much more visually appealing at a high frame rate.
I meant that even though you appeared to be confusing refresh rate with frame rate, and certainly higher refresh rates made things look a lot better on old CRTs, higher frame rates than the max 60 fps possible on today's average LCDs* actually make a difference as well.
*(only "3D-capable" LCDs run at 120 Hz or 240 Hz and are therefore capable of actually showing more than 60 frames per second).
The reason a CRT seems to flicker with low refresh rates is because the screen goes dark in between beam traces, a LCD does not. So a static image on a LCD with 60 Hz refresh rate is stable whereas a CRT would flicker at that rate.
It's much more than that; the flickery 60Hz CRT is only incidental, it doesn't relate to the perception of smoothness. Like a sibling comment to yours says, it's very noticeable with the mouse, and even more so when moving around a window on the desktop. It looks like crappy stop-motion animation compared to what it can be at higher frame rates.
Just how old are your lights? Fluorescent lights haven't flickered at line frequency in at least a decade.
Well, obviously old fixtures still exist, but if you have one in your office it's time to get rid of it. The new ones are much more energy efficient, but more importantly they are more pleasing to the eye.
I'll reserve judgement. Technical advances are of course an integral part of film history, but the idea that absolute fidelity to some Platonic ideal of an image is an end in and of itself is purest bafflegab. You don't get to throw away the existing language of film just because some consumer electronics consortium wants to sell new TV sets (see: stereo projection).
It doesn't help the case that the last time this subject came up with a big tentpole release it was Dinosaurs Fighting Helicopters.
> "the idea that absolute fidelity to some Platonic ideal of an image is an end in and of itself is purest bafflegab"
Particularly in artistic media, where emotion and tone are regularly conveyed through distortions.
48/60p may be great for some projects. Their technical advantages may enable shots that simply can't be done well at 24fps. But even that does not make them inherently "better". Merely the right tool for a given job.
As there's no technological requirement for us to use either one format or the other, I see absolutely no reason we can't go forward allowing people to choose technology on an as-appropriate basis without slandering particular choices as illegitimate in all cases.
I'll be seeing The Hobbit at a theatre that doesn't feature 48fps first, because I know I'll enjoy it.
My experience with 48fps is exactly the same as my experience with those 120Hz televisions -- any significant motion looks like it's been artificially sped up. Anyone know why this is?
There is a thing that happens with 120Hz TVs where they fill in the intermediate frames that the source doesn't have. It's this filling in that is the culprit, since it's interpolating motion in a way that may not be accurate or realistic, or perhaps that the eye is just not used to, ie. too smooth, or perhaps too averaged.
This effect is often called the "Soap-opera effect" because so many of us first noticed it on 1980s dramatic afternoon television shows. Weren't they also filmed at 24fps like everything else?
Really? I'm still in love with the higher definition. I'm a big sports fan, and I could never go back to a lower resolution. Especially for hockey, my goodness does HD make it look soo much better. Higher frame rates would be nice too, don't get me wrong, but I think resolution is more important to me. (I could be wrong, I'll let you know when they start shooting airing sports at 60fps)
I think he's referring to even higher resolutions, above 1080p (approximately 2K in resolution, since it's 1080 x1 920). At the moment, there's no consumer (as in, under a few $K) displays that'll even show 4K (3840 x 2160), so going even higher to 8K is not really helping anyone out, any time soon. Whereas all these systems are capable of running at 60fps, so that's an improvement you can actually use.
Apart from that, we've been stuck at 24fps for a long time. It's very noticeable on any film that has any action in: jarring, blurred, choppy sequences.
There is a bit of a flaw in this article. A 24 fps film projector displays 1 full frame at a time, there is no scan line. This is very different from TV and the CRT originated scan line. Comparing frame rates between TV broadcast and Film projection is flawed.
You and the person voting me down seem to be missing the point of what I'm saying.
A piece of film is illuminated and shown fully on the screen. There is no scan line.
A TV is drawn one line at a time. There is a scan line.
Interlaced vs Non-Interlaced has nothing to do with what I am talking about as both draw one line at a time. Interlaced just means it draws half the first tick and half the second tick. Non-interlaced draws the full frame each tick. Both draw a line at a time.
No, NTSC on a CRT is drawn one line at a time (it is physically scanned by the electron gun). But TV is not NTSC any more and CRTs are dead/dying.
Your HD flatscreen (plasma or LCD) does not draw one line at a time. HD is decoded into a framebuffer and that framebuffer is drawn on the screen in some hardware specific way. It may be rectangles, or the whole screen, or different vertical/horizontal slices (with very high refresh rates so you don't see flicker).
Edit: Moveover, "film" is almost always digital nowadays (I don't think I've seen a non-digital projection in the last 5 years) which means that the picture gets to the screen via some form of LCD projection. So your home TV and film are basically the same at this point. You would've been right 10 years ago, though.
That's not true. A film projector has a moving shutter - only part of the image is visible on the screen at a time.
You can prove it by using a camera with a fast shutter and trying various shutter speeds. If you have a leaf shutter in your camera you'll have to hold the camera in all 4 direction to test it properly - you don't want the projector shutter and the camera shutter to interact weirdly.
Seems like we have this debate every time someone takes a step forward. It was amusing the first couple of times, but really, we're still arguing that the particular technical limitations of the last generation really were The One True Cinema Format? That time, we got it for sure, not like all the previous iterations where we thought that, this time for sure.
Pfooui. Can't believe we're even having this discussion. You'll take your higher resolution and higher frame rates and in five years you'll have carefully edited your memory so that you knew all along that it was a great idea and you sure were telling everybody about how awesome it was going to be against all the naysayers.
I have experimented a bit with this, and this is what I can tell you about it. As soon as DPs get a hang of it, there will be no more talk about it except in "hipster" circles like vinyl or celluloid. Getting lighting and motion blur (especially motion blur) to look and feel the same is something DPs will need to adjust to. Only thing I am worried about now are render times. There are now 48 frames per second and twice as that if in 3D.
With a high FPS (in movies), each frame is less blurry, everything becomes sharper hence the 'uncanny' feeling.
And you can't compare video games and motion pictures - there's no blurring in video games at all. (Take a screenshot in them, compare and you'll see why)
Most video games since a few years ago render frames with a degree of artificial motion blur to simulate speed. This is particularly the case with racing games but also exists in first-person shooters (moreso on the PC than on consoles, since rapid mouse movements translate to rapid screen motion).
Are you think of in situations (shooters) where you get like hit by a flashbang and your screen goes all blurry and shaky? That kind of blur? Or is the blur happening during just 'normal' firefight/gameplay all the time?
No, not that kind of effect; there is genuine "motion" motion blur available in games in response to rapid movement (e.g., sprinting in Battlefield or Mass Effect - the scenery closest to the player [and therefore moving most rapidly relative to them] blurs in the appropriate direction) or rapid orientation change (Portal 2 does this; I think Mirror's Edge did so also).
There is however a cultural issue that is worst amongst the film creators that "film" is regarded as expensive, high-end and good so that things that don't have the artefacts (low frame rate, film grain and colour) are regarded with at least suspicion. This will probably pass with time especially as the number of high budget, high quality TV series increases.