Hollywood Confronts the Digital Revolution --McIlroy
Ah yes, been there, done that.
The most touching paragraph in the article concerns that old bugaboo, resolution. Gosh, they should have dropped by the IPA technical conferences for the last 20 years if they wanted to find out more than they could ever digest about that subject! The article informs that "film normally has a resolution of about 4K (4,100 pixels across by 3,000 deep). Most post-production houses convert that film into a computer file by scanning it at a resolution of 2K (2,048 pixels wide by 1,556 deep)." Eeek, half as much (actually, much less than that, but obviously no one is really counting). The article continues "the lost resolution can't be regained when the file is converted back into film."
Oh my gosh, you mean it's like lost forever?! That sounds tragic.
But no, apparently not tragic. Most people will never notice the difference. The article admits that "while the difference may not be visible to most viewers in a normal theater, it would be obvious in some IMAX theaters."
Yes, the opinions in the printing industry were always shaped by an elite old guard, too. Folks who, in all due respect, really could see the difference between a drum scan and a flatbed scan. But we soon enough learned that most customers couldn't see a difference, and that buyers were decreasingly willing to pay a significant premium to satisfy the tiny minority who could.
And so, in Hollywood, a digital trend is finally emerging: "Almost a hundred films using digital intermediate will appear in movie theaters this year, compared with about a dozen last year," the article points out. Many of this past summer's blockbusters used the technology, including "Pirates of the Caribbean," "Terminator 3," "Seabiscuit" and "S.W.A.T."
But even when you address the issue of "good enough" digital resolution, another detractor throws out yet another barrier: "Still, there are questions about how well digital intermediates will age when compared with film," the article suggests.