RSS icon Bullet (black)
  • Fearless Forecast v2.0 – 2011 Edition, Part 2

    Posted on January 8th, 2011 Mike 4 comments

    In Part 1 of this year’s forecast, we looked at cameras and production trends. In Part 2, we’ll concentrate on post production, distribution, and technology.

    In 2010, the appearance of Blackmagic’s DaVinci Resolve on the Mac at an unheard of price point was probably the single most significant event in a year that also saw Autodesk release a Mac based version of their Smoke finishing software at a very attractive price point. In some ways, both releases were attempts by their manufacturers to determine exactly where the market sweet spot was going to be for the customer base they were attempting to create and attract. In Blackmagic’s case, they clearly felt that the growing popularity of Red, and perhaps to a lesser degree the Canon DSLR’s as video cameras, was potentially opening up something of a mass market for a category of software that had previously only appealed to the professional end of the post production market. By almost giving the software away (in the professional world, $1000 for a program that formerly cost 100 times that is essentially giving it away), they were attempting to corner a market that they had no proof actually existed. To date, I think the results of that gamble have been mixed. I’m not aware of their actual sales numbers, but it is clear that at least in the mainstream Los Angeles market, they have not really heightened their profile significantly. Facilities that already had commitments to Filmlight, Autodesk, Quantel, Digital Vision, and others have not abandoned those investments in favor of Mac based Resolves (or Linux based Resolves, for that matter). And in the “indie” world, especially that segment of it dominated by Red, the presence and continuous updating of programs that are even cheaper than Resolve – in particular, Red’s own Redcine X (which is distributed, legally, for free), and more recently, The Foundry’s Storm (which will sell for about $400, but is currently being distributed for free as well) – have likely undermined Blackmagic’s original plan to some degree. So the jury, it seems, is still out on exactly how effective Blackmagic’s moves will eventually prove to be. However, I don’t think that will prevent some other companies from trying similar, albeit not as drastic, approaches to widen the market for their flagship products. The NAB convention in April will be an interesting one for some of these companies, Autodesk in particular. Their release of Smoke on the Mac has been reasonably successful, and at a price point that makes it a sensible investment for small facilities and some individuals, but still generates some significant revenue for Autodesk. I believe that policy will be expanded, and Autodesk will likely unveil versions of at least one and possibly two of their other systems products on the Mac platform, with pricing that will place it in the same market as Smoke. The most likely candidate is Lustre, which has a natural affinity with the same market that embraced the Mac based Smoke version, and which can be paired with it using the same hardware, including the Euphonix trackball controller. It is likely that a Mac based Lustre would have some limitations, much as the Mac based Smoke is missing some major features of the Smoke Advanced product on Linux (Batch FX being the primary missing feature, but a major one). It would likely also be placed under the Subscription support program rather than the Systems Product support, in the same manner as the Smoke on Mac product. But a combined Smoke/Lustre product, under one license, would make for an extremely attractive package, and certainly a reason for many to look at Autodesk rather than DaVinci for a professional grading solution that is affordable enough for individual artists as well as small facilities. Although some might say that Flame is the next logical candidate for a Mac port, I think Lustre would be a better marketing move, and one that I believe Autodesk will make, although I could see Flame being a possibility. What would be even more surprising – although possible – is an attempt by Autodesk to move all of the systems products from Linux to Mac OS X by embracing PCI expansion as a way of opening up the current Mac platform, and ultimately de-emphasizing Linux as their primary platform. Unlikely, but possible.

    2011 will also be the year that Sony gets some competition in the high resolution projection arena. The manufacturers of DLP based digital cinema projection equipment – currently Barco, NEC, and Christie – will all release 4K DLP units this year, and JVC and Epson will likely release 4K projection products for markets other than digital cinema. The DLP technology has a potentially significant advantage over Sony’s design specifically in the area of stereoscopic 3D projection. The Sony unit basically splits the 4K imaging chip into two 2K chips when doing stereo, reducing the horizontal resolution but allowing simultaneous projection of the left and right eyes. The DLP system uses a mulitple flash technique to allow the two images to be projected sequentially, but with a very fast refresh rate (usually 3 times the “normal” 24 frames per second, per eye – essentially, 144 frames per second) that “tricks” the viewer into seeing the images without any noticeable flicker. Since the 4K DLP technology uses essentially the same approach as the 2K version, it is possible for 4K DLP units to project “true” 4K stereoscopic images rather than dual 2K images, as with the Sony system. Of course, for this to happen, there needs to be a delivery system that supports that, and right now, that is not the case. The Digital Cinema Package – the current standard for digital cinema delivery – does not have a 48 frames per second, 4K format. This year, however, a number of factors could change that. The DCP format could be updated, as it has been more than once already. Or another technology could emerge that might be an alternative solution, provided the studios and distributors – as well as the theater owners – would support it. Red’s Red Ray format could be such a solution. It has technical specs that, at least on paper, would meet the specific need of delivering 4K stereoscopic images in a small enough file with acceptable quality for theatrical projection. But there are many things involved in studio level distribution that the Digital Cinema Package was designed to provide, including a very high level of security, and use of open source software for all of its components, so as to ensure that one company would not control the standard via a proprietary format that would not be guaranteed to stand the test of time and would not be publicly documented. As long as Red keeps the Red Ray format proprietary, it is a hindrance to its acceptance for studio level distribution and mainstream industry acceptance. It is, of course, possible that Red would open up the format in order to obtain such acceptance, but I’m certainly not going to predict that here.

    And speaking of stereoscopic 3D, it seems to be everywhere these days, to the delight of some and the consternation of others. This year we will see many more 3D enabled monitors appearing on the market, with great fanfare. We might see some that use “passive” glasses, which at the moment make the monitor more expensive but the glasses cheaper. We will see the arrival of a number of BluRay 3D titles, and we will also see the arrival of some cable and possibly broadcast delivered 3D programming. ESPN 3D is already available on many cable systems, as is pay per view 3D. So we’ll see a lot of attention paid to 3D in the home. But my prediction is that won’t translate to customer acceptance or significant sales numbers, at least not this year. My further prediction is that the major studios, which have already committed a number of tentpole pictures to 3D production, will ultimately begin to scale back their commitment to live action 3D, concentrating primarily on CG animated 3D releases instead. The backlash against 3D in theaters has already made itself known to some degree, and although I certainly don’t think it’s going to go away, I also don’t think that 3D will become the “norm,” as many others have predicted. Time will tell, but the fact is that technology has moved a bit too fast on this one. Consumers have already spent thousands of dollars on large, flat screen monitors, and I think the notion that they will do so again in such a short time frame for 3D is rather fanciful at best. But, as I said, time will tell.

    Which brings me to the world I know best – television. There have been some real changes in the landscape of television over the last two years due to a number of factors. The financial meltdown, the SAG/AFTRA situation of almost two years ago, the rise of alternative platforms based on Internet and wireless delivery, the rise in popularity of downloaded rather than physical media, the growth of Internet based entertainment service providers such as Netflix and Hulu, the rise of cable networks as providers of original scripted series, and the arrival of 3D are all factors that did not really exist as recently as 3 years ago. The television business as we have known it is changing, in some ways rather drastically. There are some unmistakable trends developing that will continue and become more pervasive. The cable networks have pioneered the notion of a shorter season, in which series orders are based on 13-16 episodes (sometimes even fewer) rather than the “traditional” 22 episodes of a broadcast network program. This change has in many cases proven to be both financially and creatively stimulating, and this year will be adopted to some degree by essentially all of the broadcast networks as well. The 2011 fall season will likely be the one in which film is almost completely abandoned in favor of electronic acquisition of all but a select few television series, at least on networks not named HBO or AMC. I also feel that this fall we will likely begin to see a significant erosion in the use of videotape for on camera recording, replaced by file based recording on all new cameras and many existing ones as well. This in turn will lead to a replacement of videotape based deliveries by file based media for broadcast and cable networks, but that will not likely happen this year. Hopefully there will be some standard file formats agreed upon for this purpose, however, and that very well could happen this year.

    There are other topics I haven’t touched on in this year’s forecast. I’m going to save them for more specific discussions over the next few weeks, as I think I’ve gone on a bit too long already. As always, I welcome all of your comments and look forward to hearing what you think. If nothing else, 2011 should prove —- interesting. At the very least.

     

    4 responses to “Fearless Forecast v2.0 – 2011 Edition, Part 2” RSS icon

    • “although I certainly don’t think it’s going to go away, I also don’t think that 3D will become the “norm,” as many others have predicted”

      The more I talk to people in the industry, the more it seems this is the prevailing thinking about 3D (and I do really hope you and the others are right about it).
      If you read certain forums, it almost looks like every movie 3 years from now will be shot in 3D, but it’s very difficult not to see the bias there.

      I would have expected to read something about new cameras, especially from Aaton and Panavision, not to mention the rumored digital Imax, but it’s a great follow-up to your first part nonetheless. Always a pleasure reading your blog, Mike!

    • We are overdue for a file based deliverable standard. My money is on Sony, who has the street cred in the broadcast arena and has recently expanded the SR spec. Assuming the SR codec is considered open enough, there are enough variants to cover a wide range of users.

      It would have been fun to sit in on the meetings where Sony debated the right time to take SR into the file based world knowing it would severely cut deck sales. Fun times, unless you’re a Sony stockholder 😉

    • “The Digital Cinema Package – the current standard for digital cinema delivery – does not have a 48 frames per second, 4K format.”

      The Cameron article in the WSJ Monday reminded me of this article. James Cameron Explains Why the 3D Experience Will Be Better On ‘Avatar 2’ I just makes me wonder where the give will be, or are you contemplating a complete redesign?

      Isn’t this a also a problem of bandwidth? I ca’nt imagine that many manufacturers would have overdesigned their I/O pipes. The specs call for the server to provide 307 Mbits/sec, 250 of which is for image, plus audio (37.87) and 20 for subtitles. This was the limit of the TI input card, but I don’t know about the Series 2 revisions.

      Looking at the DCI spec, 48 frame 2K already allows for a 50% reduction of bytes per color component per frame. 3D also already allows 4:2:2 color. Given the low light levels of 3D, this seems like begging failure.

      Sony might have it easier since they control all parts of their system, but they already have light and contrast problems. By the way, they consider the ability to show 3D without triple flashing a feature. It certainly helps reduce the light elimination that flashing causes.

      Ultimately, low light has to be handled before or simultaneously with frame rate. Lasers could be out next year, but imagining standards getting changed in less time seems unrealistic.

      Sorry for the length. A simple question got extended.

    • Stephen Birdsong

      The point about DCP’s being open, and not proprietary is interesting when you apply it to mastering; the standard mastering format is currently a proprietary system, requiring decks and stock. It’s silly if you think about the technology involved with recording to a tape deck. We are storing a digital codec onto a magnetic tape. The move to an open mastering format that requires disks and processors only makes sense. I’m not sure that sony would have success even if they were to open up the HDCAM SR codec as a licensed codec due the demand for an open solution. I think apple and prores already beat them to the punch in that regard anyway; albeit I’m not purporting quicktime as a professional mastering format (not to mention prores is not exaclty open).

      In response to the comment about a tapeless mastering format, it is already in the works. The ETC has defined version 1.0 of the IMF or Interop Mastering Format, which SMPTE will likely take and make minor changes to; giving us a standardized format. You will find that even before SMPTE publishes a spec, there are some who are already moving forward using IMF as it is. I think the stock shortage we are experiencing due to the tragic events in Japan will accelerate the adoption.

      I for one, look forward to the day when i don’t have to route SDI to a deck to make a master, and especially to the day that audio laybacks don’t take the entire runtime of the show to lay down either.

    Leave a reply