TrendspottingPosted on May 18th, 2012 1 comment
The recently concluded NAB Convention was, as always, interesting at the very least. Like most NAB’s, this one saw new products, updates of existing products, a lot of flash, some new faces, and a lot of old familiar ones. One can go to NAB and other trade shows and look at products, both existing and future, talk to manufacturers, form some opinions, and think about what products best suit their needs. And while that’s valuable, I prefer to look for trends, not specifics.
Products almost never exist in a vacuum. There are always competing versions of essentially the same things, because a good idea is, well, a good idea. But products are developed to serve needs, and needs exist because of trends. It’s easy to look at this year’s NAB and conclude that “4K” is a goal of many manufacturers. But that would be a rather literal interpretation, because 4K is specific, and the movement towards resolutions beyond HD and delivery systems yet to be determined is not specific. It is, rather, a trend, influenced by an industry now confronted with a future that consists of various delivery systems, fewer specific standards in some market segments, mobility of viewing platforms, and a competitive playing field. One cannot really say that “4K” is a singular format to which all market segments and delivery systems will gravitate any more than one can say that HD is “dead.” Neither view is true because both are really indicative of trends. Those trends are what determine where companies invest their time and effort in the hope that the public or the industry will embrace what they have to offer. In evaluating this year’s NAB, I try to relate the new products that were shown to what I perceive as the trend that is influencing their development. So in no particular order, here are some of my observations.
Trend 1: Interaction between Production and Editorial Growing Tighter
Many products were shown that facilitate creation of dailies. Many are based on single workstation approaches, some are based on existing color grading products, some are based on editing products, and still others are based on unified approaches. Products from companies such as Blackmagic (an updated DaVinci Resolve), Assimilate (Scratch Lab), MTI (Cortex/Convey), Colorfront (On Set Dailies and a “lite” version), YoYo, and numerous other vendors attest to the growing number of entries in this market segment. Almost all have been heavily influenced by Blackmagic’s pricing policy on the DaVinci line (the Lite product in its next version will essentially be a feature complete dailies system for free). Some are targeted to individuals or budget limited productions, while others have features that allow for multiple deliverables, more flexibility in color correction, tighter integration with larger storage, integrated backup, extensible databases, and in general, more robustness in the toolset for higher end productions. Most of them can ingest many different native file formats, and all can generate dailies in various formats as well. At the same time, nonlinear editors have become even more portable, as evidenced by programs such as Avid Media Composer, Premiere Pro, Autodesk Smoke, and Lightworks – all of which were shown running quite well on laptop computers. The emergence of Thunderbolt based storage peripherals allows for a portable system that rivals or exceeds the desktop systems we’ve used for quite some time, in terms of storage capacity, speed, and overall power. Laptop editing is certainly not for every editor and every project, but it has now become a lot more practical in situations that can benefit from it. And the growing use of metadata, with products to support both its creation and its maintenance through the post production chain, represents another piece of the puzzle. Finally, the use of portable devices for viewing daily material, both via file delivery and cloud based distribution, allows for production to see the material very, very quickly and without the need for a specific viewing environment.
While it’s important to look at these products directly, it’s more useful to look at the trend they represent. Traditionally, production and editorial have been very separate environments. The handoff from one to the other was never direct, as a dailies process was necessary before editorial could have their materials. In film days, both a film lab and, in later years, a video transfer facility were necessary to create those materials. With modern digital workflows, and portable dailies systems, that is no longer the case. Editorial dailies can be created on or near the set in some cases, in the editorial department in others, and in a third party facility in still others. Production can see what’s been shot almost immediately, or with a very short delay if color correction and double system sync is included. All of this is allowing for a much more direct and immediate link between the production unit and the editorial department in a way that has never previously been the case. Over time, this has the potential to allow for a lot more efficiency as well as better communication between the director and the editor. It also allows for better and more direct maintenance of the original creative intent of both the director and the cinematographer, which in turn can facilitate later steps such as the digital intermediate stage. The products may center around things like dailies creation, but the trend is much larger and more far reaching than that.
Trend 2: Decentralization of Post Production
The introduction of smaller and more portable editing systems certainly represents one product segment that is contributing to a larger trend. We are now also seeing products that facilitate remote collaboration, both over short and long distances. Products such as CineSync have led to a lot of changes in the visual effects industry over the last few years, allowing direct collaboration between artists around the world and supervisors in other locations. It is very common today to be creating visual effects for large projects in places like Los Angeles, Vancouver, London, New Zealand, India, Singapore, and San Francisco simultaneously, sometimes even working on the same shots in multiple locations. Connectivity has made that possible, but clever software has led to very wide adoption of those techniques. For production, the ability to either directly upload camera files to a central data dropoff point, or to have fully capable remote labs with the production unit has transformed the logistics of location shooting. It’s no longer necessary to be in a large production center to have post services directly accessible to the director, the cinematographer, the producers, and in many cases the editorial team – all while on location. Some of the larger post vendors have the capability of creating full digital intermediate viewing environments on location, allowing for dailies screenings and preliminary color work, all with or without a direct tie to a central facility. For smaller productions, files can be created on location and uploaded to editorial at “home base,” regardless of where in the world the production happens to be.
Post production, of course, also means finishing. Traditionally, finishing has been done in purpose built facilities, both large and small, for both features and television work. These facilities not only have the infrastructure, but also the talent and connectivity to facilitate a very efficient finish of very high quality. For that reason, I believe finishing facilities will certainly continue to exist and succeed for quite some time, although they will ultimately be smaller and leaner. But the same technical changes and economic advantages that have come to bear on the front end of post production are influencing many in the industry to explore alternative models for finishing as well. The existence of far more economical finishing tools, such as DaVinci Resolve and a lower priced Avid Symphony and Autodesk Smoke, allow the creation of “in-house finishing” as an adjunct to basic editorial. On the sound side of things, this movement has been going on for a few years, with a number of television productions opting to have a sound editor/mixer employed directly by the production, working in a ProTools equipped room within the editorial department. This has the advantage of eliminating a lot of uncontrollable costs that are often incurred by using outside vendors for sound services, particularly on shows that have a lot of late changes or a lot of “busy” scenes, such as action shows. By allowing the sound editing to be in lock step with picture editing, the time allowed for sound editorial is increased significantly, permitting a “deeper” mix with more tracks to be created within the limited time frame for each episode. There is certainly still a place for the traditional dub stage mixing environment, but the time it’s needed for is greatly reduced. On the picture side, this type of arrangement hasn’t really taken hold yet. That’s due in part to economics, but it’s also due to the talent pool and how they’ve traditionally been employed. But the commonality of tools between basic editorial and finishing has grown, to the point where many shows are finished on the same platform they’re cut on (Avid Media Composer/Avid Symphony). Color grading remains a specialty step, but the availability of low cost/free grading software like Resolve has already created a much wider talent pool than has previously existed, and one that is not tied to the traditional post house employment model. There is, of course, a need for storage and backup systems, as well as some other infrastructure pieces in any location finishing environment, but it is now all achievable when packaged correctly. For shows that go through a lot of changes, or involve a lot of visual effects, there is a lot to be gained by eliminating the need for facility scheduling and the exposure to hourly rates.
Taken together, it is obvious that there is a clear trend towards a mixed model, in which location based front and back end post production is a reality on many productions, while others will stay in a streamlined facility model. For features, a proper environment for digital intermediate work is still a necessity, but preliminary work can conceivably be done in a temporary environment set up specifically for the production, wherever it’s needed. But the movement towards a general decentralization of the post process is obvious, with many steps in the process being done in many physical places depending on the production’s particular logistics and the needs of the creative talent involved.
Trend 3: Platform Independence and Immersive Presentation
The entertainment industry loves buzzwords. One buzzword for the last few years has been “3D,” or more specifically, stereoscopic presentations. Another buzzword, particularly during the last year, has been “4K,” especially since the Sony F65 announcement during last year’s NAB. In fact, this year some were saying that “4K is the new 3D,” actually combining buzzwords to create a new one. It’s true that the 3D craze has abated somewhat, due largely to the public’s seemingly growing indifference to the format for all but some specific productions (animation in particular). And it’s also true that interest in 4K is growing due to the entrance of a number of manufacturers into creating actual products to support it, both in terms of production and display. And having observed some new 4K displays (courtesy of Canon), I can say that I do see some of the things that the 4K proponents have been claiming for some time now, particularly in terms of perceived depth when viewing a display of sufficient size at a viewing distance that allows for somewhat immersive display (i.e., the screen largely fills your field of view). I would say that when the image is composed to illustrate depth, as it is with shots that include some noticeable parallax (helicopter aerials of large cities are particularly effective), the sense of depth is much more natural than it generally is with a stereoscopic 3D approach. In many ways, you get the sense of real depth without the artificiality.
Another approach that is being both talked about and employed on some specific projects is the use of higher frame rates for both capture and presentation. 48 frames per second is being used on the new Hobbit series being done by Peter Jackson, and James Cameron has been talking about using either 48 or 60 frames per second on his planned Avatar sequels. The idea of higher frame rates has been around for a long time, and for systems such as stereoscopic 3D it does present an opportunity to alleviate some of the characteristics that many viewers find particularly uncomfortable about that format. Both of the filmmakers also feel that the higher frame rate provides a more “real” quality, although it is also true that many others in and out of the industry feel that the similarity in “feel” to live television actually detracts from the fictional storytelling, providing less opportunity for a “suspension of disbelief” in the average viewer. Only time will tell how higher frame rates for cinema presentation are received by the general public.
Both of these things, however, are symptoms of a more general trend, which is an industry moving towards both platform and resolution independence and a more immersive experience, particularly in theatrical presentation. I for one don’t really feel that “4K” as a specific format is either an ideal or an inevitable aim. We now live in a world in which visual entertainment is viewed on many different devices, at many different resolutions, in many different form factors, and in many different situations. The same movie that you see on a large screen in a theater might be viewed by some on a smart phone, an iPad, a small screen, a large screen, or a computer screen. Some of these devices will benefit from larger images, some won’t. But not everything is made for the large screen, in fact, most entertainment material is never viewed that way. As the industry makes its final moves towards purely digital capture and distribution on all platforms, it is clear that “standards” as we have known them in both the television and cinema worlds, are the last vestiges of a world in which all devices were the same. It is no longer necessary to have a specific aspect ratio, or a specific frame rate, or a specific resolution in order to have the display system work. At the same time, the theatrical experience is in need of a differentiator, something to make it unique in a marketplace in which home viewing quality and screen size has increased dramatically. Hence the need for “immersiveness,” which becomes far more possible as the size of the screen increases. Since theaters have much larger screens than most homes, techniques that add to that immersive quality create a different feel when they’re viewed in a cinema. Higher resolutions (and experiments are already being done with systems considerably higher than 4K) provide a lot of that immersive quality. And, at least potentially, so do higher frame rates. Stereoscopic 3D remains, for the moment, a bit of a special case, because while some are willing to accept its current limitations (glasses are required, the ticket price is higher, the image is less bright, and for some the whole system is physically uncomfortable), many are not, so until those limitations are minimized or eliminated, it will continue to have limited, non-universal appeal.
So that’s about it for now. Instead of looking at specifics, look for trends. That’s where the real signposts to the future lie.
Having missed out on NAB this year I was curious to know what the latest talk was, and your analysis of the “trends” was bang on.
Leave a reply