As someone who is a video pro, cutting commercials in NYC and LA (and former post facility engineer), I'm not seeing it. I don't know of one editor or post facility that has moved from Mac to Windows or Unix. One River Media (the co. that posted the blogpost about switching) is using Davinci Resolve as an NLE, a far more niche choice than cutting in FCPX. Resolve is a color correcting tool (a very popular one that I've used to color grade features) that has added editing support. I've yet to meet anyone in the wild using it for editing.
Even the editors I know that cut on Adobe Premiere which is available for both PC and Mac aren't switching from Mac, which honestly has surprised me a bit because of the greater choice in hardware. But for most video editors at this level, you're just trading speed in one area for problems in another. Editors whine and complain every time there is a tiny change in the interfaces they use, they hate change. They have been forced to embrace FCP and Premiere over the years (and complain about it incessantly). Very few will choose to make the jump to Windows for the same reason.
As you step down the ladder, the move will make sense for some. Your all-in-one facilities or one man bands (production and all aspects of post handled by one or two people). But in my experience, this group has already been heavily invested in the Windows side because of the cheaper initial costs (that money you save early will be spent later and the Windows post-house will cost as much or more than a comparable Mac post-house, at least it did when I was an engineer).
And the other aspects of video post production, the CG, 3D and compositing sectors already heavily lean toward Windows or Linux and have for over a decade.
There just isn't a huge need for massive speed increases in the hardware side for most video editors. We've gone from needing very fast, high end systems with fast (and expensive) SAN storage to laptops and SSDs that allow us to do more, faster than ever. iMacs or MacBook Pros are all the average editor needs, with more and more working remotely from home. I cut a project for the NBA over the holidays on the first gen USB-C MacBook and years ago cut a project for REEBOK on the just released MacBook Air. Both these projects came up unexpectedly while I was traveling but went off without a hitch on underpowered hardware (that I bought for web surfing and writing).
That's not to say that I wouldn't appreciate (and most likely purchase) a new and expandable Mac workstation. But for the most part, I'd be spending money to just spend money. It wouldn't speed up 98% of my job. And that other 2 percent isn't slow enough to cause me any issues.
It takes more than just a competent film crew to adjust for 8k and HDR. New lighting, makeup and set design techniques have to make up for the higher fidelity. Suddenly the line where makeup is applied is highly visible, as happened during the switch over to HD (this led to makeup artists switching to air-brushing in the early days). Set design needs to look even more real than before (if you've ever been on set, you probably noticed how fake everything looks, yet on screen the flaws disappear). This will take time and experimentation. And money, money that the industry really doesn't seem willing to spend yet as they've spent tons moving over to a 4k pipeline. It'll take a blockbuster success from someone like Cameron releasing an 8k HDR film for the industry to even seriously entertain the idea (and they'll be more gun shy since the 3D push wasn't as successful as they hoped). And unless annual attendance shrinks drastically, they have little reason for the extra expense (and theater owners won't want to bear the cost of the upgrade since they're still working on the upgrade to 4k).
Douglass Trumbull, a pioneer in cinema techniques, is developing technology to allow mixed frame rates and resolution. So those panorama shots could be 8k HFR, while maybe the close up shots of the actors are 4k 24fps. It will be interesting to see if this actually works in a film that requires suspension of disbelief. I wouldn't hold my breath for this to reach cinemas in large numbers anytime soon.
All this is to say, it is much more complicated than you make it out to be.
> It takes more than just a competent film crew to adjust for 8k and HDR....
Your list boils down to "use it appropriately and don't assume old techniques are appropriate". Obviously there is a lot of learning the industry would need to do to use 8k well.
> Douglass Trumbull, a pioneer in cinema techniques, is developing technology to allow mixed frame rates and resolution. So those panorama shots could be 8k HFR, while maybe the close up shots of the actors are 4k 24fps. It will be interesting to see if this actually works in a film that requires suspension of disbelief. I wouldn't hold my breath for this to reach cinemas in large numbers anytime soon.
Sounds interesting, and promising, though I agree that it seems unlikely to be widespread anytime soon, even if it works beautifully.
> All this is to say, it is much more complicated than you make it out to be.
That's an odd comment to end on. At no point did I ever say it was simple. I said it's absurd to treat 4k like it's the pinnacle and anything beyond that is somehow actually a loss.
> All this is to say, it is much more complicated than you make it out to be.
Sorry, I think this was meant for another comment, not yours, as I don't see the sentence I thought I was responding to in yours.
>Your list boils down to "use it appropriately and don't assume old techniques are appropriate". Obviously there is a lot of learning the industry would need to do to use 8k well.
This takes more than a "competent film crew". A competent film crew should have no problem working in well established techniques and workflows but wouldn't necessarily be prepared to venture outside that. If I was directing a production in 8k, HDR, VR, 3D or any edge cases, I want more than a competent film crew. I want creative thinkers and problem solvers. I want crew members who have experience on a wide range of projects, everything from digital video to imax (you might be surprised at how often even the crews of big budget productions have limited experience outside the status quo).
In the early days of the RED camera, the best footage came from cinematographers who had worked in lower budget HD productions, not film cinematographers. The HD crews had already been working in similar workflows, but the competent film crews were flummoxed by this one piece of equipment and even though they could see the results on set, they would still send back footage that was way underexposed and often unusable (and this was often from very well respected and experienced cinematographers).
I believe few people in the industry believe that 4k is the pinnacle. But most do believe that the technology to move to 8k is not even close to ready or worth the added cost, that workflows for 4k are just now becoming standard (the majority of projects are still finished in 2k, although that will change with distributors like Netflix now requiring 4k delivery). And that audiences won't care enough about beyond 4k enough to pay extra. Are you ready to pay extra for an 8k screening? The theaters have to recoup the cost for new projectors while they're still paying off the brand new 4k installs. Oh, and there aren't many cinema lenses that can cover an 8k image (especially since many DPs prefer the quality of older lenses).
There are old timers that lament the loss of film and resist moving from 24fps. But they'll be replaced by the younger generation who will be more open to experimentation and pushing the medium beyond its limits. The industry is driven first and foremost by profits, so once the pencil pushers see profit in 8k and HDR, the whole industry will move in that direction.
In fairness, I did not say that the transition to 8k was easy or that any "competent film crew" could make a beautiful 8k film or leverage 8k to its max. I said that a competent film crew could soften the look if appropriate. The lazy fix for "too sharp" is to just scale down to 4k or throw a slight blur at the frames. A slightly less lazy fix would be use lenses that yield a softer focus (which I assume some of the insanely expensive film lenses can deliver).
But I wasn't saying it's trivial to leverage 8k well, just that if 8k is too much to deal with in some cases, effectively reducing the resolution seems a tractable problem.
What makes it difficult to mix frame rates and resolution? I would naively assume that you could just record each scene with whatever framerate and resolution you wanted, combine the whole thing into a single movie with the max resolution and framerate you used out of the bunch, and be done. For example, if you put 24fps material into a 48fps video, it still displays at 24fps.
Edit: since it's an ongoing theme in this discussion, I should point out that my "just record each scene" description is from a technical perspective only. Making it result in a nice-looking work of art is, of course, another matter entirely.
Current playback technologies only playback one frame rate. yes you can mix frame rates in an edit, but they will be converted to the master frame rate of the timeline (so that 24fps footage would be converted to playback at 48fps, not it's native frame rate). Sometimes this looks fine, other times this can cause image problems.
And it's only recently that mixing frame rates in the same timeline has worked well. 5-10 yrs ago, we would have to convert the footage, typically using hardware specifically built for conversion (Teranex or Alchemist). Then came along desktop software that could do decent jobs converting. Now, if i'm cutting in Premiere or FCPX, I can just drop the footage in and the software will take care of it, usually without issue (Avid still has problems with non-native frame rates and it's recommended to convert before importing the media).
And it's been this way since playback frame rates were standardized (and automated). Projectors had no way of changing playback speed on the fly depending on what frames were projected. Television was locked into one broadcast frame rate spec (29.97 in N. America, 25 in Europe) and TVs were locked into one of those specs. Tapes and disc playback was typically locked into one in the early days, although DVD eventually allowed for multiple playback options, as does Bluray, but the hardware typically converted them for playback on 29.97 screens (pre-HD). We're still limited to what the screen can playback to some extent, the broadcast specs of 23.976, 24, 25, 29.97, 30, 50, 59.94 and 60.
With computers and monitors we have the capability to playback multiple frame rates, if the software allows it, and that is where the current issue is. I can playback different frame rate QTs on the same screen, at the same time without issue. But there is no software (that I know of) to create or playback videos consisting of multiple frame rate videos. Game engines might be able to change playback on the fly, but I have zero knowledge of that tech.
And it would be advantageous to have tech that allowed switch on the fly playback. I'm currently consulting on a documentary that uses source footage from at least 3 frame rates (24, 25 and 29.97). And the editor is cutting in Avid, so we have to convert before import, which slows down the creative process and adds complications to the finishing process.
HFR and 8k require rewriting the rules of set design, makeup, lighting and post-production for non-documentary work as they expose the artificiality of fictional narrative. Rather than feeling like you're in the movie, you feel like you're on the set, seeing all the fake backdrops, cables, and acne. HD (and 4k) had similar issues, but not to the extent that HFR and 8k does. Although both HFR and high resolutions do work extremely well for sports and nature documentaries where suspension of disbelief is unnecessary, at the moment both are worse for narrative filmmaking using current techniques.
The Hobbit (shot 48fps 5k 3D)had mixed results using HFR. The scenes shot on location with less CGI looked awful (like low budget early 80s BBC mid day dramas) while the green-screen, heavy CG scenes felt like being in a video game (in mostly a good way). Billy Lynn's Long Halftime Walk (shot at 120fps 4k 3D) looked absolutely horrible, like early HDCam home movies. It was impossible to get swept up into the movie and made an ok script and good acting feel much worse than it actually was.
I do believe someone will crack the code on HFR, but it will first require the right source material (The Hobbit and Billy Lynn's Long Walk Home were not it). I suggest utopian sci-fi or something set in a sterile environment. But even beyond that they have to figure out the lighting, makeup and set-design (and the extra burden HFR and 8k puts on post-production, especially for CGI/VFX).
One element that actually helps make a film feel cinematic is a slight softness to the image. The best looking digital cinema uses on camera filters and/or post-process to help achieve this look that comes naturally from film shot at 24fps.
Younger audiences who've grown up on HD and HFR video games are less bothered by the differences, but audiences usually don't really know what they want until they see it (one reason that early audience feedback is poison to the process).
background note: 20years experience working in production and post-production
I don't like the "feel cinematic" argument. It's rather circular: what feels cinematic is defined by what we're used to, so of course anything new won't feel the same. That's not an argument against the change, unless you want everything to stay static forever.
As for the rest, I don't doubt that it's hard, requires new techniques, isn't always the best choice, etc. But I don't buy this idea that it's always worse. Which doesn't seem to be the argument you're making, but it is the one I was responding to.
I'm getting a lot of good arguments about why certain videos should be shot using less than the maximum possible. But that's quite different from saying 8k and HFR is just plain worse.
I had a great boss (founder of the company) who said, after I just screwed up, "There is not a mistake you can make, that I haven't already made. Just don't make the same mistake twice."
That's awful. So they train people to never ask for clarification or refresh if they misunderstand or forget, so instead they go on to make a far worse mistake acting on incorrect information.
Don't know what you're prerequisite for an engineer is but my brother, an aerospace engineer, has been using mac laptops since the 90s. And in the classes he teaches, nearly all the students are using Apple laptops.
If the segment includes a fashion editor/advisor from a magazine, the clothing and accessories featured are advertisers for the magazine, not paying GMA directly (to my knowledge).
I did a little freelance on the magazine side, putting packages together to show the fashion labels what coverage the magazine was able to get. I have know idea if this cost extra or it was more of a incentive to purchase bigger ad buys in the magazine. This was nearly a decade ago, but at the time I don't believe the morning shows were paying labels or the fashion consultants, it was more of a quid pro quo.
Obfuscation of the mutual back-scratching arrangements does not obviate the fact that the segment airing on national broadcast television is not being disclosed as an advertisement, nor that many viewers will not necessarily recognize it as such without outside assistance.
It is really very well done if you consider it objectively. The show that is itself loaded with inlined advertisement is also able to sell traditional advertisement time in its segment breaks.
But in my own opinion, shows that do that sort of thing need to be nuked from orbit, and the glass broken up with jackhammers so that salt may be plowed into the dust underneath. I simply cannot abide the ubiquity and intrusiveness of advertisements in the current culture.
News companies have tools to help schedule advertisements in ways that don't make the advertiser look bad (airing a segment on oil spills then don't advertise BP).
While advertisers don't explicitly ask media companies not to run negative segments, they can book a large commercial inventory and incentivize as much.
Oh I agree and didn't mean to infer otherwise, just doing my bit to illuminate the process (a process that might have changed since I was last involved a decade ago).
Could have been an interesting learning experience had they brought up a scorecard after you've been playing a while (or at the end of the game), tabulating all the death and destruction you caused. But that really only works once, otherwise you'll come into the game with the intention of death and destruction.
It'd be one thing if he was born and raised in NY, but he grew up and spent most of his life in Minnesota. Surely as a writer he is able to invoke his past memories with sincerity in creating this fictional narrative. Or do you believe that as soon as he crossed the Minnesota border, all that was lost, and as he stepped into a NYC Starbucks he became a jaded cynical New Yorker.
When I look for places near my neighborhood in Greenpoint Brooklyn, Google Maps pulls up places in New Jersey about 30% of the time, even if I type in the exact name of the place several blocks away. I haven't been to NJ in years (and only a handful of times).
And last time I was in Palo Alto using Google Maps to find a dry cleaners, it sent me on a 20 minute goose chase, and had me going in circles. On the third try of reentering the address (using copy/paste all three times, all 3 times the address was identical) it finally game me an accurate route.
I still use Google Maps more than Apple maps mainly because of the subway/bus integration and because I've been using it for so long, but in my anecdotal experience Apple Maps isn't any worse than Google Maps and has been less irritating when it's wrong than Google (although I use Google more).
Even the editors I know that cut on Adobe Premiere which is available for both PC and Mac aren't switching from Mac, which honestly has surprised me a bit because of the greater choice in hardware. But for most video editors at this level, you're just trading speed in one area for problems in another. Editors whine and complain every time there is a tiny change in the interfaces they use, they hate change. They have been forced to embrace FCP and Premiere over the years (and complain about it incessantly). Very few will choose to make the jump to Windows for the same reason.
As you step down the ladder, the move will make sense for some. Your all-in-one facilities or one man bands (production and all aspects of post handled by one or two people). But in my experience, this group has already been heavily invested in the Windows side because of the cheaper initial costs (that money you save early will be spent later and the Windows post-house will cost as much or more than a comparable Mac post-house, at least it did when I was an engineer).
And the other aspects of video post production, the CG, 3D and compositing sectors already heavily lean toward Windows or Linux and have for over a decade.
There just isn't a huge need for massive speed increases in the hardware side for most video editors. We've gone from needing very fast, high end systems with fast (and expensive) SAN storage to laptops and SSDs that allow us to do more, faster than ever. iMacs or MacBook Pros are all the average editor needs, with more and more working remotely from home. I cut a project for the NBA over the holidays on the first gen USB-C MacBook and years ago cut a project for REEBOK on the just released MacBook Air. Both these projects came up unexpectedly while I was traveling but went off without a hitch on underpowered hardware (that I bought for web surfing and writing).
That's not to say that I wouldn't appreciate (and most likely purchase) a new and expandable Mac workstation. But for the most part, I'd be spending money to just spend money. It wouldn't speed up 98% of my job. And that other 2 percent isn't slow enough to cause me any issues.