Skip to main content

Every great movie must end, and so StudioDaily has ridden off into the sunset. We sincerely thank you, our loyal audience, for your unwavering support over the years.

For news about the media business, please check out our sister brands Cablefax and Cynopsis.

New Phantom v2640 Breaks Another Slo-Mo Speed Barrier

Vision Research is billing the new Phantom v2640 high-speed camera as “the fastest four megapixel camera available.” The company said its custom CMOS sensor can reach speeds of up to 11,750 fps at HD resolution and up to 6,600 fps at its full resolution of 2048×1952.

That’s a bit of an oddball raster for film and TV shooters, but it should capture a pretty nice image, especially for a VFX element or other bit of footage that’s meant to be composited into an HD or 4K/UHD frame. Still, many users will prefer to stick with the cinema-style Phantom Flex4K, which can hit a mere 1,000 fps at 4K and 2,000 fps at 2K or HD resolution.

Beyond pure speed and resolution, the company said the v2640 boasts 64 dB of dynamic range and has an ISO of 16,000 for monochrome versions of the camera and 3,200 for color.  The camera comes with a standard Nikon F-mount, but C-mounts and Canon EOS mounts are available as options. (The lens is not included.)

The v2640 ships with up to 288 GB of internal RAM — adequate, the company says, for recording up to 7.8 seconds of 12-bit footage at full resolution. (Less than eight seconds might not sound like much, but in the world of super-slow-motion, a little bit of time goes a very, very long way.) The camera can record longer clips directly to 1 TB or 2 TB CineMag memory cartridges, but only at lower frame rates. That’s because internal memory supports throughput of up to 26 gigapixels/sec, but a 2 TB CineMag can handle a mere 1 gigapixel/sec.

The camera also sports two HD-SDI ports and a component viewfinder port for monitoring, while a breakout box can provide analog NTSC and PAL output as needed.

Vision Research: www.phantomhighspeed.com

 

Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.



Never doubt what a great editor brings to the table. In this video essay, writer David Welch and editor Joey Scoma slice and dice Star Wars, using script drafts, deleted footage, and historical accounts to reconstruct the crucial, late-in-the-game changes executed in the edit — where director George Lucas led the editorial team of Richard Chew, Paul Hirsch and Marcia Lucas — that elevated Star Wars beyond the status of just another outer-space movie.

 

Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.



It’s a truism of the filmmaking craft: Audiences will tolerate bad picture but never bad sound. And so it is that many shooters, especially in the nonfiction world, must also capture the best sound possible, if no other reason that better sound produces better-looking pictures.

Audiences judge the professionalism of a production by the quality of the sound. Camera people, like myself, are understandably more interested in the size and type of a camera sensor, the recording format and lens performance, but these things mean little if we can’t decipher what the heck an actor or news reporter is saying.

For many documentary, news, wedding and event shooters, it makes sense to acquire an inexpensive sound kit. We don’t necessarily need a top of the line package — professional sound recordists will do that — but we do need gear that is well designed, functional, and can work satisfactorily in a professional context.

In my own work, I’ve carried a minimal audio package for years, since my days at the National Geographic Society (NGS), where I specialized in humpback whales and various ilk of wading birds. I often had to capture the impromptu squawk or gluck, or a line or two of pithy dialogue from an avian expert. Shooting in remote locations meant my audio kit had to be simple. In those days, that meant a small cassette recorder, boom pole, suspension, mic, and windscreen. Of course, operating with no dedicated sound recordist is a familiar scenario today, especially for solo news shooters who must wear many hats.

Manufacturers of high-end products face a significant challenge designing entry-level versions of their flagship gear. At K-Tek, expertise gleaned from years of experience is infused into the low-cost Airo line.
K-Tek

In 1996, Manfred Klemme founded K-Tek to produce high-quality products for audio professionals. Operating out of small manufacturing plant in Vista, CA, K-Tek continues to produce high-end boom poles, suspensions, and other accessories for the feature film and commercial industries. It makes sense, given the company’s more than 20 years of expertise that K-Tek should be the first to produce a high-quality line of accessories for entry level filmmakers.

Reasonably priced (surprisingly so), the new Airo line from K-Tek will appeal most broadly to student filmmakers and schools, church groups, and other prosumer users. The Airo boom pole is smooth operating and lightweight and will work well in most documentary, non-theatrical environments.

Constructed of aircraft aluminum rather than carbon fiber, the pole will never be confused with the company’s top-of-the-line model. The Airo lacks the balance and feel that most top professionals expect; the pole’s compression fittings are not as robust and its operation in general is considerably more vague. Still, at under $100, the Airo boom pole  is a competent performer for a wide swath of potential users. Even top audio professionals may want to adopt the Airo as a low-cost wireless antenna mast, for example, or as a second or third backup boom.

The Airo line of products is plenty rugged. Indeed, many Airo parts are pulled from the same bins as their higher-priced brethren. The ruggedized bands used in the Airo suspension are identical to the top models. Remarkably, the Airo suspension, despite its pedigree and extra rugged construction, is assembled at the California plant and retails for $29.95.

At a fraction of the price, the Airo suspension feels no less hefty or professional than its more costly cousins. It may not be as robust overall or withstand the same level of abuse, but for many students and entry-level shooters the Airo suspension and other accessories in the line might be just the ticket.

The Airo boom pole holder (left) is a huge improvement over the traditional pole hanger (center top) based on the fishing-rod holder found on docks and wharves everywhere (lower right).
Barry Braverman

My favorite Airo product is the nifty boom pole holder — a tour de force of industrial design. The pole holder is constructed of laser-anodized sheet aluminum with a pro grade stainless steel pin — the same baby pin used on other, costlier K-Tek products. The fixed pin makes it easier to line up the holder in a gobo head, eliminating the needless fumbling and frustration that happens often on jobs with small, less experienced crews.

Which brings up another key point. Since my NGS days I’ve toted around the same miserable boom pole hanger/holder. Unwieldy and déclassé, it looks just like a traditional fishing pole holder, because, well, it is a fishing pole holder.

Professional sound recordists have used these awkward things for years not because they are better but because they are cheap and they work, sort of. Equipment manufacturers even started to create their own version of the hanger, copying the dreadful fish-pole design. I don’t think I’m going out on a line when I say the traditional pole holder leaves a lot to be desired. The Airo is so much better at $49.95.

Craft, of course, is at the center of everything we do, whether we’re shooting or recording sound. The Airo boom pole is designed for the less-experienced user; for example, its configuration eschews the internally routed cable. In this way, the novice operator need not worry about inadvertently rattling the pole with the cable inside, producing unwanted noise.

Whether you’re under attack by a police water cannon or an enraged hippo, sometimes you need to run like hell! The low cost quasi-dispensable Airo might be just the ticket as a crash audio kit.

For many news and reality-TV type shooters there’s another thing to consider. While top recordists will always prefer the toughest, most efficient gear, regardless of price, there are threatening situations, albeit quite rare, when the preservation of one’s life is the primary objective. In the course of my own work, I’ve encountered a few such occasions — at the business end of a water cannon, or facing down a hungry polar bear on the Beaufort Sea. Low-cost quasi-expendable gear like the Airo, with its professional capabilities, may be just the thing for shooter-warriors facing such do-or-die perils.

There’s nothing really revolutionary about K-Tek’s new Airo line. What’s notable is K-Tek’s ability to manufacture such products to high professional standards without driving up cost, or producing useless fragile accessories. K-Tek’s Airo proves that intelligent design coupled with high-end experience can pay off for shooter-filmmakers at all ends of the spectrum. Professional camera people and others can now afford to carry a basic sound kit without investing a lot of dollars.

 

Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.



Doritos Blaze vs. Mountain Dew Ice Super Bowl commercial with Peter Dinklage and Morgan Freeman

Peter Dinklage and Morgan Freeman go together like Doritos and Mountain Dew — or like Busta Rhymes and Missy Elliott — in this rap-battle spot promoting both brands (with an ice-and-fire nod to Game of Thrones) that looks to bring double-barrelled star power to Super Bowl Sunday.

 

Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.



Cinematographer Paul Cameron, ASC, on The Commuter

In The Commuter, Michael MacCauley [Liam Neeson] seems to be having one of those days. First, the ex-cop loses his insurance job. Then, while commuting home, he finds himself drawn into a conspiracy that will play out with deadly consequences on — and eventually right off — the rails. This film marks the fourth time Neeson has worked with director Jaume Collet-Serra, after their popular collaborations on the thrillers Unknown, Non-Stop and Run All Night. The filmmaker selected Paul Cameron, ASC, as his director of photography. Cameron, whose work on the pilot episode of HBO’s Westworld was profiled in Studio Daily last year, relates the challenges involved in filming a set-in-New York tale entirely on U.K. soil, with all its train interiors shot on soundstages.

Studio Daily: With all the action and camera movement, did you rely on any previs?

Paul Cameron: It wasn’t necessary. I had worked with the director years back on commercials, so we were already comfortable designing sequences together on our own. There was some previs [from Nvizible] for visual effects supervisor Steve Begg to plan out the actual train crash, but that was about the CG effort [handled principally by Cinesite, aided by Iloura], not the live-action. However, we did do some storyboarding for our end of things, which helped with planning the dynamics. There were a lot of really extreme and dramatic moves to accomplish in very tight spaces, so I chose the ARRI Alexa Mini [provided by ARRI London.]

Liam Neeson in The Commuter
Lionsgate/Jay Maidment

I’ve read it was the director’s idea right from the onset to shoot all of the live-action train sequences entirely on stage with an articulated 30-ton set. How did that impact your approach?

First off, it meant we were going to be shooting blue screen for the windows, so I had to figure out a lighting plan that would take us all the way through the afternoon and into evening for the character’s ride. In New York, I shot 5D reference plates while traveling north on the actual train route, at the times of day that were reflected in the script. So that gave us a solid idea of what we’d need to emulate in terms of light levels and interactives. Most of it takes place on the train after it leaves Grand Central; they’re in the tunnel for 10 minutes, then pop out in late afternoon. The art department [under production designer Richard Bridgland] built a representation at Pinewood of the Grand Central platform, so we actually drive our train in and out of that station.

So your recon of the route gave you an idea of the various looks. What kinds of units were used to recreate that on stage?

My gaffer Mark Clayton did a terrific job, building a rig with 60 vari-lites [Martin Mac Viper Performance moving lights] and 60 ARRI SkyPanels. We used LEDs that let us make various color changes, programming patterns of light that gave us the dappling effect of sunlight through trees. Once that was programmed, we were able to switch very quickly during shooting from scenes taking place at one time of day to another.

Vera Farmiga
Lionsgate/Jay Maidment

Were you able to use the same lighting for close-ups, or did you usually enhance those shots?

Many times, when you want to get some real interest for the faces or create a pattern on the wall, it is smart to bring in another bar of effects lights. Whenever we’re in a tunnel, I like to play some other interactive aspect. It’s always about how to best enhance the natural reality of the moment while at the same time embellishing it for the drama, so I often pushed as far as I could.

I got a kind of Das Train vibe from the shots that rush ahead on the Z-axis through the compartment.

I think it works pretty well, and that immersive you-are-there approach helped to ensure audience suspension of disbelief, which becomes a real concern when you’re faking the whole thing with blue screen. I needed to develop a tracking system for use while moving through the length of the train. We buried our track up inside the ceiling of the cars, then used a special stabilized remote head called Stabileye that is currently only available in the U.K. You pack a stripped-down Mini in there and it made for a super-small profile, letting us race down the aisle between the seats and passengers. We could also pull the camera from the rig, plug it into a backpack and go stabilized handheld right away.

It sounds like you must have taken some time to engineer all this.

The rig featured a computerized winch system. Not only could we track up and down, but there was an arm so we could spin right, which let us fly right around a stationary character at high speed. There was potential danger when maneuvering this close to the actors, but the system was quite reliable. We also disengaged the winch system quite a bit of the time. Key grip Paul Hymns pushed the rig up and down manually, so he could react to the way an actor moved, which was important for telling this story, since there are several characters who are suspects and the camera looking at them is kind of reflecting a certain paranoia at times, not knowing who to trust.

The other big challenge was where to hide our lighting on our set; we had lights up at the top of the frame and hidden down below the frameline as well, along with lights on the sides that we could cover with blue screen. [Additional units were mounted above the windows, which helped provide pools of illumination during the tunnel passages.] I used [uncoated Zeiss] Master Primes [from CW Sonderoptic] in spherical 2.40, so that aspect ratio gave us a little space to hide things at the top and bottom. Then again, we were on wider lenses most of the time, so the battle of where to light from raises its head. I used lots of LED mats taped to the ceiling, plus small panels wherever we could find space, and some handheld units too when we could get away with it, to provide eyelight or some fill. It was definitely tricky when dealing with this enclosed reality, like shooting in a storage facility.

It looks like you used a bit of atmosphere on the train interiors.

It helped to have some atmosphere in the daytime scenes, giving us some shafts of light in the interior that gave a cinematic dimension to the light inside the train. There’s a fine line when using smoke, and we did find it advisable to back off somewhat a few times. Like so much of the job, determining the aesthetic is very subjective, and as much about taste as it is about experience. We found the smoke affected our vari-lites and also the sunlight effects coming in through the windows. Digital is just so sensitive that when there are color shifts in the light, it becomes very apparent. It is very difficult even with newer light meters that read different spectrums of light; it seems your eyes can pick up better on those subtle differences than any tool, at least right now. So we got into a routine of dropping the lights down to clean the filters inside them every three or four days to do what we could to maintain the color temperature as precisely as possible.

Can you discuss the camera workflow?

We captured in ARRIRAW. DIT Tom Gough built a couple of look-up tables, one more contrasty than the other, and we switched between them depending on the levels of smoke used on the train. I chose them to emulate a print stock look. We did some work on the ASC CDLs later on, but these LUTs translated well all the way through editorial to visual effects. Pinewood Digital handled our dailies, and I worked with Goldcrest colorist Adam Glasman on the DI.

Did you wind up using drones for establishing views of the train?

We shot helicopter aerials on the Mini with the Shotover system for establishing shots of the city and all up and down the line, with trains emerging from tunnels and crossing the countryside, building a path from New York to the final destination. During the main unit shoot in London, we only used aerials [by Flying Pictures] on the final sequence, with the train settled after crashing, which was all shot practically on the backlot. In the studio, we built where the train settled after the crash.

Liam Neeson
Lionsgate/Jay Maidment

I noticed one of those simultaneous zoom lens/dolly shots of Neeson as well. Were there any other tools and tricks you tried out on this film?

There are a couple of moments when I was able to use Cinefade, which is an in-camera way to alter depth of field. It is a complicated piece of equipment that uses two spinning polarizers, so there’s severe light loss involved. I have to say it is much easier to use when shooting in daylight, but since we were lighting everything on the train set ourselves, that complicated matters. We’d have to light things up by five or six more stops to be able to make it work. When Liam finds out he is fired, we go into slow motion as the news begins to register on his face, and Cinefade let us lose the depth on the background while he reacts. It was a very different feeling from just doing a typical push-in; except for his nose and eyes, everything is soft by the end of the shot.

Are there any technologies or new approaches that have you excited about what is on the horizon for cinematographers?

Dolby Vision is extremely exciting to me, but also daunting. Dolby is just about the only place doing HDR releases right now, but there are some issues for filmmakers and cinematographers like Jaume and I, who want to maintain the look of the film we shot as much as possible. We don’t want to see the look taken so far out from ordinary that it negatively impacts the filmgoing experience, just because there is pressure to use the expanded dynamic range. I’m on the board of governors at the ASC, and currently this matter of controlling what happens with HDR is a very important topic for us. Dolby has certain expectations, but cinematographers need some say on this as well. Yet we aren’t even always invited to be present for HDR transfers.

And the possibilities for higher quality levels for projected imagery?

I find 4K laser projection quite stunning in how it displays true blacks. I can’t tell you just how terrific I felt when watching Blade Runner 2049 in HDR; seeing it in standard after that, there was just no comparison, with one immersive, almost 3D, while the other felt like all the flat gray projection we’ve been looking at for the past decade. Theatrical releases really do need the highest quality projection to satisfy paying audiences; digital has let us duck mismatched reels and bad registration that would make you squirm and go nuts, and now it can take things to a whole new level.

About six months back, when we discussed your work on the Westworld pilot, you were supervising the HDR transfer for the whole first season.

Jonathan Nolan brought me in specifically to do that, and there were some surprises; people actually recoil sometimes from the image because it can be so intense. The one I remember most clearly was a shot of Evan Rachel Wood in the pilot with the sun right behind her. When you see it in HDR, the image is almost piercing; the intensity is actually more than your eye is used to dealing with in real life, because out in the world, you close your eyes a bit to help adjust to the light. There’s a bit of trickery with HDR, since it stretches the range from pure black to intense white, and this effect is measured in NITs. Cinematographers are wondering if we can cap the NIT level, because above a certain point it can change the impact of the image rather drastically. It is nice to be let things go a bit more intense on occasion, just to expand possibilities, but you have to be careful. The other aspect is color rendition. You have to be careful with just how much chromatic impact is useful vs. hurtful.

 

Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.



James Blake: If the Car Beside You Moves Ahead

The semi-random color and shape of lights seen in the background and reflected on car hoods make a jittery visual accompaniment to James Blake’s aural atomospherics in this video directed by Alexander Brown and edited by Avner Shiloah.

 

Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.



There’s nothing flashy about today’s music video, directed by Cape Town filmmaker and photographer Adriaan Louw, beyond the glimpse of Johannesburg street culture it offers to ease you into the weekend. (By the way, in case Mabel looks a little familiar, she’s the daughter of musicians Neneh Cherry and Cameron McVey.)

 

Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.



Red Promises Summer Shipment of Hydrogen One Holographic Smartphone

Red Digital Cinema’s Hydrogen holographic smartphone project is proceeding on schedule according to the latest Reduser.net update posted by Jim Jannard earlier this week. In the January 22 posting, Jannard said that both he and Red’s Jarred Land are using Hydrogen One as their current phones.

But like any modern phone, it’s not just a phone — it’s also a camera. And the company says it’s not just any camera, either. According to Jannard, users will be able to capture stereo 3D video in what’s called “4V” format (alongside a flat 2D version) using both front- and rear-facing cameras.

Red Hydrogen

Red Hydrogen

Red previously announced that the Hydrogen One will start at $1,195, and Jannard is now promising “unprecedented” carrier support, which presumably means you’ll be able to get service through the provider of your choice. Phones locked to specific carriers should be available sometime in the summer, Jannard wrote, while unlocked phones will ship sooner.

“If we never sell one phone, I am totally happy,” Jannard wrote. “We both have exactly what we wanted.”

Light-Field Display Tech from Leia

We’ve learned a few things about the Hydrogen project since it was first announced last July. Most significantly, Red revealed that it was investing in Menlo Park, CA-based Leia as part of an exclusive partnership bringing Leia’s light-field display technology to market as part of the Hydrogen One smartphone design.

Jannard, who now sits on Leia’s board of directors, said this week that the display will operate in a standard 2D mode as well as in the holographic 4V mode, in which the screen’s image will have the illusion of depth. “There is no way to describe this,” Jannard wrote this week, but then took a stab at it anyway. “The horizontal resolution of 2D is now split into depth layers,”he wrote. “It gives a completely different feeling. All the pixels are there… but instead of “looking at” a pic, you are immersed in the image. It is quite spectacular. As a resolution guy, I absolutely prefer watching 4V over 2D.”

The three-dimensional imagery will be accompanied by what Jannard described as “multichannel spatial sound,” which is said to offer a similarly rich experience.

Slim and light are not differentiating features for the new phone. Jannard admitted that it will be heavier (about two ounces) and bigger (a few millimeters) than competing phones. It comes with a big (4500mA) battery, Jannard said, to facilitate image capture.

The Hydrogen Network: All Things 4V

Red is currently showing prototype models to potential partners for something called the Hydrogen Network, Red’s attempt to become a single-source clearinghouse for all content created for 4V display. Red has had its eyes on content delivery before — back in 2013, it tried unsuccessfully to kickstart a 4K content distribution network via its Redray 4K player and never-released laser projector.

If you’re dying to see Hydrogen in action, keep your eyes peeled for a promised “Hydrogen Day,” likely to take place at Red Studios sometime in April. (It seems likely that NAB visitors will get a chance to preview the technology as well, though Jannard didn’t say one way or another.)

Red Hydrogen: www.red.com

 

Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.



Film Editor Tatiana S. Riegel, ACE, Explains How I, Tonya Came Together

As she sat down to chat with StudioDaily for the Podcasts from the Front Lines series, editor Tatiana S. Riegel, ACE, was only a few days away from an Academy Award nomination for her work cutting together Craig Gillespie’s new film, I, Tonya. Having no idea that honor was coming her way on the heels of an ACE Eddie Award nomination, she was more consumed with the realization that she may never get the chance to work on a movie as unique and complex as I, Tonya again. “I have to say, I’m a little bit worried that films I work on in the future won’t be as much fun as this one was, or as challenging,” she said.



Audio-only version:


Challenging indeed. The movie adopts an unorthodox structure and method of telling the story of the rise and fall of figure-skating sensation Tonya Harding — a story that made headlines in 1994, when Harding was implicated in an assault plot on her Olympic rival, Nancy Kerrigan. There have been documentaries and books about those events, but I, Tonya delves deeper into Harding’s psyche and her strange relationship with her mother, played in the movie by Allison Janney, and makes no attempt to state definitively whether or not Harding had advance knowledge of the plot to disable Kerrigan, for which her estranged husband and his associates were charged. In real life, Harding admitted finding out about it and not saying anything after the fact, but has always maintained her innocence in terms of planning the attack.

In fact, Riegel points out, “the story is told from the perspective of the real Tonya Harding and [ex-husband] Jeff Gillooly, both of whom were interviewed by the writer [Steven Rogers]. But they had wildly contradictory points of view about what happened. So that was the take that we took in how to present the story. That is the unique thing about this [real-life] story — we will never know [the exact truth].”

The movie features reams of live-action footage, stunt and visual effects shots to recreate famous skating scenes, archival footage, on-camera interviews and, occasionally, characters breaking the fourth wall to indicate whether what the viewer is seeing or hearing is or is not, from that character’s perspective, true.

“We had three [main] elements—on-camera interviews, voiceovers, and breaking the fourth wall [in terms of characters giving POVs],” she adds. “In the script, that dialogue was all written for on-camera. For editing, that sort of became the delicate dance of how to use [those elements] and when. So it was a question of trying different things. It became a question of planning all that out: what do you want to see versus what do you need to see, and what works best with the [corresponding] imagery.”

Sebastian Stan, Margot Robbie and Julianne Nicholson in I, Tonya
NEON/30WEST

At the end of the day, Riegel feels “this was a fascinating way of telling the story because of the characters. For me, the primary thing is always emotion and character. This is a really tragic story, filled with tremendous emotion, and I think no one will walk out of this movie feeling the same way about [Harding] as they did when they walked in. I love that part of it.”

Riegel discussed her choices and method of collaborating with Gillespie to edit the movie in this month’s Podcast from the Front Lines.

 

Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.



The Clay in Stop-Motion Animation at Aardman Studios

Aardman Animations senior modelmaker Jay Smart shares his recipes for modeling clay with Adam Savage, showing him how food-processing machines are used to help get the color and consistency of the material just right.

 

Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.