Hip-Hop Project Brings 'The Wordz' to South-Central L.A.

Co-producers Richard Shaw and Lee Cantelon have been working all their hip-hop connections to collect material for The Wordz from the Street, a collection of original music based on the core teachings of Christianity. The project started as a book by Cantelon called The Words: Jesus of Nazareth, which collected all of the traditionally red-lettered words attributed to Jesus in the New Testament of the Holy Bible. That book inspired an album by Rickie Lee Jones, The Sermon on Exposition Boulevard, and now it’s catalyzed an ambitious project putting those words in the mouth of rap artists – from established groups like Bone Thugs-n-Harmony and The Last Poets to up-and-comers like Skim, Oblivion, and Pat’s Justice, three talented rappers Cantelon found performing at a weekly open mic in Venice, CA, hosted by Leila Steinberg, the former manager of Tupac Shakur.
The Wordz Project is a planned one-hour documentary about the whole process that Shaw and Cantelon started shooting in the fall of 2006 on the Canon XL H1 HDV camcorder through Shaw’s Hollywood production studio, Pinlight. They hung out in South Central Los Angeles, getting to know members of the community – like Bigg Slice, who’s built dozens of low riders for Snoop Dogg and works with at-risk kids in the inner city – and watching them leverage their talents against the poverty surrounding them. And they collected footage of spoken-word and song performances that took inspiration from The Words.
Eventually, Shaw was having a conversation with Paul Long, the president of Kappa Studios in Burbank, CA, which was eager to get experience with the Red camera under its belt. Long was looking to work with filmmakers on a short film shot with the Red One, and when Shaw explained The Wordz Project, Kappa offered him use of the camera – as long as they agreed that there would be an eventual film-out. Kappa’s gesture had a domino effect. When Shaw and Cantelon approached Efilm about the film-out, Efilm donated its post-production services. So did Deluxe, with whom Shaw had worked as a producer of the 2003 AIDS-epidemic-themed African drama Beat the Drum. And when Dolby got word, they, too, signed up to donate audio services to the film.

So Shaw and Cantelon headed back into Watts, this time wielding a Red One, and shot the majority of a five-minute piece that will, hopefully, be shown theatrically as a sort of extended trailer for the longer documentary, which Shaw is currently editing. (Shaw was the director of photography for the XL H1 footage; Gianny Trutmann was director of photography for the Red footage.) “We’d like to see this project start a positive movement in rap,” says Shaw. “Not preachy, like you’re throwing a Bible in people’s faces. But the words – if they’re read and presented right – are powerful. They’re basically words of peace and love, and how to live your life without hurting other people. For so long, rap music has been about busting a cap in your ass, or something like that, and this is completely different.”

We asked Shaw about shooting with the Red versus the XL H1, and talked to Avid DS artist Igor Ridanovic at Kappa Studios about the post-production workflow. Watch a one-minute clip from The Wordz Project, then read the interviews.


The Wordz Project

Q&A: Co-Producer Richard Shaw

F&V: What was the shoot like?

Richard Shaw: We had to really shoot from the hip. We were going through Watts [in L.A.], and a couple of weeks before we came into town there were four murders, so it was tense. We were well received, but it wasn’t a place where you could have a huge crew or lay dolly tracks or have honey wagons standing by. There wasn’t the budget for that in the first place, and it would have drawn too much attention that we didn’t want. I kind of forced our DP, Gianny Trutmann, to hand-hold the camera a lot. I didn’t realize how insanely heavy the Red camera is when it’s outfitted – I think it weighed 45 pounds by the time we got it outfitted. And it’s an uncomfortable camera to hand-hold. It has metal parts jutting out from all sides. It looks like a little robot. But the pictures we were getting were stunning.

Since then, I tried a new shoulder mount from Element Technica that was most comfortable. The accessories that we attached to the camera in the field when we were shooting in Watts were the monitor on the swivel, the hard drive, the battery pack, lens, matte box and rails, focus-puller knobs and two Sennheiser wireless receivers. The hard drive was unusually dense and heavy.

What were some other advantages and disadvantages of the Red, compared to the Canon?

We had to get used to [the Red’s] 90-second boot time. If you’ve turned the camera off and needed to get ready and do something quickly, you knew you had to leave a minute and a half for the camera to boot up, like a computer. On the other hand, you don’t have to worry about tape. At the end of the day we took the hard drive and dumped the footage to another hard drive for a backup. It has really good sound, too – four channels of 48 kHz 24-bit audio, which is great. You don’t have to have double-system sound if you don’t want to. And the little flat-panel monitor was very high resolution, crisp, and easy to see. It had a really razor-sharp grid so you could see the edges of your picture.

What glass did you use?

We used the lens that is pretty much standard on the Red – the 18×50 Red lens from Cooke. It’s 2.8f. It’s fairly fast for being a little zoom lens, and it worked just fine. It would be great to use other prime lenses on it and play with it like a film camera. I think it’s a really good camera.

Were there any speed bumps?

I made a decision that because we were going to film, we should shoot at 24 frames a second. That caused us real headaches later. It was a mistake on my part. I was thinking, “We’ll avoid pulldown.” But I wasn’t thinking about all the in-between processes we were going to have to go through to actually get the Red camera’s output to film. I spent all afternoon with Bruce Mazon at Dolby trying to get the audio to stay in sync with the picture. He ended up putting an extra pulldown on it – a 0.1 percent pulldown – to get it to lock. That was shot right over to NT Audio [in Santa Monica, CA] where they made our optical negative. Since all these systems like fractional frames nowadays, that’s what we will do next time.

Why did you shoot with the Red at 2K instead of 4K?

A lot of people are donating their time. I didn’t want to do a 4K shoot and worry about the extra time they’d spend rendering all that footage. It wasn’t necessary for what we were doing and the pictures looked fine – the latitude was really good. If you shot something bright it didn’t all go white, which is typical of today’s HDV cameras. I think it has great potential. I’m really interested in the Scarlet, not so much because of the price but the size and weight. For the kind of stuff that we find ourselves doing so often, that will be less cumbersome to take into sensitive areas. And sometimes if you’re trying to interview someone and you have this gigantic, imposing camera, it freaks people out. A smaller camera is a little less imposing.


Kappa Studios

Q&A: Avid DS artist Igor Ridanovic

F&V: Was it difficult to use an Avid system with Red footage?

Igor Ridanovic: Avid DS works with industry film formats ‘ as a finishing format, DPX is a SMPTE standard that is used by all systems. So with Avid’s extensive metadata management and the capability of Avid DS systems to support DPX, a workflow can be put together for RED using the RED tools to output DPX files from the original .R3D files. For this particular case, we did the offline edit in Final Cut Pro, working directly with proxy files. Then by using Crimson Workflow, a piece of software written by Ian Bloom, we took the XML out of Final Cut and translated it into XML that RedCine understands. We didn’t really do one-light correction, but I wanted to make sure nothing was getting crushed or clipped. We exported those files, with handles, as DPX files, and took an EDL from Final Cut. And then we brought the EDL and the DPX files into the Avid DS and conformed it.

It’s very much the same as conforming scanned DPX files from a film scanner. The only difference is with the film files you would also see the feet and frame counts in the metadata, and in this case there’s no such information. But they’re conformed based on the reel name and timecode. It was a fairly smooth process. The project was shot and conformed at 2K. We did color timing in the DS, and also four or five effects shots. That’s why this workflow is interesting. You could do most of these steps in Final Cut and take it to Color to do color timing. But some of the visual effects we had to do would be difficult, if not impossible, to do in Final Cut.

What kind of effects?

They were invisible effects. In one interview, someone’s hand was moving distractingly in and out of the corner of the shot. We painted that out. There were a couple of shots shot with the Canon camera, which was not synchronized with the computer monitor behind the person who was being interviewed, so there was a distracting flicker going on. We just took a freeze-frame of a Pro Tools desktop and comped that into those shots. We tracked it, put some grain in it, and matched it pretty well. There were some off-speed shots. The DS has a pretty good off-speed tool called Time Warp that interpolates and builds frames that were never there.

Where did it go from there?

When the whole thing was done, it was exported as 16-bit linear TIFF files and delivered to Efilm for film-outs. They did a couple of tests for us, and I was pleasantly surprised. It didn’t take a lot of tweaking on their part. It was a little less saturated than what we had on our screen, but when you understood what the difference was, it worked out pretty well.

Why did they shoot 2K instead of 4K?

I suggested we shoot 2K to better match the Canon materials. When you shoot 2K with the Red camera, you use only the discrete 2K portion of the imaging sensor. You don’t use the whole sensor and then scale the picture down to 2K. It gets cropped electronically so all those pixels outside of the 2K range are not being used. It reduces the physical size of the sensor and thus increases the depth of field, so it matches the Canon material a little better and it looks less like 35mm film and more like 16mm. It’s kind of a feel. The whole piece is very gritty, down-and-dirty South Central rap, with a lot of hand-held footage.

Was it difficult to make the footage from the two cameras match?

The main problem with the Canon stuff was a lot of it was shot really fast and has clipped whites. You can’t really repair that. All you can do is blow out your Red material to match. And I did that to some extent – I hate doing that, but we let some skies blow out completely to match better. This piece is not so much about stunning visuals, but about the story and the content. Seeing the Canon HD material projected on the big screen in the context of gritty images, it doesn’t look bad. Efilm is very good with getting out material that was shot in Rec. 709. Their experience goes back before HD to people shooting Beta SP. They figured out good ways to get from the video world to a film negative.

Another interesting thing is the project was shot at 24 frames a second instead of 23.98. It’s difficult to get that out to a standard-definition DVD or Digibeta. If you want to get it out to Digibeta or DVD, you have to convert that frame rate to 23.98, and for the most part it’s not a quick process. Just last week we had a couple of filmmakers in pre-production for Red projects and we were discussing this. The producer had been advised by several people to steer clear of 24 and shoot 23.98 instead. It’s not the most intuitive thing. If you’re producing something for theatrical release, and you don’t do any dailies other than QuickTime clips and things like that, it’s perfectly fine to shoot at 24 frames a second. But as soon as you have to get it out into the world of regular video, that’s where problems arise.

How else did using the Avid DS system help?

The Avid DS system is different from all the other Avid products, because it’s designed as a one-stop shop for online effects, compositing and graphics, and hi-res conform, including HD-RGB and 2K/4K. It was actually developed by Softimage, which is an Avid company, and is used by post facilities and broadcasters for high-end projects. In fact, Avid just announced Avid DS version 10 this month. It’s powerful, because it combines Avid-like editing tools with a great effects toolset. The most immediate comparison would be [Autodesk] Smoke or Flame – and I guess Flame doesn’t really have that kind of editing capability. You can do pretty heavy-duty compositing on the DS. Working on feature films a lot over the last couple of years, I’ve noticed that, even if you’re in a color-timing session, DPs and directors will spot something they weren’t able to see on small screens. Someone is in the shot who isn’t supposed to be there, or any number of things. You can do those quick fixes in DS really fast. You might be working on color, but you can stop and just do a quick comp and take something out or add something in, track a logo out of the shot or replace something.

Just the other day I was working on a film where the director felt that the depth-of-field separation between the protagonist and the background was insufficient. This was something he was unable to spot on a small screen during the offline edit. We quickly created an articulate matte for the protagonist and applied some optical blur to the background. The whole process took just minutes in the DS, and the result really helped the story.

To see the complete five-minute short, visit www.pinlight.com/redwordz.htm

For more information on Kappa Studios, visit www.kappastudios.com