Perhaps I Was a Bit Ambitious

The movie I’m working on now involves two shows each running 5 simultaneous 1080p angles. The source material is AVC-Intra shot with 3700 Varicams. When I brought the footage into FCP, I converted to ProRes HQ because I thought our system could handle it. We bought a Caldigit HD Element just for this movie, and it’s a year-old 8-core Mac Pro. We played through both shows, watching all 5 angles at once and playing out 1080i through the Intensity Pro and never skipped a frame. However, once I started editing, trouble appeared almost immediately. Every once in a while a dark green frame would suddenly appear in the Viewer or Canvas windows, and FCP would usually crash immediately after. Sometimes it wouldn’t, but eventually as soon as I saw the green frame I would just save and shut down the program.

I went through driver updates on the Intensity and Element, tried rolling back QuickTime to version 7.5.5, and FCP to 6.0.4 (which is hard to do since Apple doesn’t let you download old update files. Save those things, kids.) Nothing helped. But the word on the street is that ProRes HQ is still pretty damn fancy. Cutting 5 simultaneous angles is a bit too much for some component of the computer to handle, and you’re unlikely to see any difference between HQ and SQ anyway. So I re-transferred everything to ProRes sans HQ. No dice. Still crashing every 15 minutes, although the green frame showed up less often.

On Friday I decided to use the Media Manager to transcode everything to 720p DVCPRO HD. It was estimating 26 hours of encode time when I left. This morning I arrived at work with a fully functional project with everything properly linked up, and it plays perfectly. I edited all day without a single crash or green flash. Even better, I’m able to play out the multiclips to the HD monitor at full quality. With ProRes I could only do medium or low. Hooray for DVCPRO HD! And hooray for a fully functional Media Manager!

Spanned P2 Clips with Timecode Past Midnight

So one of those things that doesn’t come up much, but is really important, is the prohibition against letting timecode go past midnight. Once it gets past 23:59:59:23 (or 29 or 24 depending on your timebase) it goes to 00:00:00:00. If that happens, how does your timecode-based editing system know that the footage with lower numbers comes after the footage with higher numbers? It’s a timecode break. Computers aren’t good at guessing.

I ran into this problem recently with a multi-camera P2 shoot using Time of Day timecode on a 2-hour show that started at 11pm. The timecode started at 23:00:00:00 (approximately) and ended at 001:00:00:00. That’s no good for FCP. What we should have done was start the time code at 11:00:00:00 instead, but the show started late, and we were supposed to be done before midnight, and nobody had planned to shoot past midnight and nobody remembered it would be a problem. The big problem I ran into was since these were 2 hour AVC-Intra clips recorded on P2 cards, everything was spanned over about 17 clips on each camera. But since the timecode reset, FCP couldn’t figure out how to combine the spanned clips into the one clip I wanted.

I could have just log & transfer imported all the individual clips, laid them out on a timeline, and then exported that timeline as one big QT file, but that would take forever to import and export since the files are so big. What I did instead, thanks to an idea from David Wulzen at Creative Cow, was go in and edit the start timecode in the Contents/Clip/*******.xml files for all 70 of the clips I wanted to span, and now FCP joins them up with no problem. Hooray for the Internet!

More Editing With Canon 5D Mark II

We shot 3 days with the Canon 5D last week. It looks awesome. I highly recommend it. Here was our workflow:

1. Record separate audio at 48048, stamped at 48000. This is possible with some audio recorders even if you’ve never noticed it before. Check your manual.

2. Copy contents of CF card to hard drive.

3. Convert h.264 QTs to ProRes HQ QTs using Compressor.

4. Use Cinema Tools to batch conform QTs from 30 fps to 29.97

5. Sync in FCP.

6. Edit!

Reasons to Edit in 24p

I’ve spent a lot of time on this blog writing about 24p editing because it’s so complicated and misunderstood. Last year I wrote about shooting 24p but editing 29.97 arguing that nobody is going to notice the difference. This year I want to write about the reasons to go through the trouble to shoot and edit 24p. And, as always, 24p = 23.98 fps

1) Blah, blah, blah, film blowups. My big pet peeve about 24p discussions is the obsession with film blowups. First there was the completely false idea that shooting 24p “advanced” was somehow better than 24p “regular” for doing film blowups. I hope nobody believes that anymore. As long as you use the right workflow, there is absolutely no difference in the end product. The more pervasive rumor is that the only time it makes sense to edit in 24p is when you’re going to do a film blowup. This is also false, for reasons I’ll get into below. And who the hell is wasting their money by blowing video up to film anymore?

2) Computers. Here’s my big reason for progressive 24p editing. A lot of video is made for computer displays these days, and computers and interlacing go together like two things that don’t go together. If you’re going to show your film on the web, it’s going to look a lot better at 24p than 29.97 with pulldown in it. And considering that a lot of web video is higher quality than DVD at this point, you’ll really appreciate the boost.

3. DVDs. If you make a 23.98 QuickTime and compress it to MPEG-2, it will play perfectly on any DVD player. If your DVD player can upconvert and output 24p via HDMI, it might actually play it that way on your 24p HDTV. If you play the DVD on a computer, you won’t see any interlacing. And, since DVD encoding is generally based on average megabits per second, the fewer frames you have in a second, the more data goes to each frame.

4. Educational. Editing 24p video has taught me so much about the way video works. I worry that computers are so easy to use these days that kids who didn’t grow up have to create config.sys boot menus in order to play Doom won’t really get under the hood of their computers and learn what they’re really doing. In the same way, if video just works (like it used to) then you could edit for years without really knowing what you’re doing on a technical level. I like to know how things work, and I think it’s valuable for more people to know. The proliferation of incompatible video formats may be infuriating, but it requires people to learn about technology in a really useful way. It also helps me pay my rent on time every month.

Post with Canon 5d Mark II

I did some tests yesterday with footage shot with Canon’s fancy DSLR, the 5D Mark II. It records 1080p30 video, compressed with H.264. The look of it is incredible. Using a real, expensive lens makes a big difference. There are some minor compression artifacts, and some small, but ugly noise in very low light, but I generally can’t fault the quality of the image. Of course, there are some major drawbacks for anyone who wants to shoot a movie with it, and not just upload pretty shots to their Vimeo account.

The basic workflow is this: Copy the H.264 mov files from your CF card, then convert them to an editable codec. If you’re mixing footage with other cameras, convert it to that format. I’m going to be working in ProRes HQ, so that’s what I converted to. I used Compressor, and it went pretty quickly.

The big problem I ran into is the framerate. It shoots only at 30.0 frames per second, which is incompatible with every other video format I work with. If you’re going to finish in regular old NTSC 29.97, you can easily use Cinema Tools to batch conform the 30.0 files to 29.97 files. It’ll take no time at all. If you use onboard audio, everything will stay in sync. But if you’re shooting double system (which I would recommend) then you’ll have to slow the audio down .1% before you sync it up. You can read up on that process in another post. If you’re shooting the rest of your film at 23.98 like we are, then you’ll have to do some serious frame-rate conversion. Right now I’m planning to cut it with G Film Converter turned on for preview purposes, then we’ll pay to run the final cut of the un-effected 29.97 video through an Alchemist to get a sharper conversion.

This is Bolex Stereo

filter50_1When I graduated from college, my dad and his wife gave me a 16mm Bolex camera from the 1950s. It was a neat gift, but the really unique thing about it was the Stereo Kit that came with it. It was a complete set of stereo lens, projector lens with polarizing filters, and a small silver projection screen. The system works by putting two tall, skinny images side by side on each frame of film. Then when it’s projected, they are offset and overlapped with each one polarized differently, just like a fancy new 3-D movie. Rather than being widescreen though, the image is tall and skinny. Unfortunately in the past I haven’t had the time and money available to get the system going.

The major thing missing right now is a 16mm projector that will take the 3-D lens. I can’t quite figure out what kind of projector it even needs to be. And despite being completely obsolete, they’re not always free. In the research I’ve done over the years I’ve heard that the polarizing filters in the projection lens tend to degrade over time. The projection lens definitely looks a little wonky. If that’s the case, then I’m going to have to figure out how to replace the filters. I’ll have to figure out what orientation they go in since they have to match the orientation of the glasses. I have brand-new 3D glasses provided by Coraline, which I’m pretty sure works on the same principle as the Bolex system.

And of course I’ll need to get a 100′ load of 16mm film and run it through the camera. That’s not exactly free either. Being a wind-up Bolex, sync sound isn’t an option (I also don’t have a dual-system projector lying around, or a way to sync it up in the first place) so I’m thinking a series of silent sight-gags involving things flying at the camera. To save money I’m going to shoot reversal, which I haven’t shot since way back in the year 1999. Apparently Kodak stopped making the higher speed color reversal stock, so I’m considering shooting Tri-X 200D B&W reversal. I’m not entirely sure the system will work with color film anyway. That will be an additional experiment I’m sure. A 100 foot roll costs $25. Processing will probably run another $25. Oh, and I guess I’ll need a light meter. It’s also not clear that the camera will run well without repairs. Last time I looked into it I was told it needed about $200 worth of work on it.

Making Money With Short Films

About two years ago I wrote a post entitled Why Make Short Films? which has become one of the more popular posts on my blog. A lot has changed in those two years, and I want to write some more about what the average young filmmaker can expect when setting out to make films.

First off, unless you live in Europe, don’t expect anyone to give you money to make a short film. You and your friends will have to do this on your own. And yes, you need friends. You need talented people who will work for less than they’re worth, because you can’t afford to pay strangers the amount of money they deserve.

Keep the costs down as low as you can. Learn all you can about the camera options available. These days you can do amazing stuff with some cheap HD camcorders. Definitely shoot HD. DV is not acceptable. 720p is fine. It’s the default resolution of HD on the web. I used to be able to recommend cameras, but I just can’t keep up with it anymore. A very good Hollywood DP is planning to shoot a portion of a film I’m editing on the Canon EOS 5D Mark II; a DSLR still camera that also shoots HD. You probably can’t afford to pay your crew, but you must buy them meals. Having bagels and coffee on the set in the morning really raises morale, and lunch is essential. If you’re shooting late, order some pizza.

Edit the film yourself. It sounds strange coming from a professional editor, but anyone can edit a movie these days. The only cost should be your time. Again, do your research. If you shot 24p, learn everything about what that means for your workflow before you start shooting, and for God’s sake at least before you start editing. Cut it with whatever you feel comfortable using. I hear iMovie is incredibly full-featured these days, although I can barely make the thing work.

Once you’ve finished the movie, put it out every way you can. Don’t be a dope and hold back your premiere for fancy film festivals. Film festivals are 20th Century relics. Sundance isn’t going to show your short, and even if it is, nobody watches the shorts there unless a famous person is in one of them or was seen near the venue at the time of the screening. Apply to some local festivals, and some bigger names, but applying to every festival you can will cost you way too much money. I spent about $1000 sending Kalesius and Clotho to film festivals. It got me a few awards to put on the DVD box, but never any money.

Put it on YouTube. Get yourself enrolled in their Partner Program. I’m pulling in a few bucks a day with that. Put it on Vuze. It was a strange and unique set of events, but I made over $2000 from Vuze’s pre-roll ads in a single quarter last year. Since then I’ve made about a dollar a day. Try Revver. I made a few bucks from them a year ago, but haven’t seen any since then. Blip.tv supposedly has revenue sharing, but I haven’t seen any hits or cash from them at all. Make a DVD and sell it on your website. You can burn them yourself and print full-color discs with an awesome Epson R280. Or if you want to make less money but spend less time, use Createspace to get them on Amazon. I’ve sold one DVD of my collected short films. Try merchandising. T-shirts are the true heart of our economy. I have sold exactly no t-shirts of my own logo, but other films might lend themselves to catchphrases or funny graphics that fans would like to own.

At this point I have made back the cost of producing Two Night Stand, which I shot 4.5 years ago. Most of the cast and crew didn’t get any money, and I haven’t been paid for all the time I spent writing, directing, and editing the movie. That doesn’t exactly qualify as a raging success, but it’s more than I ever hoped for. The problem I’m having is that there is an insatiable desire out there for more and more content. I could make a lot more money if I continued to put out videos. Unfortunately I just can’t keep up the pace. If you can be prolific you are much more likely to build a steady fanbase who talk about and anticiapte your new films.

Seeing in Three Dimensions

magiceye

As long as I can remember I’ve had a bad left eye. With both eyes open I can see just fine, but if I close my right eye I can’t read what I’m typing here. I’ve gone to several optometrists over the years, and they all told me if they corrected the left eye then I started seeing double, so I shouldn’t worry about it too much since I can read and edit movies just fine without glasses.

In December I finally went to an optometrist who made a real effort to correct the problem, Dr. Justin Bazan of Park Slope Eye. He came up with a prescription that seemed to work for me, but he wanted to make sure so he sent me to the University Optometric Center at SUNY. I went there yesterday and was subjected to a battery of tests by a large team of optometry students and doctors. Eventually they had me wear a pair of ridiculous mad scientist glasses with the prescription they had chosen.

Sitting down everything seemed normal. It was definitely sharper than normal, but nothing special. Then they had me walk around and I realized I haven’t really been seeing the world in three dimensions. I’ve been ignoring most of the input from one eye, and flattening everything out. I suspect that has something to do with why I was so bad at baseball. And I don’t want to read too much into this, but I wonder if the fact that movies have apparently looked as flat as the rest of the world to me is part of what drew me to movies in the first place. If they don’t look any less real than the real world that could make a real difference in the way I connect to flat images. It’s something to think about anyway. I’m curious to see how things change once I get my glasses (specialty lenses like mine take a little time) and can actually see in three dimensions all day. It’s very exciting.

Performance Films!

Through no fault of my own, I’m quickly getting a lot of experience cutting recordings of live performances. Last September I started small with a bunch of online videos recapping New York Fashion Week. It was all single-camera footage, with a lot of quick-cutting and jump-cutting. I think it was the first time I ever found myself using the quick-flash-to-white transition so popular with the kids today.

In October I started cutting some Jerry Seinfeld stand-up performances, which were shot with three cameras. I synced up the three cameras and used multicam editing in FCP, which turns editing into a totally different animal. Now, rather than assembling a scene shot by shot, you can kind of wade through the stream of images and go with your gut to pick the nicest angle of the ones available, then revise to your heart’s content. I had cut some stand-up before, in my very early film about Tim McIntire, but it was all montage-based, with very little spacial continuity between shots. Learning where to cut in Jerry’s movements was very interesting. He’s not a relentless pacer like Chris Rock, but on particular beats he turns his body to address different parts of the audience, and he does move back and forth a bit. It’s something he’s obviously thought a lot about, and as an editor it’s not something I wanted to get in the way of. I wanted the changes in camera angle to stand in for, and highlight, the changes in focus he’s giving to the various parts of the room. At first I wanted him to almost complete a turn before I cut, but I found that anticipating a move by a few frames could be very effective, so he turns into the new angle rather than already being there. Of course cutting in the middle of the action often works too. It all depends on the context.

commedia2

Almost immediately after I started the Seinfeld project, I cut the film version of Hal Hartley’s staging of Louis Andriessen’s new opera La Commedia. In the spring I edited a 5-screen movie that was projected during the performances, and two of those performances were filmed with two cameras. So the material we had to work with was the original movie footage, and up to 4 different angles of the performance. Unfortunately, good audio recordings of the shows that were video taped did not exist. Only the premiere had a good audio mix. So I had to get very creative with the editing. I could only hold on a performer singing for a few seconds (if I was lucky) before the shot would start to drift, and I’d have to slip each shot a few frames in order to keep everything in something close to sync. There was always the question of whether to show some of the stage or some of the movie. In the theater you can have 10 different things going on at once, but in the film we just had one at a time. We considered doing split-screen for a while, but it never really seemed like the right thing to do. The whole thing is confusing enough as it is, since there are two related, but slightly different plots going on at the same time between the movie parts and the staged parts. Eventually we worked out a method, and I think it was by far the best work I’ve done on anything.

Cheech and Chong Tour

Next up, and very exciting, is a recording of Cheech & Chong’s Light Up America tour. In March they’re going to shoot two performances with around 5 cameras each, plus some backstage action with the two gentlemen. I’ll be editing the whole thing myself. I can’t say too much more about it, but I think it will be a very cool project. Definitely the highest profile thing I’ve worked on. There will be a lot more angles to work with for the performance, and it’s all being supervised by a great DP. I expect we’ll have good, in-sync sound recordings as well.