Apple and Arrogance

appleiic-case

I’m happy to hear that new MacBooks will have Firewire ports. It was a dumb idea to get rid of them in the first place, and Apple seems to have listened to its customers in this case. A concerned customer emailed Steve Jobs last year about the lack of Firewire in the MacBook, and Jobs responded “Actually, all of the new HD camcorders of the past few years use USB 2.” which is patently false. What he was referring to were the  consumer HD cameras that are gaining in popularity, but leaves out HDV, which is also rather popular. What it also leaves out is the vast legacy of firewire cameras still out there, not least of which is the venerable DV camcorder. And what about hard drives? Sure USB 2.0 can hold up fine editing DV, but if you’re planning to edit HD, even if your camcorder uses USB 2.0 for transport, you’re going to want a firewire hard drive while you’re editing.

This argument is old now, and it’s not an issue for people who didn’t buy the sans-firewire Macbooks, but it’s the arrogant attitude of knowing what’s best for people—and dropping legacy support—that drives me crazy. It happened to my beloved Apple II in the early 90s (ok, that one might have been for the best) and now it’s moving into my beloved MacBook Pro.

Apparently the new MacBook Pros don’t have any expansion slots. The old G4 PowerBooks had PC Card slots, like everyone else. Those were great, and were awesome for P2 card loading. Then the MacBook Pro went for the ExpressCard, which meant I need a slightly dodgy adapter for P2 card loading, but still works great. And if I ever did any work with SxS cards, they would work without an adapter. But now the MacBook Pros have a……. SD card slot?

The only word for this is downgrade. SD cards are media storage devices. Sure, they’re very popular, but you can get a USB SD card reader for $6 at Newegg. You know what you can’t get at Newegg? A direct connection to the PCI-Express bus. That means you can’t put in an additional Firewire bus for peripherals that require a dedicated bus, and you can’t get eSATA.

I’m sure there are lots of uses I’m not thinking of, but the point of giving direct access to the PCI-Express bus is that developers can come up with any crazy thing they want to and get some serious speed. Do you know what you can do with an SD Card slot? You can put SD cards in it. Sure the 17″ still has an ExpressCard slot, but have you ever picked one of those things up? They’re monsters. They’re about as portable as my Apple IIc was (it had a handle). If I wanted something that didn’t fit in my laptop bag and weighed a ton, I’d carry around my desktop computer. Who is Apple to tell me I’ll be fine without ExpressCards? I want options!

And finally I want to complain about those charming John Hodgman/Justin Long ads. I think they’re really well made, and Hodgman is a blast, but this idea that Windows-based computers constantly crash, and Macs are impervious to lock-ups is ludicrous. On a bad day I can get Final Cut Pro to crash 10 or 15 times (that’s on a real Mac Pro, not my hackintosh. The hackintosh tends to be very stable). I’m sick of the false idea that Macs are perfect and worth the extra cost because they’re more stable, and Windows is cheap and you get what you pay for. I think this is the worst one:

Are there “Meghans” out there who are looking for “fast processors” and are disappointed by the speed of new PCs? My girlfriend just bought a netbook that’s significantly faster than her old Thinkpad, which did everything she needed already, but has a bad battery and weighs a lot more. Computers these days are incredibly fast as long as you’re not an FPS-obsessed gamer (who is going to buy a PC anyway) or a FPS-obsessed HD video editor (who is going to need a Mac).

And let’s talk about Vista. I installed it recently, and it runs great. It’s a lot better than the barely-alive XP install I had after I tried to upgrade to SP 3. The only problem I’ve had so far is a deadbeat peripheral-maker who hasn’t made a 64-bit driver for my firewire audio interface. I’m fine using my soundcard though. I only really use the interface with Mac apps. And I understand it’s tough getting everyone on the 64-bit bandwagon. Has Apple released a real 64-bit operating system yet?

BAM and the Em Dash

bamemdash

Yes, I know BAM and the Em Dash sounds like a great title for a film, but this post is going to take a break from Obscure Topics in Film Editing and focus on Obscure Topics in Typography.

At least since I’ve lived in Brooklyn—and probably a lot longer—the Brooklyn Academy of Music has used an em dash (—) between dates rather than an en dash (–), which is the correct punctuation to use between dates. Every time I go there—or see one of their posters around the neighborhood—it drives me nuts. Three years ago I took the time to send a complaining e-mail to them. This was their response:

Hi Kyle,
Thanks for taking the time to study our ads in such detail.
We do intentionally use em-dashes even though we realize
that it is not technically grammatically correct.

The en dash is basically only used in ranges, so I guess they’re trying to drop it like an appendix. Apparently they think the em dash looks nicer, even though it clearly does not. It looks like a big ugly hunk of space that shouldn’t be there. Of course the bad kerning in the example above doesn’t help. Kudos to BAM for playing Eyes Wide Shut though!

An Update on my 3D Bolex Progress

Since I last posted about my Bolex Stereo system, I’ve made a bit of progress getting the whole thing together. I bought an old 16mm projector on E-Bay, which doesn’t quite have the right size lens mount, but the stereo lens fits inside, so I’m sure I can improvise something to make it a snug fit. I bought an ancient Norwood Director light meter on Craigslist, but the sensor seems to be dead. I’m going to try using my digital still camera as a makeshift meter.

I finally figured out how to take the polarizing filters out of the projection lens. It’s in a little metal ring pressed up against the glass. There are small tabs on the inside of the barrell that you can just pull on gently and the whole thing slides out. The filter is severely wrinkled, and is unlikely to work. I’ve ordered a set of linear polarizing filters for 3D projection (and some classy glasses) from Berezin Stereo Photography.

tortoiseshell

I was under the impression that I needed a split reel less than 400′, but it turns out the split reels I’m used to using were a lot bigger than 400′ (I think they were 1200′) which makes sense since in my head what I’m looking for is just about the size of a 400′ roll of 16mm film. So I’m planning to get one from Motion Picture Enterprises this week. They’re a few blocks from the office I’m editing in these days.

We’re almost there. Next up is actually running some film through the camera. We’ll see how that goes.

Perhaps I Was a Bit Ambitious

The movie I’m working on now involves two shows each running 5 simultaneous 1080p angles. The source material is AVC-Intra shot with 3700 Varicams. When I brought the footage into FCP, I converted to ProRes HQ because I thought our system could handle it. We bought a Caldigit HD Element just for this movie, and it’s a year-old 8-core Mac Pro. We played through both shows, watching all 5 angles at once and playing out 1080i through the Intensity Pro and never skipped a frame. However, once I started editing, trouble appeared almost immediately. Every once in a while a dark green frame would suddenly appear in the Viewer or Canvas windows, and FCP would usually crash immediately after. Sometimes it wouldn’t, but eventually as soon as I saw the green frame I would just save and shut down the program.

I went through driver updates on the Intensity and Element, tried rolling back QuickTime to version 7.5.5, and FCP to 6.0.4 (which is hard to do since Apple doesn’t let you download old update files. Save those things, kids.) Nothing helped. But the word on the street is that ProRes HQ is still pretty damn fancy. Cutting 5 simultaneous angles is a bit too much for some component of the computer to handle, and you’re unlikely to see any difference between HQ and SQ anyway. So I re-transferred everything to ProRes sans HQ. No dice. Still crashing every 15 minutes, although the green frame showed up less often.

On Friday I decided to use the Media Manager to transcode everything to 720p DVCPRO HD. It was estimating 26 hours of encode time when I left. This morning I arrived at work with a fully functional project with everything properly linked up, and it plays perfectly. I edited all day without a single crash or green flash. Even better, I’m able to play out the multiclips to the HD monitor at full quality. With ProRes I could only do medium or low. Hooray for DVCPRO HD! And hooray for a fully functional Media Manager!

Spanned P2 Clips with Timecode Past Midnight

So one of those things that doesn’t come up much, but is really important, is the prohibition against letting timecode go past midnight. Once it gets past 23:59:59:23 (or 29 or 24 depending on your timebase) it goes to 00:00:00:00. If that happens, how does your timecode-based editing system know that the footage with lower numbers comes after the footage with higher numbers? It’s a timecode break. Computers aren’t good at guessing.

I ran into this problem recently with a multi-camera P2 shoot using Time of Day timecode on a 2-hour show that started at 11pm. The timecode started at 23:00:00:00 (approximately) and ended at 001:00:00:00. That’s no good for FCP. What we should have done was start the time code at 11:00:00:00 instead, but the show started late, and we were supposed to be done before midnight, and nobody had planned to shoot past midnight and nobody remembered it would be a problem. The big problem I ran into was since these were 2 hour AVC-Intra clips recorded on P2 cards, everything was spanned over about 17 clips on each camera. But since the timecode reset, FCP couldn’t figure out how to combine the spanned clips into the one clip I wanted.

I could have just log & transfer imported all the individual clips, laid them out on a timeline, and then exported that timeline as one big QT file, but that would take forever to import and export since the files are so big. What I did instead, thanks to an idea from David Wulzen at Creative Cow, was go in and edit the start timecode in the Contents/Clip/*******.xml files for all 70 of the clips I wanted to span, and now FCP joins them up with no problem. Hooray for the Internet!

More Editing With Canon 5D Mark II

We shot 3 days with the Canon 5D last week. It looks awesome. I highly recommend it. Here was our workflow:

1. Record separate audio at 48048, stamped at 48000. This is possible with some audio recorders even if you’ve never noticed it before. Check your manual.

2. Copy contents of CF card to hard drive.

3. Convert h.264 QTs to ProRes HQ QTs using Compressor.

4. Use Cinema Tools to batch conform QTs from 30 fps to 29.97

5. Sync in FCP.

6. Edit!

Reasons to Edit in 24p

I’ve spent a lot of time on this blog writing about 24p editing because it’s so complicated and misunderstood. Last year I wrote about shooting 24p but editing 29.97 arguing that nobody is going to notice the difference. This year I want to write about the reasons to go through the trouble to shoot and edit 24p. And, as always, 24p = 23.98 fps

1) Blah, blah, blah, film blowups. My big pet peeve about 24p discussions is the obsession with film blowups. First there was the completely false idea that shooting 24p “advanced” was somehow better than 24p “regular” for doing film blowups. I hope nobody believes that anymore. As long as you use the right workflow, there is absolutely no difference in the end product. The more pervasive rumor is that the only time it makes sense to edit in 24p is when you’re going to do a film blowup. This is also false, for reasons I’ll get into below. And who the hell is wasting their money by blowing video up to film anymore?

2) Computers. Here’s my big reason for progressive 24p editing. A lot of video is made for computer displays these days, and computers and interlacing go together like two things that don’t go together. If you’re going to show your film on the web, it’s going to look a lot better at 24p than 29.97 with pulldown in it. And considering that a lot of web video is higher quality than DVD at this point, you’ll really appreciate the boost.

3. DVDs. If you make a 23.98 QuickTime and compress it to MPEG-2, it will play perfectly on any DVD player. If your DVD player can upconvert and output 24p via HDMI, it might actually play it that way on your 24p HDTV. If you play the DVD on a computer, you won’t see any interlacing. And, since DVD encoding is generally based on average megabits per second, the fewer frames you have in a second, the more data goes to each frame.

4. Educational. Editing 24p video has taught me so much about the way video works. I worry that computers are so easy to use these days that kids who didn’t grow up have to create config.sys boot menus in order to play Doom won’t really get under the hood of their computers and learn what they’re really doing. In the same way, if video just works (like it used to) then you could edit for years without really knowing what you’re doing on a technical level. I like to know how things work, and I think it’s valuable for more people to know. The proliferation of incompatible video formats may be infuriating, but it requires people to learn about technology in a really useful way. It also helps me pay my rent on time every month.