When the new Apple Pencil came out a year ago, I integrated it into my iPad editing workflow. I can edit podcasts with the Apple Pencil at a pretty impressive rate of speed, and the precision of the Pencil means that I’m more inclined to make detailed edits on the iPad than I am when I’m editing on my Mac with Logic Pro X and a trackpad. In fact, every episode of The Incomparable that I’ve edited in the past four months has been done on my iPad Pro.
In a previous post on Apple Pencil, he elaborates more on why he likes it:
One of the great things about Ferrite is that it doesn’t have a preferred interface mode—you can use touch, keyboard shortcuts… or Apple Pencil. While I was trying out the new iPad Pros and the new Pencil, I decided to try to edit a podcast in Ferrite, and my mind was blown. Now I was tapping and sliding the pencil to delete extraneous audio. The latest update added support for double-tapping on the new Pencil, which I mapped to a play/pause toggle so I could edit more quickly without putting the Pencil down.
This is a great sign for the iPad; more than anything the success of the platform hinges on the interaction model. The only interaction I personally prefer on the iPad is scrolling, for everything else I prefer the Mac. I have an Apple Pencil, but only really use it occasionally for quick sketches. For me, the problem with the pencil for me is that, while you do get additional precision with it, everything else gets worse. Pinch to zoom, two finger tap to undo, and of course scrolling itself all range from difficult to impossible with the pencil in your hand. This combination of tradeoffs mirrors the trackpad on macOS. I use a Magic Trackpad 2 and I find it to also be poor for precise edits. But I still prefer it to a trackball or mouse, because, with multi-touch gestures, I find it to be a more comfortable device in sum, despite its shortcomings when making precise edits. Snell says in the post he’s using a trackpad when he edits on the Mac, so my follow-up question would be how does he thinking editing on the Mac with a mouse stacks up?
Snell is going to continue doing most of his podcast editing on iPad, but he still needs to use his Mac as the final step in his workflow because the audio plugins that give him the best results simply aren’t available on iOS:
Now here’s the tough one, one I don’t have a good answer for as yet. As cool as it is that I edit every episode of The Incomparable on my iPad, the fact is that all the audio files for that episode are prepped on my Mac before they get to my iPad. I sync audio tracks using a proprietary tool, then use iZotope RX to remove background noise, and finally use a compressor (currently it’s Klevgrand’s Korvpressor, but it’s the latest in a string of ones I’ve used, they’re like Spın̈al Tap drummers) to balance the volume of audio across different tracks.
Ferrite includes a compressor plug-in and a volume-leveling preprocessing feature, neither of which can I get to generate the output I desire. Korvpressor has an iOS version that I can use as a plug-in in Ferrite as I do on the Mac with Logic Pro X, but the iPad version crashes reliably, so I can’t use it. And there’s absolutely nothing I’ve found on iOS that can match the quality of noise and echo removal that iZotope provides on the desktop.
Of all the creative industries, audio is probably the one that clashes the most with iOS, both in terms of the business model and the user interface. On the business side, while all pro creative apps have cottage industries of plugins that spring up around them, for audio it’s simply on another level. I don’t see how audio plugin companies can adapt to the app store pricing race to the bottom when plugin bundles routinely run in the thousands of dollars. And on the OS side of things, the typical way of working is to have many different plugins with their own windows all running in a host DAW, which is antithetical to how iOS works1. The audio industry adapting to the iPad is going to be a difficult process.