In this article, I’ll show you two different ways to use the trackpad with AudioSwift for articulation switching in an orchestral library. For this tutorial, I’ll be working with the recently released BBC Symphony Orchestra Discover by Spitfire Audio, but you can use any other virtual instrument that has articulations.
The first method works on any DAW; it uses the Trigger Mode to send MIDI notes for key switching. The second method is by sending control change messages (CC) with the Slider Mode and works for Logic Pro X users via Articulation Sets. This way you can have two sliders for controlling expression (CC#11) and dynamics (CC#1), and a third one for switching articulations.
Triggering MIDI notes
The most common way to change articulations in a virtual instrument is triggering a MIDI Note On message. Each articulation is assigned to a MIDI note and they usually start in the lowest range of the keyboard, from C-2 and up. If you have as small keyboard, this range isn’t easy to access and if there are too many articulations for the instrument, it becomes a problem to remember the MIDI note that triggers the one you want.
With AudioSwift in Trigger Mode, the trackpad can be divided into different pads, each one with its MIDI note and a label to identify the articulation. In the library I’m using, there are two to four different techniques for each group of instruments. I divided the trackpad into 8 pads, changed the type to Drums and used two different banks to fill all the articulations needed.
In Bank A, the bottom pads trigger the strings articulations which are the same for the violins, violas, cellos and double basses (except the basses don’t have tremolo). The top row only triggers two articulations for the brasses and woodwinds. In Bank B, I put the rest of articulations for the percussion instruments of the library. Use the keyboard shortcuts Z, X, comma and period to change between banks. Save the AudioSwift settings in Preferences > General tab for later recall.
If your orchestral library uses control change messages instead of MIDI notes for articulation switching, you can still use the Trigger Mode if you’re on Logic Pro X or Cubase. Logic Pro has the Articulation Sets and Cubase the Expression Maps, which are two features for receiving any MIDI message (a MIDI Note On in our case) and sending control change messages that the sample library can respond to. Check out each DAW or library user guide to set it up.
CC messages with the Slider Mode & Logic Pro X
Orchestra libraries accept two or three CC messages to control the sounds for expressiveness using sliders. With AudioSwift in Slider Mode, we can use a third or fourth slider in AudioSwift to trigger articulations right from the trackpad.
I set AudioSwift in Slider Mode to three sliders and changed the controller formats to Absolute. The first two send CC#11 and CC#1 for Expression and Dynamics. The third slider sends CC#15 and depending on where I touch the finger vertically, it will switch a specific articulation in the plugin.
Since this library only accepts MIDI notes for key switching, we will use the Articulation Sets feature in Logic Pro to make it work. I set up four articulation zones for the slider. Each articulation will be triggered if it receives a CC#15 value that is inside the range specified in the articulation set. For example, with all strings instruments, the Long technique will be triggered with any CC#15 value from 96 to 127, Spiccato from 64 to 95 and so on. Check out the video demo below.
If you have the BBC Symphony Orchestra Discover and want to try this articulation sets for yourself, you can download the files from here. There are two different folders in the zip file, one using regular MIDI notes for key switching with the Trigger Mode and the other using CC for the Slider Mode. To save it, open Finder, press CMD + SHIFT + G, copy/paste this location ~/Music/Audio Music Apps/Articulation Settings and click Go. Copy the folder here and restart Logic Pro.
Learn how to control Live macros with the trackpad and AudioSwift in Slider Mode.
In the last blog post, I talked about using Automation Quick Access in Logic Pro for mapping one single slider in AudioSwift to multiple parameters on a track. Now for Ableton Live, I’ll show you a way to automatically assign the eight macros knobs on a rack to virtual sliders on the trackpad. We’re going to use a feature called Instant Mappings, which consists on changing a remote script text file with the control change (CC) numbers of the sliders, so Live can read it and make the connection instantly when you select the rack. You can use this for writing automation or quickly change the settings of an instrument/effects rack while you’re looking for the right sound. Let’s start…
AudioSwift in Slider Mode
Go to the AudioSwift Console and change the controller mode to Slider. For this tutorial, I’m going to use Bank A for the first four sliders, Bank B for the other four and MIDI channel 1. I changed the labels from Encoder 1 to Encoder 8, and assigned a CC number to each one from 20 to 27 (choose CC numbers that aren’t used anywhere in your project). I also changed the format of all sliders to Relative A. This is how it looks:
The UserConfiguration.txt File
We are going to look for the UserConfiguration.txt file and modify the remote script. With Ableton Live closed, go to Finder and press CMD+SHIFT+G. In Go To the Folder, paste this address ~/Library/Preferences/Ableton and click Go. Look for your current Live version and then click the User Remote Scripts folder. Create a new folder called AudioSwift. Copy the file UserConfiguration.txt and paste it inside the AudioSwift folder or download the one I made for this tutorial.
Open the text file to make the changes. You’re going to see a lot of lines but don’t worry, it’s really easy to get through it. All lines that have # are comments. The text is divided into sections and we’re going to check only two. The first one at the top is [Globals]. Look for the line GlobalChannel. By default it’s set to 0 which is MIDI channel 1 (it goes from 0 to 15), so we are OK here, since the AudioSwift Console is also using channel 1.
The second section is [Device Controls]. Here we type the same CC numbers used in the AudioSwift Console to the eight encoders. Some more lines below, change the EncoderMapMode to LinearSignedBit. Save the file and close the window.
Ableton Live Configuration
Open Live and go to Preferences > Link MIDI. Make sure AudioSwift 3 input port is turned on in the MIDI Ports section. Under Control Surface, choose AudioSwift and select AudioSwift 3 as Input and Output. Close the window.
Now let’s test it. Select the instrument or effect rack. You’ll see a blue hand icon next to the title of the device, meaning that it’s been controlled by AudioSwift. Turn on AudioSwift with a four or five finger tap and start moving the sliders. To change between banks A and B use the key shortcuts Z, X, comma or period. Press Escape when you finish.
I showed you how to use AudioSwift with macro controls, but you can also select any Live device and AudioSwift can handle the first 8 parameters. What parameters will control depends on the device itself and your memory to remember. For example, the first two sliders can control the frequency cutoff and resonance in the Auto Filter device. In Live’s Compressor, the first four handle the threshold, ratio, attack and release in that order.
Maybe you want to use one of the slider to map something different on the fly using CMD+M on Ableton Live. You can use the sliders in Bank C of AudioSwift for this, or since we used MIDI channel 1 for the remote script with AudioSwift, change to another channel in the AudioSwift Console with key shortcuts Right or Left arrows.
If you’re looking to control volume and pan, I recommend using AudioSwift in Mixer Mode. It’s specially designed for this purpose. Learn how to use it with Ableton Live in our tutorials section. Once it’s configured, switch between Mixer and Slider Mode with key shortcuts 1 and 5.
We used relative MIDI for this particular configuration. If you feel the sensitivity of the slider is not what are you looking for, we can set it up to use absolute MIDI. Change the EncoderMapMode in the UserConfiguration.txt file to Absolute instead of LinearSignedBit, and change each slider in AudioSwift to Regular format. Then go to Ableton > Preferences > Link MIDI and try one of the Takeover Modes: None, Pick up or Value Scaling. Also, try another sensitivity for the controllers in AudioSwift > Preferences > Slider & XY tab.
For really complex automation, try assigning the same CC numbers of the Slider Mode to one or two XY pads in the XY Mode View 2, and experiment controlling up to four macros at the same time in all directions. Use the Return to Default setting in the AudioSwift Console to return to the original position of the knobs.
Logic Pro has a feature called Automation Quick Access that allows a single MIDI slider or knob to be mapped automatically to any automation parameter of a track. In the next video, I show you how to set up this feature with AudioSwift in Slider Mode, and have a handy virtual slider on the trackpad for any automation task in Logic Pro.
The idea of using XY MIDI controllers for sound design is really interesting—multiple parameters in a virtual instrument can be assigned to one controller that works in two or three dimensions, bringing the opportunity to change the sound creatively. Of course, someone can say a simple knob or slider can do something similar, but there are physical limitations moving many controllers in different directions at the same time, specially if you want to add expression to a sound with one hand while playing notes with the other.
AudioSwift for macOS lets you use the smooth glass surface of a trackpad as an ultra-portable and multipurpose XY pad MIDI controller. With one, two or three fingers gestures, the sound designer controls up to three XY pads on a single trackpad. If the device has Force Touch support, a third dimension is available by pressing the surface, sending continuous aftertouch MIDI messages. In this article, I’m going to show you three examples using well known virtual instruments with AudioSwift in XY Mode.
Soundscapes for TV, Films and Video Games
Alchemy is a powerful synthesizer bundled with Logic Pro X. It features additive, spectral and granular synthesis plus sampling, and it comes with a great selection of presets for creating evolving soundscapes. The two included XY pads at the bottom section of the plugin are useful to morph between sounds and create interest through movement. Using Logic Pro X built-in MIDI Learn Assignments (shortcut CMD + L), the XY parameters can be mapped to AudioSwift. The next video shows the first XY pad in Alchemy being controlled using a one finger gesture and the second XY pad with two fingers. Applying pressure to the surface controls the tuning of one of the synth’s sources.
Cinematic Tension with Thrill
Thrill is a Kontakt toolbox from Native Instruments for cinematic atmospheres, built around a large library of eclectic source recordings featuring orchestral sections, percussion, vocals and synthesized samples. Thrill’s easy to use interface consists of a powerful XY control to sweep between different sounds and intensities, creating sonic tensions that build up with the controller’s movements.
Mapping AudioSwift to Thrill is quite easy. Set a control change (CC) number for each axis in the plugin’s settings and then type the same numbers in the AudioSwift Console window. Here is a quick demo by film and media composer Steve Lehmann controlling Thrill and also Native Instruments Absynth with a Magic Trackpad 2.
Deep Real-Time Control in Zebra
The workhorse synth from u-he, Zebra is widely used by soundtrack composers and music producers. It’s a modular synthesizer with numerous sculpting tools to create a variety of sound textures. Some of these tools are four XY pads with a deep matrix section, where each X or Y axis can control up to eight Zebra parameters, each one with its own range control.
Zebra was recently updated to version 2.9 and one of its new features is that now all factory presets have XY pad control assignments already patched, making it easier for the user to play with the sounds. We just need to map each XY pad to AudioSwift using Zebra’s MIDI Learn built-in function and the trackpad becomes a performance controller.
AudioSwift can divide the trackpad into two XY pads, independent from each other. These will control the first pair of XY pads in Zebra but we can use a different MIDI channel to assign the other pair without changing the CC numbers. In AudioSwift, to change the MIDI channel just press the right and left keys.
In the following demo, one XY pad controls the filter’s cutoff and resonance, and the other the oscillator’s settings. Both controllers have the Return to Default feature enabled in AudioSwift—the XY pads return automatically to a default value set by the user when the fingers are lifted.
As you can see, a trackpad with AudioSwift opens up endless sonic possibilities that otherwise will be hard to achieve with knobs. I showed you three virtual instruments that have XY pads in their user interfaces, but the same idea can be applied to any plugin you have; just map the parameters you want to control with AudioSwift. And also, the MIDI from the trackpad can be rerouted out of your DAW and sent to your favorite hardware synthesizers. To learn more about setting up AudioSwift as XY pads, make sure to check out the XY Mode tutorial or download the user guide.
When AudioSwift is on, it becomes the key app on screen, freezing the mouse pointer and receiving all keyboard inputs. The default keyboard commands used on a DAW won’t work because the DAW isn’t in front. AudioSwift comes with a couple of shortcuts as a workaround, to trigger basic commands in a DAW like record, play and set automation modes when the Console is activated. These shortcuts work with all controller modes, but in order to use them, AudioSwift should be configured first in Mixer Mode within your DAW.
The Mixer Mode lets the user control faders, panning and sends when it’s time to mix a project. It uses the Mackie MCU and the HUI protocols to communicate with the DAW. AudioSwift sends the respective MIDI messages that the DAW transforms into commands. The steps to configure the DAW are explained in the User Guide on page 12 that you can download in our tutorial section.
Once AudioSwift is properly set up, the following key shortcuts will work for transport control when the Console is on. If users have a MacBook Pro with Touch Bar the transport controls will be also displayed on it.
When writing automation on a track you can set a different mode by clicking each track to Read, Write, Touch and Latch, or you can also use the keys U, I, O and P when the Console is on. However, these shortcuts are only available in Logic Pro, Studio One and Reaper.
New in AudioSwift 2.1
The recent AudioSwift update comes with two new ways to turn off AudioSwift automatically without the need of using the ESC key. One of them reads the playhead state to turn off the Console. Go to Preferences > Mixer Tab and enable Turn AudioSwift off when play head stops.
The other way is to turn off the Console when no fingers are touching the trackpad within a second. Enable this feature in Preferences > General Tab > Turn AudioSwift off automatically.
Maybe you work with a lot of external synthesizers and want to explore new sounds. Or you miss a particular ribbon controller or X/Y pad from an old synth that you don’t have anymore. Well, with AudioSwift you can be as creative as you want with your sound designs, using simple gestures over a Mac trackpad to also control your hardware synths and modules.
When AudioSwift is running on macOS, it creates three virtual MIDI ports. AudioSwift 3 port sends Note On/Off, Control Change, Pitch Bend, and After Touch via four controller modes. This virtual port can be reroute out of the computer to an external MIDI output if we want to. I’m going to show you two methods: using a standalone MIDI router app, and by configuring it through a DAW, so we can record and edit the performance later.
Via a Virtual MIDI Patchbay
There are couple of apps that can do this. I chose MIDI Patchbay by Pete Yandell. It’s free, easy to use and it was released in 2008, but it still works even in Mojave 10.14. Another one is MIDIPipe by Nico Wald which has more features to play with.
When you download the file and click on it, it will probably show you the left message. Since it’s an old app it won’t open right away, so you’ll need to go to System Preferences > Security & Privacy > General and click Open Anyway.
System Preferences > Security & Privacy
Once it’s running, MIDI Patchbay is simple. At the bottom left corner, click Add Patch to create a new patch. When it’s selected in the table, at your right side will appear the MIDI input port. Select AudioSwift 3. At the bottom will be our external MIDI output, which in my case is a M-Audio Midisport device connected to my Macbook Pro via USB. This MIDI output will send the messages out of the computer via a MIDI cable to an external synth.
And that’s it. Select the X/Y or Slider Mode in AudioSwift, turn it on with a four or five fingers tap, play your synth with one hand while the other controls your CC messages with the trackpad. Now let’s look how to do this inside a DAW.
Via Logic Pro X
Since we probably want to record our performance, we’ll need to configure it inside our DAW. All DAWs have different ways to do this and some are similar. I’m going to use Logic Pro for this example.
First, create an External MIDI track and click the box Use External Instrument plug-in. In Output select your external MIDI device port (Midisport 2×2 Anniv A in my case). When the track is created, open the External Instrument plugin and you’ll see the MIDI destination selected. You can then send the audio from the synth back into your DAW’s mixer, trim the volume and record it. Read more about recording audio.
Now every time we hit record and start using the trackpad as a MIDI controller with AudioSwift, our performance will be recorded in the DAW and also will play our external synth. As you can see, all CC messages are recorded in the MIDI region for further editing.
The most used input device in computer based music production is definitely the mouse. However, for Mac users a trackpad can also be a valuable tool, either by using the built-in trackpad on a MacBook or adding a Magic Trackpad to a desktop setup. I’ll be showing different ways to use a trackpad in your workflow with some tips and tricks. We’ll start with the most common macOS multi-touch gestures and then we’ll check out two apps that expand the trackpad’s functionalities: AudioSwift and BetterTouchTool.
Swipes, taps and the three finger drag
The Trackpad Preferences window shows all available gestures. By default, the trackpad is configured to allow a click by pressing its surface with one finger. But it’s better to tap the trackpad to click by going into System Preferences>Trackpad>Point & Click and enable Tap to click. When I started working with a trackpad I was only using one finger to move the pointer and click. With this setting enabled, I use the middle finger for moving and then without lifting it, I tap with the index just like working with a mouse. A two finger tap is a regular right click.
The other feature that helped me a lot was the three fingers drag. On macOS, to select or drag an object you’d normally need to click and drag — a simple tap won’t work. To change this, go to System Preferences>Accessibility>Mouse & Trackpad>Trackpad Options and enable dragging with a three finger drag. Now we can drag the objects easily.
Swipes and pinch gestures with two, three or four fingers can also be configured at System Preferences> Trackpad>More Gestures. They are helpful for things such as scrolling or showing the Desktop, Launchpad and Mission Control. I avoid using three finger swipes because they cause conflicts with AudioSwift. (More on this later.)
Besides scrolling, a two finger swipe while pressing ⌥ or ⌘ lets you zoom in and out depending on the DAW. Pro Tools uses ⌥ but only for horizontal zoom. Cubase does it with the ⌘ key. Logic Pro zooms both vertically and horizontally with the ⌥ key and Studio One with ⌘. We can also zoom in Logic Pro and Studio One with a pinch gesture after enable it at System Preferences> Trackpad>Scroll & Zoom.
I swipe up with four fingers to open Mission Control and see all my active windows. Here I can move certain windows to a second virtual worskspace at the top and switch between workspaces by using a four finger swipe to the left or right. If you work on a MacBook, you’ll find this useful — you could for example, have the arrangement window in your main workspace and the mixer window in another one, saving space in your laptop screen.
Logic Pro users
Since Logic Pro is from Apple, it’s obvious that the trackpad integration with the software is better than with other DAWs. There are more functions available by using a three finger tap or the Force Touch feature with compatible trackpads. (For other DAWs, check BetterTouchTool below.)
First, we need to go to System Preferences> Trackpad>Point & Click and make sure that Look up & data detectors is enabled. I prefer a three finger tap instead of the force click because it’s easier for my hand. If you choose force click, make sure to also enable it in Logic Pro>Preferences>General>Editing>Enable Force Touch Trackpad.
With a three finger tap or force click we can do the following:
Create a MIDI region in an empty part of a MIDI track, or import an audio file in an audio track.
Create (or delete) notes in Piano Roll/Score and also markers in the time ruler.
Create two automation nodes around a region.
Toggle Zoom Focused Track on a track header. Below the header, it creates a new track.
AudioSwift — Your trackpad as a MIDI controller
With lots of music-makers working on MacBooks, using the multitouch trackpad as a MIDI control surface is an advantage. This could mean less equipment to carry for the mobile composer working at the coffee shop, or the producer tweaking a mix in a 7 hours flight.
AudioSwift is an app I developed that transforms the touches into MIDI commands, making the trackpad an expressive tool for virtual instruments and a control surface for mixing and automation. It works through five controller modes. Do you design sounds with Serum, u-he Zebra or Native Instruments Thrill? Great! Control multiple performance parameters in real time with the X/Y mode. Want to create a quick beat? Tap the trackpad as trigger pads. Need to compose a string section? Play your MIDI keyboard with one hand while the other controls CC1 and CC11 with the trackpad working as virtual sliders. Set the controllers to return to a default value after lifting the fingers.
AudioSwift also has the Scale mode for playing notes in a chosen key and the Mixer mode for quick access to two faders at the same time, panning, or a handy jog wheel for navigating through the timeline. Desktop producers can also benefit by adding a Magic Trackpad to their setup. Use one hand for editing with the mouse and the other for controlling the faders with a trackpad.
A four or five finger tap activates AudioSwift; pressing the Esc key turns it off. Press ⇧ immediately after tapping the trackpad and AudioSwift will be active until you release the ⇧ key. This way you could be editing a track or tweaking a plug-in, and then turn on AudioSwift temporarily to quickly move the fader, or put the track in solo without the need of reaching out each parameter on screen with the pointer.
With a Magic Trackpad and a mouse combo (or using two trackpads), AudioSwift can also be configured in order to be automatically activated by just touching the trackpad, and turned off by moving the mouse again. Since a Magic Trackpad is small and wireless, it can be placed anywhere on the desktop or close to the main MIDI keyboard — a great tactile setup for the home producer!
Custom shortcuts with BetterTouchTool
If your DAW lacks a better integration with the trackpad gestures, there’s a workaround: BetterTouchTool by folivora.ai. I’m not affiliated with the developer but I find it’s a fabulous app that lets you customize shortcuts for the keyboard, mouse, touch bar and of course a trackpad.
Below is the Preferences window. At the left column (1) we choose the application we want to assign shortcuts. When we add a new trigger to the table, under the column Trigger Name (2) we add a gesture and under Assigned Actions (3), the corresponding key command in the DAW. All gestures assigned will appear in the table. They can have a modifier key like ⇧ or ⌘, so the same gesture can be used for multiple commands. The only limit is how many your memory can handle.
Music producers often require to work with two different DAWs. Assigning a gesture to the same commands in both applications (e.g. join regions) gives the opportunity for only memorize one gesture instead of two different key commands.
AudioSwift works well with BetterTouchTool as long as there isn’t a gesture assigned to All Apps at the left column (1)— this could accidentally trigger a command while using AudioSwift.
Let’s take a look to the list of trackpad gestures assigned in Logic Pro. I’ve chosen finger taps at the border of the trackpad instead of swipes or other gestures, because they’re more precise to execute and are easier to remember. There are eight possible border zones: three at the top, three at the bottom, and two at the middle. I use the ⇧ key as a modifier to have more options.
With simple taps I can move a region to the playhead, join regions, close all plug-in windows, set locators, etc. In Logic Pro there is the option of opening and closing each plug-in insert on a selected track using a key command. So I set the corresponding key commands in Logic Pro’s Key Commands window, and in BetterTouchTool assign a gesture to them. Now I can open a specific plug-in with a tap.
These are just examples of shortcuts you can implement in your DAW. The last one in the list is special — it triggers two actions in a sequence. TipTap Right (1 Finger Fix) is a gesture that consist on touching the trackpad with one finger and then immediately tap with the next finger available to the right. Since I normally use my right hand middle finger to move and tap with my index, I can use this gesture for something else: to set the position of the playhead by tapping anywhere in the Arrange window in Logic Pro.
Setting the playhead position in Logic Pro with a click
For Logic Pro users, we can only do this if we click in the lower half of the ruler, or ⇧+Click in an empty space of the Track area. We can also use a Marquee Tool, click and play from that position. However, setting up the playhead to a particular position by just clicking on a region is not available.
A trick I use as a workaround is apply a sequence of actions to the TipTap Right gesture. The first one is the predefined action ⌘+Click. This set the secondary tool (which in my case is always the Marque Tool), and put a thin line in the region. Then the second action is the key command for Go to Selection Start in Logic Pro or ⌃↖. (NOTE: the ↖ in compact keyboards can be found by pressing fn+← arrow.) This moves the playhead to the position where the marquee is. From there I can zoom in, edit the region or hit play. Of course this trick will work as long as the secondary tool is the Marque Tool. If it’s changed to the Fade Tool for example, it will trigger the wrong action.
You can download my Logic Pro BetterTouchTool preset from here.
Try it out!
I hope you’ve enjoyed this guide and picked up some good ideas to speed up your workflow. Let me know if you have questions or tips you want to share in the comments section below. You can also contact me via email, Facebook, Twitter and Instagram. If you’d like to try out AudioSwift and BetterTouchTool, there are free trial demos available at each site.
Learn how to use your trackpad as a photo editing tool on macOS with AudioSwift and MIDI2LR.
Imagine these two scenarios — you’re a photographer that travels a lot, working with a Macbook and edting photos in Adobe Lightroom. Or you’re in your desktop computer and need to retouch hundreds of pictures from your last session, because the client wants them today. Time is money and you want to manage it well.
What if you could speed up your workflow in Lightroom without constantly reaching out the mouse pointer to change settings like exposure or contrast? Why not start using the Macbook’s trackpad or a Magic Trackpad in a different and better way?
OK — I confess I’m not a photographer and before I thought about writing this article, all I knew about photography was the same as any other mortal: take a picture with a phone, go to Instagram, apply a filter, and imagine I’m an artist. However, I had a similar situation to the scenarios described above when working on music production. I needed a tool to create and improve my workflow on location and in my home studio, so I developed AudioSwift.
AudioSwift for macOS is an app that lets you use a trackpad as a MIDI controller. In music production, MIDI controllers are devices widely used to make music and change parameters on screen, by sending commands known as MIDI mesages. With AudioSwift, a trackpad can send these same commands by sliding the fingers over its surface, making it easier for a user to interact with the software’s graphic interface.
I was looking around to see if this concept could also work in different applications other than music creation, like photo editing for example. The challenge was to know if a photo editing app could read MIDI messages and work with AudioSwift properly. That’s when I found out about MIDI2LR.
Many photographers are using MIDI controllers to tweak their photos in Lightroom. It works by installing a free plugin called MIDI2LR, that translates MIDI messages into Lightroom commands. The user moves a knob or fader to control a Lightroom parameter without reaching out the mouse pointer, making it easier and faster to edit a bunch of photos.
So, why not use a trackpad instead of getting another physical device? A Macbook has one built in, and some photographers already work with Magic Trackpads in their desktop computers. I tried out MIDI2LR with AudioSwift and the results were fantastic — the trackpad can now tweak the same parameters in Lightroom like a MIDI controller does, without clicking and dragging each setting on screen.
How Does It Work?
You call AudioSwift by tapping the trackpad with a four or five fingers tap. This opens a console window, freezing the mouse pointer temporarily and taking control of the keyboard. AudioSwift devides the trackpad into 4 virtual sliders, and each one controls a Lightroom setting. The console window shows which Lightroom settings will be controlled. You start touching the trackpad’s surface up or down and the corresponding parameter will move instantly. Press the period or comma keys to jump to the next bank of 4 virtual sliders. After you finish changing the settings, press the escape key and AudioSwift will be turned off. The mouse pointer is then released.
I was able to control 36 different settings in the Lightroom Develop Module, using just a trackpad. At MIDI2LR you set up each virtual slider to a specific Lightroom command. There’s a lot of options to choose from. Once it’s configured properly, it’s very precise and intuitive.
Give It A Try!
I made a video tutorial showing step by step how to configure AudioSwift and MIDI2LR. It includes preset files so you can start working right away. I hope this information is useful to you and it really improves your workflow. If you have any question, you can contact me in the comment section below, via email, or by Twitter and Facebook.
MIDI2LR is free to download and the developer RSJaffe ask for donations. I’m not affiliated, but I strongly encourage you to donate, so the developer can continue supporting this great plugin in the future.