Wouldn't it be great if we could free up our screens from floating toolboxes and just concentrate on our images? This could happen
If you ask almost any professional person to empty their bags and their pockets, you’ll probably find that they’ve got two, three or four screens. There’s probably a laptop, a phone, many people carry a tablet as well. Soon we’ll have smart watches and goodness knows what else on our person.
And of course, the moment you sit in front of your computer workstation - there’s another screen too.
In addition to all of this, you’ll have a mouse, possibly a touchpad, and even a digitising tablet.
And yet, when you’re using a professional application like an NLE, your toolbars take up valuable real-estate on your screen. Wouldn’t it be better if these control elements could be moved in a disciplined way to some of your other screens?
It can be done
Technically, this can be done quite easily. As long as the relevant devices are networked, it’s just a matter of writing an app and designing a user interface.
But first, you’ll have to make sure that your content creation applications are able to “understand” this set up. And this won’t be easy.
The problem is that if you’re, say, Adobe, while you could write this into your code, there are so many auxiliary devices that it would take a whole team of developers to keep up to date with all the possible combinations of controllers and screens.
In fact, every software developer that wanted to take part in a scheme like this would have to do the same thing. It would take so many human resources that in reality, it probably wouldn’t happen.
Which is a pity, because it could make life easier for so many people.
Toolkit description language
But there is a possible solution: a standardised “Toolkit description language”.
This would be a standardised way to describe a toolkit. It would be a sort of “PDF” for toolkits.
Let’s have a closer look at this idea.
Every toolkit or toolbar has stuff in common. There are certain types of control; quite a limited number, and it would be easy to describe these in general terms.
For example, and this is probably the easiest one, you might want to have a set of graphical controls on a tablet. All you have to do is say “arrange these eight controls in a four by two grid and use these .png images to denote their functions”. These instructions would also state which controls in the application the remote controls map to.
If it’s a more complex control, the Toolkit Description could allocate parameters to be controlled by specialist external devices like a rotary knob or a fader/slider.
Make applications aware of external tookits
It really wouldn’t be difficult. All it would take would be for the main application to be aware of external toolkits. This type of thing has been going on for a long time, especially in music production, where it is common to map, say, mixer controls onto a physical control surface. There is an element of standardisation provided through MIDI, although much more sophisticated control protocols also exist.
In fact, MIDI is a great example of how agreement on a standard has spawned a whole industry producing products based around the protocol. It’s boosted the revenues of software and hardware manufacturers alike, and completely changed the way musicians work.
Compared to MIDI, which first appeared in 1983, the opportunities to use smart, networked devices with content creation apps is enormous. I hope someone takes up the challenge, and persuades everyone else to follow.