<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

The future of video effects - we talk to Boris Yamnitsky, founder of Boris Effects

4 minute read

Boris FX/RedSharkThe story of Boris

Everyone knows about Boris Effects. And, if you're old enough, you'll remember that in the mid to late 90s, it was suddenly possible to create amazing effects and transitions on the slow and clunky NLEs of the time - that looked like they cost a million dollars to produce

And as we'll find out - that's because of Boris Yamnitsky's mathematical background and his attention to detail.

So, how was he able to create effects that were better than those created by expensive, dedicated hardware? And what does he think will happen in the future?

We spoke by email to Boris recently, about how Boris FX started, what it's doing now, and how he sees the distant future (which, by the way, is about five years from now!).

 

David Shapton: What's your technical background? Didn't you work for Media 100 at one point?

 

Boris Yamnitsky: I'm a mathematician and a computer scientist by training. Earlier work included studies in complexity of computation and linear programming. I got my first taste of video editing and visual effects at Media 100, working on version 1 release in the early nineties. I came back to the NLE software market in 1995 with a software package of my own - BorisFX - the first software-based DVE solution for the emerging NLEs: Adobe Premiere and Media 100. The product won Best Of Show at MacWorld 1995 in Boston.

 

What drove you to start Boris FX? What was the question that Boris FX was the answer to?

 

I saw an opportunity to create a cost-effective solution for the young and vibrant non-linear editing market. Traditionally, the DVEs (Digital Video Effects) such as 3D moves and page peels were done on rather expensive hardware boxes. While I loved the effects, I knew that they were inflexible and hard to access by a user. A software plug-in residing right in your timeline window was much more appealing.

 

 

I remember seeing the famous "cube" effect in the late '90s, and it blew me away - it was so smooth and perfect-looking. How were you able to make your effects look so good?

 

I focused on smoothness and render quality of the effects. I had the edge over hardware effects which always had to compromise to work in real time. My effects required rendering but delivered superb quality. Customers liked that.

 

What are the main product lines now?

 

We have two main product lines: The Native Filters: Boris Continuum Complete and Final Effect Complete and Custom UI plugins: RED, Graffiti and FX (the latter two are subsets of RED, one focusing on titles and the other on effects and compositing). Native filters use host UI to set up an effect: parameter sliders, popups, overlay widgets, etc. The effect is previewed and played back in the host. Custom UI products use our own windows and setup tools to create and preview effects but still go back to the host for playback and output. While Native Filters have lower learning curve and faster to apply, more complex compositing and graphics still requires a custom UI. In terms of workflow, plugins are still superior to multiple standalone applications because you never need to worry about rendering proper formats or versioning your edits, the host application takes care of that.

We also have workflow tools to move project between systems and to search media for sound bites. They nicely complement our effects solutions.

 

 How easy or difficult is it keeping up with all the NLEs and the plug-in architectures?

 

As any plugin developer will attest, porting from one platform to the next is a lot of work. All host applications are moving targets, making frequent releases and updates. We have to keep our fingers crossed that nothing gets broken in the process but sometimes things happen. Inadvertently, plugins get broken with new host updates and we have to hurry up to fix it. Business as usual. But we are  well set up to deal with each situation and we go through rigorous testing plan before releasing our updates.

 

Have you ever thought of making an NLE?

 

David, we actually own an NLE: Media 100 but I'd rather not focus on it in a plugin story.

 

 

What's your approach to the new demand for color correction/grading?

 

It's actually very interesting. Customers turn to color grading for several reasons, to match multiple camera shots, to composite a key over background, to enhance a dull image due to poor shooting condition or equipment, these are "technical" reasons. BCC has a variety of tools to address this need. These tools are found in the Color and Tone and Lighting categories. And then there are more artistic and creative reasons: setting mood, emulating a certain filmic look, or creating a stylized effect, all these tools can be found in our Film category. And then there is a need to select and compare different grades and looks. All BCC tools provide a split-screen view of the rendered image. And our soon-to-be released stand-alone effects browser allows to try on different looks with ease right on your timeline.

Where do you think we will be in five year's time?

 

Five years is a whole generation or even a lifetime in broadcast/post market. We may be seeing a very different mix of NLE platforms. I can't make specific predictions but I'm sure it will not be the same as it is today. In terms of technology, we will be editing hyper-HD formats, using specialized GPU processors: highly parallel and ultra fast. And you do not need to be a genius to say that.

In terms of types of effects, I predict that we'll be using more of virtual reality tools, things that help you tell the story without recreating it on the sound stage or location. Think of Chromakey process of the past, but on a grander scale, in 3D and in virtual sets. We already see this trend in recent feature films and smaller projects but we all know in the future we'll consume even more content via various delivery methods. Making this content will have to be more economical and fast, hence emulating reality in post.

We used to say "fix it in the post". But in the future we'll be saying "make it in the post".

 

Tags: Post & VFX

Comments