<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

In the future, we'll be renting our computers and GPUs as well as software

9 minute read

Dell/RedSharkDell Workstation Server

I've just spent a day in Dell's headquarters in Austin, Texas. There's some stuff going on there that, in time, could shake up the Post Production and animation industries

As a European, the US always seems supersized to me, but in Texas, you just loose perspective completely. A Ford F150 looks as big as a house to us, but to a Texan, that's just an entry-level truck.

We had great presentation by the IT Director of Shelby, Richard Sparkman. They're exclusively Dell Precision users for all their engineering tasks. He revealed that they've taken Ford's top-performing F150 and "Shelbyized" it. Apparently there's nothing on those long Texas roads that can catch it. One comment summed it up: "This is a hybrid truck: it burns gas and rubber...".

On the way back to the airport, our driver (actually a Brit, but who had lived in Texas for 35 years and sounded even more Texan than his compatriots) said he viewed electric cars as "target practice".

None of which should suggest that anything about Dell's operation is unsophisticated. Never mind the abundant engineering capability at their multiple Round Rock (close to Austin) campuses; the PR people had also organized our day with (Dell) precision.

A new direction

The reason we were taken there was not to be at the launch of some stonking new consumer product, but for a new direction in media production which Dell, in close collaboration with partners like Nvidia, Wyse and Citrix, is championing. Stay with me if you don't immediately understand what I mean by Workstation Virtualization. It's important, and I'm going to tell you why, in terms that even I can understand.

Most of us still work in a traditional way, if anything to do with digital media can be described as "traditional". What I mean by that is that typically we will have a couple of monitors on our desk, and a powerful PC under it. If we're lucky, that PC will be "workstation" class.

A workstation has come to mean a computer that is not just powerful, but optimized in a no-compromise sort of way for the job it does. So if it needs powerful graphics, it will have them. If it needs a lot of storage, it will be there. And if there's some other aspect that - however expensive - is essential for the job, it will be included.

These powerful, optimized computers are not cheap, but they're massively more cost-effective than the computers of old. You can do stuff on a modern workstation that would have taken either a room full of computers a few years ago, or around the sum of money you'd need to buy a new Aston Martin if you wanted a dedicated graphics workstation.

That's all changed now, but one thing that hasn't changed much is that there is a limit to what you can do remotely from one of these powerful devices. That's because with video and animation, there's a lot of data to move around.

And that's a pity, because there are serious advantages to keeping content, processing and storage all in one place, the biggest of which is security. Both in content creation and in engineering (another field that makes use of powerful workstations), having designs and content on multiple machines makes the possibility of "leakage" of intellectual property much higher than if it's contained in one data centre - and there's a very good way to enforce this, as we'll see in a minute.

Virtualization

So, what is virtualization? If you've never worked in an IT department, it's quite likely that even if you've heard of it, you won't have come across it in practice.

When software is running in a virtualized environment, it is - pretty much - unaware of the hardware it's running on. That's because it's running on a "virtual machine". These days you come across the term "virtual machine" a lot, and it crops up in several different guises. If you're a programmer, you might be aware that Java programs run in a virtual machine: a type of idealized computer that sits in between the Java program and the real computer. Of course, the virtual machine has to "understand" what type of computer it's being hosted on, but the program doesn't - and that's a really important thing.  It means it really can run on anything that supports the virtual machine. Microsoft's .net framework has one, sort of, and so does Android. But these are not the type of virtual machine that we're talking about in the case of Dell's Virtual Workstations, although the concept is not totally dissimilar.

With workstation virtualization, the software runs in a virtual computer - one that looks just like the real host computer, but is abstracted from it.

This has many advantages. For a start, you can create an "image" of your virtual system, complete with all the installed software and the drivers, and then either reinstall it if the original system gets corrupted, or you can move the whole thing onto another physical computer if the original host breaks down.


Multiple virtual machines

You can also run multiple virtual machines on the same physical one. Through the use of time-sharing, multiple systems can share the same resources. Of course, there's a performance hit but there are two things to say about this. First, it's not as bad as you'd think, and second, virtualization software is getting better all the time at the same time as machines are getting faster - so for many sorts of application (spreadsheets, word processing, databases etc) even a virtual machine that's sharing physical resources with many other virtual machines can perform perfectly well. In most cases, the virtual machine only has to perform fast enough for a basic application to run, for the user not to be affected - or even aware - that they're using a virtual machine.

This sort of virtualization is extremely widespread and largely tried and tested. Banks and other businesses use it with barely a second thought.

But there's a catch. A big one.

The catch is that you can't virtualize something that is physical. It would be very nice if you could duplicate your network  and I/O ports, but you can't. Nor can you just make replicas of your computer's GPU. And for every virtual machine that you run, you spread the computing resources of your system between another user.

None of this sounds very promising for typical workstation users. They like their applications to run "close to the metal", not to be shared in a non-deterministic way between multiple users.

Until now, this has been a showstopper for workstations. It simply doesn't make sense to virtualize them for all the above reasons.

It's becoming possible

But now, and it feels surprising to say this, it is beginning to be possible to virtualize even workstations.  There are limits but according to Dell we're on the edge of a new era of virtualized workstations, with tangible advantages and suitably few disadvantages.

Before we look at how they do it, let's reiterate why this might be a good thing.

Why virtualization is a good thing

In a post production facility, for example, you will often find ten or even fifty people working on material from the same, or multiple, productions. If what they're doing is simply working on shared content - for editing, perhaps - there's not really much need for a virtual system. All media is available to all users, and it works well, because only files are being requested.

But what if you want to process the material heavily? No problem if you still have a workstation as a network client.  The idea with a remote system, though is that you should be able to use all your resources from the server (in fact, from your local "cloud" if you want to call it that).

There are huge issues here. Let's say that you want to have your graphics cards (your GPUs) remotely. That's an awful lot of data that you need to transfer in real time. More, in fact, than just about any network could cope with. You're probably aware that the speed between the computer and the GPU is one of the limiting factors in the overall speed of the system. A network connection isn't going to help here.

So this is where Dell and its partners have turned the problem on its head. What they've done is kept absolutely all the content and the resources at the server side. Essentially, they send you an image of what would have been on your local workstation's screen if you had one.  

There's a very big advantage to this: it means that you never have to exchange a big file with a colleague over a slow network, because all of your material is in the same place - the data center!

One way they do this - and this is perhaps the most elegant, and certainly the most foolproof - is to put a special encoding card in the server, which "intercepts" the pixels that would have been sent to the workstation's monitor, and using a proprietary protocol called PC over IP. At the user's side, there's a small box made by Wyse, that doesn't have any operating system or anything else to set up or adjust. All you do is plug your monitor and mouse into it.

Zero Client

In case you're wondering where the computer is, there isn't one. It's called a "Zero Client" with good reason.

What comes out of the box is pixels: the very same pixels you would see if you plugged a monitor into the workstation in the server rack. You are, quite simply, running and seeing the server remotely.

As long as there isn't too much latency in the network (by latency we mean delay) then you wouldn't know that you're using the workstation remotely. But what you definitely can't do is steal the material, because it just isn't there. There's no way you can copy files or do anything else to "get at" the content. It's a pretty ideal solution for any organization that's worried about their material.

And it also means that you can sit down at any of these Zero Clients in a building and all your content - as well as your resources - will be there for you instantly.

There are limits of course. For a start, the encoding into the PC over IP protocol involves compression and is not going to give you ten or twelve bit video. This will limit its use for very high precision jobs like color grading. But if you're animating or working on pre-viz, or doing a straightforward edit, it would probably work well.

If that all makes sense so far, then good: but it doesn't mention virtualization yet. So far, what we've described here is really just remote control.


Virtualizing a workstation

The biggest problem with virtualizing a workstation is that the whole workstation ethos is about having access to real, raw, hardware resources, whether this is compute power or GPU capability. With traditional virtualization, there is a significant loss in raw compute power because a sizable chunk of the host computer's compute power is used to generate the virtual PC in which the guest operates.

Most non power users would not notice this loss, but for compute-intensive tasks, this would be a deal breaker. So how does Dell get round this?

They do it in two ways.

There's the "traditional" way, which is that the computer's power is shared between users by time-slicing. This is handled by the "Hypervizor", which is a the program that manages the virtual machines. To some extent it doesn't matter whether you're working on mundane, text-based stuff, or on high-stress computer animation: time-slicing is still time-slicing, but if you are previewing your work, as long as you get an acceptable frame rate, it's probably OK. The system administrators will be able to allocate system resources appropriately for your work. If you're timesharing a GPU card then this method is called VGPU, or Virtual GPU.

But if you really do want the maximum power available for your graphics work, there's a second mode of virtualization, which is called "GPU Pass Though". This is probably what you want if you're a real power user.

With GPU Pass Though, your application is still running in a virtual machine, and there may be several of them, so there may be multiple users on a single workstation server. But the difference is that you, and the other users, will be allocated all of a GPU card, or a whole GPU unit on a multiple GPU card.

Which means that you'll feel like you have at your immediate disposal a high-end graphics card, just like you would in a workstation under your desk.

Of course the number of users in this case is limited to the number of GPU cards in a server, or the number of GPU units on a multiple GPU card. But this is still a very efficient way to manage a larger number of users, and it still maintains absolute security relative to the user, who does not have a single ounce of material on a drive on his or her desktop.

 So that's it. It's a way to allow multiple users to use remote workstations, safely hidden away in a data center, without seriously impacting on their ability to do high-power computing. Potentially it's a way for larger facilities to scale their resources and to manage their users in a much more efficient way. 

Technologies improve

 As the techniques and technologies improve, this ability to work remotely and virtually will become more dynamic, and you'll be allocated more resources as you need them. As a user, this should appear to be seamless and automatic.
 
 Dell understands that everyone will be cautious, if not outright wary, of workstation virtualization. There are so many things to get right, and so many things that could go wrong.
 
 So, they've taken this into account while designing their products and services.
 
 First, they've taken responsibility for the whole software and hardware "stack". This means that they're committed to making it work - and making that your own system works. They do this by carefully researching and testing not just the hardware, but the way every piece of hardware and software interacts, in all imaginable circumstances. And where there are issues that escape this process, they're committed to working with clients until the problem is solved. They're certainly working closely with software manufacturers and the hardware people like Nvidia.
 
 Secondly, they've built a Workstation Virtualization Center of Excellence at their worldwide headquarters in Austin, so that they can try out real-work applications in real-world use. Workstations are incredibly important to Dell, and server workstations are an increasing part of their inventory. Virtualization has become an essential way to scale-up IT infrastructures and it now seems that the time has come to apply this magic to heavy duty content creation.
 
 And as broadband speeds improve, I would say that we can confidently expect a future where we not only rent our software, but hire the workstations to run it on, remotely, by the hour.

 

Tags: Technology

Comments