Skip to main content

Key considerations for implementing a desktop virtualization strategy

(Image credit: Image source: Shutterstock/bluebay)

It’s hard to believe it has only been a couple months since IT departments worldwide worked against the clock to move hundreds of thousands of employees from traditional workplaces to virtual workspaces.  Initially it may have felt temporary to employees, but the IT departments that supported these moves knew immediately they would need to prepare for a new, more flexible work environment, on the same or reduced IT budget.  As more and more time goes by, companies find themselves contemplating a more permanent shift to remote work post Covid-19, according to a recent Gartner study.

As more businesses and employees begin to embrace what could be a more permanent virtual work environment, IT teams are working quickly behind the scenes to create a desktop virtualization solution that satisfies user experience demands, while reducing business costs, streamlining management workloads and maintaining a good security process.

But IT teams can easily become overwhelmed when they realize that implementing a single desktop virtualization model across the entire organization is often impossible, as different kinds of users require different resources.  For this reason, it is important to work with trusted technology partner who can work with you to create a desktop virtualization strategy to fit your “virtualization profile,” ensuring you select a solution that meets your goals and objectives.

Learn more about built-in security in cost-effective VDI solutions here.

For best results, IT teams should take the following steps:

Step 1: Create a “virtualization profile”

The first step is to take inventory of user behavior across the organization, compiling the data into what we call a “virtualization profile,” by identifying what is unique about your organization. This includes identifying the number of users, usage types, the most-used applications and the business goals and objectives. In addition to this, it’s important to take time at the beginning to evaluate:

  • Network requirements – It’s critical in this step to estimate the usage of your networking infrastructure, while also factoring in future growth. Once this is calculated, it can then be determined how many gigabytes of network infrastructure will be required.  A guideline is to utilize at least 10 Gbps and recommend moving to 25 Gbps (or higher) as loads increase.  Ensure you have the lowest latency possible for the cost, and work with your network solution providers to deliver enterprise grade networking to the customer.
  • Compute requirements – Virtual desktop workloads are very CPU intensive.  Customers should use server systems with as many compute cores as possible.  Also, it is important that those cores are high performing as well.  For example, a system with 24 cores at 2.7 GHz will perform better than a system with 24 cores at 2.1 GHz because the higher clock speed helps to push more data than slower compute cycles. It can be helpful to use a “users-per-core” planning approach to approximate the number of servers needed. Begin by looking at user counts for task, knowledge and power users and analyzing any pre-determined business criteria to best inform your overall configuration.  
  • Memory requirements - Just as different types of users (task, knowledge, power) have different CPU (vCPU requirements), they also have different virtual memory (vRAM) requirements. It is important there is enough memory to support users without over-provisioning memory.  Think of this just as if you’re buying a new laptop for the end-user.  This is important as it can lead to resource contention and degrade performance/negatively impact user experience. Persistent memory enables enterprise consumers to allocate more memory per server as compared to similar cost in DRAM alone.  By providing more memory for hot data, allows enterprises to balance cost savings with adding more resources to help them achieve a better overall TCO for the solution.  As workloads grow in virtualized desktop models, data center server partners can put more memory capacity per server, utilizing Intel persistent memory to offer a lower cost edge data center solution for better user experience.
  • Storage requirements – Storage is critical for optimizing performance. Many virtual desktop solutions are based on Hyper Converged Infrastructure (HCI) where all resources are shared between identical x86 server platforms.  A rich storage subsystem will ensure all workloads have excellent serviceability and performance to deliver data to the end users.  Typically, smaller environments require very light usage and may be a better fit for local storage, while larger environments often require advanced storage. SSDs for caching and storage are excellent options when considering your storage subsystem. In an HCI architecture, these products deliver a high performance NVMe caching layer, and SATA or NVMe storage endurance that is better than HDD to sustain a smooth user experience (aka the remote/virtual worker will not experience lags while doing their work).

There are some excellent guidelines for virtualization profiles that can be broken down. LoginVSI has published some common guidelines for these types of workers:  Task, Office, Knowledge, and Power Workers.  Details can be found here:

The Virtualization Profile guidelines were utilized in recent project in building a VDI solution that showcases not only 20 percent more VDI users per dollar, but also lowers TCO $/VDI user by over 16 percent.  More details can be found here.

 Step 2: Use your “virtualization profile” to guide you

  • The next step in the process requires determining the desktop virtualization delivery model, based on the type of work being done by the remote workers you are trying to support. Getting data closer to the customer is a key task, reducing latency and having a performant system will delivery solid results.  This is where working with a proven technology partner can add real value, as they can help you evaluate different solutions for your business. 

Step 3: Evaluate desktop virtualization models

 There are several different kinds of models to consider, following are just a few:

  • The virtual container model, which centrally manages locally executed virtual desktops running on virtual machines. This is a great choice for organizations operating under “Bring Your Own Device” since it allows for isolation between corporate data and personal applications.
  • The application streaming model, which packages applications within the virtualized application tool. The system is run locally, allowing for user customization, better response, and reduced burden on the server-side infrastructure. Benefits can include an always-on experience, and singular update of the streaming app instead of individual software updates on each PC running the application locally.
  • The OS streaming model, which uses a hard disk image on the network with local processing. This can help remove some of the server load, however, this method is typically labor intensive as it offers no network mobility and typically requires IT to carefully sequence and validate the image in advance. The benefits can include improved security in cases where users log in from multiple workstations and/or multiple PCs.

Step 4: Assess total cost of ownership

When assessing the total cost of ownership, it’s important to remember that investments now will be paid back over time as desktop virtualization greatly improves the efficiency of workers by offering access to secure, business critical applications while they work remotely.  The results can range from speeding care for sick patients by delivering medical records to their bedside, to changing global supply chains at a moment's notice. So, recognizing each business has different needs, most find a combination approach works best – for example, providing employees with individual notebook PCs, while business-critical applications and data are managed centrally through a mix of VDI models.

Choosing what’s right for your business

The right solution for your business will greatly depend on your business needs, user profiles, security needs, applications and workloads.  In short, there is not a “one size fits all” solution, which is why it’s important to work closely with a trusted technology partner with a broad portfolio to customize your desktop virtualization solution. 

Learn more in my talk about future-proofing your IT organization & distributed workforce here.

Todd Christ, Enterprise Solutions Architect, Intel

Todd Christ is a Senior Solutions Architect at Intel. He worked in several Information Technology roles prior to joining the Intel Data Platforms Group, where he specializes in Software Defined Infrastructure models, solutions and technologies. Todd is a strong proponent of Hybrid- and Multi-Cloud strategies for Enterprise and Government markets.