RSS

WHY SHARED RESOURCE COMPUTING:REDUCTION OF IMPLEMENTATION COSTS IN ICT 4 EDUCATION PROJECTS:

14 Jun
WHY SHARED RESOURCE COMPUTING:REDUCTION OF  IMPLEMENTATION COSTS IN ICT 4 EDUCATION PROJECTS:


School Adminstrators are always looking for ways to reduce their IT costs and increase accessibility, flexibility and agility — but that combination rarely comes together at the same time.

The development and the proliferation of virtual client solutions provide a notable exception, however, because the technology can give businesses, government organizations, and educational institutions a cheaper yet more flexible solution for their client devices. Virtual client solutions leverage the horsepower of centralized computing resources (both servers and PCs) to provide access to a traditional desktop computing experience for end users, but in a less expensive and more controlled manner. The combination of low-cost, energy-efficient access devices and
software that creates, delivers, and lets IT centrally manage virtual user sessions is a powerful package that can transform the way companies use and think about their end-user client computing services.

Desktop PCs may not be the sexiest technology devices, but they are unquestionably the workhorses of most institutions and continue to make up a large majority of the commercial client installed base. As a result, when organizations start to think about trying to rein in their costs or improve their overall environment, desktop PCs often are the first target.
Lately, desktop discussions have become more complicated because of the introduction of desktop virtualization technologies, which are designed to provide end users with an experience that’s roughly equivalent to a regular PC in terms of performance but offer significant cost advantages in terms of initial purchase, ongoing support costs, and management control.
The basic concept behind virtualization involves reassigning different parts of the computing process to different elements in a networked environment. Specifically, instead of having all of a PC’s software, including both the operating system (OS) and applications, reside on a hard drive in the PC and then having all the program execution occur using the CPU inside that PC, virtualization moves much of the storage and program execution to a server or other centralized computing resource.
As a result, the client access device functions much like a terminal accessing a centralized mainframe. But unlike the old command line–driven mainframe applications of the past, in this virtualized environment, end users can work in a familiar Windows operating system environment complete with normal Windows and multimedia applications. So, end users have an experience that’s akin to using a regular, standalone PC, but IT departments can enjoy a number of important benefits,
including:
􀁠Centralized storage. With all data stored in a central location, backups are easier and more robust network storage devices can be used.
􀁠Centralized management. IT departments can easily ensure that all the latest operating system security patches and application versions are being used by all end users and are in compliance with their license agreements.
􀁠Enhanced security. By managing and storing all the organization’s critical applications and data in a central location, IT departments can limit access to the data infrastructure, creating a more secure environment.
􀁠Reduced client support costs. By controlling what end users can install and run on their client devices, IT departments can avoid issues with rogue software and other end user–generated issues.
􀁠More efficient use of resources. By moving computing tasks to centralized servers, IT departments can eliminate wasted compute resources at client desktops, saving hardware, maintenance, and energy costs.
Creating an environment that leverages virtualization typically requires setting up (or leveraging existing) centralized computing resources (typically servers, but in some instances it’s possible to use regular PCs), installing and configuring the necessary software on the centralized resources, creating accounts for each user, and then
setting up access devices at each end user’s working environment. Conceptually, it isn’t far removed from traditional networked PC applications — the critical differences are the software used on the centralized computing resources and the end-user devices. One of the other benefits of creating the environment necessary to enable
virtual clients is that many of the same steps are necessary to enable cloud-based computing solutions. In fact, thin clients and virtual clients are a perfect match for companies looking to move to cloud computing.
Regardless of the type of hardware used as the centralized computing resource, a server or other multiuser operating system is installed first, and then additional software is used to create and manage each of the end-user sessions. The end-user device logs into an account that gets associated with its own session and then is presented with a typical desktop environment, complete with applications and data. In
most instances, the actual execution of applications occurs on the centralized computing resources, and the visual results are transferred over a network or other connection to the end-user device and onto its attached display. Different protocols are used both to send this visual data back to the end-user device and to relay
keyboard and mouse input and the data generated by or needed by any peripherals attached to the end-user device (e.g., printers, smart card readers) back to the centralized user session.
The real “magic” of the system derives from the software used on the centralized computing resources to enable and manage the various user sessions. The manner in which this software utilizes the hardware resources (that is, “virtualizes” the hardware) and the different levels of capability enabled by this software determine the effectiveness (and strongly influence the cost) of the overall solution.
Traditional centralized client environments virtualize applications that are accessed from either traditional PCs or thin clients that still have a local operating system and CPU that must be managed. Many organizations have deployed these kinds of solutions to centrally manage a select set of applications — often important applications that IT wants/needs to closely monitor. While these types of solutions provide reasonable cost benefits by improving application management, they can be limited by the fact that certain applications, particularly critical custom corporate applications, won’t run in shared environments. In addition, these “traditional” deployments still rely on PCs or thin clients that have their own management requirements. Newer virtual desktop infrastructure (VDI) solutions solve the application compatibility problem of older applications by running them in separate independent instances of end-user operating systems, each of which runs as a virtual machine on a server. While this type of solution does offer benefits versus the “traditional” thin client model, it adds to the cost of the entire solution by requiring expensive license fees for each unique OS instance and reducing the client-to-server
ratio to numbers that are often less than 30:1, even on a high-end server. Plus, most VDI approaches still rely on full PCs or thin clients at the user’s desktop that must be managed.
One of the newer options for centralized computing models takes the concept even further — the entire computing experience occurs on the host server device, and very simple devices referred to as “zero clients” are placed on individual users’ desks.
These zero clients have no traditional CPU or memory — often they have just a monitor, mouse, keyboard, speaker, microphone, and USB connection — and just enough firmware to point the device to a host server, where they connect to one of the individual user sessions being run on the server device. Input data from the zero clients is forwarded along to the host session, and the screen output from the session
is passed backed down to the client and shown on the connected display.
While most of the host software for these newer options is designed to run on serverclass hardware, several solutions, including Microsoft’s new Windows MultiPoint Server 2010 OS, open up new opportunities by bringing this capability to traditional PCs. These solutions provide for multiple licensed user sessions on a single PC, leveraging the significant horsepower found in today’s desktop CPUs that would otherwise be underutilized.

Advertisements
 
Leave a comment

Posted by on June 14, 2011 in Uncategorized

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: