Open Source in the Next Computing Wave

Download Open Source in the Next Computing Wave

Post on 26-May-2015

141 views

Category:

Business

0 download

TRANSCRIPT

<ul><li> 1. Copyright 2009 Illuminata, Inc.TM Open Source in the Next Computing Wave Open source software has been both benefactor and beneficiary of the Internet Insight wave of computing during which large-scale, network-connected computing architectures built from relatively standardized hardware and software components have came into their own. The Nineties may have been the years when small workgroup systems running operating systems such as NetWare and Windows NT arrived throughout the business world in a big way. But the Noughts Gordon Haff have made distributed systems a core part of almost every datacenter.9 January 2009 Open source has fitted this evolution well. Linux itself came to be, in a sense, the unified Unix that had failed to be birthed through more conventional commercialLicensed to Red Hat, partnerships.1 And Unix-style operating systems, historically closely tied to theInc. for web posting. Do rise of computer networks (including standards like TCP/IP that underpin thenot reproduce. All Internet), were a great technical match for an increasingly network-centric style ofopinions and computing. At the same time, those computer networks provided the widespreadconclusions herein connectivity that collaborative open source development needed to flourish. Forrepresent theusers, these networks provided easy access to the open source software and aindependent perspectivemeans to connect to, and engage with, a community of other users. Nor did it hurtof Illuminata and itsthat the large scale of many of these new computing infrastructures made cost aanalysts. bigger issue than ever beforewhich helped to drive the proliferation of x86- based servers and open source software.Today, a fresh set of trends and technologies is changing the way that we build computing systems and operate them. Two of the biggest are virtualization and cloud computing. Virtualization effectivelydecouples operating systems and their applications from serverhardware, and thereby makes it easier to physically move them from one machine to another. Cloud computing is changing where applications runfrom on-premise to out-in-the-network. Business dynamics are also changing. Even if its often just a self-interested concern about their power bill, we are starting to see a greater awareness ofenvironmental issues among those responsible for operating datacenters. The current economic climate is also forcing more systematic thinking about costs in general, including those associated with overall complexityand the security and resiliency of large distributed infrastructures. These trends intersect in powerful ways; a new wave of computing is gathering momentum as a result. And open source is once again playing a major role.1 We use Unix here in the sense of a family of modular operating systems thatgenerally share programmatic interfaces and other conventions and approaches.*3B5DB40603D89DF Illuminata, Inc. 4 Water Street Nashua, NH 03060 603.598.0099 603.598.0198 www.illuminata.com</li></ul><p> 2. 2Cloud Computings Coming of Age operation; applying this degree ofindustrialization to datacenters and theirWe consider cloud computing first. Theresoperation isnt really viable at small scale.4certainly plenty of buzz about it. For our purposeshere, we define cloud computing as accessingOpen Source in the Cloudcomputing resources over a networkwhetherthose resources take the form of a complete Open source is very much part of cloud computing.application (Software as a ServiceSaaS); a The benefit of open source to the cloud providers isdeveloper platform such as Google Apps or clear at several levels.Microsoft Azure; or something thats more akin toFirst, theres the matter of cost. Open source isnta barebones operating system, storage, or anecessarily free as in beer (to use the populardatabase (Amazon Web Services).2expression)that is, zero cost; companies oftenAs recounted by, among others, Nick Carr in his want subscription offerings and support contractsThe Big Switch, cloud computing metaphoricallyeven if the bits are nominally available for free. Butmirrors the evolution of power generation and it does tend to be less expensive than proprietarydistribution. Industrial Revolution factoriessuchalternatives even when some production-scaleas those that once occupied many of the riverside features are extra-cost options (as in the case of thebrick buildings I overlook from my Nashua, Newmonitoring tools in MySQL Enterprise). And thisHampshire officebuilt largely customized is no small consideration when you look at the sizesystems to run looms and other automated tools, of providers like Amazon and Google which oftenpowered by water and other sources. These power seem to add datacenters at a rate that manygeneration and distribution systems were acompanies once added computers.competitive differentiator; the more power youOpen source software is also just a good match forhad, the more machines you could run, and thethis style of computing. For one thing, cloudmore you could produce for sale. Today, by contrast,providersalmost by definitionare technicallypower (in the form of electricity) is just asavvy and sophisticated. Although they dont wantcommodity for most companiessomething thatto reinvent every wheel, theyre generally ready,they pull off the grid and pay for based on howable, and willing to tweak software and evenmuch they use.hardware in the interests of optimization. OpenThe economic argument underpinning cloudsource software and, more broadly, open sourcecomputing has two basic parts. The first relates to communities with which they can engage, arehow firms should generally focus their resourcestherefore a good fit given that they can modifyon those things that differentiate them and givesource code and otherwise participate in evolvingthem advantage over competitors. Computer software in a way that meets their requirements.systemsespecially those devoted to mundaneThere are some areas of friction between opentasks such as emailarent one of thosesource and cloud computing. We see this in thedifferentiators for many companies.3 The secondongoing social and community pressure on largepart relates to size and scope of computing facilities.cloud vendors such as Goggle to make their fairEfficient IT operations involve a high degree ofshare of contributions to open source projects.5standardization, up-front design, and automated2 See our To Cloud or Not to Cloud for more discussion of 4 Theres an ongoing debate over how big big needsthe different forms that cloud computing takes. to be. See our Bigness in the Cloud. But theres general3 One of the earliest examples of widespreadagreement that the entry point is somewhere aroundoutsourcing of a computing task was payroll. This large datacenter scale.function is certainly important but having better5 Most copyleft open source licenses, such as thepayroll (whatever that would mean) isnt something GPL, dont require that code enhancements bethat advantages a company.contributed back to the community when the *3B5DB40603D89DF 3. 3Proprietary Web-based applications and services the physical servers hardware under the control ofsuch as those from Google, 37signals,the hypervisor).Salesforce.com, and even some traditional software This ability to share a single (often underutilized)vendorsalso tend to mirror certain open source physical server is certainly a salient trait ofstrengths such as easy acquisition. virtualization. In fact, its the main reason thatHowever, in the main, its a largely healthy and most companies first adopt virtualizationtomutually beneficial relationship. Open source is reduce the number of physical servers they have towidely embraced by all manner of technologypurchase to run a given number of workloads.companies because theyve found that, for many However, looking forward, the abstraction layerpurposes, open source is a great way to engage withthat virtualization inserts between hardware anddeveloper and user communitiesand even with application software is at least as importantcompetitors. In other words, theyve found that its whether its used to run multiple operating systemin their own interests to participate in the ongoing images on a single server or not.evolution of relevant projects rather than simply Historically, once an application was installed on ataking a version of a project private and then system, it was pretty much stuck there for life.working on it in isolation. Thats because the act of installing the applicationVirtualization together with its associated operating system and other componentseffectively bound it to theVirtualization is the other buzziest IT topic today. specifics of the physical hardware. Moving theTruth be told, when it comes to enterprise application meant dealing with all manner ofcomputing, its actually of more immediate interestdependencies and, in short, breaking stuff.than cloud computing given that its a morePlacing a hypervisor in the middle means that thedeveloped set of technologies and its use cases aresoftware is now dealing with a relativelybetter understood.6standardized abstraction of the hardware rather than actual hardware. The result is greatlyTo better understand how server virtualization7 increased portability.plays with both cloud computing and open source,it helps to think about what virtualization really isPortability, in turn, enables lots of interestingand how it is evolving. The core component ofpractical uses. For example, administrators can takeserver virtualization is a hypervisor, a layer ofa snapshot of an entire running system for archivesoftware that sits between a servers hardware and purposes or to rollback to if theres a problem withthe operating system or systems that run on top in a system upgrade. VMs can be transferred from onetheir isolated virtual machines (VM). Essentially, system to another, without interrupting users, tothe hypervisor presents an idealized abstraction ofbalance loads or to perform scheduled maintenancethe server to the software above. It can also make iton a server. Ultimately, virtualization enables whatappear as if there are multiple such independent is often called a virtual infrastructure or a dynamicservers (all of which, in reality, cooperatively share infrastructureby whatever name, an infrastructure in which workloads to move tosoftware is delivered only in the form of a service, aswherever they are most appropriately run, ratheris typical with cloud computing. than where they happened to be installed once6 Although various antecedents to, and subsets of, upon a time.cloud computing go back some timethink hostingproviders or even timesharing.7Open Source in VirtualizationAt its most conceptual, virtualization is an approachto system design and management that happens inmany places and at many layers in a system. For our VMware both brought proprietary serverpurposes here, virtualization refers specifically to the virtualization to the mainstream and has been theparticular approach to server virtualization described. *3B5DB40603D89DF 4. 4vendor who has most benefited from it to date.particular often uses an alternative form ofHowever, a variety of virtualization options arevirtualization thats more about distributing anow available, enabled in part by enhancements to single large job to a large number of servers usingprocessor hardware from AMD and Intel thatcertain standard protocolssometimes called asimplify some of the more difficult aspects ofgrid. However, server virtualization is certainly ansimplifying x86 hardware. ideal complement to many cloud computingimplementations.Among these options are Xen and KVM. Both opensource projects are part of standard LinuxCloud computing providers have adopted opendistributions.8 They are also available in the form source virtualization approaches (especially Xen)of a standalone hypervisoressentially a smallfor many of the same reasons that theyve widelypiece of code (often embedded in flash memory)adopted open source in general. Amazon Elasticthat lets a server directly boot up into a virtualizedCompute Cloud (EC2) is an illustrative and well-state without first installing an operating system. known example of virtualization, paired withGuest operating systems can then be installed onLinux, in the cloud.11 With EC2, you rent VMs bytop in the usual manner. Xen is the more widely-the hour. Each VM comes with a specific quantityused and mature of the two today. But Red Hat of CPU, memory, and storage; currently there arebought Qumranetthe startup behind KVMin five different size combinations available. Users canearly September 2008 and is focusing on KVM asthen build their own complete VM from scratch.its strategic virtualization technology going More commonly, theyll start from a standardforward; KVM has also been incorporated into theAmazon Machine Image (AMI)an archived VMmainline Linux kernel since version 2.6.20.9pre-loaded with an operating system, middleware,and other software.Virtualization has a close relationship to cloudcomputing, especially cloud computing Initially, these AMIs consisted almost entirely ofimplementations that provide users with ancommunity-supported Linux distributions.execution environment in the form of a virtualHowever, one of the things that we now seemachine.10 Virtualization brings a lot of the happening as cloud computing evolves from aproperties youd want a cloud computing developer-centric, kick-the-tires stage to somethingenvironment to have. You want to be able to store that supports production applications and evensnapshots of your environment to use in the entire businesses,12 is that some of the samefuture. Check. You want to be able to spin up concerns that are relevant to software running inapplications dynamically and shut them down an enterprise datacenter are finding their way intowhen theyre no longer needed. Check. You want to software running at cloud providers.insulate users from details of the physicalAn example of this trend is AMIs with Red Hatinfrastructure so that you can make transparentEnterprise Linux (RHEL) and the JBoss Enterpriseupgrades and other changes. Check. VirtualizationApplication Platform (currently in a supportedisnt a universal requirement for all types of cloudpublic beta phase). This allows enterprises runningcomputing. High performance computing inRHEL inside their firewall to run the sameoperating system on Amazon Web Services (AWS).8Xen is also the basis for the server virtualization inThey might do this as part of migrating to, or Suns OpenSolaris and xVM. See our Virtualization Strategies: Sun Microsystems.running, just new applications in the cloudor for9Red Hat is doing this for both business and technicalusing the cloud to handle temporary workload reasons. See our Red Hat Makes Buy for KVMBut VDI Too.spikes. Precise support policies can vary by software10 Providers of other types of cloud computing, such as 11 At the end of 2008, Amazon also added...</p>