open source in the next computing wave

Download Open Source in the Next Computing Wave

Post on 26-May-2015




0 download

Embed Size (px)


  • 1. Copyright 2009 Illuminata, Inc.TM Open Source in the Next Computing Wave Open source software has been both benefactor and beneficiary of the Internet Insight wave of computing during which large-scale, network-connected computing architectures built from relatively standardized hardware and software components have came into their own. The Nineties may have been the years when small workgroup systems running operating systems such as NetWare and Windows NT arrived throughout the business world in a big way. But the Noughts Gordon Haff have made distributed systems a core part of almost every datacenter.9 January 2009 Open source has fitted this evolution well. Linux itself came to be, in a sense, the unified Unix that had failed to be birthed through more conventional commercialLicensed to Red Hat, partnerships.1 And Unix-style operating systems, historically closely tied to theInc. for web posting. Do rise of computer networks (including standards like TCP/IP that underpin thenot reproduce. All Internet), were a great technical match for an increasingly network-centric style ofopinions and computing. At the same time, those computer networks provided the widespreadconclusions herein connectivity that collaborative open source development needed to flourish. Forrepresent theusers, these networks provided easy access to the open source software and aindependent perspectivemeans to connect to, and engage with, a community of other users. Nor did it hurtof Illuminata and itsthat the large scale of many of these new computing infrastructures made cost aanalysts. bigger issue than ever beforewhich helped to drive the proliferation of x86- based servers and open source software.Today, a fresh set of trends and technologies is changing the way that we build computing systems and operate them. Two of the biggest are virtualization and cloud computing. Virtualization effectivelydecouples operating systems and their applications from serverhardware, and thereby makes it easier to physically move them from one machine to another. Cloud computing is changing where applications runfrom on-premise to out-in-the-network. Business dynamics are also changing. Even if its often just a self-interested concern about their power bill, we are starting to see a greater awareness ofenvironmental issues among those responsible for operating datacenters. The current economic climate is also forcing more systematic thinking about costs in general, including those associated with overall complexityand the security and resiliency of large distributed infrastructures. These trends intersect in powerful ways; a new wave of computing is gathering momentum as a result. And open source is once again playing a major role.1 We use Unix here in the sense of a family of modular operating systems thatgenerally share programmatic interfaces and other conventions and approaches.*3B5DB40603D89DF Illuminata, Inc. 4 Water Street Nashua, NH 03060 603.598.0099 603.598.0198

2. 2Cloud Computings Coming of Age operation; applying this degree ofindustrialization to datacenters and theirWe consider cloud computing first. Theresoperation isnt really viable at small scale.4certainly plenty of buzz about it. For our purposeshere, we define cloud computing as accessingOpen Source in the Cloudcomputing resources over a networkwhetherthose resources take the form of a complete Open source is very much part of cloud computing.application (Software as a ServiceSaaS); a The benefit of open source to the cloud providers isdeveloper platform such as Google Apps or clear at several levels.Microsoft Azure; or something thats more akin toFirst, theres the matter of cost. Open source isnta barebones operating system, storage, or anecessarily free as in beer (to use the populardatabase (Amazon Web Services).2expression)that is, zero cost; companies oftenAs recounted by, among others, Nick Carr in his want subscription offerings and support contractsThe Big Switch, cloud computing metaphoricallyeven if the bits are nominally available for free. Butmirrors the evolution of power generation and it does tend to be less expensive than proprietarydistribution. Industrial Revolution factoriessuchalternatives even when some production-scaleas those that once occupied many of the riverside features are extra-cost options (as in the case of thebrick buildings I overlook from my Nashua, Newmonitoring tools in MySQL Enterprise). And thisHampshire officebuilt largely customized is no small consideration when you look at the sizesystems to run looms and other automated tools, of providers like Amazon and Google which oftenpowered by water and other sources. These power seem to add datacenters at a rate that manygeneration and distribution systems were acompanies once added computers.competitive differentiator; the more power youOpen source software is also just a good match forhad, the more machines you could run, and thethis style of computing. For one thing, cloudmore you could produce for sale. Today, by contrast,providersalmost by definitionare technicallypower (in the form of electricity) is just asavvy and sophisticated. Although they dont wantcommodity for most companiessomething thatto reinvent every wheel, theyre generally ready,they pull off the grid and pay for based on howable, and willing to tweak software and evenmuch they use.hardware in the interests of optimization. OpenThe economic argument underpinning cloudsource software and, more broadly, open sourcecomputing has two basic parts. The first relates to communities with which they can engage, arehow firms should generally focus their resourcestherefore a good fit given that they can modifyon those things that differentiate them and givesource code and otherwise participate in evolvingthem advantage over competitors. Computer software in a way that meets their requirements.systemsespecially those devoted to mundaneThere are some areas of friction between opentasks such as emailarent one of thosesource and cloud computing. We see this in thedifferentiators for many companies.3 The secondongoing social and community pressure on largepart relates to size and scope of computing vendors such as Goggle to make their fairEfficient IT operations involve a high degree ofshare of contributions to open source projects.5standardization, up-front design, and automated2 See our To Cloud or Not to Cloud for more discussion of 4 Theres an ongoing debate over how big big needsthe different forms that cloud computing takes. to be. See our Bigness in the Cloud. But theres general3 One of the earliest examples of widespreadagreement that the entry point is somewhere aroundoutsourcing of a computing task was payroll. This large datacenter scale.function is certainly important but having better5 Most copyleft open source licenses, such as thepayroll (whatever that would mean) isnt something GPL, dont require that code enhancements bethat advantages a company.contributed back to the community when the *3B5DB40603D89DF 3. 3Proprietary Web-based applications and services the physical servers hardware under the control ofsuch as those from Google, 37signals,the hypervisor), and even some traditional software This ability to share a single (often underutilized)vendorsalso tend to mirror certain open source physical server is certainly a salient trait ofstrengths such as easy acquisition. virtualization. In fact, its the main reason thatHowever, in the main, its a largely healthy and most companies first adopt virtualizationtomutually beneficial relationship. Open source is reduce the number of physical servers they have towidely embraced by all manner of technologypurchase to run a given number of workloads.companies because theyve found that, for many However, looking forward, the abstraction layerpurposes, open source is a great way to engage withthat virtualization inserts between hardware anddeveloper and user communitiesand even with application software is at least as importantcompetitors. In other words, theyve found that its whether its used to run multiple operating systemin their own interests to participate in the ongoing images on a single server or not.evolution of relevant projects rather than simply Historically, once an application was installed on ataking a version of a project private and then system, it was pretty much stuck there for life.working on it in isolation. Thats because the act of installing the applicationVirtualization together with its associated operating system and other componentseffectively bound it to theVirtualization is the other buzziest IT topic today. specifics of the physical hardware. Moving theTruth be told, when it comes to enterprise application meant dealing with all manner ofcomputing, its actually of more immediate interestdependencies and, in short, breaking stuff.than cloud computing given that its a morePlacing a hypervisor in the middle means that thedeveloped set of technologies and its use cases aresoftware is now dealing with a relativelybetter understood.6standardized abstraction of the hardware rather than actual hardware. The result is greatlyTo better understand how server virtualization7 increased portability.plays with both cloud computing and open source,it helps to think about what virtualization really isPortability, in turn, enables lots of interestingand how it is evolving. The core component ofpractical uses. For example, administrators can takeserver virtualization is a hypervisor, a layer ofa snapshot of an entire running system for archivesoftware that sits between a servers hardware and purposes or to rollback to if theres a problem withthe operating system or systems that run on top in a system upgrade. VMs can be transferred from onetheir isolated virtual machines (VM). Essentially, system to another, without interrupting users, tothe hypervisor presents an idealized abstraction ofbal