netherlands eon winter

20
To protect and serve How the cloud is redefining the approach to data protection Safe and sound Securing data in the cloud Taking control The importance and role of good governance Life in Information Winter 2010

Upload: spencerjames-design

Post on 28-Mar-2016

239 views

Category:

Documents


0 download

DESCRIPTION

Customer magazine

TRANSCRIPT

Page 1: Netherlands eON Winter

To protect and serve

How the cloud is redefining the approach to data protection

Safe and soundSecuring data in the cloud

Taking control The importance and

role of good governance

Life in InformationWinter 2010

Page 2: Netherlands eON Winter
Page 3: Netherlands eON Winter

3

Contents

It’s the CIOs job to watch over the services that his department provides to the rest of the business: safeguarding critical (and the not so critical) information from harm. But as technology, and subsequent infrastructures, continue to evolve how do you ensure that you protect enough without being overprotective?

In this issue of ON Magazine, we explore the challenges faced by CIOs responsible for information security and data protection within the cloud. On page 4, author David G Hill looks at data protection strategies and how to avoid making the same mistakes twice. We also interview Tom Heiser, chief operating officer of RSA, EMC’s security division, about the evolution of security needs within different types of cloud architectures, and The Leadership Council of Information Advantage lends us their formidable experience to advise on the potential pitfalls when developing good governance policies.

We also feature our second article from one of Europe’s leading thinkers on the impact of technology on our society, Peter Hinssen. In this piece, Hinssen argues why he believes that the role of the architect is the most important job in IT today. Finally, we also highlight two local companies who are using EMC’s solutions to ensure good governance and data security.

If you’re looking for an opportunity to meet and network with your peers while learning from industry leaders, don’t forget to join us at EMC Forum on 17 February 2011, at Van Nelle Ontwerpfabriek, Rotterdam. Choose to join one of the informative sessions or visit the accompanying expo to discover innovative solutions that can help your business.

Whatever your job title or position, we think that you’ll find something of interest in this issue of ON Magazine – whether you’re about to start your journey to the cloud, or whether you’ve already established yourself there.

Chris de JonghCountry Manager EMC Netherlands

Prevention is better than cure

Welcome4 Don’t take the cow path

David Hill: Data protection strategies must avoid the ‘cow paths’ created by legacy solutions.

8 IT infrastructure Peter Hinssen explains how architects are evolving from ‘build to last’ to ‘designed to change’.

6 The private cloud has a silver lining Sanjay Mirchandani: New roles and opportunities beckon to IT professionals.

10 Shelter from the storm Cloud security challenges and solutions from RSA

12 Well connected CIOnet on the importance of social networks for CIOs and IT managers

14 Cloud deployments: up, up and away We hear from the Leadership Council of Information Advantage on good governance in the cloud

16 Verkerk Group: how this Netherland's company is using EMC solutions to improve processes

18 Centrel adapts storage infrastructure to meet business needs

©2010 EMC Corporation. All rights reserved. The views expressed in this magazine are those of the contributors and EMC takes no responsibility for their validity.

“As technology, and subsequent infrastructures,

continue to evolve how do you ensure that

you protect enough without being overprotective? ”

Page 4: Netherlands eON Winter

4

Data protection

Although modern historians consider the story a myth, Boston’s dysfunctional roadways were supposedly the result of cows creating paths that later turned into streets. For the sake of argument, assume the story is true. Was this process initially a mistake?

Not really.

Although a particular cow path may not have been the shortest distance between two end points, it was probably the easiest route for the cows to follow. And it saved the owner from the effort of creating a path. Unfortunately, these optimal cow paths eventually led to a suboptimal solution for Boston’s roadway system as a whole.

A similar situation exists today in data protection where enterprises - both private and public - have inherited a number of cow paths in the form of legacy data protection solutions. Enterprises need to understand that these existing solutions have to be coordinated effectively in order to build a comprehensive, near-optimal data protection solution for today’s evolving needs.

These new needs often include implementing server virtualisation and a cloud, either private or hybrid. The path of least resistance in these situations is

to leave legacy data protection solutions as is and create new, independent ‘cow path’ solutions for the new hardware and software in the architecture. This, however, is the last thing any enterprise should do. Instead, an effective enterprise will define an ideal data protection architecture and fit legacy and new solutions into that architecture.

Create a comprehensive frameworkThe first step in constructing and implementing a comprehensive data protection scheme is to define the right model, one that integrates data protection’s disparate functions. These days, data protection is an important component of business continuity, disaster recovery, data security, regulatory compliance, eDiscovery for civil litigation, and other functions. What model or rules of thumb do we use to ensure that all these facets of data protection can be integrated effectively into a comprehensive solution?

One key principle is that all the high-level functions of data protection (such as backup, access control, and encryption) can be applied to any and all data types and storage locations. These data protection functions comprise the first tier of our model.

Within this context, the need to

balance privacy and security may lead to different access controls and archiving policies being applied to different types of information. For example, protection would be more stringent for employee social security numbers or corporate financial results that have not yet been released than for online repair manuals for manufacturing equipment. Similarly, one set of best practices should be defined for site mirroring, disaster preparedness, and data recovery, although data warehouses may have a greater need for incremental backup than, say, email repositories. In other words, the model should provide a common, information-centric approach to data protection at the top and fit legacy and cloud data protection sub-functions, segmented by the type and sensitivity of data, into that overall approach.

This information-centric approach makes it much easier to ensure that the defined data protection objectives - which may include preservation, availability, responsiveness, confidentiality, and/or auditability - have been met for different categories of information.

Who’s using the data? And why?This approach does not take care of cases where different constituencies are accessing the same data type. Therefore,

Don’t take the cow pathBy David G. Hill

Legacy data protection solutions need to accommodate ascendant cloud computing and virtualisation technologies - not the other way around

ILLu

STrA

TIo

n b

y A

dA

M M

CCAu

LEy

Page 5: Netherlands eON Winter

5

the third tier of our model drives decisions about how to reconcile the sometimes conflicting demands of different end users and applications. For example, business continuity requires preservation of data for accuracy and completeness, whereas eDiscovery requires preservation to ensure that the data is tamperproof and so is usable in civil litigation.

Based on end-user and application access needs, data protection schemes must also define the most effective point in time to move different types of information to less costly long-term storage as part of information lifecycle management. Developing third-tier rules to handle these needs and conflicts results in an integrated data protection solution that is more cost-effective, resilient, scalable, and easy to use. In turn, this helps organisations avoid an ad hoc and piecemeal approach to each new data protection challenge.

Note that this model of data protection fits nicely into a governance, risk management, and compliance (GRC) framework. The three pillars of the GRC framework - which represent three of the primary responsibilities of any enterprise - provide a way to analyse and understand all the facets of data protection in a comprehensive manner.

The next step: reality testingWhile our ideal model allows us initially to operate at a high level and consider principles and objectives, eventually our cow-path-avoiding data protection

planning has to get down to the level of reality testing individual technologies. Let’s consider the example of adding active archiving to the data protection environment.

Active archiving is more than the automatic tiering of infrequently or never-accessed data to relatively higher capacity/lower performance disk drives. While hierarchical storage management uses disk storage more cost-efficiently, it does not (by itself) give users a full spectrum of data protection capabilities.

In contrast, active archiving provides a controlled environment where all the facets of data protection can be managed in a comprehensive and consistent manner. For example, information that is moved into an archive was previously application controlled: the application managed the creation, reading, updating, and deleting of data. Once data enters the active archive, the original application can only take actions allowed by the active archive management software, based on policy. That is an important and necessary restriction. For instance, the

policy manager can automatically enforce retention policies, such as litigation holds, that could be accidentally or intentionally overridden in a production environment, putting the organisation at risk.

As a side benefit of active archiving, the amount of active production data is reduced (because rarely used data is automatically moved to secondary storage), which means faster backups and faster restores.

Technologies such as active archiving cannot be put in place in isolation. Enterprises must define when to move data to the active archive and how business continuity needs and security restrictions will change as data ages. To make this decision process as easy as possible, enterprises need a formal data governance programme and an overall three-tier, information-centric data protection model in place. This will, not only speed the implementation of active archiving, but also minimise implementation costs, reduce the impact on production systems, and help avoid data protection coverage gaps.

The final step: stay true to your modelOnce a data protection model has been created and implemented, there remains one final task: maintaining its comprehensive, information-centric approach as part of overall IT governance.

What this really means is that the modern equivalent of cow paths should not be allowed to occur. The pressures of cost constraints and speed-to-market may make it difficult for IT to resist quick implementations that create independent data protection schemes, but, in the long run, proliferation of independent solutions will create many of the same problems that today’s legacy solutions are causing. Boston’s infamous Big Dig was partly a reaction to the traffic-flow problems created by design violations of a previous comprehensive traffic-management scheme. The high cost of the Big Dig suggests that failure to maintain a comprehensive scheme can be almost as expensive as failure to implement one in the first place. o

David Hill is principal of Mesabi Group LLC and the author of the recently published book Data Protection: Governance, Risk Management, and Compliance

“Developing third-tier rules to handle these needs and conflicts results in an integrated data protection solution that is more cost-effective, resilient, scalable, and easy to use.”

Page 6: Netherlands eON Winter

6

ILLu

STrA

TIo

n b

y Jo

hn

S. d

ykES

CIO corner

Is anyone wondering if ‘this cloud thing’ is good for the careers of IT professionals? Let’s get any doubts out of the way: it’s good.

Recently, it seems that when I visit CIOs or other IT executives, they mostly want to chat about what a cloud-based organisation looks like. They want to talk about what roles their IT organisations should consider adding during the effort to plan, build, and run a private cloud environment. It’s a smart way to approach it because everything starts with people - with their skills, their ability to prepare, and their overall understanding of how much good IT can do to advance a business’s competitiveness.

In the years I’ve been working in IT and the technology industry, I have seen waves of technology come and go. But the private cloud isn’t just another wave. It is a game-changing opportunity for IT professionals. It is the answer to that age-old question, “Is IT aligned with the business?” A private cloud lets us deliver a catalogue - a set of offerings – and it removes the need to have custom code and custom craft and overly-complicated solutions.

When one builds and runs a private cloud, life changes. The conversations change. And the job changes, bringing opportunities to grow professionally.

Leaving your fingerprints on the cloudWe are still in the early days of this journey and we have the opportunity to add our own fingerprints as the private-cloud unfolds in our shops. How is it fundamentally going to evolve?

Well, the intent is to offer IT as a service. Our teams must build the environment differently from before and govern it differently. As an infrastructure

The private cloud has a silver liningBy Sanjay Mirchandani

The rise of private cloud environments is creating new roles and new opportunities for IT professionals

The survey asks how these managers have been coping with organisational challenges arising from the explosion of data, the increasing importance of digitised information, and the introduction of new technologies.

In the most recent update, EMC Education Services found managers rating only 30 percent of their storage professionals as ‘strong’ or well skilled. Alarmingly, in the past two years, the

evolves into a private cloud, our competencies as IT professionals will evolve and grow, too.

For example, the whole concept of ‘geography’ changes. That fact forces us to look anew at our physical boundaries, our federation approach, and our tactics for working with external providers to extend into public clouds.

A private cloud also injects automation into data centre operations. Simply put, our people won’t need to babysit the infrastructure or its users as much. For example, in a private cloud, internal customers can help themselves by provisioning their own storage by submitting a web form. It’s freeing people in our organisations to think about what higher-value IT activities they can pursue; what skills they can develop; what new competencies they can gain.

We’re already seeing cloudy roles start to emerge. They bear titles such as cloud architect, cloud capacity planner, cloud service manager, cloud solution consultant, cloud governance-risk-compliance manager, cloud security architect, and so on.

People in my organisation tell me they are eager to see what the private-cloud build out will mean for them. It’s obvious that they’ve been putting serious thought into how to take advantage of this opportunity to become more functionally competent and increase their professional worth.

The growing demand for cloud expertiseThe EMC Education Services organisation conducts a worldwide annual survey of 1,500 IT and storage managers, titled ‘Managing Information Storage: Trends, Challenges, and Options’.

Page 7: Netherlands eON Winter

7

IT was all about unique point solutions. We’d design something for an internal customer’s application and optimise our gear for that application. With subsequent projects, we’d do that same thing, over and over and over again. Our efforts were not marked by reusability.

New paradigm, new roles In the new world of the cloud, we create an enterprise-hosting platform with standardised components, built to scale. We build it once, and then we provision it again and again. What does this mean to an IT person looking for a career boost? Those of us who have made a profession of IT are accustomed to reinventing ourselves as new technologies arise.

If a guy with storage expertise begins to embrace systems, networks, and IT security as well, then his professional worth grows dramatically. He’s on his way to becoming a cloud architect, able to link into everything from an initial application thought process all the way to a steady-state running of a cloud infrastructure. Some of my people are pursuing just such a professional path now.

Career opportunities also accompany the ‘pay-as-you-go’ aspect of cloud-based consumption. Self-service provisioning and the multi-tenant nature of a private cloud mean we’ll need skilled capacity planners. In fact, the cloud capacity planner rises to an unprecedented level of importance. In the old world, capacity planning might have been someone’s side job. In the new world, capacity planning will be a full-time role that encompasses all the cornerstone technologies of storage, system, and network.

Another cloudy job: the cloud service manager, an in-house consultant who thinks through how a business can capitalise on cloud services and consume them across the stack. And think of the careers arising from the day-to-day running of a private cloud. For instance, to provision cloud services, we use a ‘single pane of glass’. But we cannot have a storage admin, systems admin, and network admin all fighting for control of the mouse on that console, figuratively speaking. I think we’ll see some of these people becoming cloud infrastructure administrators charged with provisioning the entire stack. Our own IT folks dreamed and built the EMC IT Global Operations Command Centre - our window to our private cloud - about a year ago. Inside its walls, experts in storage, systems, networks, and security are transferring their knowledge and broadening their skills.

The final piece of the career-opportunity picture relates to private cloud governance. If you’re enabling end users to provision storage by themselves, you need someone ensuring users aren’t also doing things they shouldn’t be. A cloud governance manager maps the service catalogue and tiers services to the business’s needs, making sure the right people have access to the right things.

These job titles may not be the final ones, but they describe the competencies we need. It’s an emerging world and a great time to be an IT professional. The organisations we run will evolve. Why shouldn’t our folks situate themselves right in the engine of this train? Getting on board isn’t complicated. Enjoy the journey. It starts now. o

Sanjay Mirchandani is senior vice president and chief information officer at EMC

“It’s freeing people in our organisations to think about what higher-value IT activities they can pursue; what skills they can develop; what new competencies they can gain.”

survey has revealed a decline in the percentage of storage pros rated ‘strong’. When asked about emerging technologies, the respondents showed an increasing interest in storage virtualisation and cloud technologies; 41 percent of the surveyed organisations are in different stages of storage virtualisation implementations, and an impressive 13 percent are planning or deploying private or public cloud technologies.

Our VP of infrastructure, Jon Peirce, has run just about every aspect of IT at EMC. Today, his group manages our journey to the private cloud. He and I have been thinking about what this means for our people. Let’s start with how a private cloud is built. The old world of

Page 8: Netherlands eON Winter

8

Peter Hinssen is one of Europe’s leading thinkers on the impact of technology on our society. In this article, the second in a series of three, Peter explores the impact of infrastructure in the rapidly transforming IT environment.

Architecture is undoubtedly the most important discipline in IT. Architecture defines the past, present and future of technology. Architecture is the set of guiding principles that guide our development, define our infrastructure and shape our future of IT.

But architecture is also perceived as the most boring subject in the universe. ‘Architecture’ is probably the best way to get an executive board meeting to go to sleep in an instant. Better than Prozac probably. If you really want to turn them into mindless zombies, use the lethal combination of ‘Architecture Governance’. Puts them in an on-the-spot coma.

I believe that the most important job in IT today is the role of the architects. No- one else has the opportunity to change, shape and alter as the architects do. No- one has the opportunity to transform not only how we work, but have a direct impact on our role as IT.

Architecture has been up for a major

transformation, certainly as we’re going through the evolution in IT from ‘Build to Buy to Compose’. In the old days of IT we would focus primarily on BUILDing applications. We still have those legacy systems in our datacentres that have been running for 20, 25, 30 years or more. We used to call them silos, but now a silo has become a dirty word in IT. Legacy systems refuse to die, and there’s a wonderful IT joke about a retirement home where you see a stretch limousine pulling up to the retirement home: they’re picking up a Cobol programmer.

But we’ve evolved from building our own systems more than 20 years ago to primarily BUYing applications, from vendors such as SAP or Oracle, and now we’re moving into the COMPOSE phase. Instead of building systems or buying systems, we’re ‘assembling’ systems and applications based on services. These services can be ones that we have internally, or increasingly services that we can use that reside outside our

company’s perimeter. Build, to Buy, to Compose in about 30 years time. And that’s just the beginning.

Fundamentally, what is means is that we have to make a mental shift, from the old thinking in IT that we had to build systems that were ‘built to last’ towards designing solutions that are ‘designed to change’.

The more we move into the New Normal, the need to think in terms of architecture as ‘designed to change’ will only increase. And of course, the cloud will be a fabulous instrument to help us build the next generation of IT that will be designed to change.

The New TerritoryI believe the cloud is the most exciting ‘new’ territory in our digital exploration. It’s new territory because it will force a lot of IT professionals outside the walls and perimeter of their organisation, boundaries that have served them well for a very long time.

The new mechanics of IT architecture

Thought leadership

Page 9: Netherlands eON Winter

9

But that is just a transition phase. So what if we take it up a notch?

Cloud 2.0If we keep moving the ‘old’ applications into the cloud zone, we’re just not pushing hard enough. But if we ‘break the barriers’, then we can probably transform the way we think about technology.

If we look at the first barrier, if we only virtualise the ‘old’ applications, we’re never going to be able to take advantage of new functionalities. We’re basically only OPTIMISING existing applications, and run them more efficiently. But if we ‘re-architect’ the applications, and really think about services instead of applications, we can build a whole new set of flexibility into the cloud based applications and fully deliver on the Build > Buy > Compose promise. But it means we have to re-think, re-build, re-architect to take advantage of the cloud.

At the same time, the second barrier on the right is the ‘network thinking’ barrier. If we only push applications out of our company comfort zone, and run them on a server in the network, you’re not really adding value for the end-users. But if you can re-architect the applications to take advantage of the fact that you are network-centric, you can transform applications to act as ‘network applications’. It’s like the transition of your company’s ‘who’s who’ on your own intranet, and the step towards using a LinkedIn platform. That’s network thinking. But it means we have to re-think, re-build, re-architect to take advantage of the Cloud.

So, if we really want to leverage the power of the cloud, we have to break the barriers of the existing applications, and think about services and think about

network centricity. THEN we’ll unleash the real power of the cloud.

Are we there yet? Some may be moving from Cloud 1.0 to Cloud 2.0, but there’s still a lot of CIOs out there who have not even made it to Cloud 1.0. As Marc Benioff, founder and CEO of Salesforce.com put it: “We all need to go faster. Unfortunately, some CIOs would rather retire than go faster.”

If we move from Cloud 1.0 to Cloud 2.0, we will see the drivers for using the Cloud begin to shift.

Cloud 1.0 was all about cost efficiency, scalability and flexibility. All are excellent drivers to start to implement this exciting new technology, and to get to know the ‘new territory’ as IT professionals.

But Cloud 2.0 has different drivers. Here we will see the true potential of the cloud, and look at convenience, agility and network centricity as the real drivers. We will be able not just to MOVE applications into the cloud, but start to use the cloud as a dynamic resource in the CREATION of new functionalities and applications. o

Peter Hinssen is an advisor, lecturer and author, and one of Europe’s leading entrepreneurs

“Instead of building systems or buying systems, we’re ‘assembling’ systems and applications based on services.”

But it’s time to break those boundaries. The cloud offers opportunities to use new functionalities, compose new applications, and develop new opportunities faster and more flexibly than ever before. But we still feel a bit awkward and naked outside the walls of our companies. We feel a bit strange outside the comfort zone of our firewalls, and we honestly look a little silly in the vast new territory called the cloud.

If you look at the two aspects that have been shaping the ‘cloud’ phenomenon, it’s the power of virtualisation, and the prospect of outsourcing.

The ‘old IT’ was all about the ‘one application, one server’ paradigm, where we had typically IT silos occupying our technology landscape. We firmly rooted our own servers on our own premises, and had a tendency to buy new infrastructure as we developed more applications.

The big trend that has been gaining steam is to ‘virtualise’ applications and infrastructure, and therefore move UP the vertical axis by cleverly sharing resources. At the same time, we’ve been ‘outsourcing’ more and more functionality outside our own company perimeter, and using off-premise resources. If you combine those two, you get the combination of sharing and outsourcing that we’ve come to see as ‘on demand computing’, where you use shared applications in the network, and come to the concept of cloud as we know it: cloud 1.0.

The power of virtualisation and the prospect of outsourcing is shaping the cloud: a simple way to describe this, is if you put the ‘sharing of resources’ on the vertical axis, and you put the ‘location of resources’ on the horizontal axis.

Page 10: Netherlands eON Winter

10

Shelter from the stormWith security the number one concern amongst CIos, we talk to Tom heiser, chief operating officer of rSA, EMC’s security division, about challenges, evolution and how to keep the private cloud, private.

What are the barriers to introducing private clouds? The main barriers preventing companies from using private clouds more pervasively include security and compliance challenges of virtualisation – essentially the foundation of cloud computing. A recent Forbes Insights report, which surveyed 235 CIOs and IT executives, found that 43% of the survey respondents identified security as their top concern. Adoption of virtualisation for IT test and development environments is growing rapidly, but as customers look to virtualise mission-critical applications, new security and compliance concerns emerge.

How do you keep the private cloud private and secure?The process of managing security and proving compliance is relatively similar for both physical and virtualised IT, but

virtualisation does present some unique challenges. Among them is the rapid rate of change in the virtual infrastructure, with virtual machines brought up and down or moved from one server to another on a frequent basis. It is important that security and compliance teams are included in the planning stages of virtualisation projects, or they will find themselves lacking the same visibility and control in the virtualised IT environment that they have in the physical infrastructure.

How do you see the governance, risk and compliance issues? For the most part, regulations do not differentiate between physical and virtual IT infrastructure, although some, such as the Payment Card Industry (PCI) Data Security Standard, are being revised to include guidelines for virtualised systems. However,

whether the infrastructure is physical, virtual or hybrid, organisations and cloud service providers must toughen their environment; evaluate the performance of their control framework; resolve deficiencies; and report compliance both internally and externally.

RSA became the first vendor to address governance, risk and compliance in a virtual server environment with the launch of its RSA Solution for Cloud Security and Compliance. This solution, built around RSA’s Archer Platform comprises policy management and implementation, security and compliance measurement, issue remediation, and reporting - all integrated within a single management system for both physical and virtual infrastructures. Archer integrates with RSA enVision® log management to collect and correlate security and compliance events from a

variety of sources, including the RSA Data Loss Prevention suite, VMware vShield and VMware Cloud Director. The solution enables organisations to meet their security and compliance requirements.

How do you see security and data protection evolving within the public cloud? Cloud and virtualisation are opportunities to implement better than physical security controls. The need for information security doesn’t change with the introduction of the cloud - you still need your existing controls, but in a private or public cloud environment you have to be able to mitigate new risks in the virtual environment. Security vendors have to offer solutions that run at the hypervisor layer, and/or look at offering products that were historically only appliance-based to run on a virtual platform.

How do RSA and VMware work together?RSA has been collaborating closely with VMWare to offer security, manageability and compliance solutions that are even better than we have in physical environments.

RSA’s collaboration with VMware is designed to help customers deploy cloud environments that provide comprehensive security, up and down the virtual stack. RSA’s solutions tie security controls to higher order compliance objectives, including collecting and correlating security and compliance events across the cloud infrastructure and key security services delivered through VMware’s vShield virtual firewall.

RSA, The Security Division of EMC, is the premier provider of security solutions for business acceleration.

Page 11: Netherlands eON Winter
Page 12: Netherlands eON Winter

12

Interview

Well connected

What is CIOnet?In its simplest form, CIOnet is an independent, private, social network for CIOs. But it’s more than that. CIOnet is the first international online and offline network that empowers CIOs and IT managers to network more efficiently and

effectively for business. We aim to create as much value as possible, by providing an online and offline community for CIOs to share knowledge, resolve industry challenges, exchange ideas and learn, as well as building relationships and make friends.

How does it work?When a CIO becomes a member they build up their own profile page. This can include as much or as little information as they want. Once established the member can link to other CIOs of interest to them or can recommend other members to link to and share information. They are also able to upload articles to the site or join in discussion forums.

Offline activity, which typically includes six or seven local and international events and conferences a year; regular surveys; a twice-yearly magazine; specific interest groups; and so on, support the online side of the community. Since we’re the only network

CIOnet has been networking CIOs across the world since 2005.

As a business partner and key sponsor of the network throughout

Europe, we invited Hendrik Deckers, managing director of CIOnet, to tell us more about the value of social networks and how CIOs can get better connected in an increasingly linked world.

Page 13: Netherlands eON Winter

13

to offer this 360-degree connection of both online and offline activity, we’re able to better connect members. For example, members can look at an event, see the agenda, see who has signed up to attend and then contact those individuals. This provides an opportunity to meet up with other CIOs face-to-face.

Who sets the agenda?CIOs do. CIOnet is merely the platform to facilitate knowledge and value sharing. It’s the members, and specifically the advisory board, in each country that sets the agenda and programme for the forthcoming year. The members are best placed to identify what topics are important to them, what technologies they want to hear about and what solutions will work best for them. We do provide additional topics and suggestions for special interest groups but it’s the members that drive the path of discussions.

Why is being connected so important for CIOs?CIOs are in the enviable, and unenviable, position of understanding both the industry/business that they work within, as well as the technology that runs that

business. No one else in the company has insight in to both areas.

Typically a CIO will report in to someone who knows the business well but doesn’t understand what the IT department does or the issues it faces in delivering services to the business. This means that CIOs tend to learn most from their peers. By being connected to their peers they can gain knowledge and further their own understanding of related solutions. Hearing from peers who also operate in the finance industry, for example, can be very beneficial in solving their own problems.

How are you different from all the other CIO networks?For a start we’re the largest, international community of CIOs. We have over 1750 members worldwide and expect that number to increase to 2000 by the end of the year. The more high-level CIOs we have as members, the more value each individual derives from the network. We’re also the only community that provides both online and offline activity to actively encourage members to share and add value to others. We also proactively link and build strong relationships with other communities,

such as business schools, industry institutes and media partners. This enables members to increase the value of their own connections.

So how do I join?Membership is free but by invitation-only. The CIOnet team work hard to identify the right CIOs worldwide who would benefit from and bring value to the network. Of course, other members can make recommendations and CIOs can simply apply for membership but we have a strict criterion that has to be fulfilled before a member is able to join. We also allow academics to join, since we find they are able to add value to the content that is on offer to members.

Alternatively, we have business partners, such as EMC, who represent their organisation to the members. CIOnet provides partners with access to content that helps to increase their understanding of challenges faced by their customers.

What levels of membership are available to me?Most CIOs join as normal members, but there are lots of opportunities to maximise the value they get from CIOnet. Each country has its own advisory board made up of normal members. The site also provides opportunities to be part of special interest groups (SIGs) where members can discuss topics most relevant to them. SIGs are actively encouraged to host an event on the topic, draft a report and share the experience with the membership as a whole.

This provides a great opportunity to get in contact with like-minded peers. Currently we have SIGs covering security, cloud computing, governance, IT infrastructure, and many more. o

CIOnet has offices in UK, Belgium, Italy, Spain and France. Visit www.cionet.com for more information.

"CIonet is the first international online and

offline network that empowers CIos and IT managers to

network more efficiently and effectively

for business."

Page 14: Netherlands eON Winter

14

Thought leadership

Cloud deployments: up, up and away?

A recent CIO Market Pulse survey, sponsored by EMC’s Information Intelligence Group, has found that more than three-quarters of IT organisations interviewed are currently running, actively researching or planning to deploy applications in a private, hybrid and public cloud environment over the next 12 months.

As CIOs overcome initial concerns about the viability of cloud-based solutions, especially those reliant on external service providers, another set of challenges loom. Although adoption of cloud services is increasing, information governance models are lagging. Only 34% of the more than 200 respondents to the CIO Market Pulse survey said that they have a governance policy in place, despite 86% quoting information governance as their top concern.

In the first of a series of two articles looking at cloud computing and governance, The Leadership Council for Information Advantage takes a look at the key challenges faced by CIos in the deployment of cloud computing and explains why it is so important to get governance right from the beginning.

Without clearly defined governance, deployment of cloud computing services may actually exacerbate longstanding IT challenges.

As CIOs evaluate their companies’ cloud strategies, they will need to come to grips with how data governance policies – and indeed, the very mindset of IT itself – need to evolve to support diverse models for cloud computing. Inadequate or lagging policies to manage the data stored in these different environments will dramatically increase the risks that IT organisations will lose control of the information they are expected to protect.

Rising clouds: a perfect storm for information?The cloud promises to improve and speed our ability to access, process, and share information. Yet, a ‘perfect storm’ of conditions is emerging that threaten to impede the flow and value of corporate information:

1. Growth and proliferation of incompatible cloud platforms and services – The ease and convenience of adopting public cloud infrastructure and software services also makes it increasingly likely that Smaller groups within organisations will independently go with such services without considering how they may fit into the enterprise’s IT infrastructure as a whole. Sometimes this results in different parts of the same organisation using

incompatible business applications for similar functions, which not only detracts from potential economies of scale, but also increases the number of systems and services that internal IT needs to integrate and support.

2. Data Silos 2.0 - Without planning and integration, a cloud’s information repositories aren’t readily accessible to the organisation’s other IT systems, business processes or groups. This can lead to cloud-specific silos of information that aren’t backed up, particularly in public clouds.

The problem is exacerbated the increasingly common practice of business users provisioning new cloud services with only their ad hoc requirements and a corporate credit card. Information access and portability will be major concerns as organisations move their IT services to the cloud. If cloud-based information silos are allowed to take root, integrating these silos may require future integration efforts of an epic scale.

“People choose a cloud service because it’s faster and more pragmatic and easily available and cheap. And I think all the structures and planning have the potential to go out of the window because of that.”Deirdre Woods, Associate Dean and CIO of The Wharton School

Page 15: Netherlands eON Winter

15

commitment. Although switching to another cloud platform may not be as costly and time-consuming as shifting from IBM UNIX to Wintel servers, the potential for lock-in is remarkably similar to what we saw with IT hardware 15 years ago.

4. Complex chains of custody for information management and security - As organisations shift different IT services to various clouds, they complicate the chain of custody for information, which may now reside both in-house and in external private or public clouds. Cloud vendors throughout the service delivery stack must be monitored to ensure they’re managing enterprise information appropriately and enforcing policies regarding information security, privacy, e-discovery, archiving, and backup.

Governance models lagEven though CIOs are increasingly aware of the information management and compliance challenges presented by

different cloud models, they have been slow to create governance policies for cloud services.

Nearly 4 in 10 respondents (38 percent) said they are planning to develop a policy to cover cloud based information. But nearly one-quarter of the CIOs (22 percent) said they had no plans to develop such a policy – even though 86 percent expressed concerns about information management and governance risks posed by public clouds.

As CIOs evaluate their companies’ cloud strategies, they will need to come to grips with how data governance policies – and indeed, the very mindset of IT itself – need to evolve to support diverse models for cloud computing. The information management infrastructure of the very near future will extend from the enterprise and private cloud to a mix of community, hybrid, and public cloud deployments. Inadequate or lagging policies for managing the data stored in these different environments will dramatically increase the risks that IT organisations will lose control of the information they are expected to protect.

“With the cloud’s scale, mobility, and agility comes complexity that needs to be managed,” EMC CIO Sanjay Mirchandani commented in the LCIA report. “The way to manage that complexity is through policy and governance. Across your IT environment, information policy governance and compliance becomes even more important than it was in the past.” o

Look out for the second article in this series where the council sets out the road map for good governance in a cloudy world.

EMC invites you to join this important conversation. Please visit http://www.councilforinformationadvantage.com/ to download Council reports and to contribute ideas for future reports.

3. Escalating potential for vendor lock-in - Cloud providers provide custom APIs for porting services and data to their platforms. While a few, such as Salesforce.com, have achieved sufficient market success to become de facto standards within their segments, the overall market for cloud services is still highly fragmented. Selecting a cloud platform can mean, for all practical purposes, a long-term vendor

EMC has convened the Leadership Council for Information Advantage, an advisory group made up of global information leaders from ‘information-advantaged’ enterprises - organisations from a variety of industries that are successfully using information to revolutionise how they compete and do business.

About The Council for Information Advantage

Page 16: Netherlands eON Winter

16

Case study

Verkerk Groep takes care of retention policies with SourceOneVerkerk Groep is an electronic installation company based in Zwijndrecht. In addition to its main site, the company also has a second office located in hoevelaken. The company, which employs 400 staff, operates in the utility construction industry, control engineering, panel construction and health care industries

The company also provides customers in the hospital and health centre sector, IP-based solutions for the management of company buildings, via its Verkerk Service Systemen (VSS) production unit. Under the VSS label, Verkerk offers care group systems with various electronic aids such as temperature control, air conditioning, sun blinds and lighting, as well as camera surveillance, access control and fire and burglar alarms. The VSS services are for sale or for rent in the form of a subscription.

Current laws and regulations require companies to archive their documents, often for many years. Until relatively recently, these retention policies applied only to documents that were directly related to the organisation’s annual accounts. However, shareholders are no longer the only people interested in the ins and outs of a company. Stake holders such as tax authorities, consumer organisations or bodies in charge of safety; environmental laws; and health and safety laws make for strict documentation requirements for companies.

This has coincided with an explosive growth of data that is becoming more and more digitised. This leads to high demands on a company’s infrastructure, which is required to be able to store, structure and retrieve data at a moments notice. Unstructured data such as Word documents, images, emails and especially attachments included in emails are the most prominent cause of such an explosive increase in data. Verkerk Groep has been keeping a watchful eye on the growth of its digital data, according to Erwin Hollaar the company's IT manager. This is why, last year, Verkerk Groep chose EMC SourceOne for its information security, accessibility, management, back up, archiving and reporting.

Archiving emails“Archiving emails had been an item on the list at Verkerk for a while,” says Hollaar. “Our accountants were urging us to improve the way we saved our emails and attachments, because of the retention policies. Additionally, we as a company are required to provide

information to the Tax Authority and other government organisations. We are required to hold onto our documents for at least seven years. We have now started the automation of this process. Before we introduced EMC SourceOne we were linking relevant emails about offers and contracts to our ERP system, Microsoft Navision, but we had to do it by hand.”

The retention and disposition policies are not the only reason why Verkerk needed a modern solution for mail archiving. “The email traffic between our installation company and our clients has increased enormously. These emails often include attached drawings, schematics,

Page 17: Netherlands eON Winter

17

PDFs and bitmaps. This can quickly lead to full mailboxes in our Exchange 2007 mail system. We manage 250 mailboxes, which together take up 45 GB of storage space, including attachments. Project-based emails are also kept separately by our project managers, which led to additional pressure on the storage system. Until recently, we didn’t de-duplicate the attachments, you see.”

Partner 2e2“In mid 2009 we realised that we needed to take action”, says Hollaar. “The initiative came partly from management and partly from the IT department. The

problem was that in 2009, because of the recession, investment in this sort of project had been halted.” EMC partner 2e2 Data Management from Rosmalen contacted Verkerk around this time and gave a demonstration of SourceOne. “It looked good, and quite soon the ball began rolling. We were given permission by management and we implemented the solution before the holiday period in 2010. We are currently working on the rollout.”

SourceOne Email Archiving runs on a separate server with an SQL database. The archive server and the retrieval server run on a single system that 'looks' at the Exchange server. On the Exchange server, policies determine which emails are archived and which are not. “We currently have a schedule that archives mail that is more than 90 days old, but we will eventually match that period to the retention period in Exchange, which is 30 days. Archived mail is fully text indexed - header, body text and attachments. The user can use a web browser to search the archive, both Boolean and through search fields.”

The user doesn’t notice much of the archiving, explains Hollaar. “Archived mail does disappear from the Exchange server (including attachments), but a link remains in the user’s Exchange mail box. The user accessing email from a desktop or laptop won’t notice the difference when opening an email on SourceOne via

the link. For offline use of the SourceOne archive, the mail is cached to the client. An attachment is then opened from the user’s own PC.”

De-duplicationUsers can retrieve accidentally deleted email from SourceOne. Only administrators can delete from the archive. Within the archive, emails and attachments are de-duplicated. “We have always had quite a strict policy on storing emails. Mail boxes in Exchange are limited to 300 MB each. At many companies they have more. We have now found a good solution for this. Users who receive a lot of emails can move them, including attachments, to their personal archive in Outlook and Exchange. That folder is archived with a shortcut every night. This means that a user with a lot of email traffic doesn’t have to throw any emails away and still doesn’t have an overflowing inbox.”

Hollaar is very enthusiastic about SourceOne. “EMC SourceOne is extremely easy to set up. We had it up and running within two days. It is good to have an expert there when setting up the policies though. The product’s strength is its simplicity. We don’t need more than five policies for now.” o

“Another important challenge was the pressure that the large number of emails and attached CAD drawings put on our Exchange system.”

Page 18: Netherlands eON Winter

18

Business overviewCetrel SA, established in 1985, operates an efficient e-money hub that processes payments quickly and securely, covering all points of the transaction for financial institutions across Europe. Cetrel technology manages card payment processes across 430 ATMs and over 12,000 point-of-sale devices, making electronic payments safe and reliable for banks, merchants and, ultimately, consumers. On peak days, the company authorises more than 360,000 card transactions.

In January 2008, Cetrel changed from a co-operative to a commercial company with a turnover of over 35 million euro in 2007 and 180 employees.

In 2009, the Swiss service provider SIX Group took a 50% stake in Cetrel. This agreement gave Cetrel access to 23 markets in which Six Group is represented.

ChallengesA reliable, always available, infrastructure is essential for Cetrel’s

activities and it wants to preserve its image as a reliable partner for the financial industry. Cetrel is a long standing EMC customer. Its core business processes were running on EMC Symmetrix DMX, but the infrastructure was reaching the end of its life cycle. An analysis run by EMC showed that undertaking a migration now to a more appropriate solution wouldn’t prove to be any more expensive than keeping the old infrastructure in place for a few more years. As part of the SIX Group, Cetrel is planning for further international growth, and in line with its policy for technology investment to keep the business ahead of the market, Cetrel decided that there was a strong argument for investing in a new, more flexible and powerful infrastructure.

EMC solutionCetrel has been an EMC Symmetrix customer since 2000. In 2010 Cetrel’s employees made the transition from EMC Symmetrix DMX to EMC Symmetrix VMAX. Cetrel didn’t consider another technology, since the former Symmetrix

system had always met the company’s high expectations. The migration went smoothly and was carried out by a mixed project team of Cetrel’s IT people, alongside IT specialists from EMC.

ResultsWhile the Symmetrix DMX system was still performance satisfactorily, the new VMAX system can offer the same high response time and performance, but for a higher number of applications than before. “We will be able to guarantee high storage performance for our growing number of hosts, clients and applications,” declares Manuel Fischer, CIO at Cetrel. “This means that our new storage infrastructure will host more applications than before without slowing down the performance. This means a positive contribution to Green IT. By using a more powerful storage system, we need less data centre space, and we cut down our energy consumption for cooling and electricity.”

The new EMC Symmetrix VMAX system is not only high performing but also more flexible and easier to manage than the former system at Cetrel. “The procedures for storage provisioning are much more straightforward with VMAX,” explains Manuel Fischer. “Also, with new concepts for storage virtualisation such as Virtual Provisioning and Virtual LUN Migration, Cetrel’s storage infrastructure is more flexible and the provisioning of disk space is much quicker than before, with a reduced risk of human errors. This results in a storage infrastructure that can respond more quickly to the business needs of our organisation.”

“Our new storage infrastructure is able to host more applications than before without slowing down the performance.”Manuel Fischer, CIO at Cetrel

Cetrel adapts storage infrastructure to meet business needs

Page 19: Netherlands eON Winter
Page 20: Netherlands eON Winter