key practices for a healthy nas on cloud - dell · file migrations are seldom smoothly driven...
Post on 23-Sep-2020
2 Views
Preview:
TRANSCRIPT
Sujith KumarEMC Corporation
Krishna PrasadEMC Corporation
KEY PRACTICES FOR A HEALTHY NAS ON CLOUD
2015 EMC Proven Professional Knowledge Sharing 2
Table of Contents
Why is it important for IT shops to manage File storage effectively?...........................................3
How do Companies look at their NAS Storage today…………………………………………..…....3
Administrator’s views of Managing todays NAS infrastructure…………………………………..….5
Practical approaches to efficiently manage your NAS……………………………………………....5
Step 1: Plan ahead, know your current and future needs…………………………………..........5
Step 2: Know you Data………………………………………………………………………………..5
Step 3: Know your Applications……………………………………………………………………...5
Step 4: Keep the needed and move the Rest………………………………………………………5
Step 5: Secure your Data wherever Possible………………………………………………………6
Step 6: Know who Owns the Data…………………………………………………………………...6
Step 7: Performance Matters…………………………………………………………………………7
Step 8 :Network – Main Backbone of NAS………………………………………………………….7
Step 9: Protecting your NAS………………………………………………………………...............7
Step 10: Regular DR skills………………………………………………….………………………...8
Step 11: Deal with you Data………………………………………………………………………….8
Step 12: Automation:- Makes Life Easier ………………………………………………………..…9
How do you see NAS on Cloud ………………………………………………………………………..9
Conclusion……………………………………………………………………………………………….11
Disclaimer: The views, processes or methodologies published in this article are those of the
authors. They do not necessarily reflect EMC Corporation’s views, processes or methodologies.
2015 EMC Proven Professional Knowledge Sharing 3
Why is it important for IT shops to manage File Storage effectively?
In a data-packed era where information is the key that leads the enterprise business,
management of information has gained utmost importance. Anything that enterprises produce
these days adds data exponentially. A recent survey from IDC states, “The volume of digital
data will grow 40% to 50% per year. By 2020, the number will have reached 40,000 EB, or 40
Zettabytes (ZB). The world’s information is doubling every two years. By 2020 the world will
generate 50 times the amount of information and 75 times the number of information
container"4
As unstructured data growth increases daily, companies are transforming from legacy storage
systems to the latest file-based storage solutions to improve efficiency and reduce
costs. Organizations that deploy new NAS solutions are receiving greater value from their
storage investments which is translating into business success.
While the shift to enhanced NAS technologies can provide substantial benefits, moving file
storage data from legacy systems to these systems can be an expensive, time-consuming,
labor-intensive, and risky process. There are a multitude of manual steps, and migration teams
must continually make trade-offs in terms of migration timelines, risks, and business access to
data. Cutting access to data on sources while the migration is in process is disruptive to users
and the business and often results in productivity loss. However, maintaining user access to
sources during migrations can lead to problems with locked files and missing new or recently
updated files. As a result, cutovers to new storage platforms rarely go as smoothly as planned.
How do Companies look at their NAS Storage today?
NAS storage is a vital part of managing storage in any industry. Companies widely use NAS to
share their data to Windows and Unix hosts and view NAS storage as a data container rather
than specifically looking into each of its aspects.
Companies back up their data using Snapshots to protect their data. Most times, when
customers lose their data, it is successfully restored to their original path with the help of NAS
administrators who take Daily, Weekly, and Monthly snapshots based on their requirements.
Companies consider the deduplication feature as one of the greatest assets in NAS technology.
They save TB’s and MB’s of space by running this feature on their duplicate data and reclaim a
lot of space for their other file systems.
2015 EMC Proven Professional Knowledge Sharing 4
Typically, numerous DR tests are conducted to check data robustness in case of a real DR
scenario. These are conducted to ensure access to file share data by performing tests to prove
it can be done successfully.
Companies move their unused NAS data to secondary storage saving significant space in the
existing file systems. They value this feature which helps utilize the file system space in an
efficient way by moving the unused data to a secondary storage device.
Administrator’s views of managing today’s NAS infrastructure
Administrators that manage file storage face a number of difficulties on a daily basis. There are
millions of Shares and Exports out of a NAS box, often providing no clue which share or export
hosts what critical applications in it. When an issue arises on a particular host which accesses
the NAS box, administrators find it difficult to track the share owners, users, and applications.
Data load needs to be evenly distributed among the NAS boxes to mitigate performance issues.
File system migration is another important hurdle that IT shops face in today’s world. File
migrations are seldom smoothly driven towards completion. Many issues are noticed post
migration by end users searching for their data and their Home drives that are on the NAS
shares. The biggest challenge is in coordinating with the Windows and Unix users in moving the
data from one NAS box to another.
Managing data security is an ongoing concern for administrators. When security solutions are
not deployed considering the workload, managing the data security becomes a challenge.
Storage reclamation is another painful area to examine. There are many file shares and user
home drives that may be removed from the environment after an employee has left or when a
server is decommissioned. But how far you are focused in doing an end-end storage
reclamation?
Many administrators do not set up the alerting and monitoring system to its best use so they
face difficulties managing the NAS day in and day out.
2015 EMC Proven Professional Knowledge Sharing 5
Practical approaches to efficiently manage your NAS
While all of today’s NAS systems are easy to deploy and can be brought into production within
minutes, efficiently managing the file storage continues to be a nightmare for administrators.
There are key practices that you must be followed to avoid the obstacles when it comes to
managing your NAS storage.
1. Plan ahead; know your current and future needs
File storage has a habit of growing like a thick forest, and the denser it is, the bigger your
headache. Continually buying storage is not always the right option to meet your requirements.
You must also pause to think and perform a systematic analysis of your data regularly which is
a key matrix to knowing whether your current storage is being utilized efficiently or not. This not
only allows you to understand your current state but also gives you an idea of where your
organization’s data trend is heading. Administrators should also be aware and make effective
use of certain policies and limitations like Quota feature that forces users to remove the
unwanted data when your storage is nearing its limit.
2. Know your Data
How well do you know your data? This should be a question that every administrator ask
themselves. Classify your data based on age, size, type, etc. so you know your data
characteristics better and invest your money on the right NAS solution for your environment.
This classification can also help your mission-critical data to be placed on high performance
NAS filers while placing the less utilized information on other file servers. Plan your file system
containers based on this data classification so you know where your data resides according to
their profile.
3. Know your applications
Do you know how the data is being used by your applications? There are applications that
demand high availability and zero percentage downtime. How are you going to tackle this?
Know whether your application is random or sequential in nature and provision the storage with
appropriate redundancy and data protection.
4. Keep the needed move the rest
No system has 100% active data. Knowing a percentage of your total data is active will help you
plan your storage in a better way. Plan to park your inactive data on a secondary storage
device. Most NAS systems today have automatic archival solutions which moves data in and
2015 EMC Proven Professional Knowledge Sharing 6
out based on their usage. Make sure you have a 24/7 available archival solution attached to
your file storage so your unused data is at rest. The other biggest problem that your file storage
often tends to see is duplication of data. How effectively are you utilizing the deduplication
feature of your NAS to overcome this? At a minimum, all your hot file systems should have the
deduplication turned on to ensure that your file system total space is being used effectively.
5. Secure your data wherever possible
Information security is becoming of utmost importance these days and your NAS storage
becomes a soft target if not looked at seriously. While hacking into a NAS system can provide
access to potentially sensitive information stored on it, it can also hijack information from other
devices on your environment using multiple techniques. This is why securing your NAS data
wherever possible is critical. Certain practices that you can often follow to secure your NAS are:
Segregate your NAS traffic within a highly secured VLAN from all your other networks.
Deploy file data encryption whenever possible. Do not forget to ensure your NAS
appliance supports this while choosing the solution.
Make sure you have the most powerful anti-virus solution supporting your environment
and no critical piece of data is exposed to the outside world without being scanned.
6. Know who owns the data
On a vast file storage environment, administrators deal with thousands of requests to create and
manage the file shares on a daily basis. There are many issues regarding share access, share
resizing, etc. The task becomes even more difficult when your company must adhere to
complex process and procedures that prevent you from taking action on the request without
approval from the business or the share owners. The fact that administrators normally do not
keep track of the share owners details causes a lot of administrative overhead when it comes to
managing the shares.
To get rid of the situation, a best practice is to issue the share owner name or his department
during the share creation process. This would in turn help a NAS administrator easily track the
shares and users home drive among the hundreds of shares.
This helps the most when you have to deal with a data migration process of the file storage. You
are able to save a lot of time that would normally be spent figuring out the ownership details of
the shares when you already have it handy. You know whom to contact in no time and get the
approval with minimal effort.
2015 EMC Proven Professional Knowledge Sharing 7
7. Performance matters
Are we aware of the life cycle of our data residing in NAS? Have we ever thought if one file or a
set of data will be accessed simultaneously and if so, how many users and how many
applications are accessing it? How many login sessions are created to access this information
and who are these customers and the nature of their job. Are they a single customer or do they
belong to a community and what is the nature of their work?
We also should learn the I/O pattern of our applications. Is it Random or Sequential?
If your access pattern is Random, then the best practice would be adopting FC drives for the
applications. If your access pattern is highly sequential, best practice would be adopting ATA
drives to suit the need. The performance of SAN compared to NAS is much different. Measuring
NAS performance is much more complex than the SAN space as NAS encompasses a lot of
entities, such as host, network, and disk drives. One needs to analyze performance on each
device to find the bottleneck of the issue. We need to set up performance expectations with the
client and understand the capabilities of the existing environment to see if the expectation of the
client is met or not. If not met, a complete re-design is required.
Best practice to adopt in a NAS environment is to measure performance through various tools.
We should pull out reports which will show us the NFS and CIFS IOPS and also be able to get
the top talkers and number of CIFS sessions open at any point of time.
Another best practice is that when you place NAS onto an environment where you already have
SAN, you must ensure that the SAN is occupied no more than 70%. The workload of SAN and
NAS has to always be in separate spindles at the backend to gain the most performance. Also
check that retransmissions of NAS packets remain low. A high number of retransmissions
clearly indicates that there are some issues on a customer’s network.
8. Network; the main backbone of NAS
Network is the backbone of any NAS environment. A bad network can lead to troubles with data
replication, file share slowness, user complaints about NAS getting timed out, etc.
A best practice is to use a file server’s network ports wisely. Bandwidth allocated to NAS should
be cleverly used. At any point in time, segregating the data network and replication network will
help utilize the given bandwidth effectively.
2015 EMC Proven Professional Knowledge Sharing 8
There are many techniques to ensure the data flow equally through all the NAS ports without
over utilizing a single NAS port. This would provide better results and avoid network congestion
on the NAS end. One must also make sure that there is no duplex mismatch observed in their
environment. Your network administrator holds the key here in making sure that the health of
the switch ports are checked regularly and the duplex settings comply with recommended best
practices.
9. Protecting your NAS
Any NAS box in a customer environment should have an efficient event and monitoring setup. If
any NAS functionality has a problem it should alert a NAS administrator to do the triage and fix
it. We can configure the events on the NAS for which we need the alert facility. The customer
has to decide which events need an alert to be set up and by getting those, the NAS box can be
configured and tested to check if it sends a test alert to the users configured in the alert facility
receives it.
10. Regular DR Drills
When business continuity is one of the most critical needs of your business, you need an
uncompromised disaster recovery solution. How confident are you that you have a 100%
consistent data copy at your target? Are you closely monitoring your data replications?
There have been several instances where companies have faced data loss resulting from a real
disaster just because they didn’t know what their replications were doing. While business
always demands solutions that can recover data in no time, remember that it’s not just about
how powerful your DR solution is; it’s also about how closely you monitor that.
The key is to always keep an eye on your replications and make sure you have an alerting
mechanism when the replication is broken or out of sync. Conduct regular DR drills at least
once a year so you know your data is not at risk.
11. Deal with your Data
The fact that your NAS data is often scattered across thousands of Windows and Unix servers
in the environment, even IT shops where file data is closely managed have difficulties when it
comes to migrating file storage or updating systems to take advantage of a newer technology.
File storage migration is a lengthy, time consuming, people-intensive, and highly expensive
process,
2015 EMC Proven Professional Knowledge Sharing 9
If you have to break down these barriers, then:
Start planning your migrations before-hand to avoid a last minute rush.
Conduct weekly meeting with all stakeholders.
Ensure you have defined a proper pre- and post-migration data functionality test
plan.
Have an established cadence sheet with all data and tasks defined in it with
specified timelines.
Know the individuals that own the tasks and make sure you have regular technical
meetings conducted with all the resources.
Have a regular and effective communication system via e-mail to all stakeholders.
Make sure you address all concerns and questions raised by your change
management.
Ensure you have proper communication to end users post-migration.
Have a systematic user acceptance test and environment functionality test. Open an
infant care call bridge to address any user issues post migrations.
12. Automation-Makes Life easier
Imagine an environment where the Administrator receives alerts at his doorstep. Have you ever
thought how scripting can help you manage your NAS environment?
It can produce wonders when used effectively. You can automate almost all of your file storage
management tasks just by implementing a few simple scripts in your system. Regular health
check of the NAS arrays, file system creation and expansion, Quota Management and DR
replications, are just a few of those tasks. A perfect scripting can also ensure delivery of daily
status reports of your file storage via e-mail.
How do you see NAS on Cloud?
The non-stop growth in the amount of information produced today in your file storage
environment poses a significant challenge in storing and managing the data. Shifting your
current traditional NAS environment over to a cloud-based environment might address a lot of
challenges.
A cloud NAS shouldn’t be just a remote NAS storage that can be accessed over the internet to
store your files. It should be a much more intelligent solution that is capable of doing many more
services other than just storing files.
2015 EMC Proven Professional Knowledge Sharing 10
What more can a cloud NAS do to reduce Administrator’s overhead? Today’s traditional NAS
has a manual way of managing things which increases administrator overhead to a greater
extent. So let’s think about a solution of automating many of the administrator’s tasks. How easy
would it be for a user to get all his requests done without the help of an administrator?
The idea behind this discussion is to automate as many NAS management tasks as possible.
Idea at a glance
A unified Cloud NAS management interface that is capable of managing multi-NAS vendor
systems, physical arrays, and cloud NAS services can provide administrators the ability to
provision and configure file storage and network configurations while drastically reducing
overhead. It would also enable administrators to automate several file storage management
tasks such as File System and share creation, storage expansion, resizing, quota management,
replication setup, and so forth on a multi-vendor cloud NAS infrastructure.
How it works
Figure 1: Possible NAS Management Lifecycle Workflow
Using policy-based governance, we can assure that the multi-vendor NAS environment is given
the ability to automate many of its management tasks and customer services. Administrators
can apply their own way of managing the business without changing organizational standard
policies. It also gives organizations the flexibility to have a wide variety of service levels,
2015 EMC Proven Professional Knowledge Sharing 11
policies, and automation process as needed for their business standards. This helps streamline
your file storage management and deployment process and eliminate a lot of duplication of work
by using reusable components that, in turn, reduces the administrator’s burden.
Conclusion
Many of the challenges faced by today’s IT organizations come from the disruption incurred
while moving data. All major NAS activities like data migrations, consolidations, and upgrades
take a long time and are highly people-intensive. The difficulty in separating active and inactive
data storage compromises storage efficiency. Administrators end up spending hours finding the
data, and appropriate resources and coordinating all these activities. This article is an overview
of some of those day-to-day challenges and a high level discussion of some important practical
approaches which would help ease the NAS administrator’s job. This article also discusses a
possible approach for automating many of the NAS management jobs by introducing NAS into
cloud.
2015 EMC Proven Professional Knowledge Sharing 12
Appendix
1. http://www.esg-global.com/briefs/data-dynamics-breaking-file-efficiency-barriers/
2. http://www.snia.org/sites/default/education/tutorials/2011/spring/file/Anjan_Dave_Storage_M
gmt_Spring_2011.pdf
3. http://searchsmbstorage.techtarget.com/feature/Cloud-network-attached-storage-NAS-
emerges-to-address-secondary-file-storage-needs
4. http://www.datadynamicsinc.com/rising-data-volumes-demand-for-network-attached-
storage-driving-need-to-automate-storage-deployments/
5. EMC NAS Performance Workshop Student Document
EMC believes the information in this publication is accurate as of its publication date. The
information is subject to change without notice.
THE INFORMATION IN THIS PUBLICATION IS PROVIDED “AS IS.” EMC CORPORATION
MAKES NO RESPRESENTATIONS OR WARRANTIES OF ANY KIND WITH RESPECT TO
THE INFORMATION IN THIS PUBLICATION, AND SPECIFICALLY DISCLAIMS IMPLIED
WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE.
Use, copying, and distribution of any EMC software described in this publication requires an
applicable software license.
top related