software defined labs (sdl) - tokalabs · loadgen ixia/breakingpoint test suite definition:...
TRANSCRIPT
Software Defined Labs (SDL)Revolutionizing the way lab resources are managed and making test automation incredibly simple.
I. Executive Summary
II. The Legacy IT Lab
III. Definition of Software Defined Lab
IV. Benefits
V. The Case for a Software Defined Lab Ecosystem
Table of Contents
tokalabs.com
IT Test Labs have not kept pace with the technology they are intended to support. As the market moves towards more software defined fill-in-the-blank solutions and scale expands with IoT and cloud everything, test teams across vendors and enterprises are struggling to keep up. By deploying a Software Defined Lab (SDL), these teams can actually leap frog forward, reducing test cycle times and lab costs by over 50%.
Software Defined Labs (SDL)
Executive Summary
tokalabs.com
Table 1: Legacy Lab vs SDL Man Hour Comparison
Testbed Definition: DUT1 Firewall Model ADUT2 Firewall Model BLoadGen Ixia/BreakingPoint Test Suite Definition: General Description SSL Performance Comparison TestNumber of Test Runs 42Planned Time of Test Execution 6 hours
Resources Utilized: Legacy Lab # People Required 3Hours Spent Idle (waiting for others) 4Hours Spent Configuring Physical Environment 1Hours Spent Configuring Testbeds 2Hours Spent Automating Tests/Refactoring 6Hours Spent Testing 6Hours Spent Restoring to Default 1Hours Spent Collecting Data 1 Total # of Man Hours 21
SDL1000
0.056
0.050.05
6.15
increase in productivity with SDL!70%
There are many different types of labs in the IT space with each one geared to achieve varying objectives for their organization. They range from end-user IT teams attempting to determine their next best server deployment to the largest vendor in the world trying to maintain the highest quality while still cranking out thousands of new devices a year. Using the large scale manufacturer as an example, to take a product from concept to delivery, they follow a workflow that traverses a number of labs, each with their own purpose that may include R&D, DVT, EVT, Sustaining, QA, SQA, Manufacturing, Performance, TME, Interoperability, POC, Support, etc. Once a product finally hits the market, it is then likely going to be put through comparative testing in the labs of potential end-user customers.
Regardless of whether the lab belongs to a vendor, customer, analyst, university, benchmarker, etc. or the objective is to determine performance, quality, best fit, etc., most share a certain commonality. They are all overly dependent on human interaction. While many labs have implemented automation frameworks and orchestration processes, even the most efficient of this bunch still require a heavy amount of manual, human labor.
The first issue this presents is that legacy labs are bound by the repetitive, manual work of having to rack, cable, and configure prior to ever even beginning to test. Then, after actually executing the tests, there is a necessary teardown and rinse-and-repeat for every new test bed put through the cycle (Figure 1).
tokalabs.com
Figure 1: Legacy Lab
To put this in perspective, consider a switch for example, and the large number of device and software combinations that need to be tested in order for a manufacturer to have the confidence to begin selling it in the open market. To adequately test just one switch, it is possible that hundreds if not thousands of tests will be run across a dizzying number of testbeds. This is why large manufacturers employ thousands of engineers in quality assurance roles alone. In addition to the lost time associated with all of this manual work, there is a tremendous amount of waste in utilization of valuable resources. Often, costly infrastructure such as load generators and other test equipment are left sitting idle while these physical configurations and reconfigurations take place. This occurs since each lab is constrained by access to their own resources and the sharing of these devices. For example, when test engineer A takes longer than anticipated to configure their test bed, test engineer B may be delayed in their completion. These delays cascade downstream to every lab in the workflow which will then negatively impact theentire product delivery schedule.
The Legacy IT LabWhat it is
The Problem with Scale
Software Defined Labs (SDL)
The next major issue found with legacy labs is the tests themselves and what is required to build and execute the best possible tests. In some instances, one person could do all things – rack, stack, cable, configure, automate, test, etc. In most cases, these require differing skillsets which means that larger labs separate this into three primary functions – lab management (cabling/racking/network configuration), testing, and automation. In regards to building and executing tests, it begins by creating a series of scripts that eventually grows to a library over time. It requires someone with the ability to program in a particular language and with some common knowl-edge about the underlying product and technology. It is common for new test engineers to have one, but not the other so it takes time for ramp up, training, and acclimation before they are able to begin producing complex tests with multiple steps, if/thens, loops, and pass/fails. As time goes on, these test libraries grow and become a source of strength, but also a potential weakness. When new engineers join and previous ones leave, if protocols change, and with programming languages falling in and out of favor, test libraries can easily become cumbersome and fragmented.
tokalabs.com
Barriers to Automation
If the lab is capable (has the time, expertise, etc.) then they may begin to implement more automation to combat these issues and create more efficiencies. This in itself manifests the next set of issues in that it usually requires a new set of skills and the similar cycle of ramp up followed by diminishing value over time.
From a personnel standpoint, the end result is three distinct roles that don’t necessarily always align. In very large organizations, there may be a person responsible for just cabling and another for network configuration – both of which might operate behind a helpdesk ticketing system that the test engineer must use to request changes. This results in delays and bureaucracy that become the source of frustrations and running jokes. These problems have existed since the beginning of information technology, but there are many forces at play that are pushing labs to the breaking point today. While “the cloud” might seem to make things easier, it is just another bunch of variables (AWS vs Azure vs Google vs…) that further exacerbate the problems. Similarly, The Internet of Things (IoT) brings an additional innumerable amount of devices that need to be tested. Combine this with the sheer speed of technology shifts and the competition to just stay current, labs of all sizes and purpose will need to modernize or face potential obsolescence.
Organizational Complexities
Software Defined Labs (SDL)
Building on the concepts of Software Defined Networks (SDN), a Software Defined Lab (SDL) is one that mitigates the constraints of fixed associations and, optionally, physical connections. It allows for maximum flexibility and longevity in regards to differing knowledge, skillsets, and technology.
In a SDL, all tests and the underlying automation are built in such a way that they utilize the resources by their assigned abstraction, as opposed to specific device names, IP addresses, etc. These testbeds can be as small as a single device or scale infinitely and they should be completely shareable amongst team members with an ability to reserve on demand.
A SDL may have a centralized network fabric in which all physical devices are physically cabled in via Layer 1 or Hybrid L1 (Figure 2). This is done just once during the initial deployment and configuration of the SDL or, in the future, as new physical devices are added to the lab. Virtual devices and cloud instances are capable of being added to or removed from this mesh dynamically. All three (physical, virtual, cloud) are thus abstracted and are capable of being linked in any combination. In addition, these links themselves are abstractions.
tokalabs.com
Figure 2: Layer 1 or Hybrid L1 Connected Resources
Software Defined Labs (SDL)
Definition of a Software Defined Lab
By building tests and automation based on abstracted resources, the lab gains the ability to easily clone and repur-pose tests and test beds. Every combination of variables can easily be created and saved in an expanding “virtual” library that can be pulled from in the future. Since they are not dependent on device names or specific addressing, it is simple to apply any number of tests to any number of topologies. This flexibility in repurposing existing tests dramati-cally reduces the need for creating new tests or test-specific automation policies.
With a SDL, the vast majority of all physical device connectivity and configuration is done one time only. After that initial setup, all devices are abstracted and visualized in a central GUI so that multiple users can simultaneously view real-time availability, create test beds on demand, reserve dynamically, and begin testing immediately (Figure 3).
This eliminates legacy lab issues such as having to physically locate a device (as in “who borrowed my switch”), asking for or even requiring a cable change, network configuration, moving equipment, or the myriad of other related issues. The result is a significant savings in both man hours and the improved utilization of valuable resources such as load generators and related infrastructure. For instance, a load generator can now be shared down to a port for a few minutes as opposed to tying an entire appliance up for a day.
By allowing the dynamic linkage of any combination of physical and virtual devices along with cloud instances, test engineers are free to move at their speed, unencumbered by peers or red tape. Also, during testing and especially in debugging scenarios, engineers often need access to third party tools on the fly. A SDL allows the immediate spin up of virtual machines or third party applications and tools in the environment they are already working in (i.e. IxLoad, Jenkins, etc.).
tokalabs.com
Benefits of a Software Defined Lab
Software Defined Labs (SDL)
Figure 3: Abstraction in a Software Defined Lab
The net-net is a reduction in wasted man hours and wasted resource utilization. Further, there is a significant simplifi-cation that is achieved by implementing a single-pane-of-glass GUI that standardizes test creation & execution, automation, and orchestration. This combination of more with less makes it easier for smaller organizations and teams to establish and run labs of their own. For existing labs, modernizing to an SDL will also enable a significant improvement in both CAPEX and OPEX .
While an induvial lab is certainly capable of reaping the benefits of being software defined, there is an even greater potential if implemented across the entire IT ecosystem – from vendor to end-user (Figure 5).
tokalabs.com
Figure 4: SDL Stack
The Case for a Software Defined Lab Ecosystem:
Software Defined Labs (SDL)
Figure 5: SDL across the IT Ecosystem
Lab Infrastructure
Servers Storage Private Cloud Public CloudNetworkDevices
User Management
Orc
hest
rati
onA
utomation
Inventory Management Topology Management Test Builder
Scheduling Reporting/Analytics Test Libraries
Software Defined Lab
Web GUI / Rest API
Ecosystem
ReleaseFaster
R&D
Test Smarter
QA
TME
Beat The Competition
Reduce Meantimeto Resolution
SupportCaptureConfigs
DevOpsRoll Out Confidently
Sales
Demo/POCFaster
PS
Deliveron Time
Training
ScaleSmart
Escalations
ReplicateAccurately
While the vendor to end-user relationship has typically been viewed as seller and buyer, both organizations stand to mutually benefit from a more collaborative partnership. As illustrated in Figure 5, utilizing SDL across all functions of the product delivery life-cycle in conjunction with the customer provides efficiencies throughout. With the move towards software defined everything, collaboration between customer and vendor is more important than ever. In the legacy big-iron model, the vendor held sole responsibility for testing from cradle to grave. Now, as customers implement more solutions based on reference architectures and build your own, collaboration will not be a nice to have, but a requirement. This is due to the number of vendors that could be involved in delivering a final production ready software defined solution. There may be a primary software vendor, some open-source, a chassis vendor and then all of the compo-nents within it down to the cables. The end-user will have the ultimate requirement to QA the final build and any revisions that come along over time, but being able to share tests, topologies, versions, etc. across the ecosystem will greatly hasten this process and allow for quick resolution when bugs and issues arise.
By cooperating across the ecosystem, labs can easily collaborate and avoid the struggle of issue recreation. This can cut down on finger pointing which will result in better customer experience and faster adoption of software defined solutions throughout.
To learn more about Tokalabs and how to upgrade to a Software Defined Lab, contact us at [email protected]
tokalabs.com
Software Defined Labs (SDL)
About TokalabsTokalabs enable customers to create Software Defined Labs (SDL) - a simple push-button approach to building and sharing lab resources, network topologies, and automated test beds. Through Tokalabs, Development, QA test, and Support teams can design, create, manage, and automate functions across any network device regardless of vendor or supportedmanagement protocol. Whether it’s automating simple device functions or full-scale network orchestration, Tokalabsremoves the pain and complexity out of the process.