the devops database - devops for the database devops database.pdf · the devops database - part 1...
TRANSCRIPT
A DATICAL WHITE PAPER
The DevOps Database
3
The DevOps Database
2
ContentsThe DevOps Database - Part 1 ................................................................................................................ 3
Applying “The Three Ways” of DevOps to Database Change Management ................................ 3 The DevOps Database - Part 2 ................................................................................................................ 4
Applying Systems Thinking to Database Change Management .................................................... 4Start With Reality ................................................................................................................................. 5Don’t Script! Model! ............................................................................................................................. 5No need for SQL................................................................................................................................... 6Reliable Change Design ...................................................................................................................... 6Forecast: Impact Assessment on a Representative Model ............................................................. 6Traceable History of the Evolution of your Schema ........................................................................ 6Easy Comparison of Models ............................................................................................................... 6Clear View of Completeness ............................................................................................................... 7Version Twice, Deploy Once ............................................................................................................... 7Unify Your Modes of Delivery............................................................................................................. 7
The DevOps Database - Part 3 ................................................................................................................ 8
Applying Feedback Loops to Database Change Management ....................................................... 8Always Remember Where You Came From ..................................................................................... 9Know Where You Are .......................................................................................................................... 9Know Where You’re Headed ............................................................................................................ 10
The DevOps Database - Part 4 .............................................................................................................. 10
A Culture of Continual Experimentation and Learning ................................................................. 10A Rollback Designed With Every Change ........................................................................................ 11Richness of Model Based Comparison and Remediation ............................................................. 11Flexible Quick Rollback ...................................................................................................................... 12
Conclusion .................................................................................................................................................... 12
About Pete Pickerill .................................................................................................................................... 13
About Datical................................................................................................................................................ 13
3
The DevOps Database
3
The DevOps Database - Part 1
Applying “The Three Ways” of DevOps to Database Change Management
In framing the discussion for this paper, I relied heavily on “The Three Ways” of DevOps. The Three Ways
are principles which underpin the DevOps patterns that Gene Kim discusses in detail in his novel
The Phoenix Project and in The DevOps Cookbook, written by John Allspaw, Patrick
Debois, Damon Edwards, Jez Humble, Kim, Mike Orzen, and John Willis. Here’s a
quick summary of The Three Ways:
The First Way: Systems Thinking – This Way stresses the performance of the
entire system of value delivery. Instead of becoming laser focused on the part of
the process for which an individual or team is responsible, the individual or team
works to understand the entire process from requirements generation to
customer delivery. The goal is to eliminate the delivery impediments that arise
when a project transitions from one isolated silo to another. Understanding the
entire system allows business, development, and operations to work towards a
common goal in a consistent manner.
The Second Way: Amplify Feedback Loops – This Way deals primarily with facilitating easier and faster
communication between all individuals in a DevOps organization. The goals are to foster better under-
standing of all internal and external customers in the process and develop an accessible body of knowledge
to replace dependence on “tribal knowledge.”
The Third Way: Culture of Continual Experimentation and Learning – This Way emphasizes the benefits
that can be realized through embracing experimentation, risk taking, and learning from failure. By adopting
The Three Ways of DevOps are the principles that under-pin DevOps patterns dis-cussed in detail in Gene Kim’s novel The Phoenix Project and in The DevOps Cook-book.
3
The DevOps Database
4
this kind of attitude, experimentation and risk taking lead to innovation and improvement while embracing
failure allows the organization to produce more resilient products and sharpen skills that allow teams to
recover more quickly from unexpected failure when it does occur.
Database change management is a unique challenge when adopting an agile development practice
or implementing DevOps patterns.
It really straddles two groups: the Developers and the DBAs. Developers design and author application
schema changes based on the needs of the business. DBAs are on the hook for providing a secure and high
performing data platform and protecting the integrity of the organization’s priceless data. In companies
where these two groups are isolated, the goals of each can soon become opposed and forward progress
can grind to a halt.
Because of this unique division and its impact on the Application Lifecycle, imple-
menting DevOps patterns to nurture understanding and collaboration between
these two groups is crucial to the success of companies who try to adapt to a
market that expects better value of services, through more frequent high quality
releases.
In the following pages we dive more deeply into each of the Ways as they pertain to
Database Change Management and referencing Gene Kim’s writings on The Three
Ways.
The DevOps Database - Part 2Applying Systems Thinking to Database Change Management
The First Way: Systems Thinking – This Way stresses the performance of the entire system of value
delivery. Instead of becoming laser focused on the part of the process for which an individual or team is
responsible, the individual or team works to understand the entire process from requirements generation
to customer delivery. The goal is to eliminate the delivery impediments that arise when a project transitions
from one isolated silo to another. Understanding the entire system allows business, development, and
operations to work towards a common goal in a consistent manner.
Experimentation and risk-
taking lead to innovation
and improvement.
Embracing failure allows
the organization to produce
more resilient products and
sharpen skills that allow
teams to recover more
quickly from unexpected
failure when it does occur.
3
The DevOps Database
5
When we founded Datical, our first step was to perform extensive market validation. We quickly learned
that database schema management was going to be tough nut to crack. In the scores of conversations we
had with people throughout the ALM spectrum, we learned that the process for managing and updating
the database schema that supports an application was at best murky and at worst a black box. So how do
we elevate a process owned and understood by a few people in an organization to the level of visibility
required to understand that process as part of a larger system?
What follows is a list of concepts pertaining to System Thinking that we’ve rallied around in providing our
solution Datical DB. Instead of the black box, Database Change Management can become a transparent
and flexible part of your value delivery system.
Start With RealityWhen beginning a new project based on previous development, don’t rely on a stack of SQL scripts on a file
server or even in a source code control system. As you know, databases evolve over time and very often
out of process changes happen.
When database schema is modified to resolve an error condition or performance degradation, these altera-
tions are usually handled in a support ticketing system. You can never be certain that they were added to
the stack of scripts used to build out a fresh environment. In light of this, any database change manage-
ment solution should start by generating a baseline from the working system: your production schema.
This ensures that design and development activity are taking into account not only those schema objects
generated in Dev, but also those which originated in other stops in the system.
Don’t Script! Model!The vast majority of schema management today is handled through the generation, review, and execution
of SQL scripts. These scripts can be tiny or huge; they can encapsulate the creation and relationships of
several objects or they can describe a one-time alteration to a single object. Once executed they generally
leave no history of their passing other than the presence of the pieces they create, delete or modify; you
can be dependent on hundreds of small scripts or on one giant script to build out new environments or
evaluate existing ones. You’re left with a schema that is a massive collection of individual parts applied in
an order you can’t reproduce.
Assessing the compatibility of the schema you develop against in relation to what’s in production is a time
consuming and error prone process. When you craft new changes, ferreting out the possible impact on
other objects in your schema is not intuitive. The specific business needs associated with the individual
parts of your schema get murky over time and it’s harder for you to design to your future instead of accom-
3
The DevOps Database
6
modating your past. There is no traceable history of who did what and why from environment to environ-
ment. Application issues caused by database errors become hard to troubleshoot because there is no
easily digestible standard to use as a measuring stick in evaluating a malfunctioning environment.
At Datical, we talk a lot about the model-based approach we take to database schema management and
how it’s superior to the scenario described above. When you use Datical DB to employ a model of your
database schema, everyone in your organization is working with a transparent and comprehensive repre-
sentation of your entire schema. There’s no more wandering off into the weeds of individual changes with
no sense of history or relationship when trying to make sense of your schema. The model enables a faster,
safer, and more transparent method of managing your schema. Here are a few of the perks:
No need for SQLInstead of writing SQL, the model powers a series of graphical form-based wizards that take the guess work
out of change authoring. With that said, you can leverage your existing scripts with Datical DB and auto-
mate at your own pace—a key requirement for any database change solution.
Reliable Change DesignReliable change design is based on what your database is, not what you think it should be.
When you need to update the schema, relationships between objects are clearly mapped out in the model
and their purposes and history are fully documented and easily accessible.
Forecast: Impact Assessment on a Representative ModelPrior to execution, the model can be used to simulate proposed changes in memory without touching your
database allowing you to deploy with confidence in sensitive environments.
Traceable History of the Evolution of your SchemaManaging schema change with the model becomes an exercise in incrementally updating a single histori-
cal document. Changes are described in a simple, readable format. The application features and business
initiatives the changes support are tied to the changes themselves giving data architects and developers
insight into why a change was made. The reliance on tribal knowledge disappears because everything you
need to know about the “why” of a database change is tied to the change itself.
Easy Comparison of ModelsEvery database instance is now an instance of the same model. The model provides the structure you need
to quickly ascertain what’s missing, what’s wrong, and what needs to be done to bring your disparate envi-
ronments into synchronization.
3
The DevOps Database
7
Clear View of CompletenessThe operations personnel that deploy and monitor your applications can easily establish that the database
schema is everything it should be. They can detect drift more easily than they could previously by eyeball-
ing diagrams or reviewing batches of deployment scripts. If reality doesn’t match the model, the path to
remediation is clear.
Version Twice, Deploy OnceThe first level of versioning any database change management tool should provide is versioning of the gold
standard.
Versioning the scripts or model that you use to build a fresh database instance for your application
provides a huge number of benefits. You can track how your schema has evolved over time and across
releases, you can tightly couple a schema definition to the version of the application it supports using the
same branch/tag/merge workflow that you use for your application code. You can quickly stand up a new
database instance that you know is correct for any released version or experimental branch you are work-
ing in and more.
The second level of versioning takes place in each database instance. Tracking the version of the schema
deployed on a specific instance makes deployment and troubleshooting much easier. Will this application
build work with the schema on this instance? What changes need to be applied to this database to catch it
up to the latest version? Were the right changes applied to this test database to validate a closed defect?
If you are tracking the individual changes applied in a database and the impetus for those changes, these
questions can be answered very quickly and easily.
Unify Your Modes of DeliveryWe’ve found that a lot of uncertainty happens when database updates are handled by several individuals
using their own tools and methods to affect the required changes. To remove this uncertainty, all database
change must be executed using the same tools and processes across departments and individuals. The
tricky part: a continuous integration system has different requirements than a developer working iteratively
or a DBA processing a batch of changes in a headless environment during a maintenance window. In order
to unify your database change process, the solution you use should be accessible to all of these individu-
als while maintaining the consistency of deployment activities. That’s why Datical DB provides a rich GUI
experience for developers that helps them craft and deploy changes in development environments; tight
integrations with popular build and release automation frameworks that preserve the frameworks’ work-
flows while providing Datical DB functionality; and a command line interface that allows users in headless
environments to deploy changes in the exact same manner that the GUI or a third-party integration would.
By unifying the modes of delivery, you are constantly testing your release practices. By the time the produc-
tion push rolls around, you can have confidence that your system works.
3
The DevOps Database
8
The DevOps Database - Part 3Applying Feedback Loops to Database Change Management
This Way deals primarily with facilitating easier and faster communication between all individuals in a
DevOps organization. The goals of this step are to foster better understanding of all internal and external
customers in the process and to develop an accessible body of knowledge to replace the dependence on
expertise scattered across individuals.
I’ve stated before that Database Change Management poses a unique challenge when your organization
is shifting to an agile development methodology and implementing DevOps patterns. Unlike other areas
of your application stack, responsibility for managing application schema straddles two groups operating
under somewhat opposed expectations.
The Development group is on the hook for producing more and more business critical features and re-
leases at an ever-increasing rate. DBAs are tasked with providing a secure, highly available data platform
and protecting the integrity of the organization’s priceless data. The rate of schema change required by
Development to satisfy expectations can run head long into a database change process that is deliberate
and metered by necessity to avoid downtime and data loss. In organizations where these two groups are
isolated from each other, you have the makings of a bottleneck in your release process—undermining the
promise of DevOps.
The solution to this problem is embodied by The Second Way of DevOps.
Communicate early, communicate often, communicate broadly, and prepare for what’s ahead. The tricky
part is implementing the solution in a way that’s meaningful to every stakeholder in an organization’s ap-
plication group. At Datical, we’ve spent just as much time on how we organize and present the data associ-
ated with application schema changes as we have on automating the deployment of these changes.
We’ve rallied around the following key concepts to bring the The Second Way of DevOps to Database
Change Management.
Proactive, Predictive Change AnalysisIn an organization where development works independently of the database group, truly understanding
the impact a stack of SQL scripts will have on downstream environments is a tedious and time consuming
task. Before these changes can be promoted, target environments must be meticulously evaluated for con-
flicts and dependencies that will impact the deployment process. This often involves manual reviews and
comparisons of diagrams and database dumps of complex environments.
3
The DevOps Database
9
Achieving a high degree of confidence in the success of the proposed updates is difficult because it is so
easy to overlook something.
Datical has developed a patent pending simulation feature called Forecast that automates this process.
The Forecast feature builds an in memory model of the target environment, simulates proposed changes
on top of that model, and warns of potential error conditions, data loss and performance issues without
touching the target database. Because there is no impact to target environments, database administrators
can Forecast changes several times during the development cycle to get ahead of issues that would nor-
mally be discovered much later in a pre-release review. Development gets regular feedback on the changes
they are proposing and can address issues that arise during the initial development phase when it is easier
and safer to resolve them. The two teams are working in unison to ensure a safe database deployment that
works the first time without surprises.
Always Remember Where You Came FromDatabase changes are usually designed to address the immediate goals of an organization. Once one set
of requirements has been satisfied by a release, the motivations for the design decisions made for that
release generally fades away as new requirements come along and new business initiatives take center
stage. Comments in SQL scripts and on the database objects themselves can be helpful in determining
why things are the way they are, but these traces of the past are scattered everywhere. Making sense of the
whole is an exercise in archaeology. This was one of the driving forces behind our model-based approach
to Database Change Management. Our model is architected to provide a living history of your application
schema. Individual changes are tied to the specific requirement and release that necessitated them. This
data lives in the model so the information you need to make intelligent design decisions is right in front of
you when you need it.
Know Where You AreBy tying the business reasons behind each schema change in the model, this information can be tracked
in each database instance as it’s updated and included in Forecast, Deploy, and historical reports. Tracking
the changes in each instance and providing detailed reports allows you to easily disseminate information,
effectively gate deployment steps, and quickly satisfy audit requirements. When everyone in your organi-
zation has access to thorough accounts of the Who, What, Where, When, and Why of any single database
change in any environment, everyone is operating on the same level and can more effectively work towards
a common goal.
3
The DevOps Database
10
Know Where You’re HeadedThe model also facilitates concurrent development on multiple releases of a project. By tracking changes
made for several different releases in a single model, the development teams working on these releases
are able to collaborate and stay ahead of changes made by other teams that may impact future releases.
Developers are able to unify redundant changes and eliminate conflicting changes as they implement in-
stead of spending time on redesign later in the process when time is scarce and the cost of change is high.
The DevOps Database - Part 4
A Culture of Continual Experimentation and Learning
This Way emphasizes the benefits that can be realized through embracing experimentation, risk-taking, and
learning from failure. By adopting this kind of attitude, experimentation and risk-taking lead to innovation
and improvement while embracing failure allows the organization to produce more resilient products and
sharpen skills that allow teams to recover more quickly from unexpected failure when it does occur.
The Third Way is by far the most intriguing of the “The Ways” to me. I’ve spent the lion’s share of my career
in early stage start-ups where cycles of experimentation, learning, and failure are the norm. When bring-
ing a new product or service to market, your latest release is never your last. It may not even be the last
release this week. You are constantly experimenting with new workflows and technology, learning about
your target market, and getting more valuable information from your early failures than from your early
successes. The Third Way is crucial to the success of an early stage company.
While the benefits of the Third Way still apply to more established companies and product lines, practic-
ing it becomes more difficult. The potential negatives of experimentation and risk-taking are much harder
to stomach when you have a large base of paying customers with SLAs. This aversion to risk is most acute
when you’re talking about your data platform where outages, performance problems and data loss are not
an option. Complicating matters further is how difficult it can be to unwind the database changes that were
affected to support a specific version of your app. Application code can usually be uninstalled and replaced
with the previous working version fairly simply should problems arise. Reverting the database changes
that support that version of the application is more akin to defusing a bomb. Database changes must be
reverted delicately and meticulously to avoid errors and omissions that could negatively impact your data
platform.
What DBAs and Release Managers need to facilitate experimentation and risk taking on the data platform is
a special combination of tools and process. This combination should make it easy to identify the root cause
3
The DevOps Database
11
of issues, quickly remediate problems caused by application schema structure, and revert to a previous
version of the schema safely. When we founded Datical, we spent many hours in conversation and at white
boards exploring these unique needs, hammering out a path to usher the Third Way into regular database
management.
A Rollback Designed With Every ChangeThe biggest problem with experimentation in the data platform is how difficult it is to move backward and
forward through your schema’s version history. We feel the best way to bring more flexibility to the process
of upgrading and reverting schema is an attitude shift. Your rollback strategy for each database change
must become as important as the change itself. The best time to craft your rollback strategy is when the
change itself is being designed. When the motivation for the change is fresh in your mind and the depen-
dencies of the object being created, dropped or altered are clearly mapped out, a developer or DBA can
better craft a rollback strategy for which every contingency has been considered. This leads to a stronger
safety net and makes your application schema as agile and easily managed as your application code. The
database is no longer preventing you from being bold but quickly and safely moving between versions to
accommodate experimentation.
Richness of Model Based Comparison and RemediationI’m a native Texan, born and raised in Austin. I know the jokes about how proud and vocal Texans can be
about their home state. That being said, I spend more time telling people about the model-based approach
Datical has applied to Database Change Management than I do telling them how wonderful it is that I was
fortunate enough to be born in the best state in the country. The advantages of the model based approach
really shine through when it comes to experimentation and troubleshooting.
The model allows you to annotate all objects and modifications to your application’s schema with the
business reason that prompted them. This detailed history is invaluable when designing new changes or
refactoring your schema as part of an experimental exercise. You immediately know what objects are most
crucial, what your dependencies are, what areas you need to tread lightly in, and what areas are ripe for
experimentation due to the changing needs of your business. Designing intelligently eliminates risk.
Troubleshooting with models is also dramatically faster and more reliable than other methods. Program-
matically comparing models allows you to determine the differences between two databases much more
quickly than manually comparing diagrams or SQL scripts. You know with certainty exactly what has
changed and what is missing in a fraction of the time that human review takes.
Once you have identified the differences, remediation is as simple as plugging one, some or all of the deter-
mined differences into the model and deploying those changes to the non-compliant instance.
3
The DevOps Database
12
Flexible Quick RollbackIf you’ve taken our advice to implement your rollback strategy when you implement a change, recovering
from disaster becomes testable, fast, and simple. Before going to production or a sensitive environment,
you should always test your rollback steps in dev, test and staging. This will allow you to make any tweaks
or changes to your rollback strategy before you are in a pinch. Think of it like testing your smoke alarm.
Hopefully you’ll never need it, but it’s nice to know that it’ll work if you do.
Let’s say the worst happens. You deploy a new set of database changes and the application performance
degrades or errors are logged. The decision is made to revert the entire installation to the previous version.
Because you have carefully designed, tested and refined your rollback strategy, rolling back the database
changes becomes a push button operation or a single invocation of a command line tool.
No more running disparate SQL scripts or undoing changes on the fly. You can be confident that your data-
base has been returned to the same state it was in before the upgrade.
ConclusionThe database has long been handled with kid gloves and for good reason. Data is a precious resource for
consumers and businesses. Consumers provide data to businesses and trust that it will be kept safe and
used to offer them better products and services. Businesses rely on data to strategize, grow, and become
more efficient and profitable. It is the lifeblood of our economy.
As our ability to collect and process data becomes greater and greater, the rate at which an enterprise
must move on what is learned from that data becomes linearly faster. Data must be kept safe, but the
database must become more agile to accommodate the growing pressure for faster value realization of
business intelligence initiatives.
DevOps patterns hold the key to this necessary agility while maintaining or improving the security and
integrity of data stores. The database and DBAs need to be brought into the application design process
earlier and must be treated as the first class stakeholder and commodity that they are.
Companies that acknowledge this and move to adopt DevOps patterns and include their database teams
will be at a distinct competitive advantage to those that don’t. Don’t miss the boat!
3
Copyright ©2015 Datical, Inc. Datical, all products prefaced by the word Datical and the Datical logos are either registered trademarks or trademarks of Datical, Inc. in the United States and/or other countries. All other products mentioned are either registered trademarks or trademarks of their respective corporations. 201511
13
About Pete PickerillPete Pickerill is Vice President of Products and Co-founder of Datical. Pete is a software
industry veteran who has built his career in Austin’s technology sector. Prior to
co-founding Datical, he was employee number one at Phurnace Software and helped
lead the company to an acquisition by BMC Software, Inc. Pete has spent the major-
ity of his career in successful Austin startups and the companies that acquired them
including Loop One (acquired by NeoPost Solutions), WholeSecurity (acquired by
Symantec, Inc.) and Phurnace Software. Pete is a frequent contributor to Datical’s Blog
Primary Keys and speaks often regarding DevOps and Agile Processes at industry events.
About DaticalDatical’s flagship product, Datical DB, automates the deployment of database schema updates for large
enterprises that struggle with the constant state of database application change across complex envi-
ronments. Based upon the Liquibase open source project, Datical DB manages the database application
lifecycle for people who develop, deploy, and maintain business critical applications.
Datical was founded by the same experienced management team that founded Phurnace Software,
acquired by BMC Software in 2006. Datical is privately held with headquarters in Austin, Texas.
Datical, Inc.
www.datical.comTel: [email protected]