|By Pete Pickerill||
|February 11, 2014 10:00 AM EST||
In the third post in this series, I’d like to talk about the Second Way of DevOps: Amplifying Feedback Loops. Here’s a refresher on The Second Way from my introductory post in this series:
The Second Way: Amplify Feedback Loops – This Way deals primarily with facilitating easier and faster communication between all individuals in a DevOps organization. The goals of this step are to foster better understanding of all internal and external customers in the process and to develop an accessible body of knowledge to replace the dependence on expertise scattered across individuals.
I’ve stated before in this series that Database Change Management poses a unique challenge when your organization is shifting to an agile development methodology and implementing DevOps patterns. Unlike other areas of your application stack, responsibility for managing application schema straddles two groups operating under somewhat opposed expectations. The development group is on the hook for producing more and more business critical features and releases at an ever increasing rate. DBAs are tasked with providing a secure, highly available data platform and protecting the integrity of the organization’s priceless data. The rate of schema change required by development to satisfy expectations can run head long into a database change process that is deliberate and metered by necessity to avoid downtime and data loss. In organizations where these two groups are isolated from each other, you have the makings of a bottle neck in your release process.
The solution to this problem is embodied by The Second Way of DevOps. Communicate early, communicate often, communicate broadly, and prepare for what’s ahead. The tricky part is implementing the solution in a way that’s meaningful to every stakeholder in an organization’s application group. At Datical, we’ve spent just as much time on how we organize and present the data associated with application schema changes as we have on automating the deployment of these changes. We’ve rallied around the following key concepts to bring the The Second Way of DevOps to Database Change Management.
Proactive, Predictive Change Analysis
In an organization where development works independently of the database group, truly understanding the impact a stack of SQL scripts will have on downstream environments is a tedious and time consuming task. Before these changes can be promoted, target environments must be meticulously evaluated for conflicts and dependencies that will impact the deployment process. This often involves manual reviews and comparisons of diagrams and database dumps of complex environments. Achieving a high degree of confidence in the success of the proposed updates is difficult because it is so easy to overlook something. Datical has developed a patent pending simulation feature called Forecast that automates this process. The Forecast feature builds an in memory model of the target environment, simulates proposed changes on top of that model, and warns of potential error conditions, data loss and performance issues without touching the target database. Because there is no impact to target environments, database administrators can Forecast changes several times during the development cycle to get ahead of issues that would normally be discovered much later in a pre-release review. Development gets regular feedback on the changes they are proposing and can address issues that arise during the initial development phase when it is easier and safer to resolve them. The two teams are working in unison to ensure a safe database deployment that works the first time without surprises.
Always Remember Where You Came From
Database changes are usually designed to address the immediate goals of an organization. Once one set of requirements has been satisfied by a release, the motivations for the design decisions made for that release generally fades away as new requirements come along and new business initiatives take center stage. Comments in SQL scripts and on the database objects themselves can be helpful in determining why things are the way they are, but these traces of the past are scattered everywhere. Making sense of the whole is an exercise in archaeology. This was one of the driving forces behind our model based approach to database change management. Our model is architected to provide a living history of your application schema. Individual changes are tied to the specific requirement and release that necessitated them. This data lives in the model so the information you need to make intelligent design decisions is right in front of you when you need it.
Know Where You Are
By tying the business reasons behind each schema change in the model, this information can be tracked in each database instance as it’s updated and included in Forecast, Deploy, and historical reports. Tracking the changes in each instance and providing detailed reports allows you to easily disseminate information, effectively gate deployment steps, and quickly satisfy audit requirements. When everyone in your organization has access to thorough accounts of the Who, What, Where, When, and Why of any single database change in any environment, everyone is operating on the same level and can more effectively work towards a common goal.
Know Where You’re Headed
The model also facilitates concurrent development on multiple releases of a project. By tracking changes made for several different releases in a single model, the development teams working on these releases are able to collaborate and stay ahead of changes made by other teams that may impact future releases. Developers are able to unify redundant changes and eliminate conflicting changes as they implement instead of spending time on redesign later in the process when time is scarce and the cost of change is high.
Lori MacVittie is responsible for education and evangelism of application services available across F5's entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senio...
Dec. 24, 2014 09:00 PM EST Reads: 1,183
15th Cloud Expo, which took place Nov. 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA, expanded the conference content of @ThingsExpo, Big Data Expo, and DevOps Summit to include two developer events. IBM held a Bluemix Developer Playground on November 5 and ElasticBox held a Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of Bluemix, its services and functionalit...
Dec. 24, 2014 06:30 PM EST Reads: 1,888
SYS-CON Media announced today that Skytap blog on "DevOps Journal" exceeded 84,000 story reads. DevOps Journal is focused on this critical enterprise IT topic in the world of cloud computing. DevOps Journal brings valuable information to DevOps professionals who are transforming the way enterprise IT is done. Noel Wurst is the managing content editor at Skytap. Skytap provides SaaS-based dev/test environments to the enterprise. Skytap solution removes the inefficiencies and constraints that comp...
Dec. 24, 2014 05:00 PM EST Reads: 1,293
SYS-CON Events announced today that Gridstore™, the leader in hyper-converged infrastructure purpose-built to optimize Microsoft workloads, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Gridstore™ is the leader in hyper-converged infrastructure purpose-built for Microsoft workloads and designed to accelerate applications in virtualized environments. Gridstore’s hyper-converged infrastructure is the ...
Dec. 24, 2014 04:00 PM EST Reads: 1,695
In her General Session at 15th Cloud Expo, Anne Plese, Senior Consultant, Cloud Product Marketing, at Verizon Enterprise, focused on finding the right mix of renting vs. buying Oracle capacity to scale to meet business demands, and offer validated Oracle database TCO models for Oracle development and testing environments. Anne Plese is a marketing and technology enthusiast/realist with over 19+ years in high tech. At Verizon Enterprise, she focuses on driving growth for the Verizon Cloud platfo...
Dec. 24, 2014 04:00 PM EST Reads: 2,313
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to th...
Dec. 24, 2014 02:00 PM EST Reads: 2,590
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 24, 2014 02:00 PM EST Reads: 2,803
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
Dec. 24, 2014 01:45 PM EST Reads: 1,926
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Dec. 24, 2014 01:00 PM EST Reads: 1,899
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happe...
Dec. 24, 2014 12:30 PM EST Reads: 1,906
WebRTC defines no default signaling protocol, causing fragmentation between WebRTC silos. SIP and XMPP provide possibilities, but come with considerable complexity and are not designed for use in a web environment. In his session at @ThingsExpo, Matthew Hodgson, technical co-founder of the Matrix.org, discussed how Matrix is a new non-profit Open Source Project that defines both a new HTTP-based standard for VoIP & IM signaling and provides reference implementations.
Dec. 24, 2014 12:30 PM EST Reads: 1,749
"ElasticBox is an enterprise company that makes it very easy for developers and IT ops to collaborate to develop, build and deploy applications on any cloud - private, public or hybrid," stated Monish Sharma, VP of Customer Success at ElasticBox, in this SYS-CON.tv interview at DevOps Summit, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 24, 2014 12:00 PM EST Reads: 1,688
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from ha...
Dec. 24, 2014 12:00 PM EST Reads: 2,105
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series dat...
Dec. 24, 2014 12:00 PM EST Reads: 2,054
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and asse...
Dec. 24, 2014 11:45 AM EST Reads: 2,184