SYS-CON MEDIA Authors: Trevor Parsons, Xenia von Wedel, Peter Silva, Glenn Rossman, Ava Smith

Related Topics: Open Source, Java, SOA & WOA, Linux

Open Source: Article

Memory Monitoring and Limiting with LXC

Building a Metrics Infrastructure

Original article can be found here: Memory Monitoring with LXC

Memory Monitoring And Limiting With LXC

For a long time we didn't limit the amount of memory that you can use during your build on Codeship. There was the possibility of a bad build eating up all of our memory.

A few weeks ago that bad build happened, using up so much memory that it decreased performance and eventually killed the test server. Even though we measure the memory usage of the whole test server, we didn't have the data to figure out exactly which build caused the trouble.

Combined maximum and minimum memory usage of Amazon EC2 Instances.

Combined maximum and minimum memory usage of Amazon EC2 Instances.

How to avoid builds eating up all of your memory
We couldn't risk that this problem happened again and be a threat to other builds on that test server. We didn't have enough data about the memory limits at this point so we had to take an educated guess. My first assumption was the most conservative one. Each test server has 60G of memory and we run 22 builds on each instance, so each build can get a maximum of 2,5G memory. As LXC manages memory with cgroups, it is simple to set a memory limit for each container.

Setting lxc.cgroup.memory.limit_in_bytes in container config:

lxc.cgroup.memory.limit_in_bytes = 2560M

This worked, the bad build wasn't a threat anymore.

We got a few support request where people were asking about their builds being stuck. While the 2,5G are enough for 95% of the builds, the other 5% were hitting the limit. We needed to increase the memory limit to ensure that all builds are running fine. After 2 incremental steps The Codeship is sailing smooth with a 10GB memory limit for each build.

Codeship - A hosted Continuous Deployment platform for web applications

Monitoring in the Future
We wanted to have more data to improve the memory limit in the future. We started measuring the maximum memory usage for each build. We export it to Librato Metrics and save it as metadata of the build in our database. This allows to inspect the memory usage easily. We plan to show the memory usage to our users in the future.

LXC tracks the memory usage for each container and exposes that and many other values in the cgroup. Right before we shutdown the build, we read the memory usage from the cgroup. You can read the memory usage from the cgroup on the LXC Host.

/sys/fs/cgroup/memory/lxc/name_of_running_container/memory.max_usage_in_bytes

and send the data to Librato Metrics.

Metriks.timer('build.memory_usage) .update(metrics[:max_memory])

Building a Metrics Infrastructure
It is important to back your actions with data. Data is your best argument. In case you don't have any useful data to solve the problem, it is important that you can easily add more metrics to your infrastructure. Sit back with your Coworkers and think about good metricsfor your product. Sometimes it's not easy to spot them on the first glimpse. We often talk about our metrics.

We discuss new metrics but also talk about removing redundant metrics. It is very exciting to talk about data and what your Coworkers conclude from that data. We are able to add new metrics to our infrastructure in minutes. By building this metrics infrastructure you can handle any new challenges by quickly adding mesurements that help you decide on the next steps. This infrastructure can be the difference between life and death of your service, so make sure you have it in place.

Do you monitor? Which tools and services do you use for it? Let us know in the comments!

Further Information on Linux Containers


Download Efficiency in Development Workflows: A free eBook for Software Developers. This book will save you a lot of time and make you and your development team happy.

Go ahead and try Codeship for Continuous Integration and Continuous Deployment! Set up for your GitHub and BitBucket projects only takes 3 minutes. It's free!

More Stories By Manuel Weiss

I am the cofounder of Codeship – a hosted Continuous Integration and Deployment platform for web applications. On the Codeship blog we love to write about Software Testing, Continuos Integration and Deployment. Also check out our weekly screencast series 'Testing Tuesday'!

Latest Stories
The 4th International DevOps Summit, co-located with16th International Cloud Expo – being held June 9-11, 2015, at the Javits Center in New York City, NY – announces that its Call for Papers is now open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's large...
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Verizon Enterprise Solutions is simplifying the cloud-purchasing experience for its clients, with the launch of Verizon Cloud Marketplace, a key foundational component of the company's robust ecosystem of enterprise-class technologies. The online storefront will initially feature pre-built cloud-based services from AppDynamics, Hitachi Data Systems, Juniper Networks, PfSense and Tervela. Available globally to enterprises using Verizon Cloud, Verizon Cloud Marketplace provides a one-stop shop fo...
Leysin American School is an exclusive, private boarding school located in Leysin, Switzerland. Leysin selected an OpenStack-powered, private cloud as a service to manage multiple applications and provide development environments for students across the institution. Seeking to meet rigid data sovereignty and data integrity requirements while offering flexible, on-demand cloud resources to users, Leysin identified OpenStack as the clear choice to round out the school's cloud strategy. Additional...
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com...
We are all here because we are sold on the transformative promise of The Cloud. But what good is all of this ephemeral, on-demand infrastructure if your usage doesn't actually improve the agility and speed of your business? How must Operations adapt in order to avoid stifling your Cloud initiative? In his session at DevOps Summit, Damon Edwards, co-founder and managing partner of the DTO Solutions, will highlight the successful organizational, process, and tooling patterns of high-performing c...
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from ha...
Software-driven innovation is becoming a primary approach to how businesses create and deliver new value to customers. A survey of 400 business and IT executives by the IBM Institute for Business Value showed businesses that are more effective at software delivery are also more profitable than their peers nearly 70 percent of the time (1). DevOps provides a way for businesses to remain competitive, applying lean and agile principles to software development to speed the delivery of software that ...
Docker offers a new, lightweight approach to application portability. Applications are shipped using a common container format and managed with a high-level API. Their processes run within isolated namespaces that abstract the operating environment independently of the distribution, versions, network setup, and other details of this environment. This "containerization" has often been nicknamed "the new virtualization." But containers are more than lightweight virtual machines. Beyond their small...
The move in recent years to cloud computing services and architectures has added significant pace to the application development and deployment environment. When enterprise IT can spin up large computing instances in just minutes, developers can also design and deploy in small time frames that were unimaginable a few years ago. The consequent move toward lean, agile, and fast development leads to the need for the development and operations sides to work very closely together. Thus, DevOps become...
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.

ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ --  IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's

An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and asse...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Ar...
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the "Internet of Things" (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his General Session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, discuss the ...