SYS-CON MEDIA Authors: Liz McMillan, Elizabeth White, Maria C. Horton, Andy Thurai, Zakia Bouachraoui

Blog Feed Post

Struggling to scale Agile?

Small teams are more effective. The general agreement is that anything from 5 to 12 is the 'right' small. But of course small teams will also have 'small' throughput - relatively speaking. So if your demand is X and the throughput of a small team is X/10, you probably need 10 teams to meet that demand. But more teams also mean more effort to coordinate and align their efforts in the same direction. So, the challenge is how to harness the power of small teams and yet orchestrate multiples of them to get higher throughput.

In the context of enterprise Agile, this is very critical.

From slow and rigid to fast and flexible. That was the promise of Agile to IT. What it meant was that ideas coming from business would be converted into working software rapidly. It meant that IT will be more responsive to what business wants and needs and will not spend an eternity analysing the requirements, writing detailed specifications and then sending them back to business to review. Some of that has happened. Product backlogs are more informal and easier to maintain and update than formal specifications. IT pushes business to prioritize and tries to break the requirements down so that they are both manageable and valuable and can be delivered in short sprints of 2~3 weeks. So far so good. However, what happens after that is a bit complicated.


In the recent past, as I observe agile projects in organization after organization, sometimes as a consultant, sometimes as a Scrum Master and sometimes just as an invisible fly on the wall, I notice organizations struggling to scale. There is a high degree of awareness and obsession with the mechanics of agile but without an equivalent grounding on the fundamentals. Yes, there is SAFE, LeSS, etc., but frameworks are good when you know how to apply them in your context. Otherwise, you get drowned under new rules and terminology. Here is a summary of my recommendations for organizations trying to scale Agile:

1. Architecture - Agile discourages grand upfront design. Recommendation is to do a design which is sufficient for the prioritized user stories and then iteratively improve it so that it is able to accommodate newer stories. The overall architecture thus evolves and emerges. This approach is fine when you have a single scrum team working on a set of stories. The moment more scrum teams come into the picture, the concept of emergent design becomes a bit fuzzy. Two or more scrum teams working on a set of user stories without a bare minimum architectural skeleton as a reference will pose huge integration and collaboration challenges. In these scenarios, it is wise to invest some upfront effort in creating an architectural framework which all the teams agree to and will collectively evolve as the software grows. I see this explicit guideline / practice missing in many instances.

2. Work Allocation - One of the basic premise of agile was to break functional silos and align people to what the customer wants. By advocating the practice of having cross functional scrum teams, agile has solved this problem. But in many organizations I've seen functional silos giving way to scope silos - individual scrum teams too focussed only on their features or themes or scope of work at the cost of the overall system. Scrum masters need to be wary of this. This goes back to how work is allocated or pulled by individual scrum teams. If scrum teams pull items from the product backlog strictly by features assigned to them, risks of scope silos are higher. There are no easy answers. There is a dependency on the overall architecture of the product (if there is one and in a multi scrum team scenario there should be atleast a skeletal version) and the extent to which there are dependencies and modularity. In my view, to the extent possible, teams should have the freedom to pull items from the entire product backlog and not restricted to selected features or themes / epics.

3. Ownership - The Product Owner owns the system and is responsible that the right product is built. In my view, he/she owns the behaviour of the system and controls its destiny. But who owns the architecture, design and other internals of the system? You can alter the internals and yet get the same behaviour - who controls what can be changed and what cannot be?. In a multi scrum team structure it is very important to fix that ownership clearly. Related to the issue of work allocation explained above, there has to be a team who is responsible for the overall system internals and not just a set of themes or features. Projects in real life have a finite end date and products have a future state, post which only incremental changes are made to it. This means development effort over the life of the product will peak and then plateau. Extra capacity has to be released for other productive work and hence fixing system ownership early on is very important.

4. Communication and Noise - Agile gives importance to individuals and the interactions between them. Popular literature recommends an optimal team size of 7 and so does Scrum. At the same time multi scrum teams are a reality because they can provide higher throughput. To be effective a team should be shielded from external interference and be allowed to focus on the job at hand. I've seen instances where 2 scrum teams have so many dependencies between them, they are constantly talking to each other. In a way, they are then actually not separate teams. The benefit of higher throughput should be weighed against the cost of higher interference and communication overhead between multiple teams. In my view, while interactions within the team should be encouraged, that between the teams should be controlled lest it leads to chaos.

5. Nurturing Excellence - Today, the word 'waterfall' has almost taken on evil connotations. We're so paranoid about the term 'functional silos' that we assume than any 'functional' team is necessarily a 'silo'. That is far from true. Technical excellence is necessary to live upto the promise of Agile. How do you nurture that? Smart developers learn from smart developers and challenge each other to reach higher levels of expertise. Same is true for architects, testers and other members of a product team. Organizations should provide forums or structures where professionals practising the same craft can learn and interact with each other. It is perfectly normal for a developer to have strong affiliations towards his / her scrum team as well as the developer community inside or outside organizations. Organizations should provide the opportunity to maintain and nourish these multiple identities. They strengthen our collaborative capabilities in a cross functional forum, not diminish them.

6. Integration - This is one of the basic best practices which is not adhered to very consistently. In a multi scrum team scenario continuous (preferably automated and daily) integration needs to happen not just within but between scrum teams. Many organizations have this concept of an integration sprint at the end of 2~3 sprints which is basically a recipe for disaster and very watefallish in approach as it is delaying the discovery of the inevitable bugs and thereby the cost of resolving them. It is better that everybody stops working and resolve any integration problem when it happens rather than keeping them in a backlog and allowing technical debt to accumulate. I agree that in some context there would be technical limitations to integration but let those not be by design. Multiple teams feeding from a single gold version and collectively merging their versions might seem messy but if done regularly and with discipline, makes integration a non-event.   

Read the original blog entry...

More Stories By Sujoy Sen

Sujoy is a TOGAF Certified Enterprise Architect, a Certified Six Sigma Black Belt and Manager of Organizational Excellence from American Society for Quality, a PMP, a CISA, an Agile Coach, a Devops Evangelist and lately, a Digital enthusiast. With over 20 years of professional experience now, he has led multiple consulting engagements with Fortune 500 customers across the globe. He has a Masters Degree in Quality Management and a Bachelors in Electrical Engineering. He is based out of New Jersey.

Latest Stories
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
"We were founded in 2003 and the way we were founded was about good backup and good disaster recovery for our clients, and for the last 20 years we've been pretty consistent with that," noted Marc Malafronte, Territory Manager at StorageCraft, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
According to the IDC InfoBrief, Sponsored by Nutanix, “Surviving and Thriving in a Multi-cloud World,” multicloud deployments are now the norm for enterprise organizations – less than 30% of customers report using single cloud environments. Most customers leverage different cloud platforms across multiple service providers. The interoperability of data and applications between these varied cloud environments is growing in importance and yet access to hybrid cloud capabilities where a single appl...
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
In today's always-on world, customer expectations have changed. Competitive differentiation is delivered through rapid software innovations, the ability to respond to issues quickly and by releasing high-quality code with minimal interruptions. DevOps isn't some far off goal; it's methodologies and practices are a response to this demand. The demand to go faster. The demand for more uptime. The demand to innovate. In this keynote, we will cover the Nutanix Developer Stack. Built from the foundat...
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Sold by Nutanix, Nutanix Mine with Veeam can be deployed in minutes and simplifies the full lifecycle of data backup operations, including on-going management, scaling and troubleshooting. The offering combines highly-efficient storage working in concert with Veeam Backup and Replication, helping customers achieve comprehensive data protection for all their workloads — virtual, physical and private cloud —to meet increasing business demands for uptime and productivity.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessio...
"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin, ...
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...