Conference Presentations

Performance Appraisals for Agile Teams

Traditional performance evaluations, which focus solely on individual performance, create a “chasm of disconnect” for agile team members. Because agile is all about team performance and trust, the typical HR performance evaluation system is not congruent with agile development. Based on his practical experience leading agile teams, Michael Hall explores how measurements drive behavior, why team measurement is important, what to measure, and what not to measure. Michael introduces tangible techniques for measuring agile team performance-end of sprint retrospectives, sprint and project report cards, peer reviews, and annual team performance reviews. To demonstrate what he’s describing, Michael uses role plays to contrast traditional, dysfunctional annual reviews with agile-focused performance reviews.

Michael Hall, WorldLink, Inc.
Lessons from a DevOps Journey

In large financial institutions, treasury departments-specialized teams of traders and experts in liquidity, risk, accounting, financial forecasting, and quantitative analysis-manage the organization’s wealth and financial risk. These departments require large, complex, third-party software products that must change often to support the treasury’s complicated business processes. Matt Callanan describes how a team of developers and operations staff-the DevOps team-applied agile principles to the “last mile” and reduced software deployment from one week to one day. He discusses how their DevOps team collaborated to develop automation solutions to support ongoing deployment activities and solve many issues in the operational environment.

Matt Callanan, Independent
Ready, Really Ready, and Really Really Ready Stories

Product owners create stories they believe are ready for development. Developers accept and then estimate stories that are not really ready to be started. This disconnect between being “ready” and “really ready” results in miscommunication and frustration. For example, story development can take much longer than original estimates because of the details and “sad paths” that were not expressed in the story. Ken Pugh describes how to turn vague acceptance criteria into specific acceptance tests. He explains how levels of detail in acceptance tests can help to more closely estimate the effort required by stories and shows how acceptance tests determine when stories are complete. With Ken, you’ll go through creating a “really really ready” story and examine when it should be created and who should participate.

Ken Pugh, Net Objectives
Agile Development Conference & Better Software Conference West 2012: Patterns of "Big" Scrum

Software development organizations adopting Scrum have struggled to apply it to big projects with multiple teams. Dan Rawsthorne is frequently asked, “What does ‘big’ Scrum look like?” Because no two organizations are alike, this simple question does not have a simple answer. However, Dan has discovered patterns that are common in organizations that successfully implement “big” scrum. The first pattern he explores-Product Owner Team-allows the organization to handle agility up and down the hierarchy. Dan also discusses the Cross-cutting Teams pattern that handles issues-architecture, usability, integration, performance, and evaluation-that the formal hierarchy can’t resolve. Finally, Dan discusses the BuddyUp pattern to describe the best way to work with subject matter experts from dispersed parts of the organization.

Dan Rawsthorne, Consultant
Enterprise Agile: From the Top Down

Now that agile has gone mainstream, team-level development is not the only way organizations are implementing agile. Some senior management teams are trying to understand how they can implement agile-and lean-principles and practices from the top down. Jon Stahl demonstrates agile and lean techniques applied in a new way with certain constraints. With these techniques, your organization can begin its journey toward becoming an agile enterprise. However, before beginning, it is important that management “see the whole”-customers, projects, applications, people, leadership, financials, and standard work products-and start implementing and practicing the culture they wish to create. To help PMOs support this journey, Jon shares some guiding principles that can be applied to both agile and waterfall approaches.

Jon Stahl, LeanDog Software, Inc.
Implementing Agile in the Cloud with a Large Distributed Team

Jeremy Leach shares Pitney Bowes’ agile development experience implementing a cloud-based application with a large, globally-distributed team. Jeremy’s story recounts challenges working with the very specific delivery cycles required by third-party contractors and hardware vendors. He describes the interactions and complexities that a global engineering team face when multiple project and products must come together into a single release. Learn how outside elements can stress the development rhythm that a team needs, how to mitigate these challenges, and how Pitney Bowes eventually came to embrace them. Jeremy explores how their management evolved and the focus of their communications structure changed from key individuals to group collaboration. In conclusion, Jeremy shares lessons learned and how Pitney Bowes is structuring similar projects for the future.

Jeremy Leach, Pitney Bowes
Specification by Example: Building Executable Requirements

Specification by Example is a collaborative approach for constructing executable requirements. Examples demonstrate how the system should operate through the eyes of its users and shows understanding of the application’s functions. Michael Connolly demonstrates the practical and easy-to-implement Specification by Example method which he uses to write user stories and acceptance criteria. This direct approach, in which requirements are elaborated via executable code, creates a solid communication bridge between non-technical and technical staff and managers within the organization. Eventually, these executable requirements become the basis for the system’s acceptance test suite. As a take away, Michael provides participants with a lightweight requirements document format and an acceptance criteria framework to help you translate written specifications into automation.

Michael Connolly, OPOWER
Acceptance Test-driven Development: Tests with the Future in Mind

Acceptance Test-driven Development (ATDD) is a popular topic these days-everyone’s excited about the idea of writing tests prior to development. Yet many teams run into difficulties as they attempt to implement this practice. It’s all too easy to fall into the trap of writing acceptance tests that mostly specify keystrokes and button clicks. Join "Cheezy" Morgan as he offers an overview of ATDD while sharing his experiences and insights gained working with numerous teams implementing ATDD. "Cheezy" will take you on a journey of discovery, demonstrating practical techniques for writing ATDD tests that describe the essence of what they are specifying while hiding unnecessary details that obfuscate their meaning. Because ease of maintenance is a key to ATDD’s long-term ROI, "Cheezy" shows how to structure and layer test code to reduce brittleness and fragility so your ATDD test suite will retain its usefulness well into the future.

Jeff Morgan, LeanDog
Signs Your Agile Adoption Is Off Track-And How to Fix It

Adopting agile is often a difficult proposition with many variables and sometimes uneven results. Recognizing when your adoption isn't working well and taking pro-active actions to put it back on track are essential. So, how do you know if your adoption is proceeding through rough but expected waters or running the risk of failing? Thomas Stiehm describes the signs of serious adoption problems and the steps you can take to fix them. Leveraging ten years of experience helping teams adopt agile, Tom walks through the many successes and failures he’s seen and, more importantly, the mistakes companies and people made that led to those failures. Learn the remediation steps you can take to re-energize and re-center your adoption efforts. Don’t let small missteps cascade into failure. Instead, join in and take back an action plan that’s sure to increase the odds of making your agile adoption a win for you, your teams, and your company.

Thomas Stiehm, Coveros, Inc.
Testing Traps to Avoid on Agile Teams

Why do many agile teams fail at testing? Iterations turn into mini-waterfalls with testing at the end; stories never become “done” and carry into the next iteration with unresolved bugs; testers worry they’re losing control or being set up to fail; customers keep changing their minds after all the tests have passed. However, some teams do succeed with testing on agile projects. What do they do differently? Janet Gregory shares the lessons she’s learned that help teams-and especially testers-get agile right. With examples from her real-world experiences, Janet describes the testing traps and the practice or process to help fix each one. One example is “forgetting the big picture”-so easy when you are testing small, granular stories. A practice to put in place that avoids this trap is implementing feature acceptance tests to supplement story acceptance tests.

Janet Gregory, DragonFire, Inc.

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.