After chatting with Troy over Skype yesterday I decided to add an article about his (or I have to decide their because there are two coaches more involved in the project) wonderful assessment website.
Myself along with Mike Mccalla and James Gifford recently created the industry’s first fully customisable agile self-assessment tool called Lean Agile Intelligence. I wanted to share some examples of how we are using it and discuss what pitfalls to avoid when it comes to assessments.
The word assessment has become a dirty word in the Agile community and I can understand why. The market has been littered with an abundance of assessments, bringing their credibility and value into question. On top of that, organisations are abusing them, utilising them as team audits. We thought long and hard about whether or not to adopt assessments, but concluded that given the safe environment, they could be used as a tool for learning Agile practices, identifying team coaching needs, and recognising organisation behavioural patterns.
We recognized that assessments are not one size fits all! Rather, they should be tailored to the team’s challenges and objectives. All Agile coaches collaborated on a catalog of over 100 tried and true Agile and Lean Practices directly compiled from the specific framework literature. Each practice was mapped to one or many general business outcomes such as time to market, employee satisfaction, customer satisfaction, innovation, reliability, responsiveness, and predictability. The catalog provided guidance to leaders and teams at all levels of the organisation. It was used to identify “when” to experiment with a practice and “how” to properly apply and measure the effectiveness of it. Each practice consisted of a set of criteria for each of the five agility stages; blocked, developing, emerging, adapting, optimising.
The team was empowered to assess themselves. After all, the value is in the conversation! We worked with teams on facilitation and consensus voting techniques, and helped conduct assessments in which team members felt comfortable to express their opinions. There was no scoring system, just a set of criteria for each agility stage that enable teams to assess their fluency on each practice and set goals for improvement.
Finally, we were able to create dashboards that consolidated cross-assessment and cross-team results in a consumable and actionable format so behavioural patterns, organisational constraints, coaching needs, and continuous improvement opportunities could be identified.
Aggregated results were created from teams within a specific business unit or a program. This approach provided a holistic view of the group’s performance on practices that contributed to each general business outcome. Each unit or program had slightly different objectives and would assess themselves on the practices that lead to that desired business outcome.
For example, we worked closely with the business unit responsible for supporting the organisation’s payments application. Given that this was a mission critical application, the team’s focus was mainly on reliability and quality. After two assessments, it became obvious the team felt they lacked knowledge in key technical engineering practices that lead to high quality and reliability. This was a scenario in which our customised self-assessment framework enabled us to identify a coaching need for a team which supported a mission-critical application vital to the organisation’s success. The team was assigned a technical coach who focused on the adoption of Agile engineering practices such as test-driven development, continuous integration, and test automation.
The overall agility stage of each team and group was also tracked. The overall agility stage of the team was determined by calculating the average of all questions answered by them in a given assessment. The motive behind this data point was to ensure that the overall Agile fluency of the team was continuing to improve.
In one scenario, we identified a team that did not improve their overall agility stage after six quarters. After engaging the team, we found they lacked the skillsets to support the technical stack of the product, and were forced to hire contractors to fill this void. The organization did not retain a contractor for more than six months, so the team received new team members on the same cadence. It was obvious that the churn of team members prevented the team from forming team norms, and forced them to essentially start over every six months. After further discussion, it turned out the group’s management did not quite understand the product’s technology, and the level of effort and knowledge needed to best support customers. The conclusion was that the benefit of hiring a full-time employee in this role far exceeded the cost.
To check Lean Agile Intelligence out go to www.leanagileintelligence.com . The first 4 months are absolutely free!
Experienced Agile practitioner and creator of multiple original software products with a focus on team and personal continuous improvement. Extensive hands on experience as a Scrum Master, Agile Coach, Product Owner, and Agile/Scrum Trainer.
Troy is currently a full time Lean/Agile practitioner, coaching teams in self-organisation, cross functionality and technical excellence using S.O.L.I.D principles and Extreme Programming (XP) practices like Test Driven Development (TDD), Continuous Integration / Delivery (CI/CD), Pair programming, Mob Programming, Lean UX / Customer focus, and continuous improvement.
Troy holds certifications from Scrum Alliance (CSP, CSM, CSPO), Scrum.org (PSM II, SPS), PMI (PMI-ACP).