I wanted to share a quick blog detailing some of the new ceremonies that we have added to a couple of our sprints in the Confluence team and how they have helped to mitigate risks.
So for a quick recap, these are some of the ceremonies that have taken place at Atlassian in the past.
What we do already
Dogfooding
Here in Adaptavist the only one of these ceremonies that we have carried out often is Dogfooding. This is where we install a release candidate on our own internal Confluence site and have a good play around with it for a few days. The benefits of dogfooding are that not only can the tester find issues but anyone using the system can find issues.
This is good because it gives the chance for the feature/bug fix to linger so we can see how it performs during everyday use.
Testing Notes/Test Cases
In The Confluence team, we aim to get testing in right at the start of development and for the last few features, this has worked really well. This helps to align the whole team on what is to be tested once the feature has been developed and also presents a chance for the devs, pm, tech writers to chip in with any scenarios that they think the QA has missed. This was only added to our ways of working pretty recently and I have found this to be extremely beneficial so if any teams are not currently doing something like this, I would strongly advise you to do so.
What Ceremonies did we add?
QA Demo
I found this to be arguably one of the most beneficial new ceremonies to utilise as it makes the communication of any testing findings to a number of stakeholders so much easier. Especially now that our teams are remote, this takes on even greater importance.
Once a feature has been fully tested, (all test cases executed and exploratory testing complete) a meeting is then set up typically between the QA, the PM, A Support Engineer, and a Tech Writer. Some devs can also be added to the meeting as optional as there could be some operational queries that only they can answer.
There also could be some issues that need further clarification or some aspects that work slightly different to what was intended. This ceremony will resolve everything in one go, leaving everyone coming out with a crystal clear idea of exactly where we are with a feature.
Blitz Test
This ceremony involves the whole team and requires just a small bit of planning. For more information on running one of these in your respective teams check out this page.
We have carried out two of these so far on CFMC and a total of 10 issues were found and fixed resulting in a product that we are confident about. The first Blitz session was scheduled for 2 hours as this was the first one we had done on CFMC and because we had not put the app in the bug bounty program by that time, we wanted to catch vulnerabilities before we added CFMC in the program.
The second Blitz we carried out was only time-boxed for an hour as it was just one new macro we wanted to ■■■■■■■■■. I feel that the input from different roles in the team is invaluable as everyone will use our features in a different way. So if future teams wish to try this out, I would recommend getting at least 2 of a Tech Writer, Marketer or PM involved to uncover different types of issues.
To conclude, I think the two ceremonies QA Demo & Blitz Test would greatly help our teams here at Adaptavist but they do not have to be added to every sprint or done religiously as this could potentially create a bottleneck. The blitz test can be done without a QA with help from the guidelines page mentioned in the previous paragraph. The QA demo can also be done by whoever has just tested the feature so alignment can still happen.
If test cases are added to each story then this improves things further as this means any member of the team can test the feature in a way that the QA would.
So depending on how big a new feature is, and also with experience, we can increase our confidence further for future releases with or without a QA just by strategically adding some of these ceremonies to our future sprints thus removing ( or minimising ) the chances of QA becoming a bottleneck.