A beta is where you choose your best solution from alpha and start to build it.
The beta phase comprises of two phases:
- A private beta (usually 16 weeks) available to a limited set of users to get real world feedback and make improvements.
- A public beta (usually 16 weeks) available to any user, run like a live service. The private beta must reach a launch-ready standard before it can move into public beta.
I delivered a successful beta recently. I wanted to capture my key learnings here.
What key questions should you consider in a beta?
- How is your service performing in relation to your Key Performance Indicators (KPIs)?
- How will you ensure the ongoing security of your service?
- Is your service accessible to those with accessibility needs?
- How will you manage the transition to live? What team will you need?
- What offline and online support systems do you have in place to help users?
What did my beta project look like?
Due to the high profile nature of my project, I ran a compressed beta phase combining a private beta and public beta into 12 weeks!
Here’s how I structured my project:
- Sprint 1 was about understanding the problem space and setting the scope. In alpha we prototyped our main journey. In beta, we needed to make sure we accounted for other pathways. We also started to set up our infrastructure e.g. hosting.
- Sprint 2-3 we continued with infrastructure set up. We also started to design out the journeys we had not accounted for in alpha. We tested these journeys with users and iterated.
- In Sprint 4, we ran security audits (load & pen test) and an accessibility audit.
- In Sprint 5, we actioned any changes from the audits, and released the service to a small group of users.
- For Sprint 6, we iterated the service based on user feedback and data from the service.
What does a typical beta team look like?
On our beta we had the following roles:
- Product Manager / Delivery Manager (me)
Led the team. They help to set the project’s strategic direction. The client usually provides the product role, but they are often inexperienced, so this is something I usually take too.
- Content designer x 2
Responsible for the content in your service.
- Developers x 2
Build software with a focus on what users need from your service and how they’ll use it.
- Technical lead
Ensures the technical architecture is robust, scalable, open and secure.
- Service Designer
Helps the team to build up an understanding of the end-to end process or service.
- UI designer
Creates the visuals for the service.
- User researcher
Plan and facilitate the user research. They also take a lead in facilitating synthesis sessions and work with the wider team to deliver high-impact outputs.
- Accessibility expert
Ran our accessibility audit, and helped to ensure our product met the latest accessibility standards.
Reflecting on the experience
- Running a compressed Beta phase was challenging. It meant we had to create a lean MVP, and say no to any new features. In some respects this was good because we could remain focused.
- Testing a working version of the service is more powerful. In our research rounds, we noticed users doing things we’d not quite expected e.g. entering a value we had not accounted for. Based on this feedback, we were able to improve our service.
- I led the alpha so we were able to move fast at the beginning of the beta. We didn’t need to spend the first couple of weeks rehashing old ground.
- Presenting our service to senior government officials was exhilarating but stressful too. It’s hard to avoid ‘design by committee’. Sometimes senior officials are privy to conversations you are not aware of. Your job as a team is to understand this, and work out how you can validate this feedback. Other times you will have no choice but to make a change. When this happened to me, I tried to counter by using data from our research but to no avail.
- Design and develop with accessibility in mind from the beginning. When it came time for our accessibility audit, we had minor changes to make because we’d spent time upfront doing the necessary work. Remember it is critical to ensure your service is accessible to all.
- We used a design prioritisation matrix to help us identify where we should focus based on user research and data. This worked well early on in the project. We didn’t use it as much towards the end. We knew what needed to be done so got on with the job.
Dual track agile
Image by: https://scrumandkanban.co.uk/dual-track-agile/
- In dual track agile, design and development splits out into two separate work streams. In the design stream, you are testing and learning. It runs ahead of the development stream. The goal of the design stream is to feed into the development cycle.
- The positive is you maintain a steady pipeline of development work. Also, you are testing your hypotheses and assumptions, which means you are building a better product.
- I ran two separate planning sessions: one for development, the other for design. This worked well. In the dev planning session, we spent time estimating and understanding tickets. In the design planning session, we talked through user feedback, and data to prioritise where we should focus first.
Other things to consider in your beta
- Test with users with accessibility needs
- Test offline journey with users
- Remain focused on the MVP, and avoid feature creep
- Ensure design are working alongside technical
- Measure the success of your service
At the end of a beta, you should have…
- A tested end-to-end service
- Accessibility and security test your service
- Assisted digital support in place where needed (e.g. phone and/or email support)
- Plan for continuous service improvement based on user testing; including team, metrics / KPIs, activities, time and cost
- Full understanding of cost of running service with budget and people in place to do this
- Documentation to hand over to live service team
I hope this helps when you embark on a beta.