Reading time: 4.5 minutes
My team designs with the support of several tools. Our design process has several feedback cycles built into it.
This design process and supporting tools are tailored to helping us generate designs of processes and tools for the problems we solve: helping customers with 10s or hundreds of engineers migrate to AWS securely and adopt continuous delivery of applications and infrastructure.
The goal of the design process is to help us make good (but not perfect!) decisions on matters that will be difficult to change later. In particular, we want to gather information about whether customers want to use the system, the system’s demands in terms of people and compute resources, and the limits of the system.
We propose a problem and solution in our design, gather feedback, and update the proposed design.
First, we seed a design by defining the problem and sketching a solution.
Second, we draft a design using the QM design template. Completing the first draft of a design may take an afternoon or a couple days of uninterrupted focus. I prefer taking a ‘breadth-first’ approach cover each area in the design doc quickly and note which topics need deeper investigation. It’s best not to spend a lot of time going deep investigating some topics before gathering feedback from stakeholders to confirm they actually need them. For example, it’s fine to note that enforcing encrypted connections to Aurora Postgres can be done with a cluster parameter but Aurora MySQL requires user grants or system queries.
At this point in the process, the design may have (lots of) error in the:
- Problem statement
- Benefits a solution will provide to customers
- Key use cases supported by the system
- Architecture of the solution
- How the solution satisfies ‘non-functional’ requirements
Let’s incorporate a round of design author feedback. Put the design aside for a day so it’s easier to approach it with a fresh mind. Print the design on paper. Go someplace without distractions. Read it top to bottom and mark it up with questions and suggested improvements. I find that a lot of ambiguities and optimizations are discovered in this review cycle. Address all of the issues in the design that you feel are important for your next round of feedback: design review by your peers.
Next, we’ll use a couple of tools that help us detect design defects: design review and prototyping.
According to a review of quality analysis methods described in Code Complete, 2ed, the modal defect detection rate of these practices are:
- Formal Design Review: 55% (‘Informal’ is 35%)
- Prototyping: 65%
These practices should detect more than 50% of defects out of the design, especially when used in combination. This is great, because it can really help us conserve valuable development effort and money.
The third major step in our design process is to conduct a design review.
Our design review is of the ‘formal’ variety. The review includes multiple reviewers with support ‘checklist’ of questions and uses a particular format. The reviewers include stakeholders from outside the team who are versed in the problem space, ideally several people who will use or operate the system built from the design. Naturally, engineers from the delivery team implementing the system are included along with consulting architects and senior engineers from the organization.
The actual format of our design review meeting comes to us from Amazon (supposedly, never worked there). This format is designed for busy people. I’ll share more on the mechanics of that review process next.
Conduct the review, gather up the feedback, and incorporate it into the design. There’s a good chance you just found half of the design’s defects and can handle them in some way. You may also have made decisions that can now be recorded for posterity.
When we want to identify more than just half of the design’s defects, we use prototyping.
Prototyping is a tool teams can use to explore the problem and solution space and gather feedback from the real world. One of the ways I think about prototypes is as a test of the design.
At this point in the (QM) design process, the design is built on educated guesses by the engineers and other stakeholders involved with the design. They might be good guesses, but we’re still operating in our minds.
In a prototype, you build a form of the solution that approximates what you’ve designed. We generally try to go end-to-end from a customer request through the system on back to a legitimate looking response for the primary use case. We try to connect all the major components in the system with the simplest thing that could possibly work. By default, we consider this throwaway code — bash, python, aws-cli, jq, even some manual steps for handoffs between processes, are all fine and sometimes preferable.
We’re doing this to learn and (in)validate the design, not implement it. Re-use is gravy.
This feedback from the real world can you help you:
- validate proposed communication patterns and runtime complexity of core algorithms
- collect real-world data on request processing time and rate limits
- understand what actually comes back from someone else’s API
- identify the common runtime errors to accounted for
- estimate storage and compute costs
New, non-trivial systems are going to go through a significant learning curve and prototyping is a way to label that process appropriately. There’s a lot of ‘MVPs’ out in the world that are actually prototypes.
Ok, now feed that learning back into the design. We estimate that prototypes help us discover half of the design defects that remained after design review. This results in approximately a 75% reduction in design defects overall.
Learning from design feedback
Once design feedback is collected, the delivery team and product manager have much more knowledge about whether to continue, how to approach the solution, and how to break up and sequence the work. Additionally, you can communicate to collaborating teams with the confidence that you’e probably identified half or more of the design’s defects.
Receive #NoDrama articles in your inbox whenever they are published. Reply to Stephen and the QualiMente team when you want to dig deeper into a topic.