Design Systems metrics — what I have learned leading the DS of the biggest edtech in Latin America.

André Rolla
9 min readNov 24, 2023

Without a doubt, one of the main discussions about Design Systems in organizations is how to measure and demonstrate that the investment pays off. There is already a lot of material on the subject out there, so in this post, I will focus on my practical experience, leading the Hotmart Design System for 5 years, and the interactions I had with many other companies throughout this journey.

First of all, it’s important to understand the level of maturity you already have; the set of metrics you choose will depend on the team’s maturity level, available resources, the type of organization, and the system architecture.

To illustrate these maturity levels, I will use some hypothetical and very simplistic examples:

Level 1 — You are part of a design team of 5 to 10 people, the Design System will serve 1 to 3 products or tribes. You are in the construction phase of the DS, developing base components, and in the phase of initiating implementation.

Level 2 — You are part of a design team of 11 to 50 people, the Design System will serve 4 to 8 products or tribes. You already have the majority of components built, both in the design library and in code. You have complex components, implement accessibility best practices in components, and run iterative processes to evolve the DS by talking to internal customers. You have a 40% coverage of products using the ideal version.

Level 3 — You are part of a design team with more than 50 people, the Design System will serve more than 10 products or tribes, and may even serve more than one company. You already have all components built and implemented, have a data structure around the DS, you know the satisfaction indices of internal customers, you test components with external users. You have a coverage of more than 70% of products using the ideal version of the DS.

The important part is to find the few metrics that demonstrate genuine progress and value to the project at the moment you find yourself in. Filling a dashboard with metrics that are not actionable will only increase complexity and undermine the success of the project.

Another important point is to understand that a Design System is a social endeavor. There is the entire complex technical universe behind the development of a DS, but when it comes to implementation throughout the entire product, we are primarily talking about the art of negotiating and managing stakeholders. Therefore, it is of great importance to work with both quantitative and qualitative metrics.

Quantitative metrics will highlight progress in numerical and practical terms, measuring the KPIs of objectives.

Qualitative metrics will help you understand the subjective aspect of the Design System — for example, understanding why the satisfaction level of front-ends is low in relation to token governance, which reduces product adoption.

After this brief introduction, let’s move on to more practical matters. In the development of a Design System, it is important to focus on 4 topics: Development, Adoption/Coverage, Team Operation, and Shared Responsibility.

The most effective way we found to coordinate the work was by using the OKRs framework directed at each of these themes and developing KPIs for monitoring each one (learned from a post by Nathan Curtis).

See an example with fictional data of the structure we applied.

→ As you are dedicating your time in this article I'll give you free access to this table.

Let’s delve a bit deeper into the concepts:

Development:

It’s quite obvious, but many teams falter when it comes to communicating with upper management. It is extremely important to demonstrate to the company that the DS team has a complete understanding of the project’s size, control over the progress, and clarity in showing what still needs to be done, what can be done, in how much time, with the available resources (this also extends to coverage metrics; the point is that it cannot be ignored in this phase). When senior management lacks clarity on these aspects, it creates a sense of insecurity about the project.

In essence: know how to demonstrate that you have control over the project’s development. Here are some examples of how this can evolve:

  • Level 1 — Have a reliable list of all necessary components, have a map of all necessary steps (design + dev) for the completion of a component, create a calculation to understand the % completion, set goals, monitor, and share.
  • Level 2 — Adding another layer, you can begin to measure the time invested in each component, adding up the design and development stages. These metrics will provide a basis for mapping bottlenecks and starting to think about improving team processes, prioritizing better, and reducing the speed of DS construction.
  • Level 3 — Once you have a view of the time spent building the components, you can add the financial variable, calculating the cost of each component based on the volume of hours multiplied by the average hourly rate of the team. In the future, these data can be related to the time saved within squads or the number of times that a component that cost “U$X” was reused in the product.

In the example above, I used the logic of the 3 levels to illustrate the rationale in layers. With the following suggestions, it is up to each team to understand what is worth measuring or not. Always think through the rationale of ‘what is the main goal that I need to solve NOW’ and build your metrics around this premise.

Adoption / Coverage:

Adoption = code in production. There must be trust from the engineering, design, and product teams in using the Design System as part of their daily process because the value of the DS is realized when products deliver features that use the system in the hands of customers.

Polished and shiny documentation is important when the team is large, but this is a secondary goal. Adoption metrics are easy to capture as they can be extracted directly from the repository. Some examples include:

  • Count of CSS classes, component names, imports in application codes. The higher, the better. This metric can even be separated by tribe, for example, to have a comparative view of where in the company adoption is higher or lower so that a qualitative investigation can be conducted later.
  • Count of specific properties in CSS. The lower, the better. This indicates that developers are finding everything they need in the code structure.
  • Count of components that receive “detach” within Figma. The lower, the better. If a component is dismantled many times, it means it is not in the ideal format for its purpose and needs to be reviewed.

When talking about coverage, we can discuss 2 types: the scope of components covered within the Design System and the total number of applications in the company using the most up-to-date version of the DS. Here are some examples:

  • Count of total components built in relation to mapped components.
  • Count of applications using the updated NPM version of the DS.
  • Ratio of the total number of total applications / number of applications using the correct version of the DS.

To control coverage, I created this spreadsheet (shared above) where we had a view of the total number of applications and measured which ones were outdated and which ones were updated with the correct version of the DS in each tribe.

Consolidation Tab for DS Coverage”

For each tribe, there is a tab like the one below, where in column “D” (Updated?), “zero” means that a specific application is outdated, and “one” means it is on the correct version, which adds up in the consolidation tab. It’s as simple as that.

Tab of a specific tribe

Our team tried to automate this process twice, extracting data directly from GitHub into some data consolidation tools. However, manual control allowed us to group applications in ways that were easier to manage. To keep it updated, the DS team performed a quarterly ritual with the leaders of each tribe, which was excellent for understanding the reality within the teams and promoting DS advancements to leadership.

Operating a Team:

Controlling the operation of the DS team and indirect collaborators is crucial to achieving goals. You need dedicated people, as some teams struggle to maintain commitment and keep leaders engaged. It is essential to have a stable team with allocated resources and sponsor participation, and this can be controlled by:

  • Measuring sponsor engagement in key rituals like release plans.
  • Participation of key members in DS decision-making rituals.
  • Establishing predictability of releases and meeting deadlines.
  • Productivity within the DS team.

Shared Responsibility: Some design and engineering leaders see the system as a lever to change culture and strengthen collaborative practices. Creating a community will help promote and engage the DS. To assess success, you can track some numbers, such as:

  • Number of people outside the DS team participating in critiques.
  • Number of Pull Requests made by devs outside the DS team.
  • Number of active participants in official DS channels.

Regarding Qualitative Metrics:

The most effective way we found to analyze the subjective aspects of the DS was through surveys we conducted semi-annually with internal teams. The format involved running an initial questionnaire with a satisfaction survey with open-ended response fields for designers and another for developers, seeking to understand specific aspects of DS usage, such as:

  • Quality of component structure in Figma.
  • Level of satisfaction regarding token structure.
  • Dev satisfaction level regarding library usage.
  • Satisfaction level with the clarity of documentation.

After receiving the questionnaire responses, there was a round of interviews selecting people based on the answers we received.

And when we talk about task completeness/execution speed within product squads?

These metrics are delicate because they require a rigorous and rigid tracking standard that is unlikely to reflect reality. We all know that factors influencing task completeness are infinite, and from personal experience, I’ve found that creating a micro-task management control for a large design team is ineffective. Some teams are more disciplined than others in this type of control, and inconsistencies in the data often occur.

At this point, it’s better to look at specific activities that the Design System influences, such as the time from a closed idea to a prototype or the number of story points related to the interface that frontend teams can deliver in a sprint. If the purpose is to highlight the effectiveness of the DS in teams, you can measure by sampling, working only with the team most precise in task tracking.

An experiment that can be done as a proof of value is to select two developers of the same seniority and similar performance, in equivalent concentration conditions. Developer “A” builds a screen without using the DS, while developer “B” builds the same screen using the DS, and then compare the time for each. At Hotmart, we conducted this experiment, and the use of DS reduced the time for a screen from 26.4 hours to 8.4 hours. We were more than 3 times faster with DS. The result can be presented by multiplying the number of hours by the average hourly rate of a developer, giving you a comparison of how much one screen costs compared to the other.

It’s essential to be very careful when comparing like things. For example, it would be a mistake to conduct this experiment using a junior and a senior developer. It’s common to fall into the trap of comparing metrics that have a dubious and confusing relationship, leading to a distorted perception of reality.

My conclusion is that there are many possibilities to measure different aspects of a Design System, and the points to consider are, first of all, to create metrics around clear objectives, understand what your company values, and build a coherent narrative around it using data. Demonstrating mastery and control over the project is always a positive point.

I hope that this small part of my experience can contribute to how you and your team think about managing and promoting the Design System within your company. If you liked the content, give it a 👏🏼, and if you want to chat, feel free to schedule a conversation with me on ADPlist.

I wish you the best. 🙂

--

--