Design is driven by data, insight, and creativity.
Each of the three could drive the design practice alone, while it’s often a dynamic synthesis of the three that leads to real change based on business success. The relative strength of each in that synthesis reflects the design culture, and to a broader sense, work culture, in an organization.
“…organizations which design systems … are constrained to produce designs which are copies of the communication structures of these organizations.”
— M. Conway
In the spirit of Conway’s Law, the tri-factor model of data-insight-creativity is a useful lens to assess the design practice in your organization. Each factor has its own maturity in the design practice, which in turn reflects the work culture that accommodates the former.
By looking at the strengths and potential weaknesses of your org’s design practice, you are more likely to get a better idea on both the strategic and the tactical side of your design activities (usually embedded in the operation), and therefore play to the strength and improve, or at least shield, the weak points.
What does that mean? Yes, you need a cultural profile of your org’s design practice.
A Cultural Profile for Design Practice
Modelling is a tricky thing. The good news is that we don’t have to aim at the top on our first try.
In fact, the beauty of modelling is that we can always start with something very simple, with the devotion that we’ll iterate, refine, enhance it along the way. Sheer human judgement is still critical in the process. A collaborative culture would find it easier to avoid risky biases.
The Maturity of Design Practice
Let’s start, again, from the magic number three.
Each of data, insight, and creativity can have three stages of maturity: naive, mature, and creative:
- Naive design culture: where designing is usually practiced in an ad-hoc manner, on projects or products, without any strategic planning from either managerial or operational viewpoints. In a naive design culture, designing is all about tactics, such as making the user interface “beautiful”, implementing fluent interactions, or avoiding hurtful usability issues. The bigger (often strategic) context of the design practice is typically missing or lost in the org’s work culture.
- Mature design culture: where design activities and designer roles are directly included in the managerial and operational planning. The management usually takes an active role in ensuring designing is done in a measurable and standardized manner, and often welcome the design leaders to take a more active role in higher level decision making (“a seat at the table”? Maybe). The grassroots evangelization of design is often granted and even embraced. A mature design practice often affords consistency and coherence of the things designed, and contributes to (if not raise) the bottom line set for the business survival or success.
- Creative design culture: where strategic design is planned top-down and curated by leading stakeholders, and best practices thrive in a bottom-up movement to evolve the whole design practice. Design activities are often aligned with all other operational faucets like management, marketing/branding, and creating social/economic values. Design leaders may not “have a seat at the table”, because they’re often some of the table legs.
The three stages are dynamic and they sometimes overlap. That’s fine, because the whole point of the modelling is not about being precise, but about being less uncertain. Having a vague idea that your org is somewhere between Naive and Mature could be so much better than not knowing anything at all except an intangible feeling.
Let’s see how that works.
Measuring the Data Factor
The first step is to guesstimate your analytics capability and capacity.
Here are some questions to ask and elaborate on:
- How automated and accessible is your analytics?
- How often does your org rely on analytics to make design decisions?
- How many design decisions are backed by data evidence? To what extent?
- Who and how many people in your org can access analytics?
- What happens when the design stakeholders disagree with the implications from analytics?
Measuring the Insight Factor
Start from guesstimating your org’s ability to understand (especially problem, solution, and people) through empathy.
Initial probing questions include:
- How human-centred do you think your org is? Why?
- How much of your research time is spent on understanding problem / solution / context / people?
- How many of your employees can paraphrase your org’s vision and mission in details?
- What processes, techniques, and tools do your org use to gain insights?
- When and where do you often get them?
Measuring the Creativity Factor
Probing questions to ask:
- Do your employees like to propose new or different ideas?
- How are those ideas received and/or addressed? What’s the bounce rate?
- What’s the quality of those ideas? How many of them sound good?
- How often does your org find alternative or new ways to approach problems/solutions?
- How much does your org use visual thinking?
How Exactly Does It Work?
Well that’s super vague. I intended it to be, because it’s always up to you to find the proper vantage point of measurement for your org. Yes, there are some common themes and things across orgs, but it’s often more rewarding to pave the way yourself. The means is the end. It’s likely that the exploration, rather than the eventual findings, will benefit your org more.
The one obvious thing to keep in mind is that such a measurement on data, insight, and creativity is inevitably very subjective. Each org can find their own way to measure them, according to their own interpretation.
The goal of such a measurement is not to compare your org with other orgs. Sometimes, due to the subjective nature of the matter, it’s not even comparable.
The goal is to compare within, internally, to compare where you are with where you will be, to compare among your teams.
The cultural profile of your org’s design practice is introspective.
What’s the Cultural Profile of Your Design Practice?
Here are some wild guesses on well-known organizations:
But as mentioned before, how their profiles look like is irrelevant.
Here’s the only relevant thing: what is your org’s profile?
{END}
Additional Notes:
Some may instantly argue: isn’t data also supposed to provide insights? Yes and no. When we talk about data, we commonly think of analytics.
The tricky thing about analytics is that, it doesn’t really generate insights, and insights are largely irrelevant to the goal of analytics.
Insight is a human diet, cooked with human judgement.
The goal of analytics is to support decision making. If the analytics doesn’t support decision making, then it doesn’t have to be.
When the decision maker is a human, she derives insights from analytics, by creating models, algorithms, and, most importantly, points of view. The only reason she needs insights is that otherwise she wouldn’t be able to synthesize what she knows and make decisions like any human does. Human’s ability to synthesize goes beyond logic. While analytics offers only logic, because it’s a human invention based on and only on logic.
Humans need insights to understand and fathom; machines don’t.
When the decision maker is an artificially intelligent machine, insights are irrelevant as long as it can yield the right decisions in a probabilistic manner. Insights only become relevant to analytics when humans intervene and curate the machines. Machines rely on counterintuitive patterns and rules, and they stick to a probabilistic worldview as humans designed them to be.
The Turing Test for analytics is about whether and how much it helps on decision making, not about how many insights it generates. In fact, it doesn’t generate insights. Human does.
3 comments