1 min read

AI and good design

#61 - May.2023

Making a product "understandable" is one of the principles of good design. This is when a product is self-explanatory. It is predictable.

We achieved this by aligning the user expectations with the intended behavior of our product. It means ensuring we cover common affordances through our product experience. But with the advent of AI-driven products, how we can ensure we apply this principle?

One of the common claims of AI is that is hard to understand. Part of the effect of why people are amazed by AI progress is precisely because it surprises us with unexpected results, beyond our initial expectations. Take AI-generative art tools or ChatGPT: is a "Wow"-effect mixed with "How he knows this?".

While this excitement effect enhances the experience, it is becoming a design challenge. AI encompasses prediction algorithms and concepts that most people don't understand today. And while you might argue that they shouldn't understand this complexity, predicting how a product will behave is a proxy for customer trust. In simple words: If customers do not trust your product, they will not use it.

I see 3 foundational areas to consider in this new technology iteration: Education - informing people on what AI is, basic concepts, and the potential of this technology. Context - improving experiences giving visibility of the key factors used to make a decision. Transparency - explaining how data is handled, how bias is prevented, and further ethical considerations.

Trust will continue to be the backbone of good product experiences. Design for trust.

💡
For more ideas like this one, I encourage you to subscribe here or follow me on Twitter and stay up-to-date with the latest posts. If you’re already a subscriber, thanks for being part of this journey!