People ask me about this constantly, so let me just write it down properly. No theory, no positioning, just the actual process as it currently exists.
I've been using Weavy as a thinking tool in branding projects for a while now. Not to generate final assets — I want to be clear about that upfront, because the conversation around AI in design gets muddled quickly. Weavy doesn't design anything. It gives me a way to explore visual territory at a speed that wasn't possible before, which changes the early phases of a project significantly.
Here's the workflow from the beginning.
Before the project starts: the brief
Before I open any design tool, I spend time with the brief. Sometimes a client sends something written. Often it's a conversation. Either way, I'm trying to extract three things: what the brand needs to communicate, who it needs to communicate it to, and what feeling it needs to leave.
That last one — the feeling — is the hardest to articulate and the most important to get right. A brand for a fintech app aimed at first-generation immigrants has to feel different from a brand for a premium B2B SaaS tool, even if the visual styles seem superficially similar. The feeling is what makes them distinct.
Day 1: Finding the direction with Weavy
Before I touch Figma, I spend two to three hours in Weavy. The prompts at this stage are abstract and emotional, not literal. I'm not prompting for "logo for a CRM company." I'm prompting for the feeling of the brand — textures, light, mood, atmosphere.
Something like: "matte surface, warm afternoon light, objects arranged with intention, quiet competence." Or: "dark water, the weight of something important happening, controlled tension." These images don't describe the brand — they describe how the brand should make someone feel.
I generate a lot. Forty, fifty images in a session. I move quickly, sorting into yes and no without overthinking. The yes pile doesn't give me answers, but it shows me a direction. A palette starting to emerge. A texture that keeps appearing. A sense of weight, or lightness, or energy that I now know I'm working toward.
Day 1–2: Building the reference
I pull the Weavy images I want into Figma alongside real-world references — typeface specimens from foundries I'm considering, color systems from brands I respect, photography that lives in the same emotional territory. The mood board starts to have its own logic.
This stage is about committing. By the end of day two, I know what direction I'm going in, and I know why. That's the most important foundation for everything that follows.
Day 3 onwards: Actual design
Figma takes over completely. The logo, the color system, the type hierarchy, the components — everything from here is made by hand. The AI served its purpose in the thinking phase. The design is mine.
I do use Claude during this phase, but for different things: drafting brand voice guidelines, writing positioning statements, helping a client who doesn't have a copywriter get their first website copy to a usable state. It's a different kind of support.
What AI genuinely cannot do
It doesn't know your client's competitors. It doesn't understand the cultural nuance of their market. It can't make the call about which direction is brave enough to stand out but grounded enough to sell internally. It can't tell you when a client's instinct is wrong and how to redirect them diplomatically.
All of that is still the job. AI just changed how I get to the starting line.
