Building a Wellness App with AI: Part 2
Getting Figma and Claude Code to talk (and what I learned the hard way)
Summary: After discovering AI could turn Figma designs into working code, this designer learned the hard way that while setup is fairly simple, getting consistent results requires completely rethinking how you organize and name design elements in Figma. It turns out your design teacher yelling at you to name your layers was right. "Frame 124567" is useless to AI, but "breathing-guide-svg" works like magic.
This is part 2 of my series documenting how I'm building a wellness app using AI-assisted development. If you missed the backstory about why I'm doing this, check out part 1.
Okay, so after that first test where Claude Code somehow turned my Figma design into working code with really good accuracy, I was obviously super intrigued. But you know how it is with shiny new tools, there's always that moment where you have to figure out if it was just beginner's luck or if this thing actually works consistently.
Spoiler alert: it works, but there are definitely some things I wish I'd known before diving in headfirst.
The Setup
Getting this whole thing running is way simpler than I expected. The tricky part is learning how to design in a way that doesn't confuse the AI.
*If you aren’t comfortable with the terminal, I don’t recommend using my setup as a guide. Instead consider using something like Cursor or VS Code with Copilot. Figma’s instructions for setting up MCP with those can be found here. I decided to use Claude Code with the Claude Code Plugin for VS Code (that is a lot of Codes…) but Figma MCP also works with Cursor, VS Code/Github Copilot, and Windsurf.
Setting Up Claude Code
First things first, you need to have node.js and npm installed. Then you can install Claude Code following the instructions from Anthropic. It's pretty standard command-line stuff, nothing too scary.
From there create a folder where your project will live. In the terminal you will need to cd to the project folder, start claude with the command
cd project_folder
claude
and then tell Claude Code to initialize the project with the init command.
Once you are past this point it’s time to setup Figma and tell Claude Code to use it.
In Figma, go to your design file and in the Figma Menu ➡️ Preferences look for Enable Dev Mode MCP Server. When you check this it will start the MCP server.
Finally go back to Claude Code and add the MCP server with
claude mcp add --transport sse figma-dev-mode-mcp-server http://127.0.0.1:3845/sse
and that is it.
Well almost, For some reason I needed to go back into Figma and restart the MCP server.
From here I asked Claude Code to create a basic Hello World Flutter App. So I had something to work with.
How Figma MCP Actually Works
Here's something I didn't realize at first: Figma's MCP server doesn't just screenshot your designs and hope for the best. It actually reads the structure, understands relationships between elements, and picks up on some of your design decisions. Pretty cool.
But… it reads your Figma files like a developer would, not like a designer. And that changes everything about how you need to organize your work.
Designing Files That AI Can Actually Use
Remember how I mentioned my auto-layout wasn't quite right in that first test? That wasn't just a minor detail, it's actually crucial to how well the AI interprets your designs.
Frame Structure Matters
I discovered that Claude Code works best (at least in my situation) when my Figma files follow a clear hierarchy:
Pages in Figma should represent different sections or flows of your app. Frames should be actual screens or major components, EVERYTHING needs descriptive names (more on this below).
For my wellness app, I structured it like this:
Design System page (colors, typography, components)
User Flow page (main app screens)
Naming Is Everything
Here's something that took me a few failures to learn: generic names like "Frame 18555675" or "Rectangle 124567" are useless to the AI. But descriptive names like "breathing-exercise-card" or "navigation-tab-active" give Claude Code the context it needs to generate accurate code.
I started going through my designs and renaming everything with intention:
primary-button
instead ofButton
meditation-timer-display
instead ofGroup 47
breathing-guide-circle
instead ofEllipse 3
The difference in AI interpretation was dramatic.
Auto-Layout: Your New Best Friend
If you've been avoiding Figma's auto-layout feature, now's the time to embrace it. Claude Code understands auto-layout constraints and translates them into mostly proper layout.
I spent a bit of time converting all my manually-positioned elements to auto-layout, and the payoff was immediate. The AI went from generating code that kind of worked to code that actually handled different screen sizes properly.
My First Real Test: The Breathing Exercise Screen
With my setup dialed in, I decided to build the core interface of my app: a guided breathing exercise screen.
I very quickly designed the screen in Figma with proper naming and auto-layout. When the design was complete, I made sure it was selected and returned to Claude Code. Then fed it to Claude Code with this prompt:
"Create a breathing exercise screen based on the design selected in Figma. Be exact."
The Results
The good news: Claude Code nailed the layout and styling. The visual design was nearly pixel-perfect for a simple screen.
The not-so-good news: Claude Code had decided to recreate an SVG I had used in the design file. It actually nailed it but I wanted it to use my SVG. Also, it started to implement some of the interaction and functionality. Something I didn’t want it to do just yet.
This taught me something important: AI is incredible at implementation, but it needs human insight for code organization (and experience design but more on that in Part 3). The code worked perfectly, although disorganized and could use A LOT of refactoring.
Learning to Guide, Not Just Generate
From here I added the SVG to the correct folder in my project and prompted Claude to use it instead of the SVG it had created. I also asked it to create an app theme because it hardcoded various design choices, like color and typography into the code. I didn’t want to deal with that mess later on.
It did all of these things without a hiccup.
What I'm Learning About This Workflow
A few days into this experiment, here's what's becoming clear:
Design skills matter more, not less. The AI can implement anything you can design well, which means the quality of your thinking becomes the bottleneck.
System thinking pays off. Building a proper design system in Figma translates to better, more consistent code output.
Iteration is faster than ever. Instead of waiting days for developer feedback, I can test design changes in minutes.
Although it can create working code, it may not be perfect code. I think you need to have some level of development knowledge to understand where it goes wrong or how to prompt it to do better.
Up Next: Addinging Interaction and Functionality
In the next post, I’ll dive into my experiences with implementing the rest of the screens, animations, and other functionality.
What questions do you have about AI-assisted development? Drop them in the comments. I'm documenting everything and happy to explore specific challenges in future posts.