v0.1.0
MothTaste is a pattern. Patterns can be learned.
You say: "experimental layout, harmonious color, newjack type" β [ DESIGN AGENT ] knows what those terms mean to you β Output: coherent design you couldn't have imagined β but recognizably yours
The agent starts as a blank slate. You define the terms. You feed it knowledge. You show it images. You teach it what your eye sees. Then it creates like you, to your vision.
You teach it your references. You approve or reject its output. Over time, it learns to work in your shadow β a junior design-agent trained on your vision.
The agent stores session history and user feedback. Confidence scores adjust based on approvals and rejections. Relevant learnings are retrieved for new briefs.
[ KNOWLEDGE LAYER ] YOUR TERMS "experimental layout" β what it means to you + techniques + anti-patterns "harmonious color" β how you use it + references + compatibilities "newjack typeface" β foundries you like + characteristics + pairings RULE-BREAKING MΓΌller-Brockmann rule β when to break β how to break well Every violation is intentional, documented, anchored BOOK KNOWLEDGE (current) MΓΌller-Brockmann (Grid Systems) β 88 layout principles Emil Ruder (Typography) β 137 type principles Alina Wheeler (Branding) β 134 branding principles + add more via training pipeline STYLE PROFILES (current) swiss_modernism, retro_70s, studio_profile_template + define your own REFERENCE BANK (current) Eagle + Are.na integration ready + tag images with your terms to build β [ AgentCoreβ’ ] IDEATION (generate concepts) β EXECUTION (produce Figma JSON) β CRITIQUE (evaluate) β βΊ LEARNING Store successful patterns, failed attempts, feedback Adjusts based on approvals and rejections β [ OUTPUT LAYER ] Figma Plugin | Moodboards from Eagle | HTML/CSS Prototypes
The agent takes briefs, pulls relevant knowledge, generates work, critiques itself, learns from corrections, and repeats until the output meets the criteria.
[ DESIGN BRIEF ] β [ KNOWLEDGE RETRIEVAL ] Foundation principles (always) Style profile (if specified) Relevant past learnings β [ DESIGN REASONING ] Analyze brief requirements Plan approach using skills Generate initial design β [ REFLECTION LOOP ] Self-critique against principles Identify weaknesses Generate improved version Repeat until satisfactory β [ OUTPUT & LEARNING ] Generate final output (Figma JSON / Image) Store learnings in memory Update skill proficiency
All extracted principles are stored in the skills/ and knowledge/ directories, organized by domain and source.
--style swissmodernism, --style Lubalin, --style PaulRand, --style brianokarskistudioThe vision engine learns from your curated images (pulling from Are.na, Eagle) using Gemini + visual LLMs for image coherence analysis, style interpretation, and content classification.
The training pipeline is Python-based and handles how knowledge enters the system β book extraction, image analysis, pattern recognition. Uses Gemini for vision tasks. This is one reason to keep Python in the stack even if the main agent moves to TypeScript.
eagle_extractor.pypattern_analyzer.pystyle_synthesizer.pybook_processor.pyThe reasoning engine analyzes briefs, selects relevant principles, plans the approach, and generates the work in a structured four-step process.
The reflection engine critiques output against principles, scoring layout, type, color, and concept before identifying problems and applying fixes.
The memory palace stores sessions, extracts learnings, and tracks approvals, rejections, and adjustments while updating confidence scores over time.
The skill registry maps deliverables to skillsβlogo work requires typography and branding, layout requires grid and compositionβwith book-extracted principles attached to each.
Moth is the TypeScript AI reasoning layer β a standalone Node.js application (not a Figma plugin) that handles ideation, execution, and critique. It uses GPT-4 and outputs JSON specs that can be executed by the Figma plugin or other output channels. Currently a parallel implementation to the Python agent; architecture decision pending on whether to merge or keep both with shared knowledge.
The Figma plugin serves as the output layer, executing designs directly in your canvas.
Work in progress β these are example capabilities.
You define terms. The agent learns them based on what you teach it. When you say "curious color" in a brief, it knows exactly what you mean β your definition, your references, your Pantone swatches, whatever.
These are examples β you define your own.
--experimental--strict_grid--editorial--harmonious--curious--restrained--classic--newjack--expressive
agent/ β Python agent (prototype)
βββ core.py Main agent orchestration
βββ reasoning.py Design thinking layer
βββ reflection.py Self-critique system
βββ knowledge.py Knowledge management
βββ memory.py Long-term learning
βββ skills.py Skill registry
βββ tools/
βββ figma_mcp.py Figma integration
βββ image_gen.py Image generation
design-agent/ β TypeScript agent + Figma plugin
βββ figma-plugin/ Plugin source
βββ src/agent/ MothAgent (ideate, execute, critique)
βββ src/tokens/ Three-layer token system
βββ src/motion/ Animation system
βββ guidelines/ Design knowledge (MD format)
knowledge/
βββ foundations/ Universal principles (always loaded)
βββ studio/ YOUR design language
β βββ vocabulary.yaml Terms you use in briefs
β βββ rule_breaking.yaml Intentional violations
βββ rules/ Evaluation criteria
βββ books/ Extracted book knowledge (JSON)
skills/
βββ layout/ 88 principles (MΓΌller-Brockmann)
βββ typography/ 137 principles (Emil Ruder)
βββ branding/ 134 principles (Wheeler)
styles/
βββ studio_profile_template/
βββ swiss_modernism/
βββ retro_70s/
training/ β Knowledge ingestion pipeline
βββ book_processor.py PDF β principles (text PDFs)
βββ scanned_book_processor.py PDF β principles (image PDFs)
βββ eagle_extractor.py Eagle β training data
βββ pattern_analyzer.py Find patterns in clusters
βββ style_synthesizer.py Auto-generate style profiles
βββ moodboard.py Create moodboards from refs
βββ pipeline.py Orchestrate full training
arena x eagle/ β Visual reference management
βββ main.py Interactive menu
βββ ai_vision.py Image analysis (Gemini)
βββ config.py API configuration
Major revision. Added Three Products (Studio Junior Designer, White-label Brand Agent, Effects Plugin). Added VisionEngineβ’, AgentCoreβ’, Memory Palace, Training Pipeline sections. Clarified Moth Agent vs Moth Figma Plugin architecture. Added Style Codes, inline monospace for commands. Linked Are.na + Eagle. Reorganized Current Status with architecture decision. New tagline. Mobile responsive. Refined copy throughout.
Initial documentation website with project overview, architecture diagrams, component descriptions, and project status.
Peace