Mastering Visual Consistency in AI

AI Art Director
Synthetography Specialist.
One of the biggest hurdles in Generative AI has always been character consistency. You generate a stunning character, but in the next prompt, they look like a stranger. The introduction of --cref (Character Reference) in Midjourney v6 changed everything for storytelling and branding.
1. The Workflow: Anchor & Iterate
The professional workflow involves creating an "Anchor Image". This is your definitive reference. Do not use a generated image from a long chain of variations; use the original upscale. Copy its URL and treat it as the source of truth for all future generations.
❌ Bad Practice
Describing the character again in text: "Blue eyes, scar on cheek, blonde hair..." (AI interprets this differently every time).
✔️ Good Practice
Using the reference: --cref https://url.... The AI "sees" the exact features instead of guessing from text.
2. Managing Influence with --cw
The Character Weight (--cw) parameter ranges from 0 to 100. It determines how much of the reference image is copied.
- --cw 100 (Default): Copies face, hair, and outfit.
- --cw 0: Copies ONLY the face. Allows you to change outfits and hairstyles completely.
Key Takeaway: Text prompts control the *scene*, while --cref controls the *identity*.