foreword
The topic to be discussed in this article is similar to the advanced Stable Diffusion — dressing and changing technique, but you need to use a more stable method to change the clothes into what you want.
When making consistent pictures and comics, there are two biggest problems. One is the consistency of human faces, which can be achieved with LoRA, and the other is the consistency of clothes. At present, there is no general solution to this problem. . In the past, it could only be realized by using prompt words, but the things that came out of the prompt words were not necessarily what we wanted. At this time, there is a method that can strongly require AI to draw clothes according to our instructions. Although it cannot reproduce all the patterns of clothes 100%, But in many cases it can handle it, such as comic strips or novel illustrations.
question
We have a picture, the character’s skirt is a white pleated skirt:
gorgeous hana
But I want her to change into a red pleated skirt:
method
At this time, we have to use a method called photo bashing to replace the clothes. First, open the drawing software such as Photoshop or GIMP, put the red pleated skirt on the character map, and cut off the excess part, so that he roughly fits the original character’s body curve:
Photo bashing, paste the red dress on the original picture
Then pass this composite picture into img2img or Inpaint. Take me as an example, I pass the picture into Inpaint and select the skirt.
In the prompt word column, if it is img2img, you need to use the complete prompt word. If it is Inpaint, you only need to focus on the prompt word of the clothes to be repainted. In my example, it is a red short pleated skirt (red short pleated skirt) :
(masterpiece, top quality, best quality, official art, beautiful and aesthetic:1.3), extreme detailed, Hana, red short pleated skirt, fantasy lora:Hana25:0.4
And use a medium weight of Denosing strength 0.5:
Next, turn on ControlNet, use normal (normal map), and use the unmodified original image as the data source, so as to confirm that the clothing will be close to the original clothing shape of the character when redrawing:
Use unmodified image as normal map source
Finally, use Loopback to continuously feed and run the image, let the AI continuously loop input, redraw the same part, use the Final denoising strength of 0.75 high weight so that the AI can have enough strength to redraw the image, and at the same time use the normal of ControlNet to Constraints redraw the shape of the item:
Then you can start running the map! In the resulting graphs, we can arbitrarily choose the desired result:
It can be seen that in the first two steps, the pasted clothes can not reflect the normal clothing texture and shadow, but in the third and fourth steps, there are already very stable clothes!
This method is suitable for when the shape of the original image and the clothes are similar, but the colors are different. If the clothes are far from the original clothes, such as changing a narrow skirt to a pleated skirt, you can try to lower the Control Weight of ControlNet normal, but increase the number of rounds of Loopback, and choose a satisfactory result picture.
I wish you all a happy AI calculation!