Prompt

Midjourney: a cozy little cabin with a colorful roof at the top of a cloudy peak, a long twisting slide leads down the side of the peak, a small garden is planted around the house --ar 4:3

Theme

This weeks theme is your dream home. Where would you want to live if your imagination is the only limit? A cozy cabin in the middle of nowhere? The gigantic pillow fortress you dreamed of when you were little? Or maybe a majestic tower on an asteroid drifting through space?
Go with whatever you, or your inner child, would love to live in :)

Rules:

  • Follow the community’s rules above all else
  • One comment and image per user
  • Embed image directly in the post (no external link)
  • Workflow/Prompt sharing encouraged (we’re all here for fun and learning)
  • Posts that are tied will both get the points
  • The challenge runs for 7 days from now on
  • Down votes will not be counted

Scores

At the end of the challenge each post will be scored:

Prize Points
Most upvoted +3 points
Second most upvoted +2 point
Third most upvoted +1 point
OP’s favorite +1 point
Most original +1 point
Last two entries (to compensate for less time to vote) +1 point
Prompt and workflow included +1 point

The winner gets to pick next theme! Have fun everyone!

Previous entries

  • Itrytoblenderrender@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    7 months ago

    My Little hobbit home.

    Made with comfyui. The workflow is embedded in this picture (Catbox)

    Details

    Base Model (first image) : fenrisxl_V16fp16 (CivitAI)

    Outpainting Model: juggerxllnpaint_juggerinpaintV8 CivitAI)

    Like in the hobbit, it all starts with a door:

    Prompt:

    frontal closeup Photo of A beautiful round wooden entrance door to a cozy hobbit burrow. A rune is carved at the bottom of the door. The door is painted blue.

    Sampler: dpmpp_3m_sde

    Scheduler: karras

    Steps: 60

    CFG: 7

    But what is a door without a home? So we build the home around the door via outpainting.

    Prompt:

    frontal closeup Photo of A cozy hobbit burrow.

    A cozy nice home needs some flowers. So lets give the home a nice flower garden:

    Prompt:

    beautiful flower Garden

    But where should ou r hobbit home be located. I think a forest would be nice.

    Prompt:

    Beautiful forest

    To reduce the strain on my graphics card, i am scaling the image back down to 1024x1024 for the next steps.

    A home needs a nice fence. so lets add a nice picket fence.

    Prompt:

    Intricate detailed beautiful picket fence.

    The finishing touch should be a nice Sky with clouds. This brings us back to the final image:

    • theUnlikely@sopuli.xyzM
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      7 months ago

      Okay soooooo, that took a lot longer than I anticipated, but I think I got it. It seems it is a problem with the VAE encoding process and it can be handled with the ImageCompositeMasked node that combines the padded image with the new outpainted area so that pre-outpainted area isn’t affected by the VAE. I learned this here https://youtu.be/ufzN6dSEfrw?si=4w4vjQTfbSozFC6F&t=498. The whole video is quite useful, but the part I linked to is where he talks about that problem.

      The next problem I ran into is that at around the fourth from the last outpainting, ComfyUI would stop, it just wouldn’t go any further. The system I’m using has 24GB of VRAM and 42 GB of RAM so I didn’t think that was the problem, but just in case I tried it on a beastly RunPod machine that had 48GB VRAM and 58GB of RAM. It had the exact same problem.

      To work around this I first bypassed everything except the original gen and the first outpaint. Then I enabled each outpaint one by one until I got to the fourth from the last. At that point I saved the output image and bypassed everything except the original gen and first outpaint and enabled the last four outpaints, loading the last image manually.

      I used DreamShaper XL Lightning because there was no way I was going to wait for 60 steps each time with FenrisXL 😂 I tried two different ways of using the same model for inpainting. The first was using the Fooocus Inpaint node and Differential Diffusion node. This worked well, but when comfy stopped working I thought maybe that was the problem so I switched all of those out for some model merging. Basically, it subtracts the base SDXL model from the SDXL inpainting model and adds the Dreamshaper XL Lighting model to that. This creates a “Dreamshaper XL Lightning inpainting model”. The SDXL inpainting model can be found here.

      You should be able to use this workflow with FenrisXL the whole time if you want. You’ll just need to change the steps, CFG, and maybe sampler at each ksampler.

      Image with ImageMaskedComposite: https://files.catbox.moe/my4u7r.png

      Image without ImageMaskedComposite: https://files.catbox.moe/h8yiut.png

      • Itrytoblenderrender@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 months ago

        Wow! Thank you for the effort and time you put into this! I will definitely look into the workflow. Model Merging sounds very interesting. I will look into it!

    • theUnlikely@sopuli.xyzM
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 months ago

      Very cool idea to use outpainting like that! I’m wondering if something happened to the image along the way. A lot of the details look burnt out by the final outpainting. Looking at the workflow, I counted 12 VAE decode/encode pairs. I know that changing between latent and pixel space is not a lossless process, so that might be it, but I’m not sure. I’m going to see if I can get a workflow going that maintains the original quality.

      Comparison

    • Thelsim@sh.itjust.worksOPM
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      Total upvotes: 6

      Prize Points
      Workflow included +1
      Total 1

      That is a very cozy little home indeed!
      And thanks for including the detailed workflow, so nice to see others helping out as well :)