This Will Change EVERYTHING in Architectural Visualization FOREVER!

Sdílet
Vložit
  • čas přidán 29. 06. 2024
  • You can use this AI Render Engine to create beautiful renders of your projects just in seconds!
    Save a view, describe it, and hit generate. That's it!
    What do you think about it? Do you think you will try to use it in your next project?
    You can find all the recourses here: designinputstudio.com/this-wi...
    Use Stable Diffusion & ControlNet in 6 Clicks For FREE: • Use Stable Diffusion &...
    My previous video about Sketch to Render: • Create Realistic Rende...
    Mood Board to Render: • Turn Your Mood Boards ...
    ControlNet and Stable Diffusion Local Step-by-Step Installation Guide: • ControlNet and Stable ...
    Don't forget to subscribe if you liked the video :)
    Timeline:
    0:00 - Intro
    0:20 - Set Up
    2:50 - Text-to-Image
    5:02 - Image-to-Image
    6:15 - Parametric Pavilion Modeling
    6:54 - Rhino Viewport to Render
    9:19 - Final Results
    Let me know what you think about this workflow. Are you planning to use it for your next project?
    Join my Patreon: / designinput
    Free PNG Packs: designinputstudio.gumroad.com
    Freebies: designinputstudio.com/freebies/
    Instagram: / design.input
    Subscribe: / @designinput
    Website: designinputstudio.com
    Newsletter: newsletter.designinputstudio....
    Tools I Use
    My Computer: amzn.to/3mwVZr3
    My Mouse: amzn.to/3zPfyxS
    Free Notion Template: designinputstudio.com/freebies/
    Website Hosting: bluehost.sjv.io/rQgeVQ
    Music from Uppbeat (free for Creators!):uppbeat.io/t/mountaineer/1st-...
    License code: CEEISWL2HP8G1WFM
    Music track: City Lights by Aylex
    Source: freetouse.com/music
    Music for Videos (Free Download)
    Music track: Magnificent by Pufino
    Source: freetouse.com/music
    Free Music Without Copyright (Safe)
    Music track: Días Mejores Vendrán by Popoi
    Source: freetouse.com/music
    Royalty Free Music for Video (Safe)
    Music track: Animal Friends by Lukrembo
    Source: freetouse.com/music
    Free Background Music (No Copyright)
    *Description may contain affiliate links that at no additional cost to you, I may earn a small commission.

Komentáře • 162

  • @ilaydakaratas1957
    @ilaydakaratas1957 Před rokem +3

    Such useful tools!! I will definetly try it out! Thank you for the video!! Also, that was an interesting pavilion model

    • @designinput
      @designinput  Před rokem

      Hey there, thanks for your support and lovely comment ❤❤ I hope you liked the pavilion :)

  • @peterpanic7019
    @peterpanic7019 Před rokem +3

    Thanks for your great quality videos, I just watched the latest ones about AI and image generation, can't wait to try them out. Hope you channel grows :)

    • @designinput
      @designinput  Před rokem

      Hey, thank you so much for your support ❤ Please let us know what you think after you try it out :)

    • @mockingbird1128
      @mockingbird1128 Před rokem

      what did u watch im new to this

    • @armannasr3681
      @armannasr3681 Před rokem

      @@mockingbird1128 try stable diffusion + controlnet

  • @phgduo3256
    @phgduo3256 Před rokem +1

    I am become a fan of your works. Will spen the next holidays of this month on these AI series. Thanks

    • @designinput
      @designinput  Před rokem

      Hi, thanks a lot for your lovely comment and feedback

  • @nickp8094
    @nickp8094 Před rokem +5

    I think its really good and i can see the evolution of it in my head as a visualiser. Feels like one day you will really be able to custom load in pre written scripts that perform very specific functions to make it even more as per the experience of working for a client. Basically visualisation will become a bit like computer programming, not necessarily quicker or easier

  • @mkemaladro5942
    @mkemaladro5942 Před 2 měsíci

    Very nice work, I'm a student trying to learn this and just stumbled on your video. It's constructive and informative, keep up the good work sir!!!

  • @reflections191
    @reflections191 Před rokem +1

    Very well explained, Thanks for the great video!

  • @pranayyalamuri3127
    @pranayyalamuri3127 Před rokem +1

    Thanks for the content ❤

    • @designinput
      @designinput  Před rokem

      Hey, thanks a lot for your great comment and support!

  • @B-water
    @B-water Před rokem +1

    A gift from heaven...a million thanks 😃😃😃

  • @ilhan1936
    @ilhan1936 Před rokem

    Thats really great thanks for the video! Eline saglik arkadasim :)

    • @designinput
      @designinput  Před rokem

      Hi Ilhan, thanks a lot for your lovely comment :)) ❤❤

  • @HannesGrebin
    @HannesGrebin Před rokem

    Wizard! Thank you so much for your concise introduction and other videos. Just came along from the Parametric Architecture course of Aturo Tedeschi who you might know (the grasshopper guy)

    • @designinput
      @designinput  Před rokem

      Hi, thanks a lot for your lovely comment and feedback

  • @sherifamr4160
    @sherifamr4160 Před rokem +6

    love the way you explained it, to the point and easy to follow up. I do have a question hopefully you will read my comment, but I wanted to ask if you already have materials on your pavilion would that somehow redirect the rendering process into what we want acting as a more parameters? ... I hope I am making sense in my comment. again thank you so much I love that you are sharing your knowledge with us shows how amazing you are as a person.

    • @designinput
      @designinput  Před rokem

      Hey, thanks a lot for your lovely comment! Unfortunately, it is not possible to use materials as a parameter at the moment, but I am sure soon we will be able to have more control over this workflow.
      Thanks a lot for your kind words

  • @dkn822
    @dkn822 Před rokem +1

    Thank you for all this amazing information and resources, I will definitely use this for my projects.
    Subscribed and eager to watch your upcoming videos! Keep it up!

    • @designinput
      @designinput  Před rokem +1

      Hey, thanks a lot for your lovely comment and support! I am happy to hear that you liked it! Please share your experiences with me once you try it out!

  • @niirceollae2
    @niirceollae2 Před rokem

    wow... that is insane. i have to try it now

    • @designinput
      @designinput  Před rokem

      Hey, thanks for your lovely comment! Please share your experiences after you tried it out, and feel free to ask if you have any problems.

  • @tatianagavrilova2252
    @tatianagavrilova2252 Před rokem

    It look like a magic! Thanks a lot

    • @designinput
      @designinput  Před rokem

      Hi, thanks a lot, glad you liked it! You are very welcome!

  • @Masoud.Ansari
    @Masoud.Ansari Před rokem

    Thank you for sharing this is awesome 👌

    • @designinput
      @designinput  Před rokem +1

      Hey, thanks a lot! Glad to hear that you liked it :)

    • @Masoud.Ansari
      @Masoud.Ansari Před rokem

      @Design Input yourwelcome bro

  • @emekachime1089
    @emekachime1089 Před rokem

    Looking forward to your next video of CLASSICAL RENDER VS AI RENDER .👍

    • @designinput
      @designinput  Před rokem +1

      Hey, thanks a lot for your lovely comment! It will be out soon :)

  • @hopperblue934
    @hopperblue934 Před rokem

    great bro💖💖💖

    • @designinput
      @designinput  Před rokem

      Hi, thanks a lot for the lovely feedback

  • @MDLEUA
    @MDLEUA Před rokem

    Great tutorial, followed Ambrosini videos but I like this format more!

    • @designinput
      @designinput  Před rokem

      Hey, thank you! Glad to hear that you liked it! Did you have a chance to try it?

  • @infographie
    @infographie Před rokem

    Excellent

  • @shinndin
    @shinndin Před rokem +1

    Amazing

    • @designinput
      @designinput  Před rokem

      Hi Dina, thanks a lot for your excellent feedback ❤❤

  • @william0916
    @william0916 Před 11 měsíci

    Thank you for sharing this fabulous workflow!! I am about to try it out, and I'm wondering if there are any newer extensions and development you would suggest us to use (since this video is from April, not sure if there's anything new in these 3 months!)
    Thank you in advance and have a nice day :)

    • @designinput
      @designinput  Před 11 měsíci +1

      Hey, thanks a lot for the feedback! Of course, there are lots of new developments happening every day, I am trying to stay updated as much as I can and share what I learn. But in terms of this specific workflow, there are new major updates for both Stable Diffusion and Grasshopper extensions. But both should still work fine!

  • @user-ae5pa
    @user-ae5pa Před rokem +1

    soooooo good

    • @designinput
      @designinput  Před rokem

      Hi, thanks a lot for your great comment! ❤

  • @moaazaldahan1175
    @moaazaldahan1175 Před rokem

    thank you very much

  • @NicoChin
    @NicoChin Před rokem +8

    if you tell the client that the last picture is man-made and the same picture you say was created by AI. If the client's attitude does not change, then AI will really change the world.

  • @Albert_Sierra
    @Albert_Sierra Před rokem

    Awesome! I like it, thanks. Please make a tutorial using blender, if possible

  • @firatgunesbalci2743
    @firatgunesbalci2743 Před rokem +5

    Great videos 👍🏻👍🏻👍🏻Can you explain Sketchup workflow as well?

  • @alexanderaggersbjerg5187

    Thanks for the great explanation! Got everything up and running:) One quick question, I am having issues working with the depth controlnet. I have downloaded the previous controlnet versions (aside from the new controlnet v1.1 versions) but the depth and canny masks are very bad quality. This is only an issue for me when I use controlnets in grasshopper. Any ideas what the problem may be?

    • @simongobel2709
      @simongobel2709 Před rokem

      i have the same problem unfortunately .... any answer yet ?

  • @motivizer5395
    @motivizer5395 Před rokem

    Amazing video . Can you make a video for sketchup as well about this process ?

    • @designinput
      @designinput  Před rokem

      Hi, thanks for your comment and suggestion! I will definitely try it out and share the results!

  • @JJSnel-uh3by
    @JJSnel-uh3by Před rokem

    I love the setup but the voice is just too funny xD

  • @arv3ryn
    @arv3ryn Před rokem

    Great video, also what is you computer specs, cuz I have a basic laptop, wondering whether I can run this

    • @designinput
      @designinput  Před rokem +1

      Hey, thanks a lot for you lovely feedback! I am using a laptop with RTX3060 (6GB VRAM) and 12th Gen Intel(R) Core(TM) i7-12700H CPU. Of course, for this process, the most important one is the GPU. I will share another workflow how you can use Stable Diffusion without any computer in couple of days.

  • @amazingsound63
    @amazingsound63 Před rokem

    Scary For Future Job Opportunity.

  • @dianaallaham2801
    @dianaallaham2801 Před 6 měsíci

    Since your video there has been an update to the Ambrosinus, and for some reason I cannot get the port to be available. Do you happen to know what inputs should go into the LaunchSD as it has many more inputs now?

  • @mukondeleliratshilavhi5634

    I think it's a great tool for rapid prototyping with less images . It unlocks more possibilities and gives us and the client more variety with less time and energy. The biggest hope is we come to a final image that we might have not even though possible before.
    But for a final image I think the old method is still king. Who knows next year this time it might be a different story m
    Will I use it for my next project oh yes but the blender version it's always best to get in early with new technology

    • @designinput
      @designinput  Před rokem

      Hey, thanks for your comment; I totally agree! Hmm, that's interesting; why do you prefer Blender specifically?

    • @mukondeleliratshilavhi5634
      @mukondeleliratshilavhi5634 Před rokem

      @@designinput there are a few reason.
      1) Been open source it was easy access with out restrictions and invest time and resources on it. I'm a freelance/ business owner. It is important I run as lean as possible
      2) rapid development : it can do a lot of things and it's ever expanding its reach. I'm able to complete a project in one software with out having to hop on another. Yes it's not as strong as rhino or Max but it's gives great quality.
      3) the community: they drive the development and education of the software it's so of owned by us .
      The amount of tutorial and add on , stores available.
      There is more but let me park here

  • @cgimadesimple
    @cgimadesimple Před rokem

    cool :)

  • @wido.daniel
    @wido.daniel Před rokem

    Thank you man, this si SO good! to your knowledge, would it possible to use this in Revit through Dynamo?

    • @designinput
      @designinput  Před rokem +1

      Hey, thanks a lot for the feedback. ❤Hmm, I am not super sure, but I believe there is no extension for that yet. But I am experimenting with connecting Revit to this same workflow with Rhino.Inside.Revit. I will share it as soon as it's ready :)

    • @wido.daniel
      @wido.daniel Před rokem

      @@designinput that would be awesome!

  • @youssefdaadoush8755
    @youssefdaadoush8755 Před rokem

    Thanks a lot for the video, is really incredible, I just have a question, I did everything exactly same and in the generation comes the results regardless of my base image, what could be the problem? otherwise it works directly in stable difussion in web window

    • @designinput
      @designinput  Před rokem

      Hey Youssef, thanks for your great comment! It looks like there is a problem with the ControlNet. Did you enable it?

  • @soitalwaysgoes
    @soitalwaysgoes Před rokem +1

    Hello! I checked out your instagram and I would die for a tutorial on how to do those veil textures you did!

    • @designinput
      @designinput  Před rokem

      Hi, oh, thank you for your lovely feedback. Happy that you liked them ❤
      I created them with Midjourney v5. Sure, I will do a video about it soon!

  • @oof1498
    @oof1498 Před rokem

    Great! How about if I want to use the same material on the same place but in different perspective?

    • @designinput
      @designinput  Před rokem

      Hey, thanks for your feedback ❤You can keep the same seed number for the different views to have similar results. But still, it is not so easy to generate precisely the same materials and textures all the time. If I figure out something for more consistent results, I will share it :)

  • @adel.419
    @adel.419 Před rokem

    I have followed everything in the video but when I tried my own model and hit the generate button the AleNG-Ioc battery turned red and doesn't generate anything and the panel connected to the info says "No data was collected" even though the viewport appears in the LB image viewer

  • @sirousghaffari9556
    @sirousghaffari9556 Před rokem

    Hello, thank you very much for your good lessons. In the 3rd minute of the tutorial, you say that I put the GrassHopper codes for you in the description section. But unfortunately I can't find it. Is it possible to guide me?

    • @designinput
      @designinput  Před rokem +1

      Hi, thanks for the feedback! You can find all the resources mentioned in the video here: designinputstudio.com/this-will-change-everything-in-architectural-visualization-forever/
      And you can download the file here: www.notion.so/designinputs/AI-Render-Engine-Template-File-02d34b595f824ca6a9f1339470fb1387?pvs=4

  • @danr9277
    @danr9277 Před rokem

    This is great how is the speed of the rendering? Seems very fast.

    • @designinput
      @designinput  Před rokem +1

      Hey, thanks for your comment! It mostly depends on your GPU, I am using a RTX 3060 with 6GB VRAM, and I can generate a 1024x1024 image in 1-2 minutes.

  • @lawrencenathan351
    @lawrencenathan351 Před rokem

    quick question : Do i just add this on top of sketchup? or is there any simple tutorial i can follow on combining ai in sketcuo? thanks

    • @designinput
      @designinput  Před rokem

      Hi, this workflow doesn't work with SketchUp at the moment, but you can try platforms like VerasAI. Thanks for your comment!

  • @DannoHung
    @DannoHung Před rokem

    Backing the rendered image out to a textured and lit scene is the next step probably, hah!

  • @zafiriszafiropoulos5346
    @zafiriszafiropoulos5346 Před rokem +1

    hi there. I only have rhino 6, and the ambrosini tool is only available for rhino 7. is there another way?

  • @RodriguezRenderings
    @RodriguezRenderings Před rokem

    Can stable diffusion further elaborate the model so that at different views you can maintain the same materials, facades?

    • @designinput
      @designinput  Před rokem

      Hey, thanks for your feedback ❤You can keep the same seed number for the different views to have similar results. But still, it is not so easy to generate precisely the same materials and textures all the time.
      But I am sure we will see some developments about this very soon!

  •  Před rokem

    hi, thanks for the video. i check other videos and came to somewhere until I stuck with webui part. my webui-user file looks different than yours. there is "--xformers" and "git pull" lines in yours but I don't have it. unfortunately just copying it as yours doesn't work :) . Dont know what is missing but I can say that it is pretty overwhelming setup for sure.

    • @designinput
      @designinput  Před rokem

      Hey Cankat,
      Thanks for your comment. "--xformers" is an additional step that you can use if you have an RTX 30 or 40-series GPU; it will speed up the generation process. And the "git pull" comment automatically checks for new updates when you run the SD. So you don't have to have them to use it; the only must is the "--api" to give access directly inside the Grasshopper file.
      Since it is an early experimental workflow, you are right that it is not so user-friendly. But it will surely develop, and I will share the newer versions very soon.
      Thank you!

  • @kedarundale972
    @kedarundale972 Před rokem

    Thank you for the wonderful video.
    I had one question, so everything in the script works perfectly on my computer but when I connect value list to Mode, I get error. Do you know why this could be?
    Basically the mode doesn't take any other input apart from 0 - which is the T2I Basic. In my stable diffusion I do see the other models but I am not sure what the error is. The same thing is happening with SAMPLER MODEL, it does not take any input apart Euler A. Any suggestions will be helpful. Thank you.

    • @designinput
      @designinput  Před rokem

      Hey, thanks for your comment. I am not sure why you can't see the other modes. There was a new update to the ambrosinus-toolkit plugin since I published the video, maybe you should update it to work. I will check the file and upload an updated version soon. Let me know if you are still having problems with it. Thank you!

  • @Peter-hn9yv
    @Peter-hn9yv Před rokem

    i got the error in grasshopper saying excepting index was out of range, have you encounter this issue before?

  • @diegovazquezdesantos4667
    @diegovazquezdesantos4667 Před 11 měsíci

    Thank you so much for the clear explanation. I tried to follow this video with the new update of ambrosinus but I was no able too. And when I installed v1.1.9 I was able to utilized your code. Although at the output SeeOUt (LA_SeeOut) an error occurs. “index was out of range” any ideas on how to fix this error?

    • @designinput
      @designinput  Před 11 měsíci

      Hey, thanks a lot! I think you just need to generate an image first, after that you will able to see it and the error will disappear.

  • @firatgunesbalci2743
    @firatgunesbalci2743 Před rokem +1

    When I first saw the teaser, I thought that you used ArkoAi

    • @designinput
      @designinput  Před rokem +1

      Hey, haha, yes, that's the most "popular" one nowadays, but I feel like you don't have much control over it.
      I will share a video soon to compare different AI Render alternatives. Thanks for your comments!

  • @borchzhang2211
    @borchzhang2211 Před rokem +1

    How to handle parameter settings indoors to better align with the model?

    • @designinput
      @designinput  Před rokem

      Hey, thanks for your comment! For indoor views, you can try the Depth Model too. Is there any specific parameter you want to ask? Maybe I can help better with that one :)

  • @Macora3251
    @Macora3251 Před rokem

    Can you get the same results twice if the client wants the exact same render but change just the column material for example?

    • @designinput
      @designinput  Před rokem

      Hey, thanks for your comment! Generating exactly the same image twice can be challenging. But if you want to change a part of it, you can use inpainting to edit it.

  • @user-bw9ft4lh6z
    @user-bw9ft4lh6z Před rokem

    Thank you so much!! I’m just having issues with the resolution of the “depth image” that it creates, its really low and cause of it I can t use my models Can I increase it ? Thank you anyway this tool is amazing 👍

    • @user-bw9ft4lh6z
      @user-bw9ft4lh6z Před rokem

      being more precise, I probably have problems with the preprocessor I can't change it so it doesn't generate the correct depth image

    • @designinput
      @designinput  Před rokem

      Hey, thanks for the comment! If the image resolution is low from the viewport, you can try printing a view from Rhino with a custom resolution and use it in Stable Diffusion directly. It may help but don't go larger than 1024x1024 it will slow down the process dramatically, once you like one of the views than you can upscale the image later. Hope I understood your question correctly. Let me know if you have any other issues.

  • @shiryu7101
    @shiryu7101 Před rokem

    Hi! Could you tell me why it says “Input image doesn’t exist or is not supported format” even I put png file? Thank you!

  • @ezzathakimi2201
    @ezzathakimi2201 Před rokem

    Please make a video how to use it in 3ds Max + Corona

  • @jelisperez7968
    @jelisperez7968 Před 11 měsíci

    Thank you for sharing this amazing tutorial. Is it still working? I am having this issue with ControlNet updates: controlnet warning gess mode is removed since 1.1.136. please use Control Mode instead. If I choose the CN v1.1.X IN Ambrosinus tool, Result image differs completely from original image. Also changed directory to point directly to CNet path.
    Any hint?
    Is there a way to choose the SD Model?
    Best

    • @jelisperez7968
      @jelisperez7968 Před 11 měsíci

      I figured out that with the update, CN Depth modes are working as expected, but not Canny mode. I've posted the bug on food4Rhino. Many thanks again

    • @designinput
      @designinput  Před 11 měsíci

      Hey, good to hear that it's working :) For me, it was working without any issues. Thanks for your comment!

  • @mrezaforoozandeh520
    @mrezaforoozandeh520 Před 6 měsíci

    thanks but by clicking start botton the webui-user.bat wont run by --api. i edit the bat file but after clicking start it wont be able to run it in that way and changes the bat file back to origin

  • @user-yq7zn7do8p
    @user-yq7zn7do8p Před rokem

    Hi. What’s your rhino version and ladybug version? Ladybug is not working on my rhino.

    • @designinput
      @designinput  Před rokem

      Hey, I was using 1.6 version, you can download it here: www.food4rhino.com/en/app/ladybug-tools
      But even if Ladybug doesn't work, you can still use this workflow, just you won't be able to see the images directly inside Grasshopper.

  • @azimbekibraev1249
    @azimbekibraev1249 Před 2 měsíci

    Selam aleykum Omer! Ambrosinus has updated and your sample GH fail is no longer work, could you please share the updated version, if this workflow is still relevant. Thank you in advance

  • @user-qr1hs3mn9x
    @user-qr1hs3mn9x Před 4 měsíci

    I have a no data problem when I connect 2.24 LaunchSD to the panel, how can I solve it?

  • @Peter-hn9yv
    @Peter-hn9yv Před rokem +1

    does this workflow saves the viewport and dimensions of the image?

    • @designinput
      @designinput  Před rokem

      Hey, yes, it saves the image exactly in the viewport size and uses the same aspect ratio for the new image.
      Thanks for your comment!

  • @firatgunesbalci2743
    @firatgunesbalci2743 Před rokem +1

    Hi, what is your computer hardware configuration ?

    • @designinput
      @designinput  Před rokem

      Hey Fırat, I am using a laptop with RTX3060 (6GB VRAM) and 12th Gen Intel(R) Core(TM) i7-12700H CPU.

  • @pedorthicart1201
    @pedorthicart1201 Před rokem

    I feel it is great and help me with visualization of orthopedic footwear designed through #Pedorthic Information Modeling! Waiting to have time to explore it! Thank you

    • @designinput
      @designinput  Před rokem +1

      Hey, thanks for your comment! I will share a video specifically about product photography and how to use AI.
      Thank you!

    • @pedorthicart1201
      @pedorthicart1201 Před rokem

      @@designinput Waiting for it! Thanks!

  • @darkrider897
    @darkrider897 Před rokem

    Hi sir, I was stuck at 2:28 when u clicked on the administrator window. I tried to do it by right clicking webui-user.bat, then click run as administrator. However it just flashes but nothing happens. How do I solve the problem?

    • @designinput
      @designinput  Před rokem

      Hey, you don't need to run the webui-user.bat file as administrator, you need to run Rhino as administrator. And make sure to add the --api parameter to the .bat file.
      If you can't start Stable Diffusion inside Grasshopper you can just run it manually and if you have --api commend, it should automatically connect to the Grasshopper plugin.

  • @user-un3jv4lo3r
    @user-un3jv4lo3r Před rokem

    I need a plugin that can give a million likes to this video👍👍👍

    • @designinput
      @designinput  Před rokem

      Hey, thanks a lot for your lovely comment!

  • @user-ee7ko1yb9s
    @user-ee7ko1yb9s Před rokem

    Hi its looks amazing thank you for that but I tried it and also used the same parameters but unfortunately it generate a different image not the image of the pavilion it change it completely i dont know what i did wrong if you could help me thank you again

    • @designinput
      @designinput  Před rokem

      Hey, thanks for your comment! Probably there was a problem with the ControlNet. Do you have the ControlNet models installed locally?

    • @user-ee7ko1yb9s
      @user-ee7ko1yb9s Před rokem

      @@designinput hi thank you for replying back yes I already download it but controlnet doesn't work in Rhino it just work in the Browser no idea why

  • @METTI1986LA
    @METTI1986LA Před rokem

    It’s actually good but I rather have control over the textures and put them where I want to have them - it’s really not that hard... of course it takes a bit more time but why would you need 1000 renders just to get overwhelmed by the choices you have

  • @sirousghaffari9556
    @sirousghaffari9556 Před rokem

    In the 4th minute, when you press the start button, it renders without any problem, but it is a problem for me because the SEE OUT code is red and it gives this error ( Solution exception:Index was out of range. Must be non-negative and less than the size of the collection. Parameter name: index)
    can you help?

    • @11Bashar
      @11Bashar Před rokem

      Have you found a solution yet?

    • @sirousghaffari9556
      @sirousghaffari9556 Před rokem

      @@11Bashar Unfortunately, I was disappointed in connecting to Grasshopper because I don't notice its errors and there is no explanation about it anywhere.

  • @borchzhang2211
    @borchzhang2211 Před rokem

    succes 成功了

  • @sossiopalmiero3582
    @sossiopalmiero3582 Před rokem

    where i can find the grasshopper file?

    • @designinput
      @designinput  Před rokem +1

      Hey, you can find all the resources here: designinputstudio.com/this-will-change-everything-in-architectural-visualization-forever/

  • @user-fw1ui9oy5o
    @user-fw1ui9oy5o Před rokem

    My rhino7 cannot be installed ambrosinus-toolkit,which version of am should i download?

    • @designinput
      @designinput  Před rokem

      Hey, I am also using Rhino7 and was able to use it without any issues with the latest version of Ambrosinus-toolkit, if you are still having issues you may contact the developer.

  • @ABCDEFGH-bi5tk
    @ABCDEFGH-bi5tk Před 11 měsíci

    Does this work with 3ds Max as well?

    • @designinput
      @designinput  Před 11 měsíci

      Hey, not with the exact workflow but it can be possible to use it with an extension. I am not using 3ds Max myself, that's why I haven't experimented with that one. Let me know if you try it :)

  • @mockingbird1128
    @mockingbird1128 Před rokem

    would this work with revit too?

    • @designinput
      @designinput  Před rokem

      Hey, maybe it could work with the Rhino.Inside.Revit, but I haven't tested it. But you can always take a screenshot and use the SD + ControlNet additionally.

  • @riccia888
    @riccia888 Před rokem

    This is the most cofusiing software ever

  • @remyleblanc8778
    @remyleblanc8778 Před rokem

    nice! wish it was 1000 times more simple

  • @bixp2k3
    @bixp2k3 Před rokem

    how does it cost

    • @designinput
      @designinput  Před rokem

      Hey, it doesn't cost anything if you already have Rhino, because Stable Diffusion is running locally on your computer.

  • @sabaahmed1261
    @sabaahmed1261 Před rokem

    Does it work with revit ?

    • @ArchiProcess
      @ArchiProcess Před rokem

      I know revit currently has one called Veras

    • @designinput
      @designinput  Před rokem +1

      Hi, I am currently experimenting with implementing this workflow in Revit. I will share a video about it soon :)
      Thanks for the comment!

  • @abdulmelikyetkin9721
    @abdulmelikyetkin9721 Před rokem

    #DesignInput can u do this with sketchup

    • @designinput
      @designinput  Před rokem

      Hey, thanks for your comment! Technically yes, I had some issues creating this custom workflow on SketchUp; when I figure it out, I will share it :)
      Meanwhile, you can try extensions like VerasAI and ArkoAI extensions.

  • @user-fw1ui9oy5o
    @user-fw1ui9oy5o Před rokem

    2023-07-01 22:55:51,129 - ControlNet - WARNING - Guess Mode is removed since 1.1.136. Please use Control Mode instead.
    What should i do?

    • @designinput
      @designinput  Před 11 měsíci +1

      Hello, I think it should still work but if it doesn't update your ControlNet extension and it should solve this issue. Thank you!

  • @iaspace6737
    @iaspace6737 Před 10 měsíci

    I NEED SD+SU

  • @motassem85
    @motassem85 Před rokem

    Looks too complicated for me still prefer 3ds max vray or lumion 😂

    • @designinput
      @designinput  Před rokem

      Haha, totally understand that :) But we will see much easier user interfaces soon, surely!

  • @oof1498
    @oof1498 Před rokem

    Great! How about if I want to use the same material on the same place but in different perspective?

    • @designinput
      @designinput  Před rokem

      Hey, thanks for your feedback ❤You can keep the same seed number for the different views to have similar results. But still, it is not so easy to generate precisely the same materials and textures all the time. If I figure out something for more consistent results, I will share it :)

    • @oof1498
      @oof1498 Před rokem

      @@designinput thanks bro, appreciate your effort:)