Shaun Comly
Shaun Comly
  • 21
  • 20 240
Creating a Custom Burnin in Movie Render Queue with Blueprints in Unreal Engine
dev.epicgames.com/community/learning/courses/0y6/unreal-engine-virtual-production-for-visualization-guide-chapter-3-shoot-day-and-delivery/rKDx/unreal-engine-creating-a-custom-burnin-in-mrq-with-blueprints
In this video, we’ll get into some Sequencer scripting and show you how to pass information from your level sequence back and forth from Movie Render Queue to be printed to our own custom Head-Up Display or “Burnin”.Knowing how to print information into the renders Burin can be incredibly helpful in knowing exactly what you’re seeing in the render without having to go digging around in Unreal Editor. In this particular case, we are going to show exactly what assets and performances are being used in editorial so we know what we need to go pull out into proper shots or what to send for mocap processing. But really you can use this technique for anything you want to convey at a glance.
0:00 - Intro
2:29 - Editor Utility Actors
5:45 - Sequencer Scripting
28:25 - Making the MRQ HUD
34:50 - Firing a Custom Event from MRQ
39:15 - Outro
zhlédnutí: 350

Video

How to render layers using the new Unreal Engine Movie Render Graph
zhlédnutí 4,4KPřed 3 měsíci
dev.epicgames.com/community/learning/tutorials/LpaB/unreal-engine-using-the-movie-render-graph-to-render-in-layers In this video, we will go over the basics of the new Movie Render Graph in Unreal Engine. I will show you how to start rendering in layers out of Movie Render Queue. 0:00 - Intro 0:58 - Graph first look 4:00 - Our first render layer 6:25 - Collections and Graph Eval 17:24 - Modifie...
Part 1 - Driving a Crane with an XBOX controller in Unreal Engine - Making the Rail
zhlédnutí 347Před rokem
dev.epicgames.com/community/learning/paths/XRe/unreal-engine-virtual-production-visualization-guide In part 1 we'll make the rail. Also, make our Virtual Live Link Subject and show how you can use Enhanced Inputs to map the controller. Part 2 - czcams.com/video/UxD5hA9JcBM/video.html Part 3 - czcams.com/video/L388tSp2ShY/video.html
Part 2 - Driving a Crane with an Xbox Controller in Unreal Engine - Adding the Crane to the Rail
zhlédnutí 189Před rokem
dev.epicgames.com/community/learning/paths/XRe/unreal-engine-virtual-production-visualization-guide Now that we have our Rig Rail working let's add a Camera Crane! Part 1 - czcams.com/video/Ktl5L9hEZQo/video.html Part 3 - czcams.com/video/L388tSp2ShY/video.html
Part 3 - Driving a Crane with an Xbox Controller in Unreal Engine - Rebroadcasting over MultiUser
zhlédnutí 90Před rokem
dev.epicgames.com/community/learning/paths/XRe/unreal-engine-virtual-production-visualization-guide In Part 3 we'll show how to rebroadcast the Live Link Virtual Subject over a MultiUser session. Part 1 - czcams.com/video/Ktl5L9hEZQo/video.html Part 2 - czcams.com/video/UxD5hA9JcBM/video.html
Part 3 - Midi Controlled Rig Rail and Crane using Blueprints in UE - Triggering Sequences
zhlédnutí 89Před rokem
dev.epicgames.com/community/learning/paths/XRe/unreal-engine-virtual-production-visualization-guide Now lets learn how we can trigger level sequences in Unreal Engine with physical buttons on our midi controller. This is really handy on set when you need the performance to react to something in the virtual world. Part 1 - czcams.com/video/JVwUxdx79f0/video.html Part 2 - czcams.com/video/JKWNv2F...
Part 2 - Midi Controlled Rig Rail and Crane using Blueprints in UE - Interacting With Actors
zhlédnutí 201Před rokem
dev.epicgames.com/community/learning/paths/XRe/unreal-engine-virtual-production-visualization-guide Controlling a Crane and Rig Rail Arm with the MIDI Controller in Unreal Engine Part 1 - czcams.com/video/JVwUxdx79f0/video.html Part 3 - czcams.com/video/FPXw3lz-hUY/video.html
Part 1 - Midi Controlled Rig Rail and Crane using Blueprints in UE - Creating the Midi Controller
zhlédnutí 237Před rokem
dev.epicgames.com/community/learning/paths/XRe/unreal-engine-virtual-production-visualization-guide Creating your first Midi controller in Unreal Engine using the Midi Blueprint Library Part 2 - czcams.com/video/JKWNv2F7S_w/video.html Part 3 - czcams.com/video/FPXw3lz-hUY/video.html
Part 4 - Making a "4 Up" display with Editor Utility Widgets in Unreal Engine - Using Functions
zhlédnutí 113Před rokem
dev.epicgames.com/community/learning/paths/XRe/unreal-engine-virtual-production-visualization-guide Let's fit our Blueprint code to functions so that we can reuse it without having to copy a bunch of nodes everywhere. Part 1 - czcams.com/video/Hvb_EJwHaQY/video.html Part 2 - czcams.com/video/6ipe8NH4nNs/video.html Part 3 - czcams.com/video/tlxSl9Ha014/video.html Part 5 - czcams.com/video/uKbB7L...
Making a Procedural Follow Camera in Unreal Engine using Blueprints for Virtual Production
zhlédnutí 248Před rokem
dev.epicgames.com/community/learning/paths/XRe/unreal-engine-virtual-production-visualization-guide After talking with the Fortnite Cinematics team, one of the things we took away from that conversation was the team's heavy use of procedural cameras during a performance capture session. Even though they might have every intention of going back and recording virtual cameras or animating the came...
Controlling Virtual Camera Attributes with Remote Control Preset for Virtual Production
zhlédnutí 273Před rokem
dev.epicgames.com/community/learning/paths/XRe/unreal-engine-virtual-production-visualization-guide The Remote Control plugin for Unreal Engine lets you collect and organize any Actor parameter or function in a single panel. You can also quickly and easily bind these parameters to a MIDI, OSC, or DMX device right from the same interface without having to do any complicated Blueprint work. The R...
Part 6 - Making a "4 Up" display with Editor Utility Widgets in Unreal Engine - Max View Function
zhlédnutí 122Před rokem
dev.epicgames.com/community/learning/paths/XRe/unreal-engine-virtual-production-visualization-guide Finishing Up With Some Maximize View Functionality. Let's add the ability to fullscreen any of the views to get a better look. Part 1 - czcams.com/video/Hvb_EJwHaQY/video.html Part 2 - czcams.com/video/6ipe8NH4nNs/video.html Part 3 - czcams.com/video/tlxSl9Ha014/video.html Part 4 - czcams.com/vid...
Part 5 - Making a "4 Up" display with Editor Utility Widgets in Unreal Engine - The Quad View
zhlédnutí 158Před rokem
dev.epicgames.com/community/learning/paths/XRe/unreal-engine-virtual-production-visualization-guide Now lets make the 4 views and learn how to fit different cameras into each view. Part 1 - czcams.com/video/Hvb_EJwHaQY/video.html Part 2 - czcams.com/video/6ipe8NH4nNs/video.html Part 3 - czcams.com/video/tlxSl9Ha014/video.html Part 4 - czcams.com/video/Et2fw3gjpko/video.html Part 6 - czcams.com/...
Part 3 - Making a "4 Up" display with Editor Utility Widgets in Unreal Engine - Adding a Config
zhlédnutí 113Před rokem
dev.epicgames.com/community/learning/paths/XRe/unreal-engine-virtual-production-visualization-guide Let's add a switch to get to a "Config" window where we can set the resolution of our Scene Captures. Part 1 - czcams.com/video/Hvb_EJwHaQY/video.html Part 2 - czcams.com/video/6ipe8NH4nNs/video.html Part 4 - czcams.com/video/Et2fw3gjpko/video.html Part 5 - czcams.com/video/uKbB7LT_tOk/video.html...
Part 2 - Making a "4 Up" display with Editor Utility Widgets in Unreal Engine - Creating the HUD
zhlédnutí 152Před rokem
dev.epicgames.com/community/learning/paths/XRe/unreal-engine-virtual-production-visualization-guide Let's make the Heads Up Display for our "4 Up" display tool. Part 1 - czcams.com/video/Hvb_EJwHaQY/video.html Part 3 - czcams.com/video/tlxSl9Ha014/video.html Part 4 - czcams.com/video/Et2fw3gjpko/video.html Part 5 - czcams.com/video/uKbB7LT_tOk/video.html Part 6 - czcams.com/video/1-SWtg2BwF4/vi...
Part 1 - Making a "4 Up" display with Editor Utility Widgets in Unreal Engine - Making the 1st Panel
zhlédnutí 644Před rokem
Part 1 - Making a "4 Up" display with Editor Utility Widgets in Unreal Engine - Making the 1st Panel
How to trigger a Take Record on multiple Unreal Editors machines at once using Switchboard
zhlédnutí 166Před rokem
How to trigger a Take Record on multiple Unreal Editors machines at once using Switchboard
Launching multiple Unreal Engines into a MultiUser Session using Switchboard for Virtual Production
zhlédnutí 4,3KPřed rokem
Launching multiple Unreal Engines into a MultiUser Session using Switchboard for Virtual Production
A talk about Textures, MIPS, Compression, Streaming, and Memory in Unreal Engine Virtual Production
zhlédnutí 2KPřed rokem
A talk about Textures, MIPS, Compression, Streaming, and Memory in Unreal Engine Virtual Production
Automatically create and/or apply materials from Maya to alembic meshes on import to Unreal Engine
zhlédnutí 2,9KPřed 2 lety
Automatically create and/or apply materials from Maya to alembic meshes on import to Unreal Engine
Using ffmpeg with Unreal Engine's Movie Render Que
zhlédnutí 3,2KPřed 2 lety
Using ffmpeg with Unreal Engine's Movie Render Que

Komentáře

  • @zacharydembinski7486

    Thanks so much Shaun! Is there anyway to call out the {version} number of the output directory ({project_dir}/Saved/MovieRenders/{version})?

    • @shauncomly
      @shauncomly Před 3 dny

      Actually no, it doesn't look like it. We'll get that added.

  • @iamhere56
    @iamhere56 Před 6 dny

    I am using a time dilation in level sequencer and I render the shot with movie render graph but it is not rendering the slowing motion effect how to fix that🙏

    • @shauncomly
      @shauncomly Před 6 dny

      It’s not supported yet in MRG. It’s on the list but behind some other more pressing functionality.

    • @iamhere56
      @iamhere56 Před 5 dny

      @@shauncomly ok thankyou for reply🙏

  • @user-tg1nn8xn1f
    @user-tg1nn8xn1f Před 7 dny

    May I ask how to render the alpha channel of the bloom?

    • @shauncomly
      @shauncomly Před 7 dny

      Bloom does not show up in the alpha.

  • @NYH_
    @NYH_ Před 10 dny

    Is there a discord server or somewhere I can contact you for help me I have a team and im trying to set this up but it just wont work for me. If you can help me out itd be really appreciated thanks.

    • @shauncomly
      @shauncomly Před 10 dny

      No. You can post on the forums and people will help you. Some Epic folks, myself included, go in there from time to time to answer questions. Or if you have UDN access that is a way to get directly to Epic support.

  • @LeoLiesvirta
    @LeoLiesvirta Před 15 dny

    Hi Shaun, great video. I have been trying to find ways to edit or add custom tokens. When exporting a master sequence the "{camera_name}" returns "Unsupported". And while the global variable camera_name works, plugging it into "FineNameFormat" will make it impossible to add frame number to the exported sequences. Which will result in the export overwriting the sequence each frame. What I really would want to do is have any possible way to grab parts of the rendered camera name as name and folder structure since in our system we name the cameras "scene_shot_take_version". For the burn-ins I could simply "parse into array" the current camera display name and grab the info like that. We could start naming our master sequences by "scene" and the sub sequences by "shot" and such get those strings. However getting the "take" and "version" (since we overwrite until the version is acceptable, we cant use the current {version}-token) is not possible without some customization. Any advice that could help? We have fold that can if needed dabble into c++, however if I can point them to the right direction, that would save alot of time and headache! Thank you!

    • @shauncomly
      @shauncomly Před 14 dny

      The only way to add a custom token is from c++. We are going add ways to do that in MRG shortly with string variables. But {camera_name} should be supported. I just tried {camera_name}.{frame_number} and it worked fine.

    • @LeoLiesvirta
      @LeoLiesvirta Před dnem

      @@shauncomly Thank you! We managed to use python to override a custom user serialized_string variable to split camera_name into 4 parts (scene/shot/take/version) and use them as directories in the path (and the name). However we did notice that when getting the value of a user variable, it only ever returned the default value of it, never the value the user has overridden in the movie render queue-side. Overriding those values worked in python tho. So we got what we wanted. However I would have wanted to code in a "check if user has overridden a value, if so, use their value instead" into the python code.

    • @shauncomly
      @shauncomly Před dnem

      @@LeoLiesvirta When did you override it (at edit time? Or through an Execute Script node at runtime?) What are you overriding, the Job's version of the variable, or the Graph's default value for that variable? Are you using the variable as a node plugged into another node or overriding a final property directly? Are you able to share a small repro?

    • @LeoLiesvirta
      @LeoLiesvirta Před 15 hodinami

      @@shauncomly "on_shot_start" override then loop through the variables... "graph = job.get_graph_preset() #job is the "in_shot_copy" copy of the on_shot_start. variables = graph.get_variables(True) for variable in variables: member_name = variable.get_member_name() if member_name == "OverrideOutputDirectory":" ... to find our variable. And then we set the variable to overrider it. And to clarify this overriding works fine. However if we print the value of the variable it always returns the default value of it and not the value which the user set in the movie render queue window where all the user variables are exposed. Is there another point of the render process or perhaps another command to get the user-set value of the variables?

  • @LeoLiesvirta
    @LeoLiesvirta Před 21 dnem

    Amazing tutorial, thank you so much! I am using a master sequence and trying to pull data from the sub-sequences. Couldn't quite figure it out since the "Get Bindings" pull zero instances from the Master sequence... You wouldn't happen to know how?

    • @shauncomly
      @shauncomly Před 20 dny

      I wish I could post a pic here but I'll try and describe it. So that will show up as a "track". If you list your tracks in the master/main sequence you should see a "MovieSceneSubsctrion" track show up. Anyways "Get_Tracks" on your main level sequence then loop through that array and "Get Sections". Now loop through your sections array and "CastToMovieSceneSubsection". This will filter out your subscenes. For the return of "AsMovieSceenSubSection" connect a "GetSequence" node and plug that return value into a"GetBindings". That should return you an array of your bindings in your subscene track.

    • @shauncomly
      @shauncomly Před 20 dny

      In hindisght it would have probably have been easier to just post some python.

    • @LeoLiesvirta
      @LeoLiesvirta Před 16 dny

      @@shauncomly Wonderful, thank you! Finally got the array of the subsequences. I dont think I could have figured this out on my own.. xD However now I am trying to return just the one that is currently being rendered. I suppose I should do something similar as in the tutorial 17:30 point where you get the keys to compare if the frame is the same as the current played/rendered frame. However not sure how to pull out the current "key" of the sub sequence being played.. x/ Please halp!

    • @shauncomly
      @shauncomly Před 14 dny

      @@LeoLiesvirta Is the problem that the subsequence is offset from the main sequence so the frame number in the subsequence does not match the main? If so you can get the section range start frame and just offset the render frame.

    • @LeoLiesvirta
      @LeoLiesvirta Před 10 dny

      @@shauncomly Ah, right! And I just compare the current frame of the mastersequence to the offset-subsequence frames to see which one is being rendered. I can probably assume that the array is in chronological order based on how they are ordered in the master sequence so I can use an integer to keep track of the ones that have been "played" already and just add the offsets that way. That what you meant?

  • @mathieulanglois3259
    @mathieulanglois3259 Před 24 dny

    I have some differences between what I see in the viewport color managed with ACES OCIO in srgb SDR and an output from MRGraph using the disable tonecurve option and the OCIO in ACES when I look at it in Nuke with the same viewport settings. Especially with pixels that go above 1, so remapped in sdr. They are identical when I render with traditionnal MRQ. Is this a known issue?

    • @shauncomly
      @shauncomly Před 20 dny

      Yes. There are likely a couple things at play here. The MRQ path employs the GPU OCIO path, unlike MRG which is all done CPU-side now. Our OCIO shader clamps negative colors (this could be argued to be undesirable but it does currently). This clamping is not present with MRG/CPU OCIO. I believe part of the difference here could therefore be explained by this gamut expansion + clamping happening just before OCIO. You can try and disable Expand Gamut and Blue Correction in the PPV or use the VPPassThroughPostProcessVolume (in VPUtilities plugin) which has all the required settings disabled by default (including local exposure).

    • @shauncomly
      @shauncomly Před 20 dny

      Also note, we just submitted a fix to allow negative values through the OCIO shader @ CL 34647810 in UE5 main.

    • @mathieulanglois3259
      @mathieulanglois3259 Před 17 dny

      @@shauncomly Hi Shaun, disabling the Expand Gamut and the Blue Correction in the PPV did the trick and I now have an identical image in Unreal and Nuke's viewports (OCIO sRGB Display SDR) when I render using MR Graph .Thanks for the help, problem solved! :) For the record, I just tested rendering with the Movie Render Graph with a build of UE5 Main and it seems to work without having to turn off the Expand Gamut and Blue correction (similar to the way MRQ works when using OCIO). My build dates from July 25th so I'm not certain the fix you mention was already in there as I'm not really sure how to find a CL #.

    • @shauncomly
      @shauncomly Před 17 dny

      @@mathieulanglois3259 Yeah you probably got the fix then.

  • @romeoinferno
    @romeoinferno Před měsícem

    Thanks for the tutorial. For some reason my hold out passes are producing opaque render layers. All the objects have a slight opacity to them. Also seems like the sky is completely black and hasn't been taken into account during the hold out. Any suggestions? Thank you!

    • @shauncomly
      @shauncomly Před měsícem

      Check that all the components on the UDS actually have a holdout parm. I have not gone through the UDS and checked it. You may have to go mess with the individual components to get it working.

  • @SuperLordee
    @SuperLordee Před měsícem

    Thank you for this video, that was exactly what I was looking for!

  • @andrewgreenlee6132
    @andrewgreenlee6132 Před měsícem

    It freakin worked, and I have some big caches that I can't un "flatten" due to a GPU crash, you're the man dude thanks so much!

  • @benkatz3854
    @benkatz3854 Před měsícem

    The problem using this method dor object mattes is that when you for example do the Holdout for a floor, to separate a cube from the floor. The separated cube loses the bouncing/reflecting light from the floor so it's changes the final result. Is there any way to this without losing the reflecting light of the floor?

    • @shauncomly
      @shauncomly Před měsícem

      I go over it in the video. Enable Visible in Indirect While Hidden and make sure you don't have Screen Traces on.

  • @ToastNz
    @ToastNz Před měsícem

    Hey there I was wondering if there was a way to isolate or tag individual plants or types of plants within the foliage actor, i tried to tag the instanced static mesh components within it but it sadly didn't give the desired result. thanks heaps for the video!

    • @ToastNz
      @ToastNz Před měsícem

      update for anyone else wondering I managed to get it going by manually ticking visibility boxes within the components and separating out different MRQ/levels for FG,MG,BG

    • @shauncomly
      @shauncomly Před měsícem

      @@ToastNz Hey! Sorry, I was away for a bit. Yes, this is the only way atm.

  • @AndyBean-y6j
    @AndyBean-y6j Před měsícem

    Just stumbled upon this. This was great Shaun, really helpful - thnx!

  • @FilmByteProductions
    @FilmByteProductions Před měsícem

    hello, is it possible to render object IDs in movie render graphs?

    • @shauncomly
      @shauncomly Před měsícem

      It's not supported yet, unfortunately. How much do you use them? Does the lack of post-effects support make them less effective?

  • @iamhere56
    @iamhere56 Před měsícem

    how to enter console variable

    • @shauncomly
      @shauncomly Před měsícem

      There is a set console variable node

    • @iamhere56
      @iamhere56 Před měsícem

      @@shauncomly ok thanks🙏🥰

  • @Daniel_Crane
    @Daniel_Crane Před 2 měsíci

    Awesome tutorial, thank you! If I had a blueprint with two static meshes inside it, and wanted to set only one as a holdout, how would I do it? I tried using component tags within the collection node condition group and I can't get it to work.

    • @shauncomly
      @shauncomly Před 2 měsíci

      Component tags are the only way. It should work. If its a spawnable make sure you add it to the asset. When it respawns it wipes the tags from components unless you save it to the asset. We are working on better support for spawnabes in 5.5.

  • @ahmedadel3073
    @ahmedadel3073 Před 2 měsíci

    Hi Shaun, Render Passes for Reflection, Refraction, Ambient occlusion how can do it in UE 5.4 Render Graph? Thank you

    • @shauncomly
      @shauncomly Před 2 měsíci

      YOu can get to some of the buffers under the viewmode index on the renderer node or use the lighting components under the show flags. Otherwise, you would make a PPMaterial and grab the buffer like normal. We are going to add the path tracer lighting components in the PPV to the pathtracer node in 5.5

  • @ak-gi3eu
    @ak-gi3eu Před 3 měsíci

    does this effect render time?

  • @zacharydembinski7486
    @zacharydembinski7486 Před 3 měsíci

    Thank you so much Shaun for this breakdown. Really helpful! In the video you said we can download the more complex `MRG_MainSeq` example referenced at 3:40. I've downloaded the latest Meerkat Demo on the `Samples` page on Epic Launcher and it does not have this MRG in there. Is there another place we can grab it from. Thanks!

    • @shauncomly
      @shauncomly Před 3 měsíci

      It's supposed to be, lol. Let me see if I can see why it's not updated.

  • @sergibro
    @sergibro Před 3 měsíci

    @shauncomly would this method work for connecting a follow cam on mesh component actors? Or must the actors be a rig or have bones in order to constrain camera? Your vids are great, thanks for taking the time creating these types of videos.

    • @shauncomly
      @shauncomly Před 3 měsíci

      No, it would work with anything that has active transforms. You would just have to change the BP to be able to get at the component you want.

  • @Almaironn
    @Almaironn Před 3 měsíci

    Hi Shaun, thanks for this feature, very useful! I was wondering if you ever tried rendering ExponentialHeightFog/AtmosphericFog on a separate layer, while holding out the rest of the scene. I am trying this and it's giving me some artifacts around the edges and especially in the alpha channel. Am I missing something?

    • @shauncomly
      @shauncomly Před 3 měsíci

      Yeah, we tested this and it worked. Can you post on the forums and include an example image?

  • @ryanjstever
    @ryanjstever Před 3 měsíci

    This was great, thanks so much! Can we now render out multiple passes all at the same time now? Instead of rendering each pass separately? Emissive, Specular, Diffuse etc.

    • @shauncomly
      @shauncomly Před 3 měsíci

      You can render out the viewmode index. But honestly in the deferred render path those passes are not going to give you something where you can build your frame back like you can in VRay/Arnold. You might be able to get a little closer with Pathtracer.

    • @ryanjstever
      @ryanjstever Před 3 měsíci

      @@shauncomly No no, myself and most likely others that need this are probably using path tracer. Thanks for the response. Cant wait to give this a try!

  • @iPEMiC.
    @iPEMiC. Před 3 měsíci

    How to render Niagara particles separately in a scene?

    • @shauncomly
      @shauncomly Před 3 měsíci

      Niagra still has limited support but you would just hold everything else out.

  • @HomAxz
    @HomAxz Před 3 měsíci

    Your contribution to the unreal learning hub is invaluable. Thank you.

  • @EMNM66
    @EMNM66 Před 3 měsíci

    amazing tutorial, is there a way to save render layers separately in diffrent folders? also can you make a tutorial on render passes. pleaseee

    • @shauncomly
      @shauncomly Před 3 měsíci

      Yes you can add the {layer_name} token to the output path.

    • @EMNM66
      @EMNM66 Před 3 měsíci

      thank you @@shauncomly will it give me layers in separate folders ?

    • @shauncomly
      @shauncomly Před 3 měsíci

      @@EMNM66 yep

    • @EMNM66
      @EMNM66 Před 3 měsíci

      @@shauncomly hey, how do i get only the emissive pass in UE i dont see it anywhere.

    • @shauncomly
      @shauncomly Před 3 měsíci

      @@EMNM66 There is no one-button solution. You would need to setup the pass by holding out everything else.

  • @John_atCTS
    @John_atCTS Před 3 měsíci

    Thanks for the video Shaun. A MRG question - have all the MRQ settings been implemented into nodes in 5.4? I'm trying to find the "Render all Cameras" setting/node and it's not listed in the Camera Settings node.

    • @shauncomly
      @shauncomly Před 3 měsíci

      Nope not all. They are still working on getting everything in. That one is not in there yet.

    • @John_atCTS
      @John_atCTS Před 3 měsíci

      @@shauncomly - Thanks for the info! Can't wait for the next set of updates. Keep the MRG videos coming!

  • @behrampatel4872
    @behrampatel4872 Před 3 měsíci

    super informative video. Thank you b

  • @davidratajczak8408
    @davidratajczak8408 Před 3 měsíci

    Is there a way then to render out an environment with atmospherics effects, but without the sky ? Like having a rainforest scene with some fog in it, but being able to export everything but the sky into Nuke so we could place our own personal sky instead ?

    • @shauncomly
      @shauncomly Před 3 měsíci

      Yes all that is supported with the new render attributes. You can just hold out the sky or make it visible in indirect illumination.

  • @simonflash_music
    @simonflash_music Před 3 měsíci

    I remember using UE4 for rendering, even before MRQ was a thing. Finding info on making cinematics was tough back then. With the tools and info you guys are working on, it's a whole new world of possibilities. Big thanks to your team for that! Really appreciate it.

  • @YuuJer
    @YuuJer Před 3 měsíci

    this is awesome, could you potentially just stack post process stuff into sublevels, and render out a buncha passes with different post process setups? or is there a way to mute and unmute seqeuencer tracks with this?

    • @shauncomly
      @shauncomly Před 3 měsíci

      I have not tried that but you could theoretically do it. And there is no way to mute a track natively inside of the render graph. The graph doesn't really have that kind of a direct connection to Sequencer. You could disable things with a script though.

  • @user-ie7zr1hc1c
    @user-ie7zr1hc1c Před 3 měsíci

    I want to know how to render multi-channel in UE, such as how to render shadows

    • @shauncomly
      @shauncomly Před 3 měsíci

      Not natively atm. In the future, it will probably be there with the Pathtracer. For the deferred renderer it's a little harder as that information is gone by the time MRQ gets it. They would have to make a custom buffer to hold shadows till the end. Which is not something they want to do. You can make the pass though by going old school and just putting a white material on everything. We did just request a receive shadows parm to make that easier.

    • @user-ie7zr1hc1c
      @user-ie7zr1hc1c Před 3 měsíci

      thanks

  • @tmaintv
    @tmaintv Před 3 měsíci

    This is awesome Shaun. Really great additions. How’s it looking for rendering shadow catchers/layers on alpha with shadows now?

    • @user-ie7zr1hc1c
      @user-ie7zr1hc1c Před 3 měsíci

      I want to know how to render multi-channel in UE, such as how to render shadows

    • @shauncomly
      @shauncomly Před 3 měsíci

      Not natively atm. In the future, it will probably be there with the Pathtracer. For the deferred renderer it's a little harder as that information is gone by the time MRQ gets it. They would have to make a custom buffer to hold shadows till the end. Which is not something they want to do. You can make the pass though by going old school and just putting a white material on everything. We did just request a receive shadows parm to make that easier.

    • @tmaintv
      @tmaintv Před 3 měsíci

      @@shauncomly thanks Shaun. Great to know. It’s the most common request from comp especially when working with live action but yes it is achievable with a mix of pp and using the lighting passes already just about but it’s very messy I find. If there’s a possibility of it making it in in whatever form fantastic

    • @shauncomly
      @shauncomly Před 3 měsíci

      @@tmaintv I'll pass the feedback along to the product team. It's a common request for us as well.

  • @sunnynoop
    @sunnynoop Před 3 měsíci

    Awesome tutorial, thanks Shaun! One thing I noticed after building this out is that whenever I open the project, the livelink capture doesn't work unless I go and recompile the Live Link BP. Has this been an issue for anyone else/anyone know if there's a way to get live link roles to work already upon project opening?

    • @shauncomly
      @shauncomly Před 3 měsíci

      Yeah you have to re-initialize it which is annoying. We actually just came out with a plugin that makes all this so much easier. I considered taking these videos down but its not a bad thing to learn about enhanced inputs. Anyway this all makes working with a controller so much easier. dev.epicgames.com/community/learning/tutorials/xpV1/live-link-input-device-a-new-way-to-use-game-controllers-for-performance-capture-in-unreal-engine-5-4

  • @lolboy9501
    @lolboy9501 Před 4 měsíci

    Hey Shaun, great tut btw this looks interesting. This is unrelated but I couldn't find any other way to reach out to you. I saw a help thread in which you mentioned being able to create a UV pass by plugging a TexCoord node into Emissive Colour and using that as a render pass material. However, (i wish i could attach images) the pass this creates in the viewport and in the render is just a fullscreen green, black, red and yellow gradient. The principle is correct, but it's not actually attached to any individual geometry like it is for example when you render a UV pass in Maya, so it's unusable in the compositing stage. Any idea how the proper render pass could be achieved? Would be invaluably useful for our team :)

    • @shauncomly
      @shauncomly Před 4 měsíci

      Ahh yeah after thinking about it the UVs are gone by that stage. You would need to write an event to reassign the material to the RGB pass.

    • @lolboy9501
      @lolboy9501 Před 4 měsíci

      I see, is there any way you could explain how to go about writing that event or is it too complex to write over a youtube comment?

    • @shauncomly
      @shauncomly Před 4 měsíci

      @@lolboy9501 Just get all actors in your scene and reassign the material. You can launch a custom event from a cvar in MRQ by "Ke * CustomEventName"

  • @TheFearofThePast
    @TheFearofThePast Před 4 měsíci

    thank you so much. Please I need your help. I am trying to upload my game in Epic game, and epic game required developers to use command promote. The problem they do not give us any direction how to upload it.

  • @sergeydet3004
    @sergeydet3004 Před 5 měsíci

    Спасибо большое! Очень сильно помогли!

  • @berraknil
    @berraknil Před 6 měsíci

    thanks for making this, for anyone else stuck at 4:56 like me, you need the Composure plugin enabled first, to be able to get that connection node.

  • @vessel825
    @vessel825 Před 6 měsíci

    Hi Thank you for making this tutorial it is really helpful, but i have a question how i use a Cinecamera instead of the rig Rail, i tried to do but it didnt work properly, is there something that i have to consider first?

    • @shauncomly
      @shauncomly Před 6 měsíci

      What specifically are you trying to do with the cinecamera? Control the lens or control the transforms? If you are trying to control the transforms I suggest you put your controls on an empty actor or something and patrent the cinecamera to it.

  • @FF-FAN9999
    @FF-FAN9999 Před 6 měsíci

    awesome video! thank you!

  • @chah1653
    @chah1653 Před 7 měsíci

    How to export a mp4 with audio?

    • @shauncomly
      @shauncomly Před 7 měsíci

      Just add it to your command line with another -i. "-i video1.mp4 -i audio1.mp3"

    • @chah1653
      @chah1653 Před 7 měsíci

      @@shauncomly the command line component inside movie render queue config? Do I also need to enable wav?

    • @chah1653
      @chah1653 Před 7 měsíci

      @@shauncomly I tried adding -i video1.mp4 -i audio1.wav in "Additional Command Line Args" in Command Line Encoder, log said there is no video called video1.mp4. I also replace video1.mp4 to {Videooutput} still failed. Is it the correct way to add the command line?

    • @shauncomly
      @shauncomly Před 7 měsíci

      @@chah1653 You will not be able to add that as an additional command. It will be fixed in 5.4 but there was a bug that would not allow you to include more than one input using the command line encoder. You can get 5.4 off of Git and try and merge the code. Or you'll have to do it in a separate operation until we release it around March/April.

    • @chah1653
      @chah1653 Před 7 měsíci

      @@shauncomly Right now I indeed do a separate operation after rendering out wav and mp4 in ue. So there is no way directly make audio and video together inside ue... sequencer rendering is not an option, I use movie render queue

  • @MrBrubobaggins
    @MrBrubobaggins Před 8 měsíci

    Any update on this for 5.3? Seems like the input actions are a bit different and don’t seem to be working for me since updating.

    • @MrBrubobaggins
      @MrBrubobaggins Před 8 měsíci

      😅never mind like usual one little check box fixed my problem.

    • @sergibro
      @sergibro Před měsícem

      @@MrBrubobaggins I'm having the same issue, do you remember what the little check box was?

    • @MrBrubobaggins
      @MrBrubobaggins Před měsícem

      @@sergibro For me it was in the EUA_Spoof_Video BP the second node that is used to set Receive Editor Input was checked off when I updated. You will see it for a second at 3:14 good luck!

  • @yang5616
    @yang5616 Před 8 měsíci

    More hardcore tutorials, please!!

  • @thecommenterry
    @thecommenterry Před 9 měsíci

    Where can I get perforce? I looked at their website but wasn’t sure which thing to download

  • @Pikameleon
    @Pikameleon Před 9 měsíci

    Hello again! I was wondering if you anything about using the "Vcam Actor" in a multiplayer session? When I click enable on the Vcam, all the viewports on the host and clients lock to the vcam. I'm trying to only have the client be able to use the VCAM. I just haven't found much about the viewports on the Vcam actor and having it on display for a client without having it showing up on everyone else's viewport. cheers

    • @shauncomly
      @shauncomly Před 9 měsíci

      Check out VP Roles. You can set them in Switchboard or manually in the UE. Then set the role you want to the vcam HUD to display to on the VCam actor at the top. dev.epicgames.com/community/learning/courses/lPA/unreal-engine-virtual-production-visualization-guide-chapter-1-fundamentals/mM17/unreal-engine-virtual-camera docs.unrealengine.com/5.3/en-US/virtual-camera-multi-user-quick-start-guide-in-unreal-engine/#:~:text=VCam)%20Actor%5B%E2%80%A6%5Dtchboard%20application

  • @Pikameleon
    @Pikameleon Před 10 měsíci

    This is awesome! I'm trying to set this up for a live link session of my rokoko motion capture suit. Although when I set up my pawn as the target it's only getting the actor transform. I think I need some form of root motion or bone that I need to set the target for. Do you have any ideas on how I could set it up to follow a specfic bone on my character mesh so it follows the performance?

    • @shauncomly
      @shauncomly Před 10 měsíci

      You can constrain an empty actor or something to a bone or a socket and they derive the transforms from there. czcams.com/video/33b1IFBhEBw/video.html

    • @Pikameleon
      @Pikameleon Před 10 měsíci

      thanks! Constraints are great. I believe this only works inside of sequencer. I was trying to set it up where it works in game, during a live link performance capture session. In the blueprint you made, you make a soft object reference as the target, I was wondering if that target could be a bone or a constraint? That's what has the transforms in my case. With this method, I think setting up a level sequence in the blueprint, where the sequence has the follow actor constrained to a bone on the live-link skeleton, maybe that would work? all the best @@shauncomly

    • @shauncomly
      @shauncomly Před 10 měsíci

      @@Pikameleonconstraints work outside of sequencer. They work in editor as well. I did a quick test and it should work for you.

    • @Pikameleon
      @Pikameleon Před 10 měsíci

      @@shauncomly wow I didn’t realize that! That worked!! I followed this tutorial as well as your 4UP tutorial and I just want to say thanks because this was exactly what I was looking for!

    • @shauncomly
      @shauncomly Před 10 měsíci

      @@Pikameleon Awesome! Glad we can help. If you haven't already check out the Vis Guide on Epic DevHub. Now that UE Fest is done we are working on the next installment. We'll periodically post things there that we think are helpful to people working with performance capture and vcam in Unreal Engine. dev.epicgames.com/community/learning/paths/XRe/unreal-engine-virtual-production-visualization-guide

  • @uzizizu
    @uzizizu Před 11 měsíci

    Hi Shaun! Can you please tell the devs to make ffmpeg as a real command, I mean I want to play the video right after the render and I can't because it makes everything as ffmpeg input.. For example I want to make this commands in the project: ffmpeg -hide_banner -y -loglevel error {VideoInputs} {AudioInputs} {AdditionalLocalArgs} -acodec {AudioCodec} -vcodec {VideoCodec} {Quality} "{OutputPath}" && "C:\Program Files\DAUM\PotPlayer\PotPlayerMini64.exe" "{OutputPath}"

  • @RamboNisse
    @RamboNisse Před 11 měsíci

    This was so helpful! The peforce environment setting was not easy to find without you. We are building a ledstage in Sweden any change we can get in touch :-D

    • @shauncomly
      @shauncomly Před 11 měsíci

      Hit me up on LinkedIn or something. I can connect you with someone at Epic who supports stages.

  • @neilb143
    @neilb143 Před rokem

    YOU ARE A GOD

  • @blakeXYZ
    @blakeXYZ Před rokem

    Thank you for this great look into Textures and how they can easily fly under the radar and eat up GPU memory. This was very enlightening, thanks for sharing your expertise! Cheers

  • @sebastianvillalba9472

    Thank you. You saved me days of struggle. You're a hero!