LRGB Complete Workflow: Part 3 -- Advanced Layer-Based Developing

Sdílet
Vložit
  • čas přidán 5. 09. 2024

Komentáře • 16

  • @AmatureAstronomer
    @AmatureAstronomer Před měsícem

    Interesting.

  • @davidemancini7853
    @davidemancini7853 Před měsícem

    Very nice,is there any chance you can make a series tutorial how to use Affinity?

    • @SKYST0RY
      @SKYST0RY  Před měsícem +1

      Thank you. I don't have immediate plans to do so but perhaps down the road. All or my processing videos are using Affinity, and there are thousands of videos out there on standard usage of Affinity. BTW, right now you can get a six month free trial. Just go to the Affinity Photo website and download it.

    • @davidemancini7853
      @davidemancini7853 Před měsícem

      @@SKYST0RY Yep I saw that!

  • @JoshuaAkonom
    @JoshuaAkonom Před měsícem

    Oh, also one more question. Can I extract the luminance from an OSC in PI and then save that as a TIFF and bring it over to Affinity for the frequency separation steps?

    • @SKYST0RY
      @SKYST0RY  Před měsícem +1

      OSC cameras don't produce a L channel; they only make RGB channels. The closest you could do would be to extract the saturation and produce a pseudo-L channel. But there is no point in it, really. All the information is already in the RGB channels.

  • @JoshuaAkonom
    @JoshuaAkonom Před měsícem

    Is there any reason to not work with a 32b tiff over 16b? I sometimes use James Ritson's macros and they have both 32b and 16b versions. I've only been working in 32b as this is the first time I'm even using an image editing program like Affinity. I'm just wondering if there are any pros and cons to it.
    Your workflow is vastly different from what I've learned so far, but great information to add to my own. I'm looking forward to part 4. Especially, seeing how you get that OIII out. I've been struggling to find a way to bring it out in my Crescent Nebula.

    • @SKYST0RY
      @SKYST0RY  Před měsícem +1

      There are two reasons to consider your image bit depth in AP. One is what is your camera's bit depth. There is no point going beyond it because you won't get any additional information. Ex. My Ares-M camera has a 14 bit depth, so I keep my files sizes to 16 bit. The other is that 32 bit images are considerably larger and tap your computer's resources more. It's a lot easier for AP to handle 16 bit files than 32 bit, especially if you're images are drizzled or they are from a camera with a sensor that produces very large files.

    • @j.s.3407
      @j.s.3407 Před měsícem +1

      @@SKYST0RY I don't believe you can discard bit depth consideration once you've applied a non-linear stretch to the image. Once you've done that you have moved to the FP32 domain to maintain full fidelity

    • @SKYST0RY
      @SKYST0RY  Před měsícem

      ​@@j.s.3407 When information is up converted to a higher bit depth, the information is linearly multiplied. More information isn't gained, it's just scaled up. At this point, the information is still linear. When a nonlinear transformation is done on the information, the stretch may fit that higher bit depth, but that is merely a software procedure and a temporary state. No new information is gained. In fact, by the time the image is developed and output to a monitor, there will be many substantial down conversions.
      Of all the down conversions, the most significant--and essentially the gatekeeper--is the monitor. Whether the camera had a 10, 12, 14 or 16 bit sensor, and regardless if the software worked in 32 bit FP, the monitor will simplify the output unless one is using an extremely high end retina display monitor. I don't know if there are even 32 bit monitors. Most monitors only display at 8 bits. Better ones at 10 bits. However, it is hardly limiting to the final image because the average person only perceives about a million colors and luminance ranges while more visually sensitive persons may perceive up to 100 million and 10 bits gives a range of 1.07 billion. What this means--unless I am misunderstanding the underlying mathematical theory and my education in Gestalts of perception--is that sensors with very high bit depth and software processing at 32 bits, isn't going to give anything extra in the end. These things are sort of like marketing gimmicks, in the same way camera companies sell novice prosumers cameras with super high ISO ranges and megapixel counts. Except . . . not exactly. If monitors could match their output to the sensor's bit depth or software's capacity, then it would make a difference. But at this point in time, not much hardware is designed to work with images that would be gigabytes in size. I am using a BenQ monitor that was made for photo/video editing and it only operates at 10 bits. I don't know of any 32 bit monitors, though maybe they are out there somewhere. I shudder at the thought of the requirements that would place on memory size and speed.
      So, in practice, there isn't much point in going beyond your sensor's bit rating. It doesn't yield an actual benefit. If it did, you wouldn't be able top perceive it anyway. And the monitor will crush everything down to 8 to 10 bits in the end.
      Richard Wright wrote several excellent articles on this in his Sky & Telescope blog. Linked is the key article: skyandtelescope.org/astronomy-blogs/imaging-foundations-richard-wright/astrophotography-bits-bytes-dynamic-range/#:~:text=A%2016%2Dbit%20image%20would,bit%20data%20from%20their%20camera.

  • @ritacastil
    @ritacastil Před měsícem

    Thanks a lot for sharing your workflow! I am following along as best as I can with the needed adaptations for a single dual narrow band filter. I have never used affinity photo, and I stumbled in getting the curves tab under the screenized layer. Can you let me know how do I get it on? I am eagerly waiting to get to the end of part 3 with my crescent nebula photo! All the best!

    • @SKYST0RY
      @SKYST0RY  Před měsícem

      Do you mean you put your curves layer in the wrong place? You can click and drag any layer up or down the layer stack. Adjustment layers will affect everything below them. To place an adjustment layer inside another layer (so it only affects that layer), drag the adjustment layer right on top of the other layer till the layer is highlighted, then let go.

    • @ritacastil
      @ritacastil Před měsícem

      ⁠@@SKYST0RYhi, no I meam i do not know where the curves are! I do not see theat window. Tks

    • @SKYST0RY
      @SKYST0RY  Před měsícem

      @@ritacastil In the menu up top, go Layer, New Adjustment Layer, Curves. But it is faster to use the shortcut menu beneath and right of the layer stack. Click the circle with the half moon in it to pull up the adjustment layers.

    • @ritacastil
      @ritacastil Před měsícem +1

      @@SKYST0RYYes!! Thanks!

    • @jdmatthew
      @jdmatthew Před měsícem +1

      @@SKYST0RY Thanks. After 20 minutes of looking your reply made it simple.