Beiträge von PWC

    Is there a way to create user presets/defaults that one can later select in the Voukoder GUI?

    Lets say by creating a local json or xml file which is also loaded when the GUI starts?

    I know that it is possible to store presets in the individual programs (Premiere Pro, and so on) - but that means that when using multiple programs (connected to the Voukoder installation) these presets have to be maintained at every program.

    Thanks for considering!

    Cool!

    P.S. It's no problem if I get an error on unsupported GPUs. When using such hw-features one has to know what the GPU capabilities are of course ;) The only alternative is either implementing complicated capability checks (not really necessary if you ask me) or losing such wonderful functionality...

    Hello,

    first or all: great project. I don't know what I would do without it... It puts the default encoding capabilities in various video editing software to shame...

    As to my issue:

    I have an Nvidia 1080Ti which in higher than 8 bit color depth not only supports YUV 4:2:0 hardware encoding but also 4:4:4.

    ffmpeg.exe -h encoder=hevc_nvenc

    >Supported pixel formats: yuv420p nv12 p010le yuv444p p016le yuv444p16le bgr0 rgb0 cuda d3d11

    When using ffmpeg from command line I get an output that has 4:4:4.

    That means that even if I have 4:2:2 12bit footage like yuv444p12le nvenc will auto-switch to the higher pixel-format without quality loss:

    ffmpeg.exe -i in.mov -c:v hevc_nvenc -preset p7 -tune hq -rc vbr -multipass:v fullres -qmin:v 15 -qmax:v 20 -pix_fmt yuv444p12le -an "out.mkv"

    >Incompatible pixel format 'yuv444p12le' for codec 'hevc_nvenc', auto-selecting format 'yuv444p16le'

    I realize that it will cost more disk space in such an Incompatible pixel format event, but at least I do not loose any quality.


    But when I select HEVC (NVIDIA NVENC) In Voukoder the only thing I can select is 8bit or 10bit. Not the chroma subsampling. Not 12bit.

    Not the pixel format itself. Then I end up with a 4:2:0 encoded 10bit video, loosing both quality in color depth as well as chroma subsampling.

    When selecting CPU encoding there is a detailed dropown for the pixel format (4:2:0 10bit, 4:2:2 10bit, 4:4:4 10bit and so on) - but encoding on the CPU takes ages......

    Would it be possible to add support for that?

    If not a specific chroma subsampling option (as there might be lots of combinations depending on the hardware)

    https://developer.nvidia.com/video-encode-a…port-matrix-new

    Maybe at least an option to specify pix_fm manually as filter parameter?

    Thanks a lot in advance!