What are your output resolution and framerate?
Beiträge von Joe24
-
-
-
Does rendering the same project without using Voukoder produce a playable file?
-
To remove the session limit on Nvidia GeForce cards, search for the driver patch by keylase on GitHub. NB: You will have to re-apply the patch every time you update drivers or change graphics cards.
-
Is your problem system a pre-Haswell Intel processor, by any chance? And are you trying to use QSV encoding? Some library that FFmpeg uses appears to be incorrectly reporting QSV encoding capabilities of older Intel processors, and this is screwing up several programs including FFmpeg, Zoom, OBS, etc. when running on these systems.
This may or may not be your issue; I'm just throwing ideas out there. Have had trouble on Sandy/Ivy Bridge systems when using QSV processing in FFmpeg-based programs. Pre-QSV chips (Nehalem and prior) don't seem to have this problem, nor do Haswell and forward.
-
The reason that I recommended Quadro cards was because they were proven to be more stable than GeForce for rendering.
I guess you're referring to ECC? Probably not a concern for anybody whose budget is $200, and to me a Pascal Quadro is not worth giving up the improved Turing NVENC.
It's also pretty pointless to run ECC memory on your graphics card, but still run non-ECC memory on your computer mainboard. So add a bunch of server RAM into your budget if you want to get *that* carried away. Assuming your platform even supports server memory.
-
RTX 3060 12GB falls into the OP's requirement by their computing power & larger VRAM size initially.
Doesn't matter how much memory you have if your GPU is breathing through a straw. 3060 only has 192-bit memory bus, which is why it performs about the same as the cheaper 2060S. I wouldn't buy a 3060; I don't think they're worth the money. The lowest 30-series card I'd consider is a 3060 Ti.
I own many cards from 10-, 20-, and 30-series, everything from a 1060 to a 3090, including a 3060 Ti which has a 256-bit memory bus and is around 30% more card than a 3060. And yet my 2070 Super finishes my jobs significantly quicker than the 3060 Ti (which is not to say everybody's workload will behave the same). I suspect this is due to the 2070S's higher clock speed. Don't buy a card just so you can tell your mommy you have a 30-series; they're not necessarily better, as I've said multiple times already.
2080 Ti cards with their oddball 352-bit bus don't work well with my CUDA workload (Vegas/TMPGEnc). Maybe it's just my application. 2080 Ti is stable and I love the cards (I have 2 of them), but somehow they're way slower at this job than other cards with a 256-bit bus. I assume the odd bus-width doesn't mesh with the particular CUDA instructions my application uses. Which is another example of a hotter card not necessarily being better. Haven't tried a 10GB 3080 yet, though i suspect it would have the same issue.
I did some research.
Aka, you googled it. Yeah I spent a long time googling too, when i first started out. Then i realized that 99.5% of people on the Internet don't actually know anything, they just copy and pasted uninformed theories from OTHER people who don't actually know anything. Which realization is an important step in growing up. I bought several cards that the Internet swore were the best for Vegas. I was disappointed every time. So i just started buying various cards and running my own tests against my particular workload... and that's still what I'm doing as new tech comes along.
-
You really should point my fault out.
Do I have to? Okay, this is not productive or constructive for anybody, but here we go....
First of all, with a $200 budget and an 8000-series Intel processor, the OP is not a high-powered user. He's looking for hardware encoding and some light effects, by the sounds of it.
I explained the different generations of Nvidia cards, and why Ampere is a waste of money for encoding, and Lovelace (while somewhat better for encoding) is obviously way out of his price range. I suggested a card (2060S) at the price point he was looking for, and with what should be very good specs for his purposes.
You came in with a misinformed wall of text touting every high-powered card you'd ever heard of. Despite the facts that: this clearly wasn't what he was looking for, encoding wouldn't benefit from these cards (as I'd already explained), Vegas editing wouldn't benefit much from these cards, and they were also significantly out of his price range (a Quadro, when the guy asked about $200 cards? Really?).
I did not provide a bad answer compared to yours.
I answered his question, based on my own experience trying to improve Vegas' performance. You shouted me down by copy and pasting half the Internet, which didn't really apply to his situation or to Vegas.
... gluing humilation with context.
I don't speak whatever language this is.
You are actually recommending 40-series cards, which are NOT older than mine recommendation.
If you actually read what i wrote above, you'll see i was not recommending Lovelace cards to this gentleman. Quite the opposite. They do have better onboard NVENC encoders supposedly, but are significantly out of his price range and are not a good fit for him.
Nor is it simply a matter of card age. Which i explained (Pascal vs Turing vs Ampere). For instance, a 2060 Super has the same NVENC module and 24% more memory bandwidth than a 3060.
I will say that you were right on one thing: Faster processors make everything better, including Vegas. But the OP's rig isn't that bad for what it seems like he's doing, and on a $200 budget he's certainly not going to buy an entirely new platform and a 13900K or 7950X.
-
Vegas is notorious for not benefiting from hot graphics card hardware. Which you'd know if you had any experience with it. I've been working with Vegas for over a decade, on over a dozen different graphics cards.
I'm here to share what I know, because somebody asked. I didn't come here to waste time arguing with somebody who has no experience with Vegas and who clearly doesn't know what they're talking about.
-
"I haven't used Vegas Pro so I could only guess."
-
Regarding dedicated graphics card for encoding, low-end Nvidia Turing (except GTX 1630 and GTX 1650) cards are best at the moment. Any Nvidia card from 1650 Super and up.
Turing cards (16/20-series) encode notably better h.264/h.265 picture quality for the same bitrate vs. Pascal cards (10-series). Difference to me looks about the same as decreasing the Constant Quantizer setting by 1, but with the same filesize.
Ampere cards (30-series) use the same NVENC module as Turing. The NVDEC decoder has a slight improvement, but not the encoder.
Lovelace cards (40-series) have better NVENC modules, but are outside your $200 range at the moment.
Worth noting too that a higher-end card won't gain you much encoding performance. Turing and Ampere cards all have the same NVENC encode module, which has nothing to do with how big the GPU is, how many CUDA cores etc. Separate module. (If you're doing effects or filters etc. where you also use CUDA, that's different...) The only encoder difference between a 1650 Super and a 3090 Ti is memory bus width/speed; they both have the same NVENC module.
The RTX 2060 Super has a 256-bit memory bus with 14 Gbps GDDR6. Might be close to the sweet spot for you.
-
-
It's something to do with your resolution. I tried a known working project on my rig (VP15, 3060Ti, Voukoder 13.0.2, Nvidia driver v531.29DCH patched with keylase), and I can make it choke by changing either of the resolution values (width or height) to your values. Tried 2400x3600, 1920x3600, 2400x1080. In all 3 cases, I get the following error in the log when ffmpeg is being initialized:
FFmpeg: InitializeEncoder failed: invalid param (8): Invalid Level.
When setting resolution back to 1920x1080, everything works again.
According to the documentation, NVENC resolution limit for everything from Pascal to Lovelace is 8192x8192, so it shouldn't be an NVENC problem. Either FFmpeg doesn't like your resolution, or it's not receiving the correct syntax from Voukoder.
You could test for an FFmpeg limitation by CPU-rendering a small test file in Vegas and then trying to re-encode the file using FFmpeg and the NVENC encoder directly from command line.
-
-
-
-
-
-
So it's not viable for VP to use the x264/x265 libraries ONLY if they are present in the FFmpeg build? (Which would only happen if the user has chosen to download and compile their own version of FFmpeg, manually enabling these encoders.)
-
Is there a way to have the user download/install ffmpeg separately for free? This might remove the requirement for you to buy commercial licenses.
IIRC, Handbrake has an "update ffmpeg" function in it somewhere.