-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CUGAN models doesnt work on RTX 3090 #4
Comments
I have a 3090 as well, used TensorRT with mkv and got audio only. If I can get output from Direct or NCNN then maybe TensorRT has an issue. |
TensorRT gives me CUDA out of memory error every time, Direct and NCNN i get output but very buggy lots of green screen artifacts etc etc |
So the one setting I wasn't looking out for was "allow videos in output path to be overwritten" Disabling this at least got me as far as actually seeing the conversion in real time on console output. I'm going to see if I hit the CUDA memory limit but so far that's not happening as it's only using less than 4gb of my vram. Are you using NVENC? |
Tried HEVC , x265 , x264 all 3 of em have artifacts for me using cugan... |
I'm wondering about your cpu as well, it's possible there is a bottleneck causing additional processing to go towards your vram. I have 12 cores and the vram usage is pretty low. |
i9-9900K , 64 GB RAM |
Maybe share the output in a txt file and look for any errors in the console |
Forgot to ask, are you using compact or ultracompact? |
I tried Real-CUGAN, which is not working on an RTX4080. If anyone has a good onnx model and/or TensorRT settings that work for Real-CUGAN, I would love to give it a try. |
using TensorRT every CUGAN model gives CUDA out of memory on my RTX 3090 24 GB VRAM, Direct or NCNN works but lots of encoding errors in the video
The text was updated successfully, but these errors were encountered: