Skip to content

routineLife1/VS-DistilDRBA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

VS-DistilDRBA

vapoursynth version for DistilDRBA based on vs-rife.

This project is modified from HolyWu/vs-rife and achieves nearly the same interpolation quality as the original DistilDRBA project.

With TensorRT integration, it achieves a 400% acceleration, enabling real-time playback on high-performance NVIDIA GPUs.

Dependencies

trt requires additional packages: (If encountering issues, you may skip installing this dependency and specify trt=False when using drba.)

To install the latest stable version of PyTorch, Torch-TensorRT and cupy, run:

pip install -U packaging setuptools wheel
pip install -U torch torchvision torch-tensorrt --index-url https://download.pytorch.org/whl/cu128 --extra-index-url https://pypi.nvidia.com

Installation

pip install -U vsdrba_distilled==1.0.0

Note: Please make sure to install all dependencies listed in the Dependencies section before performing the steps in the Installation section.

If you want to download all models at once, run python -m vsdrba_distilled. If you prefer to only download the model you specified at first run, set auto_download=True in drba_distilled().

Usage

from vsdrba_distilled import drba_distilled
ret = drba_distilled(clip, trt=True, factor_num=2, factor_den=1, scale=1.0, model="v1", auto_download=True)

See __init__.py for the description of the parameters.

Benchmarks

model scale os hardware arch speed(fps) 720 speed(fps) 1080 vram 720 vram 1080 backend verified output batch level streams threads trtexec shape precision usage
drba_distilled v1 2x Linux rtx5070 / 14600kf drba_distilled 251 115 1.8gb 2.9gb torch+trt cu128 yes, works 1 5 - 1 static RGBH drba_distilled(clip, trt=True, model="v1", trt_optimization_level=5)
drba_distilled v2_lite 2x Linux rtx5070 / 14600kf drba_distilled 999+ 700 - - torch+trt cu128 yes, works 1 5 - 1 static RGBH drba_distilled(clip, trt=True, model="v1", trt_optimization_level=5)

🤗 Acknowledgement

This project is supported by SVFI Development Team.

About

vapoursynth version for DistilDRBA

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published