Video transforms#
Eztorch supports transforms from Torchaug and from Pytorchvideo.
On top of that, several video transforms have been defined.
Video transforms#
- class eztorch.transforms.video.RemoveTimeDim[source]#
Remove time dimension from tensor.
Suppose the tensor shape is [C,T,H,W].
- class eztorch.transforms.video.RandomResizedCrop(target_height, target_width, scale, aspect_ratio, shift=False, log_uniform_ratio=True, interpolation='bilinear', num_tries=10)[source]#
nn.Module
wrapper forpytorchvideo.transforms.functional.random_resized_crop
.
- class eztorch.transforms.video.RandomTemporalDifference(use_grayscale=True, absolute=False, p=0.2)[source]#
Randomly compute temporal differences from consecutives frames.
- Parameters:
use_grayscale (
Default:, optional) – IfTrue
, apply grayscale transformation before computing the temporal difference.Default:True
absolute (
Default:, optional) – IfTrue
, take the absolute value of the difference, else shift to the mean value (\(255\) for int, \(0.5\) for float) and divide by 2.Default:False
p (
Default:, optional) – The probability to compute temporal difference.Default:0.2
- class eztorch.transforms.video.TemporalDifference(use_grayscale=True, absolute=False)[source]#
Compute temporal differences from consecutives frames.
- Parameters:
use_grayscale (
Default:, optional) – IfTrue
, apply grayscale transformation before computing the temporal difference.Default:True
absolute (
Default:, optional) – IfTrue
, take the absolute value of the difference, else shift to the mean value (\(255\) for int, \(0.5\) for float) and divide by 2.Default:False
SoccerNet#
- class eztorch.transforms.video.soccernet.BatchReduceTimestamps[source]#
Aggregate successive timestamps of soccernet batch of clips information.
- class eztorch.transforms.video.soccernet.BatchMiddleTimestamps[source]#
Select the middle timestamps of soccernet batch of clips information.
Spot#
- class eztorch.transforms.video.spot.BatchMiddleFrames[source]#
Select the middle frame of spot batch of clips information.