Created
December 22, 2022 14:12
-
-
Save fakufaku/d395f5aa5e5c5fa07c7cbcf51d543413 to your computer and use it in GitHub Desktop.
Matching the STFT of paderbox and torch
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import torch | |
import paderbox | |
import numpy as np | |
from scipy.signal import hamming, blackman, get_window, hann | |
import matplotlib.pyplot as plt | |
f = 1500.0 # exactly periodic | |
fs = 48000.0 | |
nfft = 512 | |
hop = 128 | |
demo = np.sin(2 * np.pi * f / fs * np.arange(fs)) | |
# To make the comparison easier, we pad the input signal to | |
# make it a multiple of the FFT size | |
pad_size = nfft - hop | |
padding = np.zeros(pad_size) | |
demo = np.concatenate([demo, np.zeros(pad_size)]) | |
# torch | |
# - use `center=False` | |
# - zero-pad the front of the signal | |
demo_lr = np.concatenate([padding, demo]) | |
demo_lr = torch.from_numpy(demo_lr) | |
win_pt = torch.hamming_window(nfft, dtype=demo_lr.dtype) | |
PT = torch.stft( | |
demo_lr, | |
n_fft=nfft, | |
hop_length=hop, | |
window=win_pt, | |
return_complex=True, | |
pad_mode="constant", | |
center=False, | |
) | |
recon_pt = torch.istft(PT, n_fft=nfft, hop_length=hop, window=win_pt, center=False) | |
recon_pt = recon_pt[pad_size:] | |
recon_pt = recon_pt.numpy() | |
PT = PT.numpy() | |
print("torch: reconstruction exact ?", np.allclose(recon_pt, demo)) | |
# paderbox | |
AR = paderbox.transform.stft( | |
demo_lr, | |
size=nfft, | |
shift=hop, | |
fading=False, | |
window=hamming, | |
# window=hamming_win, | |
# symmetric_window=True, | |
) | |
recon_ar = paderbox.transform.istft( | |
AR, size=nfft, shift=hop, fading=False, window=hamming | |
) | |
AR = AR.T | |
recon_ar = recon_ar[pad_size:] | |
print( | |
"pader: reconstruction exact ?", | |
np.allclose(recon_ar, demo), | |
) | |
print("difference between librosa's and pra's STFT", abs(PT - AR).max()) |
I usually use a Hamming window for separation...
…On Sat, Dec 24, 2022, 18:58 Desh Raj ***@***.***> wrote:
***@***.**** commented on this gist.
------------------------------
I also reported the same issue! pytorch/pytorch#91309
<pytorch/pytorch#91309> Does it have to be
Blackman ? You can also use center=True, which has slightly different
padding, but should not change quality.
I will try out other options and see how they compare in terms of
downstream WER.
—
Reply to this email directly, view it on GitHub
<https://gist.github.com/d395f5aa5e5c5fa07c7cbcf51d543413#gistcomment-4412594>
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAIQ7NILBT55KZAXM663HEDWO3CL3BFKMF2HI4TJMJ2XIZLTSKBKK5TBNR2WLJDHNFZXJJDOMFWWLK3UNBZGKYLEL52HS4DFQKSXMYLMOVS2I5DSOVS2I3TBNVS3W5DIOJSWCZC7OBQXE5DJMNUXAYLOORPWCY3UNF3GS5DZVRZXKYTKMVRXIX3UPFYGLK2HNFZXIQ3PNVWWK3TUUZ2G64DJMNZZDAVEOR4XAZNEM5UXG5FFOZQWY5LFVEYTCOJZGU3TANZRU52HE2LHM5SXFJTDOJSWC5DF>
.
You are receiving this email because you authored the thread.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>
.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I will try out other options and see how they compare in terms of downstream WER.