Skip to content

Instantly share code, notes, and snippets.

View LeeSinLiang's full-sized avatar
🕶️
Studying

Architect LeeSinLiang

🕶️
Studying
View GitHub Profile
@LeeSinLiang
LeeSinLiang / top_p_top_k_filter.py
Created September 1, 2024 06:06
Top K and Top P filtering. Used for microGPT inference.
def top_k_top_p_filter(logits, top_k: int = 0, top_p: float = 0.0):
if top_k > 0:
filter = torch.topk(logits, min(top_k, logits.size(-1)))[0]
logits[logits < filter[:, [-1]]] = float('-inf')
if top_p > 0.0:
sorted_logits, sorted_indices = torch.sort(logits, descending=True)
cumulative_probs = torch.cumsum(
F.softmax(sorted_logits, dim=-1), dim=-1)
filter = cumulative_probs > top_p
filter[..., 1:] = filter[..., :-1].clone()
@LeeSinLiang
LeeSinLiang / models.py
Last active May 24, 2023 13:49
Enables nn.Sequential to accept multiple inputs, enhancing the flexibility of sequential neural network models.
import torch.nn as nn
class MultiInputSequential(nn.Sequential):
def forward(self, *inputs):
for module in self._modules.values():
if type(inputs) == tuple:
inputs = module(*inputs)
else:
inputs = module(inputs)
return inputs