Skip to content

Instantly share code, notes, and snippets.

View JosephSBoyle's full-sized avatar

JosephSBoyle

  • Canon Medical Research
  • London, England
View GitHub Profile
@JosephSBoyle
JosephSBoyle / tex.py
Created July 16, 2025 10:42
Latex Commands (variables) from Python
"""Library for generating latex commands for use as variables.
Example:
>>> generate_newcommand2(prefix="", name="logisticRegressionAccuracy", value=77.779)
\newcommand{logisticRegressionAccuracy}{77.8}
This is most useful if you dump the outputs into a file, e.g.:
```variables.tex
@JosephSBoyle
JosephSBoyle / init.lua
Created April 22, 2025 09:54
My current nvim config.
-- Basic, bootstrappable neovim config for Python programming
-- place in `~/.config/nvim/`
local lazypath = vim.fn.stdpath("data") .. "/lazy/lazy.nvim"
if not vim.loop.fs_stat(lazypath) then
vim.fn.system({
"git", "clone", "--filter=blob:none",
"https://github.com/folke/lazy.nvim.git", lazypath
})
end
@JosephSBoyle
JosephSBoyle / specificity_score.py
Last active December 28, 2024 15:55
Specificity score
from sklearn.metrics import recall_score
# Specificity is just recall for the negative class
#
# This is a valid scorer object and can be used as a `scorer` in, for example,
# sklearn.model_selection.cross_validate etc.
def specificity_score(y_true, y_pred):
# assumes that your negative class has label `0`.
return recall_score(y_true, y_pred, pos_label=0)
@JosephSBoyle
JosephSBoyle / logging_utils.py
Last active April 12, 2024 12:42
log to a new file in '/log' each tiome the script is run
import logging
# Set up the logger
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
# Create a file handler and set its log level
from datetime import datetime
from pathlib import Path
@JosephSBoyle
JosephSBoyle / tensor_shapes.py
Last active June 3, 2023 14:34
Print the shape of any tensors and arrays in the current stack
print(f"\nArray Shapes:\n{'':=>50}"); [print(f"{k:<20}: {str(type(v).__name__):<7}:", tuple(v.shape)) for k, v in locals().items() if hasattr(v, "shape") and not hasattr(getattr(v, "shape", None), "__call__")] # Exclude the 'numpy' import, the `shape` func. isn't relevant
# Array Shapes:
# ==================================================
# input_ids           : Tensor : (8, 1, 250)
# attention_mask      : Tensor : (8, 1, 250)
# labels              : Tensor : (8, 8921)
# input_ids1          : Tensor : (8, 250)
# x0                  : Tensor : (8, 250, 768)
# x1                  : Tensor : (8, 50, 251)
# x2                  : Tensor : (8, 251, 50)
@JosephSBoyle
JosephSBoyle / cache.py
Last active May 3, 2023 11:44
Wrapper around `transformers.Dataset.map` that only uses the cache if the function hasn't changed since the last run.
import functools
import hashlib
import inspect
from datasets import load_dataset
def cache_invariant(map_method):
"""Wrap the `map` method s.t it only caches the values if the `function` arg is the same."""
@functools.wraps(map_method)