Skip to content

Instantly share code, notes, and snippets.

View jdpigeon's full-sized avatar

Dano Morrison jdpigeon

View GitHub Profile
@jdpigeon
jdpigeon / index.html
Last active June 4, 2020 20:22
Animated line from random dataset
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8" />
<title>d3.js | SVG paths</title>
<script src="https://d3js.org/d3.v5.min.js"></script>
</head>
<body>
<script>
@jdpigeon
jdpigeon / machine.js
Last active January 3, 2020 22:20
Generated by XState Viz: https://xstate.js.org/viz
// Available variables:
// - Machine
// - interpret
// - assign
// - send
// - sendParent
// - spawn
// - raise
// - actions
@jdpigeon
jdpigeon / class_enum.ts
Created January 10, 2019 22:08
Typescript Class Enum Pattern
type BothOptions = 'raw_emg';
class Both {
public static RawEMG: Both = new Both('raw_emg');
private value: BothOptions;
private constructor(value: BothOptions) {
this.value = value;
}
public rawValue(): BothOptions {
return this.value;
}
@jdpigeon
jdpigeon / seedWithLatestFrom.js
Created August 13, 2018 19:01
RxJS withLatestFrom Operator seed
// One of the main 'gotchas' of combination operators like combineLatest and withLatestFrom is that they won't emit until all source observables emit at least once.
// In order to make your combination observables start immediately when they're subscribed to, use the startWith operator on each inner observable
const Rx = require("rxjs");
// emit every 5s
const source = Rx.Observable.interval(5000).startWith('begin');
// emit every 1s
const secondSource = Rx.Observable.interval(1000);
// withLatestFrom slower than source
const example = secondSource.pipe(
@jdpigeon
jdpigeon / gist:a09c7b44bd629661d1ab6949c55b25be
Created June 13, 2018 15:33
How to be more productive through neuroscience
@jdpigeon
jdpigeon / lessonsFromNTXStudentOpenChallenge.md
Created April 12, 2018 14:53
lessonsFromNTXStudentOpenChallenge

NeuroTechUofT

  • Encountered performance bottlenecks when running BCI paradigms in Python in real time. Overcame somewhat by using specific data structures such as trees
  • Had to use Muse 2014
  • multi-class motor classification seds estimation approach ? Ran into trouble with source localization using Muse (figures). Found paper using 8 channel EEG to classify hand gestures based on spectral data (freiburg)
  • With P300 speller, encountered significant marker variability that messed up ERP detection.
  • Signal quality issues with motor imagery task (muse?)

MINT

  • Implemented flanker test on Android w/ Muse
  • Looked for alpha and beta suppression in order to evaluate ADD
@jdpigeon
jdpigeon / museExperiments.md
Last active March 7, 2019 12:04
Validated EEG event-related potentials with Muse

(Somewhat) Validated Experiments

All experiments from muse-lsl's notebooks

Visual P300 with Oddball paradigm

The visual P300 is a spike that occurs 300ms after perceiving a visual stimulus that has implications on decision making. This was validated in Muse by AB with the Oddball paradigm, in which low-probability target items (oddballs) are interspersed with high probability non-target items. With AB's paradigm, the experiment takes about 10 minutes to run (5 x 2 minute trials). The best pipeline for classifying P300s after collecting a dataset (for use in BCI) was found to be .77 AUC with ERP Covariance + MDM (Riemannian Geometry based). This accuracy is apparently good but not outstanding as far as BCIs go.

Unfortunately, I've also heard from Hubert that, in testing 10 different people, some of them weren't able to get very good ERPs. This could be due to their neuroanatomy, as EEG expresses pretty differently between peopl

@jdpigeon
jdpigeon / filiFFT.js
Last active March 22, 2018 17:48
FFT with Fili
// Create FFT object
import { Fft } from "fili";
// Helpful functions
const nextPow2 = (num) => {
let pow = 1;
while(pow < num) {
pow *= 2
}
return pow
@jdpigeon
jdpigeon / conv.js
Last active February 27, 2023 08:29 — forked from PhotonEE/conv.js
Javascript implementation of convolution function
/* Returns the discrete, linear convolution of two vectors.
** Convolution in time/space is equivalent to multiplication in the frequency domain.
This function is equivalent to numpy's convolve function with the default 'full' parameter
example :
------
vec1 = [2,3,4]
vec2 = [1,2,3]