Instantly share code, notes, and snippets.

View neurolincs_gaps.csv
Experiment Well ObjectTrackID
AB-CS47iTDP-Survival A1 7
AB-CS47iTDP-Survival A1 11
AB-CS47iTDP-Survival A1 11
AB-CS47iTDP-Survival A10 4
AB-CS47iTDP-Survival A12 8
AB-CS47iTDP-Survival A9 5
AB-CS47iTDP-Survival B10 3
AB-CS47iTDP-Survival B9 7
AB-CS47iTDP-Survival B9 14
View duplicated_indices.csv
Experiment Well ObjectTrackID
AB-CS47iTDP-Survival D1 1
AB-CS47iTDP-Survival D1 2
AB-CS47iTDP-Survival D1 3
AB-CS47iTDP-Survival D1 4
AB-CS47iTDP-Survival D1 5
AB-CS47iTDP-Survival D1 6
AB-CS47iTDP-Survival D1 7
AB-CS47iTDP-Survival D1 8
AB-CS47iTDP-Survival D1 9
View mhealthtools_demo.R
library(mhealthtools)
head(mhealthtools::accelerometer_data)
cleaned_sensor_data <- mhealthtools:::preprocess_sensor_data(
mhealthtools::accelerometer_data,
window_length = 256,
sampling_rate = 100,
frequency_range = c(1,25),
time_range = c(1,9))
cleaned_sensor_data
transformation <- transformation_window(
View score_by_task.py
from __future__ import division, print_function
import synapseclient as sc
import pandas as pd
import numpy as np
import argparse
import LDopaScorer
TRAINING_TABLE = 'syn10495809'
TESTING_TABLE = 'syn10701954'
View SageMountSharedEFS.md

Mounting an EFS Instance From an External Account

Only First time:

  1. Send Thaneer your AWS account ID and he will add you to the trusted entities for the IAM role that he has created.
  2. If you don't have one already, create an IAM user. Permissions aren't relevant since you will be switching roles as soon as you sign in as the IAM user, but you may want to omit any permissions so that you don't accidentally create an instance before switching roles.

Every time

  1. When you are signed in as an IAM user under your root account, go to this link, enter a display name, (pick your favorite color), and click "Switch Role" https://signin.aws.amazon.com/switchrole?roleName=mPowerEFSRole&account=389689814525
View BatchTipsAndTricks.md

AWS Batch Tips and Tricks

Working with AWS Batch works roughly like this:

​ Compute Environments > Job Queues ⊥ Job Definitions > Jobs

Meaning compute environments and job queues are configured independently of job definitions and jobs — though if you're going to create a job, you're going to need a job definition, and if you're going to create a job queue, it's going to need a compute environment to run within.

Job Definitions > Jobs

View sf12v2.R
#' SF12v2 questionnaire scoring
library(synapseClient)
synapseLogin()
healthSurveyId <- "syn10278768"
healthSurvey <- synTableQuery(paste("select * from", healthSurveyId))@values
questionCols <- names(healthSurvey)[12:23]
sf <- healthSurvey[c("recordId", "healthCode", "dataGroups", questionCols)] %>%
filter(dataGroups %in% c("beta_thalassemia", "myelodysplastic_syndrome", "myelofibrosis")) %>% na.omit()
sf12v2 <- function( X = NULL ) {
View QC.Rmd
title output date
JourneyPRO - on-going summary
html_document html_notebook pdf_document
default
default
default
`r format(Sys.time(), '%d %B, %Y')`
View exportMD5.py
import bridgeclient
import hashlib
import pandas as pd
import synapseclient as sc
import argparse
SYNAPSE_TABLES = {
'journey-pro': 'syn11439373',
'elevate-ms': 'syn11439398',
'lilly-presence': 'syn11445782'}
View plotHeartRate.py
import synapseclient as sc
import matplotlib.pyplot as plt
import pandas as pd
import argparse
def read_args():
parser = argparse.ArgumentParser(description='Plot camera vs oximeter readings.')
parser.add_argument('medianData')
return parser.parse_args()