This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
ensp,position,residue | |
ENSP00000367263,1441,M | |
ENSP00000367263,1572,M | |
ENSP00000367263,5030,M | |
ENSP00000367263,5550,M | |
ENSP00000350990,1428,M | |
ENSP00000437271,1428,M | |
ENSP00000350990,1443,C | |
ENSP00000437271,1443,C | |
ENSP00000263801,551,M |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
## getJournalsFreq.R | |
## author: David Ochoa <dogcaesar@gmail.com> | |
## The script displays the number of manuscripts in each journal stored in the Mekentosj's Papers database | |
topN <- 20 #Top journals to display | |
lastyears <- 5 #Last n years | |
journalsquery <- "osascript -ss -e 'tell application \"Papers\" to get bundle name of publication items'" | |
yearsquery <- "osascript -e 'tell application \"Papers\" to get publication year of publication items'" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import rdflib | |
import pandas as pd | |
## Path to efo local or remote | |
owlpath = "https://github.com/EBISPOT/efo/releases/download/v3.18.0/efo_otar_slim.owl" | |
## File downloaded from | |
## https://docs.google.com/spreadsheets/d/1CV_shXJy1ACM09HZBB_-3Nl6l_dfkrA26elMAF0ttHs/edit | |
mappingFile = "/Users/ochoa/Downloads/OTAR project EFO mappings for disease profile pages - Sheet1-2.csv" | |
intraneturl = "http://home.opentargets.org/" | |
outputFile = "/Users/ochoa/Desktop/output.json" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import argparse | |
import array | |
import struct | |
import pyspark.sql.functions as F | |
from pyspark import SparkConf | |
from pyspark.sql import SparkSession | |
from pyspark.sql import DataFrame, StructType, ArrayType, StringType | |
from typing import Iterable | |
from functools import reduce |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/usr/bin/env python3 | |
# Import relevant libraries for HTTP request and JSON formatting | |
import requests | |
import json | |
# Set gene_id variable | |
gene_id = "ENSG00000244734" | |
# Build query string |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from pyspark.sql import SparkSession | |
from pyspark.sql import Window as W | |
from pyspark.sql import functions as F | |
# genetics portal studies | |
studiesPath = "/Users/ochoa/Datasets/study-index/" | |
failedPath = "/Users/ochoa/Datasets/failedEvidence" | |
evidenceFailedPath = "/Users/ochoa/Datasets/evidenceFailed" | |
diseasePath = "/Users/ochoa/Datasets/diseases" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import pyspark.sql.functions as F | |
from pyspark import SparkConf | |
from pyspark.sql import SparkSession | |
sparkConf = SparkConf() | |
spark = ( | |
SparkSession.builder | |
.config(conf=sparkConf) | |
.master('local[*]') |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from pyspark.sql import SparkSession | |
from pyspark.context import SparkContext | |
from pyspark.sql import functions as F | |
from pyspark.ml.feature import Word2VecModel | |
from pyspark.sql.types import DoubleType | |
from pyspark.ml.feature import Normalizer | |
# establish spark connection | |
spark = ( | |
SparkSession.builder |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from pymol import cmd | |
cmd.load("/Users/ochoa/Downloads/AF-Q9HC29-F1-model_v1.cif") | |
cmd.bg_color("grey95") | |
cmd.set_color("veryHigh", [0, 83, 214]) | |
cmd.set_color("confident", [101, 203, 243]) | |
cmd.set_color("low", [255, 219, 19]) | |
cmd.set_color("veryLow", [255, 125, 69]) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import pyspark.sql.functions as F | |
from pyspark import SparkConf | |
from pyspark.sql import SparkSession | |
sparkConf = SparkConf() | |
sparkConf = sparkConf.set('spark.hadoop.fs.gs.requester.pays.mode', 'AUTO') | |
sparkConf = sparkConf.set('spark.hadoop.fs.gs.requester.pays.project.id', 'open-targets-eu-dev') | |
spark = ( | |
SparkSession.builder |
OlderNewer