Skip to content

Instantly share code, notes, and snippets.

Avatar

Leonardo Collado-Torres lcolladotor

View GitHub Profile
View 2020-09-11_plotly.R
## For https://docs.google.com/document/d/1iWKnvbn6wGS66rVTRSGkFcJyIKccWTCniN5CWvYuvfw/edit?usp=sharing
## Install R packages if needed
# if (!requireNamespace("remotes", quietly = TRUE)) {
# install.packages("remotes")
# }
# remotes::install_cran("ggplot2")
# remotes::install_cran("plotly")
# remotes::install_cran("sessioninfo")
# remotes::install_cran("BiocManager")
View 2020-08-28_intro_to_shiny.R
## LIBD rstats club 2020-08-28
## https://docs.google.com/document/d/1bsq-1FfHkgoviECdhlA594T_G-kA2B88JMI5485Z4E4/edit?usp=sharing
library("shiny")
## From https://shiny.rstudio.com/articles/basics.html
# Define UI for app that draws a histogram ----
ui <- fluidPage(
# App title ----
View recountWorkshop2020_questions.md

Chat messages

Hi! Tomorrow is the workshop! ^^ I'm hope that you are excited about it. You can find the materials at http://research.libd.org/recountWorkshop2020/index.html. I'll start the workshop with some slides about the recount2 project & friends, then we'll run some of the code in the workshop (it was originally designed for a 2 hour workshop). I'll finish with a few slides about the future and then we can have a Q & A session, though you are more than welcome to continue asking questions beyond the workshop through the different venues (Bioconductor support site for package questions, GitHub issues for feature requests, Slack for informal questions, etc). See you tomorrow! Best, Leo

The slides are available at https://speakerdeck.com/lcolladotor

Questions and answers:

  • "how does scale_count fun work?" As described in the workshop and associated recountWorkflow, it uses the area under the coverage (AUC) and the base-pair coverage counts. In practice, this data is stored in the RangedSummarized
View 2020-07-24_r_functions.R
library("purrr")
# View(mtcars)
.x <- mtcars[[1]]
mean(.x)
map_dbl(mtcars, ~ mean(.x))
map_dbl(mtcars, mean)
# rnorm()
# args(rnorm)
View incar.R
## From https://docs.google.com/document/d/1CSCPUfHGMoa_QTDf6EUsFMYVTiA5GP8OyeIGCyTVbLI/edit?usp=sharing
if (!requireNamespace("remotes", quietly = TRUE)) {
install.packages("remotes")
}
remotes::install_cran(
c(
"tidyverse", "gganimate", "maps", "knitr", "kableExtra",
"scales", "lubridate", "RColorBrewer"
)
@lcolladotor
lcolladotor / debug_pkgdown.R
Created May 21, 2020
Weird issue with finding gitconfig with git on Windows through processx (used internally by pkgdown)
View debug_pkgdown.R
##################### Steps for reproducing this problem ######################
## Create a package
usethis::create_package("~/Desktop/testgit")
## Set up git and say yes to the prompts
usethis::use_git()
## Create a file
View 2020-04-24_ggplot2_and_related_packages_rstats_club.R
View macOS_build_regionReport_1.2.8.txt
$ R CMD build --keep-empty-dirs --no-resave-data regionReport
Loading required package: colorout
* checking for file ‘regionReport/DESCRIPTION’ ... OK
* preparing ‘regionReport’:
* checking DESCRIPTION meta-information ... OK
* installing the package to build vignettes
* creating vignettes ... OK
* checking for LF line-endings in source and make files and shell scripts
* checking for empty or unneeded directories
* building ‘regionReport_1.21.8.tar.gz’
View 2020_04_03_regionReport_vignette_output.txt
## Generate the HTML report
report <- derfinderReport(
prefix = "report", browse = FALSE,
nBestRegions = 15, makeBestClusters = TRUE, outdir = "html",
fullCov = list("21" = genomeDataRaw$coverage), optionsStats = optionsStats
)
## Registered S3 method overwritten by 'GGally':
## method from
## +.gg ggplot2
## Writing 9 Bibtex entries ... OK
View log_step1.txt
$ java -jar ~/Desktop/bfg-1.13.0.jar --strip-blobs-bigger-than 600K .
Using repo : /Users/lcollado/Dropbox/Code/spatialLIBD/./.git
This repo has been processed by The BFG before! Will prune repo before proceeding - to avoid unnecessary cleaning work on unused objects...
Completed prune of old objects - will now proceed with the main job!
Scanning packfile for large blobs: 4459
Scanning packfile for large blobs completed in 133 ms.
You can’t perform that action at this time.