Skip to content

Instantly share code, notes, and snippets.

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>Ansi 0 Color</key>
<dict>
<key>Blue Component</key>
<real>0.30978870391845703</real>
<key>Green Component</key>
<real>0.30978870391845703</real>
#!/bin/bash
conda env list | grep -v "^#" | grep -v "^$" | cut -f1 -d' ' | sort > all.envs
for env in `cat all.envs | grep -v "base" | grep -v "pytorch_p36"`
do
echo $env
conda remove --name $env --all --yes
done
rm *.envs

Keybase proof

I hereby claim:

  • I am delip on github.
  • I am deliprao (https://keybase.io/deliprao) on keybase.
  • I have a public key ASAPFnKwplWieaH8pNWHwWKgqEOIQ6GNZAI5K6MbBK3kzgo

To claim this, I am signing this object:

@delip
delip / awsdu.sh
Created February 16, 2018 20:48
`du -sh *` like script for an AWS bucket with mutliple keys
#!/bin/sh
BUCKET=$1
for obj in `aws s3 ls s3://$BUCKET/ | perl -pe 's/.*(PRE )(.*)/$2/'`
do
size=`aws s3 ls s3://$BUCKET/$obj --recursive --human --summarize | grep "Total Size" | cut -f2 -d':' | perl -pe 's/^\s+//g'`
printf "$obj\t$size\n"
done
@delip
delip / EmbeddedSolrExample.java
Last active April 20, 2016 07:25
Solr 4.4.0 EmbeddedSolrServer example: Indexing and Querying
import org.apache.commons.io.FileUtils;
import org.apache.commons.io.filefilter.TrueFileFilter;
import org.apache.solr.client.solrj.embedded.EmbeddedSolrServer;
import org.apache.solr.client.solrj.response.QueryResponse;
import org.apache.solr.common.SolrDocument;
import org.apache.solr.common.SolrInputDocument;
import org.apache.solr.common.params.CommonParams;
import org.apache.solr.common.params.ModifiableSolrParams;
import org.apache.solr.common.params.SolrParams;
import org.apache.solr.core.CoreContainer;
@delip
delip / run.log
Last active December 20, 2015 23:39
Run log
/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -Didea.launcher.port=7534 -Didea.launcher.bin.path=/Applications/Cardea-IU-130.1365.app/bin -Dfile.encoding=UTF-8 -classpath /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/deploy.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/dt.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/javaws.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/jce.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/jconsole.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/management-agent.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/plugin.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/lib/sa-jdi.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/charsets.jar:/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Classes/classes.jar:/System/Library/Jav
@delip
delip / schema.xml
Created August 12, 2013 19:46
schema.xml used by EmbeddedSolrServer
<?xml version="1.0" encoding="UTF-8" ?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
@delip
delip / Homebrewconfig.log
Created September 8, 2012 06:06
config.log after brew install gpg failed
This file contains any messages produced by compilers while
running configure, to aid debugging if configure makes a mistake.
It was created by gnupg configure 1.4.12, which was
generated by GNU Autoconf 2.68. Invocation command line was
$ ./configure --disable-dependency-tracking --prefix=/usr/local/Cellar/gnupg/1.4.12 --disable-asm
## --------- ##
## Platform. ##
@delip
delip / spark-setup-local.sh
Last active August 29, 2015 14:18
Spark setup (local)
#/bin/sh
# select the correct version of spark and hadoop
# (depends on your situation but go with the latest if possible).
# spark.apache.org/downloads.html
tar -xvzf spark-1.3.0-bin-hadoop2.4.tgz
ln -s spark-1.3.0-bin-hadoop2.4 spark
export SPARK_HOME=`pwd`/spark