Skip to content

Instantly share code, notes, and snippets.

View dkincaid's full-sized avatar

Dave Kincaid dkincaid

View GitHub Profile
@dkincaid
dkincaid / kernel-log-bottom-board.txt
Created March 17, 2017 19:19
Kernel logs for miner
[ 0.000000] Booting Linux on physical CPU 0x0
[ 0.000000] Linux version 3.14.0-xilinx-gf387dab-dirty (lzq@armdev01) (gcc version 4.8.3 20140320 (prerelease) (Sourcery CodeBench Lite 2014.05-23) ) #38 SMP PREEMPT Fri Jun 17 20:02:51 CST 2016
[ 0.000000] CPU: ARMv7 Processor [413fc090] revision 0 (ARMv7), cr=18c5387d
[ 0.000000] CPU: PIPT / VIPT nonaliasing data cache, VIPT aliasing instruction cache
[ 0.000000] Machine model: Xilinx Zynq
[ 0.000000] cma: CMA: reserved 128 MiB at 27800000
[ 0.000000] Memory policy: Data cache writealloc
[ 0.000000] On node 0 totalpages: 258048
[ 0.000000] free_area_init_node: node 0, pgdat c06e4600, node_mem_map e6fd8000
[ 0.000000] Normal zone: 1520 pages used for memmap
@dkincaid
dkincaid / pdp_text_example.R
Created February 8, 2017 22:48
An example of an xgboost ML problem in R to show issue with pdp package in this circumstance
library(text2vec)
library(xgboost)
library(pdp)
# Create the document term matrix (bag of words) using the movie_review data frame provided
# in the text2vec package (sentiment analysis problem)
data("movie_review")
# Tokenize the movie reviews and create a vocabulary of tokens including document counts
vocab <- create_vocabulary(itoken(movie_review$review,
@dkincaid
dkincaid / gist:9924418
Last active May 1, 2022 10:35
Java 8 Stack Trace on Protege 4.3
ERROR: Bundle org.protege.common [1] Error starting file:/home/davek/apps/Protege_4.3/bundles/org.protege.common.jar (org.osgi.framework.BundleException: Unresolved constraint in bundle org.protege.common [1]: Unable to resolve 1.0: missing requirement [1.0] osgi.wiring.package; (&(osgi.wiring.package=org.w3c.dom)(version>=0.0.0)))
org.osgi.framework.BundleException: Unresolved constraint in bundle org.protege.common [1]: Unable to resolve 1.0: missing requirement [1.0] osgi.wiring.package; (&(osgi.wiring.package=org.w3c.dom)(version>=0.0.0))
at org.apache.felix.framework.Felix.resolveBundleRevision(Felix.java:3818)
at org.apache.felix.framework.Felix.startBundle(Felix.java:1868)
at org.apache.felix.framework.Felix.setActiveStartLevel(Felix.java:1191)
at org.apache.felix.framework.FrameworkStartLevelImpl.run(FrameworkStartLevelImpl.java:295)
at java.lang.Thread.run(Thread.java:744)
Error: Could not parse XML contribution for "org.eclipse.equinox.registry//plugin.xml". Any contributed extensions and exte
@dkincaid
dkincaid / server.R
Created February 2, 2014 21:51
Tycho data Shiny web app
library(shiny)
fileUrl <- url("http://s3.amazonaws.com/data-excursions/states_cases.Rda")
load(fileUrl)
diseases <- unique(as.character(states_cases$disease))
states <- unique(as.character(states_cases$state))
shinyServer(function(input, output) {
@dkincaid
dkincaid / gist:7652146
Created November 26, 2013 01:42
Cascalog 2.0 stack trace
cascading.pipe.OperatorException: [d9b750b1-28f6-4c53-a32...][cascading.pipe.assembly.AggregateBy.initialize(AggregateBy.java:564)] operator Each failed executing operation
at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:107)
at cascading.flow.stream.FunctionEachStage.receive(FunctionEachStage.java:39)
at cascading.flow.stream.FilterEachStage.receive(FilterEachStage.java:73)
at cascading.flow.stream.FilterEachStage.receive(FilterEachStage.java:34)
at cascading.flow.stream.FunctionEachStage$1.collect(FunctionEachStage.java:80)
at cascading.tuple.TupleEntryCollector.safeCollect(TupleEntryCollector.java:119)
at cascading.tuple.TupleEntryCollector.add(TupleEntryCollector.java:107)
at com.idexx.lambda.hadoop.jobs.assembler.EntityAssembler$ExtractFields.operate(EntityAssembler.java:121)
at cascalog.CascalogFunctionExecutor.operate(CascalogFunctionExecutor.java:41)
@dkincaid
dkincaid / MinRTF-control-rule.g4
Last active December 22, 2015 09:58
RTF Parser Early Tests
grammar MinRtf ;
document : (control | text )+ ;
text : TEXT ;
control : KEYWORD INT? SPACE? ;
KEYWORD : '\\' (ASCIILETTER)+ ;
@dkincaid
dkincaid / WordCount.java
Last active September 6, 2018 03:50
Hadoop job remote submission
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
@dkincaid
dkincaid / ClientEmailQuery.java
Last active December 10, 2015 17:48
Example of an issue with trying to use two PailTap's reading from the same Pail in a query.
/* If I execute only the clientQuery or only the emailQuery by themselves everything works right.
I set breakpoints inside the ExtractClientEdgeFields() and ExtractClientId() functions and they
are called with only the Data objects with the correct property types.
However, if I execute this query as it is shown here then only one of the two functions is called
with all of the Data objects from both taps. */
public static Subquery clientEmail(String pailPath) {
PailTap clientEdgeTap = clientEdgeTap(pailPath);
PailTap clientTap = petOwnerTap(pailPath);
@dkincaid
dkincaid / gist:3277712
Created August 6, 2012 19:19
Change Name
public MutableClass changeName(MutableClass oldNameClass, String newName) {
MutableClass newNameClass = new MutableClass();
newNameClass = oldNameClass;
newNameClass.setName(newName);
return newNameClass;
}
@dkincaid
dkincaid / gist:3277619
Created August 6, 2012 19:08
Fixed mutable state test
@Test
public void changeNameTest() {
MutableClass original_name = new MutableClass("my name");
MutableClass expected_name = new MutableClass("my name");
NameFilter filter = new NameFilter();
MutableClass new_name = filter.changeName(original_name,
"new name");
assertEquals(new_name, expected_name);