Skip to content

Instantly share code, notes, and snippets.

@s1ck
s1ck / wikipedia-articles.el
Created November 23, 2022 16:22
Edge list converted from memgraph wikipedia articles dataset.
This file has been truncated, but you can view the full file.
0 21319
0 3425
0 24070
0 21324
0 11450
0 24071
0 21455
0 17779
0 21634
0 1917
@s1ck
s1ck / WeaklyConnectedComponents.java
Created September 29, 2022 09:03
Wcc using stack-based DFS
package org.neo4j.minigds.connected_components;
import org.neo4j.minigds.api.Algorithm;
import org.neo4j.minigds.api.AlgorithmResult;
import org.neo4j.minigds.api.Graph;
import org.neo4j.minigds.utils.DirectToUndirectedGraphConverter;
import java.util.ArrayList;
import java.util.Collections;
import java.util.LinkedList;
@s1ck
s1ck / AuthorExample.java
Created May 12, 2018 11:21
Author-Publication-Graph to CoAuthor-Graph Example
package org.gradoop.examples.sna;
import org.apache.flink.hadoop.shaded.com.google.common.collect.Lists;
import org.gradoop.common.config.GradoopConfig;
import org.gradoop.common.model.impl.pojo.Edge;
import org.gradoop.common.model.impl.pojo.GraphHead;
import org.gradoop.common.model.impl.pojo.Vertex;
import org.gradoop.examples.AbstractRunner;
import org.gradoop.flink.io.impl.dot.DOTDataSink;
import org.gradoop.flink.model.api.epgm.GraphCollection;
@s1ck
s1ck / Neo4jGraphXExample.scala
Created March 8, 2018 20:13
Demonstrates loading a graph from Neo4j into CAPS and running a GraphX algorithm
// 1) Create CAPS session
implicit val session: CAPSSession = CAPSSession.local()
// 2) Load a graph from a running Neo4j instance.
implicit val neo4jConfig = Neo4jConfig(uri = URI.create("bolt://localhost"), password = Some("password"))
// 3) Create a data source
val neo4jSource = new Neo4jPropertyGraphDataSource(neo4jConfig)
// 4) Load the whole Neo4j DB
@s1ck
s1ck / Problem.java
Created October 19, 2016 06:31
Type erasure problem solely on cluster execution
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.DataSet;
import org.apache.flink.api.java.ExecutionEnvironment;
import org.apache.flink.api.java.tuple.Tuple1;
import java.lang.reflect.Array;
public class Problem {
public static class Pojo {
@s1ck
s1ck / TypeProblem.java
Created October 10, 2016 11:26
TypeProblem when using RichFlatMapFunction on GenericArray Types
import org.apache.flink.api.common.functions.FlatJoinFunction;
import org.apache.flink.api.common.functions.RichFlatJoinFunction;
import org.apache.flink.api.java.DataSet;
import org.apache.flink.api.java.ExecutionEnvironment;
import org.apache.flink.api.java.operators.DataSource;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.util.Collector;
public class TypeProblem {
@s1ck
s1ck / Exception
Created April 6, 2016 08:17
Akka Exception
------------------------------------------------------------
The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: The program execution failed: Job execution failed.
at org.apache.flink.client.program.Client.runBlocking(Client.java:381)
at org.apache.flink.client.program.Client.runBlocking(Client.java:355)
at org.apache.flink.client.program.Client.runBlocking(Client.java:315)
at org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:60)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:855)
at org.gradoop.model.impl.GraphBase.writeAsJson(GraphBase.java:264)
@s1ck
s1ck / PageRankWithConf.java
Last active March 14, 2016 16:52
Example PageRank with custom scatter gather conf
// your code to create vertices and edges
Graph<Long, Double, Double> graph = Graph.fromDataSet(vertices, edges, env);
ScatterGatherConfiguration conf = new ScatterGatherConfiguration();
conf.setSolutionSetUnmanagedMemory(true);
long vertexCount = graph.numberOfVertices();
Graph<Long, Double, Double> ranks = graph.runScatterGatherIteration(
new PageRank.VertexRankUpdater<>(0.85, vertexCount),
@s1ck
s1ck / SerializationIssue.java
Last active November 27, 2015 17:31
Serialization Issue in Collection Environment
import org.apache.flink.api.common.functions.GroupReduceFunction;
import org.apache.flink.api.java.DataSet;
import org.apache.flink.api.java.ExecutionEnvironment;
import org.apache.flink.api.java.tuple.Tuple1;
import org.apache.flink.runtime.StreamingMode;
import org.apache.flink.test.util.ForkableFlinkMiniCluster;
import org.apache.flink.test.util.TestBaseUtils;
import org.apache.flink.test.util.TestEnvironment;
import org.apache.flink.util.Collector;
import org.apache.hadoop.io.WritableComparable;
import org.apache.flink.api.common.functions.FilterFunction;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.DataSet;
import org.apache.flink.api.java.ExecutionEnvironment;
import org.apache.flink.api.java.functions.FunctionAnnotation;
import org.apache.flink.graph.Edge;
import org.apache.flink.graph.Graph;
import org.apache.flink.graph.Vertex;
import org.apache.flink.graph.library.ConnectedComponents;
import org.apache.flink.types.NullValue;