Skip to content

Instantly share code, notes, and snippets.

View fahadsiddiqui's full-sized avatar

Fahad Siddiqui fahadsiddiqui

View GitHub Profile
@fahadsiddiqui
fahadsiddiqui / pom.xml
Created October 14, 2016 05:36
Have a look here on the pom file, i'm using this plugin
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>org.webcrawler.core.Driver</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
@fahadsiddiqui
fahadsiddiqui / write-in-between.sh
Last active October 18, 2016 14:25
Write in between a file
SOME_STRING="exit 0"
PATH_TO_YOUR_FILE="./file"
cat $PATH_TO_YOUR_FILE | head -n `expr $(echo $(cat -n $PATH_TO_YOUR_FILE | grep "$SOME_STRING" | tail -n 1) | cut -d ' ' -f 1) - 1` > $PATH_TO_YOUR_FILE
echo -e "Hello world\n$SOME_STRING" >> $PATH_TO_YOUR_FILE
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<!-- APACHE SPARK -->
<dependency>
<groupId>org.apache.spark</groupId>
public final class ScreenSaver {
public static final void main(final String[] args) throws Exception {
UIManager.setLookAndFeel(UIManager.getSystemLookAndFeelClassName());
final JFrame screenSaverFrame = new JFrame();
screenSaverFrame.setDefaultCloseOperation(
WindowConstants.EXIT_ON_CLOSE);
screenSaverFrame.setUndecorated(true);
screenSaverFrame.setResizable(false);
screenSaverFrame.add(new JLabel("This is a Java Screensaver!",
@fahadsiddiqui
fahadsiddiqui / sol.scala
Created March 22, 2017 06:51
Scala case pattern match error in case of Option[String] [Solution]
def foo(args: Any*) = {
args.flatMap {
case str: String if str.isEmpty => None
case str: String => Some(str)
case Some(x: String) if x.isEmpty => None
case Some(x: String) => Some(x)
}.mkString(", ")
}
scala> val thisfile = sc.textFile("/home/fahad/e.json")
thisfile: org.apache.spark.rdd.RDD[String] = /home/fahad/e.json MapPartitionsRDD[102] at textFile at <console>:27
scala> val rdd = sc.parallelize((thisfile.collect().mkString.replace("},", "}}\n{").dropRight(1) + "}").split("\n"))
rdd: org.apache.spark.rdd.RDD[String] = ParallelCollectionRDD[103] at parallelize at <console>:29
scala> val xy = sqlContext.read.json(rdd)
xy: org.apache.spark.sql.DataFrame = [emp-1: struct<age:bigint,name:string,sex:string>, emp-2: struct<age:bigint,name:string,sex:string>]
// startDate, endDate are jodatime LocalDate objects
val daysCount = Days.daysBetween(startDate, endDate).getDays
(0 until daysCount).map(startDate.plusDays).foreach { today =>
// today is LocalDate
}
def foo(l: List[Any]): List[Int] = {
l flatMap {
case e: Int => List(e)
case f: List[Any] => foo(f)
}
}
foo(List(1, List(1, 3, List(1, 7, List(3, 5, 7))))) // results List(1, 1, 3, 1, 7, 3, 5, 7)
export PS1="[\[\e[31m\]\\t\[\e[m\]] \\u:\[\e[32m\]\\w\[\e[m\] $ "
# disabling second argument if "push" else works as it is
block_git_push() {
if [ $1 = "push" ]; then
echo "\"push\" is blocked temporarily in ~/.bashrc file to prevent volcano eruption."
else
git $*
fi
}
alias git=block_git_push