Skip to content

Instantly share code, notes, and snippets.

💭
Every day is learning. Every day is the pursuit to perfectness.

Afsal Thaj afsalthaj

💭
Every day is learning. Every day is the pursuit to perfectness.
Block or report user

Report or block afsalthaj

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
View spark_optimiser_is_not_a_silver_bullet.scala
```scala
scala> list.collect
res0: Array[org.apache.spark.sql.Row] = Array([1.0], [1.0], [1.0], [2.0], [2.0], [0.0])
scala> list
res1: org.apache.spark.sql.DataFrame = [a_indexed: double]
scala> val table1 =list
table1: org.apache.spark.sql.DataFrame = [a_indexed: double]
View spark_with_bucketing.md

Spark with Bucketing

Execution plan of spark on bucketed data-sets, and verify if it is smart enough to avoid wide dependency.

PS: When trying things in spark-shell, make a note that, for small datasets, the join would be probably be broadcast exchange in physical execution plan by default. Example:

./spark-shell

View zio_pattern.scala
import zio.{Task, ZIO }
trait ConfigModule {
def configService: ConfigService
}
trait ConfigService {
def config: Task[String]
}
View cron_scala.scala
// All that you need for managing cron in scala!
import com.cronutils.model._
import com.cronutils.model.definition._
import java.time._
import com.cronutils.parser.CronParser
import scalaz.{Show, \/}
import com.cronutils.model.time.ExecutionTime
import scalaz.syntax.either._
import CronOps._, StepDirection._
View task_approaches.md

My note on scalaz.Task approaches. This is mainly for old code!!!!

I have found wrong usages of scalaz.Task in old programs.

Why this is important?

When you bring in FP ornaments into your code, it comes with certain cost. The first and formost is additional learning curve for new developers to come in and understand the concepts of total, side effect free, lazy descriptions of computations. This has its advantage of being able to ship robust code with equational reasoning. But what if the usage of FP constructs itself is wrong?

Most of the old scala-FP code has the usages of scalaz.Task which is one of the most popular FP datastructure then. It's good to use it even today although there are typelevel libraries and ZIO providing better choices - cats.IO, ZIO, Monix.Task

View task_repeat.scala
object Repeat {
def apply(f: Task[Unit], duration: Duration): Task[List[Any]] =
Nondeterminism[Task].gatherUnordered(List(Task.fork(f), Task.fork { apply(f, duration).after(duration) }))
}
View run_for_missing_periods_in.scala
// This is a demo how functional and recursive thinking could help us write unbreakable, stack safe code.
// **This is not any IP, but a general pattern of code solving a usual simple usecase**
// and that also show how safe it is to play with lazy structures (although I did that with scala.Stream as my project didnt have dependencies to fs2)
// Do note that, the entire logic can be just tested with `scala.Double, or scala.Int` and needn't get into the details of `java.time.Instant`
/**
* Given the data freshness (eg: data always arrives by 2 days late: hence offset = 2.days,
* and the schedule be once in 10 days (schedule = 10.days),
* find out all the time periods (start time and end time) for which the spark job should run
* such that the last result was updated at `lastUpdatedTime`. It should also handle cron schedules.
View rle_algorithm_in_scala.scala
import scala.util.Random
sealed trait Streamss[A, B]
object Streamss extends App {
case class NeedInput[A, B](f: A => Streamss[A, B]) extends Streamss[A, B]
case class HasOutput[A, B](b: B, s: Streamss[A, B]) extends Streamss[A, B]
View johns_zio_challenge_quick_try.scala
// http://degoes.net/articles/zio-challenge
// This is a quick try with Ref and I will keep trying!
import scalaz.zio.{UIO, ZIO, _}
final case class Percentage(value: Double) extends AnyVal
/**
* A `Tap` adjusts the flow of tasks through
* an external service in response to observed
View pointer_to_pointer.c
//
// main.c
// Explanation of pointer to pointer by deleting a node from linked list.
//
// Created by Afsal Thaj on 10/3/19.
// Copyright © 2019 Afsal Thaj. All rights reserved.
//
//
// The concept discuss about pointers to pointers: https://www.eskimo.com/~scs/cclass/int/sx8.html
You can’t perform that action at this time.