You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Manipulation of massive astronomical data using graphs
Background
The Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST), will generate data of the order of TBs and will send out a stream of around 10 million alerts per night. LSST has committed to making a part of the useful data it collects public in the form of alerts generated in real-time. AstroLab Software has developed Fink, an Apache Spark based broker infrastructure, which is able to analyze large streams of alerts data from the telescopes like LSST, and then redistribute it to subscribers, enabling a wide range of applications and services to consume this data. This processed data needs to be stored for visualizing and post-processing. The efficient manipulation and visualization of patterns in this extremely large dataset is a real challenge for which we created grafink as
A function is a mapping from one set, called a domain, to another set, called the codomain. A function associates every element in the domain with exactly one element in the codomain. In Scala, both domain and codomain are types.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Serialization of Scala closures that contain local defs
Serialization of Scala closures that contain local defs
Several Apache Spark APIs rely on the ability to serialize Scala closures. Closures may reference non-Serializable objects, preventing them from being serialized. In some cases (SI-1419 and others), however, these references are unnecessary and can be nulled out, allowing otherwise-unserializable closures to be serialized (in Spark, this nulling is performed by the ClosureCleaner).
Scala 2.12's use of Java 8 lambdas for implementing closures appears to have broken our ability to serialize closures which contain local defs. If we cannot resolve this problem, Spark will be unable to support Scala 2.12 and will be stuck on 2.10 and 2.11 forever.
As an example which illustrates this problem, the following closure has a nested localDef and is defined inside of a non-serializable class: