Skip to content

Instantly share code, notes, and snippets.

View mehdichamouma's full-sized avatar
🏠
Working from home

Mehdi Chamouma mehdichamouma

🏠
Working from home
View GitHub Profile
@lukewang1024
lukewang1024 / pyspark-java9-issue.md
Created October 25, 2017 11:36
Apache Spark does not work with Java 9 yet. Install Java 8 back to get it running.

When Java 9 is the default version getting resolved in the environment, pyspark will throw error below and you will see name 'xx' is not defined error when trying to access sc, spark etc. from shell / Jupyter.

Python 3.6.3 (default, Oct 19 2017, 13:58:41)
[GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.38)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
@satya164
satya164 / MainActivity.java
Last active December 2, 2016 20:03
react-native-fbsdk in React Native 0.29.+
public class MainActivity extends ReactActivity {
/**
* Returns the name of the main component registered from JavaScript.
* This is used to schedule rendering of the component.
*/
@Override
protected String getMainComponentName() {
return "MyApp";
}