Skip to content

Instantly share code, notes, and snippets.

@pmventura
Last active April 25, 2018 08:45
Show Gist options
  • Save pmventura/2e932cdd41dcde463dec1bc08537f764 to your computer and use it in GitHub Desktop.
Save pmventura/2e932cdd41dcde463dec1bc08537f764 to your computer and use it in GitHub Desktop.
How to install pyspark to AWS Ec2 instance
PySpark Installation on AWS EC2
1. Launch an aws ubuntu instance
2. sudo apt-get update
3. sudo apt install python3-pip
4. pip3 install jupyter
5. sudo apt-get install default-jre
- java -version
6. sudo apt-get install scala
7. pip3 install py4j
8. wget <latest-version> (http://www-us.apache.org/dist/spark/spark-2.3.0/spark-2.3.0-bin-hadoop2.7.tgz)
- tar -xzvf spark-2.1.1-bin-hadoop2.7.tgz
9. pip3 install findspark
10. jupyter notebook --generate-config
11. cd ~
- mkdir certs
- cd certs
- sudo openssl req -x509 -nodes -days 365 -newkey rsa:1024 -keyout mycert.pem -out mycert.pem
- Country Name: US
- State or Province: CA
- Skip the other entries if you want
12. cd ~/.jupyter
13. vim jupyter_notebook_config.py
14. Add in this config in the beginning of jupyter_notebook_config.py file
c = get_config()
c.NotebookApp.certfile = u'/home/ubuntu/certs/mycert.pem'
c.NotebookApp.ip = '*'
c.NotebookApp.open_browser = False
c.NotebookApp.port = 8888
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment