Skip to content

Instantly share code, notes, and snippets.

@oldarmyc
Created November 21, 2018 14:17
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save oldarmyc/c55c51b2c99498d3ebd6866deca2f7aa to your computer and use it in GitHub Desktop.
Save oldarmyc/c55c51b2c99498d3ebd6866deca2f7aa to your computer and use it in GitHub Desktop.
NFS and AE5 instructions

On host server for NFS

yum install nfs-utils nfs-utils-lib -y

Create dir for nfs share and set permissions

mkdir /nfs
chmod 655 /nfs
chown nfsnobody:nfsnobody /nfs

Edit the exports for the NFS server

vi /etc/exports

Add the export to the NFS share that you setup above i.e. the following

/nfs AE5_MASTER_IP(rw,sync,no_root_squash)

Enable and start up processes for NFS share

systemctl enable rpcbind
systemctl enable nfs-server
systemctl start nfs-server
systemctl start rpcbind

On the AE5 master go to OPS Center and login using your credentials. Click on Configuration on the left hand side and do a search for NFS. You will see the following uncommented section that is already setup/

volumes:
  {}

You would want to replace/setup this with the NFS details that you setup above with something like the example below.

volumes:
  mynfs: # Directory to access in the projects themselves
    nfs:
      server: NFS_SERVER_IP
      path: "/nfs/" # Mount point on the NFS server and in this example was /nfs/
      readOnly: true

Click Apply at the bottom of the file to save the changes. Login to the AE5 master through SSH (You can also access this through OPS Center by getting root of the master AE5 server).

gravity enter
# Restart all of the pods to take the changes we made above
kubectl get pods | cut -d' ' -f1 | xargs kubectl delete pods

Watch the pods come back up and when all are running and ready proceed to the next steps.

watch kubect get pods

Login to the AE5 UI and try to create a new project. Once the project comes up you can start a session for the project as well.

From the project you should be able to access the NFS share at the following location: /data/mynfs/DATAFILE.csv

Below is a quick pandas example reading a CSV from an NFS share called bigdata

import pandas as pd
data = pd.read_csv("/data/bigdata/test_data.csv") 
# Check the first 10 rows of the data
data.head()

Additional documentation

NFS Setup

Loading data in projects

Working with projects

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment