Skip to content

Instantly share code, notes, and snippets.

@j08lue
Created July 28, 2015 11:09
Show Gist options
  • Save j08lue/55903e30978724696839 to your computer and use it in GitHub Desktop.
Save j08lue/55903e30978724696839 to your computer and use it in GitHub Desktop.
Convert Polar science center Hydrographic Climatology (PHC) 3.0 data from Levitus ASCII to netCDF
"""
Convert PHC 3.0 data from Levitus ASCII to netCDF
Polar science center Hydrographic Climatology (PHC)
A Global Ocean Hydrography with a High Quality Arctic Ocean
http://psc.apl.washington.edu/nonwp_projects/PHC/Climatology.html
Citation:
PHC 3.0, updated from:
Steele, M., R. Morley, and W. Ermold, PHC: A global
ocean hydrography with a high quality Arctic Ocean,
J. Climate, 14, 2079-2087, 2001.
From the data download page:
http://psc.apl.washington.edu/nonwp_projects/PHC/Data3.html
* These files are structured and named to be analogous with the format of the
World Ocean Atlas (Levitus) "analyzed" fields.
* The data are in ASCII format.
* Data points begin at -89.5 S and are arranged on a matrix which runs first
over longitude, from .5 E to 359.5 E degrees, then over latitude, ending
with 89.5 N (The data "spirals" up the world).
* This repeats for every depth.
* There are 10 entries per line, and there are 8 significant figures, 4 after the decimal point (10f8.4).
* The land value is -99.
* All of the depths for each field are included in each file.
* Each level has 64800 data values (that's one for every latitude and
longitude coordinate: 360 X 180).
* Since there's 10 values per line, every 6480 lines of data in the files below constitute one level.
* For example, the first 6480 lines of data in the "Winter Temperature" file are the surface field.
* The next 6480 lines are the depth=10 meters field, and so on. Latitude and longitude files
structured like these data files are included toward the bottom of this page under "Product Aides".
"""
import numpy as np
import xray
# hard-coded list of depth levels from data website
levels = np.array([
0, 10, 20, 30, 50, 75, 100, 125, 150, 200, 250, 300, 400, 500,
600, 700, 800, 900, 1000, 1100, 1200, 1300, 1400, 1500, 1750,
2000, 2500, 3000, 3500, 4000, 4500, 5000, 5500])
nlevels = 33
assert len(levels) == nlevels
# longitude coordinates
lons = np.arange(.5, 359.5+1, 1)
nlons = 360
assert len(lons) == nlons
# latitude coordinates
lats = np.arange(-89.5, 89.5+1, 1)
nlats = 180
assert len(lats) == nlats
# missing value
missval = -99.
# file names
files = {
'salt': 'Salt00_p3.obj.gz',
'temp': 'Temp00_p3.obj.gz'}
# dimensions
dims = ['depth', 'lat', 'lon']
# attributes
attributes = {
'salt': dict(name='salinity', units='', long_name='Salinity'),
'temp': dict(name='temperature', units='deg C', long_name='Potential temperature'),
}
# coordinates
coords = {
'lon': lons,
'lat': lats,
'depth': levels}
# read data from files
datadict = {}
for varn, fname in files.iteritems():
data = np.genfromtxt(fname, delimiter=[8]*10).reshape((nlevels, nlats, nlons))
data = np.ma.masked_where(data==missval, data)
datadict[varn] = (dims, data, attributes[varn])
# convert to xray dataset
ds = xray.Dataset(datadict, coords=coords)
ds.name = 'PHC 3.0'
# save to netCDF
ds.to_netcdf('PHC30_annual_means.nc')
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment