Skip to content

Instantly share code, notes, and snippets.

@krkeegan
Last active May 30, 2023 21:42
Show Gist options
  • Star 7 You must be signed in to star a gist
  • Fork 3 You must be signed in to fork a gist
  • Save krkeegan/64e96290eb6569790d230085016501da to your computer and use it in GitHub Desktop.
Save krkeegan/64e96290eb6569790d230085016501da to your computer and use it in GitHub Desktop.
Missed an ampersand
'''
A Script for automatically downloding earth based image tiles and stitching
them into a single image file. Designed for downloading and stitching together
a cloud map file for use with xplanet. But it could be used for other things.
Images are obtained from the Space Science and Engineering Center (SSEC)
at the University of Wisconsin-Madison through their RealEarth project.
RealEarth is a data discovery and visualization platform developed at
SSEC/CIMSS at the University of Wisconsin-Madison to support outreach
and collaboration efforts of scientists.
http://re.ssec.wisc.edu
Source: SSEC RealEarth, UW-Madison
Terms of use for the images can be found here:
https://www.ssec.wisc.edu/realearth/terms-of-use/
Usage:
1. Set the variables below. At minimum OUTPUT needs to be set.
2. Call the script from python.
3. It may take several minutes (it downloads 512 images at zoom level 4)
4. When it completes, you should have a single image file at the location
specified.
The script can then be run with Python2.7. I think it is even Python3
compatible, but I haven't checked. The processing time could be decreased if
multi-threading downloading was performed. But I didn't see any reason to
increase the load on SSEC.
The output image is projected as equirectangular. This is the projection
xplanet and other visualization tools use. Mercator projection could also be
accomplished with minimal editing of the script.
'''
import json
import requests
from PIL import Image
# The ID of the satellite product to use, if you want realistic clouds you
# want to use 'globalir'. You can see a list of products here:
# http://realearth.ssec.wisc.edu/products/
PRODUCT = 'globalir'
# URL of the API, if it ever changes
API_URL = 'http://re.ssec.wisc.edu/api'
# Zoom level. Valid Levels are 0(smallest image) - 20(largest image). Zoom
# level 3 will create a 4096x2048 image which if downloaded once an hour
# results in about 200 MegaPixels of usage a day, well under the 500 MegaPixels
# free amount. Zoom level 4 produces a 8192x4096 image, which results in about
# 800 MegaPixels of usage a day, which is under the free API access level.
# It isn't clear that zoom levels above 4 get you any more detail anyways.
ZOOM = 3
# Your API Key.
# You can register for one at:
# https://realearth.ssec.wisc.edu/users/
# Without a key, your IP address will be limited to 500 MegaPixels per day.
# With a free API Key you can get up to 1000 MegaPixels per day. If you
# exceed those thresholds, a watermark will appear on your images.
API_KEY = ''
# Output file location
# Output will be a PNG, this can be easily modified in the code below.
# Make sure the file location is writable by the user executing the script.
OUTPUT = ''
##############################################################################
DATE_REQ = requests.get(API_URL + '/time?products=' + PRODUCT)
DATE = json.loads(DATE_REQ.content)[PRODUCT]
URL = API_URL + '/image?products=' + PRODUCT + '_' + str(DATE[0]) + '_' +\
str(DATE[1]) + '&equirectangular=true&z=' + str(ZOOM) + '&accesskey=' +\
API_KEY
# Tiles from SSEC are 256x256
OUTPUT_RAW = Image.new('RGB', (((2 ** ZOOM) * 2 * 256), (2 ** ZOOM) * 256))
for x in range((2 ** ZOOM) * 2): # Equirectangular images are twice as wide
for y in range((2 ** ZOOM)):
# You could add some logging here if you wanted to see each image
# request go by
tile = requests.get(URL + '&x=' + str(x) + '&y=' + str(y), stream=True)
img_tile = Image.open(tile.raw)
OUTPUT_RAW.paste(im=img_tile, box=(x * 256, y * 256))
OUTPUT_RAW.save(OUTPUT, "PNG")
'''
Copyright 2020 Kevin Keegan
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
of the Software, and to permit persons to whom the Software is furnished to do
so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
'''
@krkeegan
Copy link
Author

Yeah, I don't think the issue is the actual image, agree that is public domain. But the service of providing the image (bandwidth, cpu, api design, backend) costs money and I agree they can charge for that. I just think $6k a year for personal use is completely unjustified, particularly for government data from a government supported institution.

Given that the image is public domain, there isn't anyway to subscribe and resell it so that we could all share in one subscription. The next guy will always be able to undercut your price.

I doubt anything could fix the watermark, it is very large. Enough of the image is missing that any NN would just be guessing cloud data.

I have been in email communication with them, and they say that they plan on creating some cheaper plan. I am dubious it will be anything reasonable. Personal users are just not the market they are trying to supply.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment