Skip to content

Instantly share code, notes, and snippets.

Last active August 1, 2022 15:37
Show Gist options
  • Star 98 You must be signed in to star a gist
  • Fork 7 You must be signed in to fork a gist
  • Save celoyd/b92d0de6fae1f18791ef to your computer and use it in GitHub Desktop.
Save celoyd/b92d0de6fae1f18791ef to your computer and use it in GitHub Desktop.
A way to make Himawari-8 animations

Himawari-8 animation tutorial

Here’s how to make animations like this one. It requires intermediate Unix command-line knowledge, to install some tools and to debug if they don’t work. You’ll need these utilities:

  • curl (or you can translate to wget)
  • convert and montage, part of ImageMagick
  • ffmpeg, plus whatever codecs
  • parallel, for iteration that’s nicer than shell for loops or xargs
  • run everything in zsh for leading 0s in numerical ranges to work

It’s a lumpy process with plenty of wasted effort and unnecessarily manual parts. Please build off it (CC-BY), and get clarifications from me on Twitter. This is a first draft, copied from my terminal history, with bad variable names.

About Himawari-8

Himawari-8 is a Japanese weather satellite. It’s in geostationary orbit over about 141° (near New Guinea), and has been operational since 2015-07-07. It’s noteworthy in many ways, but what’s cool here is that it produces full-disk, true-color images every 10 minutes. The L1 data is only distributed to national weather agencies in the field of view, but they can redistribute it.

JMA, the responsible agency, should be congratulated for producing timely open data like this. They clearly made an effort to do it right, and everyone who uses anything that relies on a weather forecast benefits.

Fetching the data

The only public sources I know are CIRA RAMMB and (here) NICT. RAMMB does impressive correction – they pretty much remove first-order atmospheric and sun-angle effects. However, they only distribute two sizes, they put data in the night hemisphere that I’m not too keen on, they leave out several images around local solar midnight, they crop out Earth’s disk, and their images are JPEGs (good JPEGs, but this workflow already involves repeated lossy compression). So, broadly speaking, the NICT data is rawer, and in my opinion makes a cooler animation.

Scraping looks like this for the 550×550 version (using parallel to build and rewrite the date strings):

parallel -j5 --delay 0.1 --header : 'curl -C - "{year}/{mo}/{dy}/{hr}{tenmin}000_0_0.png" > {year}-{mo}-{dy}T{hr}{tenmin}000.png' ::: year 2015 ::: mo 11 ::: dy {27..28} ::: hr {00..23} ::: tenmin {0..5}

(N.b. to turbonerds: their HTTP server doesn’t seem to do ranges, so the -C - is probably useless. Also theoretically it would be better to do the range in curl instead of parallel, so it could reuse connections.)

If you go that way, you’ll have a smaller but still really cool animation. But here I’ll work a slightly trickier case, with bigger images that come tiled 2×2:

parallel -j5 --delay 0.1 --header : 'curl -C - "{year}/{mo}/{dy}/{hr}{tenmin}000_{x}_{y}.png" > {year}-{mo}-{dy}T{hr}{tenmin}000-{x}{y}.png' ::: year 2015 ::: mo 11 ::: dy {27..28} ::: hr {00..23} ::: tenmin {0..5} ::: x {0,1} ::: y {0,1}

One way to untile them (skip this if you downloaded the smaller all-in-one images!):

parallel 'mkdir {} && mv *{}.png {}' ::: {0,1}{0,1}

mkdir full

parallel --header : 'montage -mode concatenate -tile 2x {00,10,01,11}/{year}-{mo}-{dy}T{hr}{tenmin}000-*.png full/{year}-{mo}-{dy}T{hr}{tenmin}000.png' ::: year 2015 ::: mo 11 ::: dy {27..28} ::: hr {00..23} ::: tenmin {0..5}

(A clearer way might be to fetch each set of 4 tiles into a directory named for their time, then iterate montage that way. And if you were feeling fancy, you could use, say, requests and skimage to fetch the data and untile it all within python.)

Check the images – if everything worked so far, they should be beautiful already. There will be a few with text reading “No Image”; we’ll get to them in the interpolation section.

Color tweaks

The images come pretty dark – they’re mainly for looking at clouds. Let’s brighten them up so that the lightest clouds are washed out but we can see more color and texture on land (change full if the images are in some other directory):

mkdir pretty

parallel convert -channel R -gamma 1.2 -channel G -gamma 1.1 +channel -sigmoidal-contrast 3,50% {} pretty/{/} ::: full/*

We brighten the red and green channels because the atmosphere tends to dampen them, and add contrast to fill out the histogram a bit. This is just what looks right to me on my screen tonight; I always do it a little different. Play around. (As a rule of thumb, if you’re going for realism, opaque clouds in the middle of the image near noon should be close to neutral gray or white.)


If you page through the images, you’ll notice the ones for 02:40 and 14:40 UTC are always placeholders, without payload data. These gaps are at solar noon and midnight, and they’re consistent – I assume the satellite is busy with uplinks or something on that cycle. (There are occasionally other gaps, which seem to be fairly randomly distributed.)

I use the fact that 14:40 is midnight to help me trim to a local solar day. I look in the Finder and sort by name with the file size visible (because the gaps are tiny). I select everything before and including the first midnight, 2015-11-27T144000.png, and throw them out; then everything after but not including the second midnight, 2015-11-28T144000.png, and throw them out too.


Now there are 24 hours × 6 images per hour = 144 images in the directory, but two of them are placeholders. I fill them manually, linearly interpolating between the images before and after – but cheating by pretending that the day loops, so the first image in the sequence is after the (missing) last one:

cd pretty

convert -average 2015-11-28T023000.png 2015-11-28T025000.png 2015-11-28T024000.png

convert -average 2015-11-27T145000.png 2015-11-28T143000.png 2015-11-28T144000.png

We can do a rough-cut animation like this:

convert -delay 10 pretty/* ../fulldisk.gif

If that doesn’t look basically right, panic.

Now for nitpicking.

First, as a GIF, this has a lot of speckly dithering artifacts, and it’s 60 megabytes. With a lot of cleverness, we could improve both these problems, but not enough. So we’re going to use Gfycat, a hosting service that does user-transparent transcoding to much more compact, proper video formats.

Second, in this speed range where you can see the clouds move but have time to watch details, the framerate is a bit low. Gfycat has a 15 second limit, and to fill that out with 144 images will look a bit jumpy. Unless we cheat.

We’re going to interpolate between every 10-minutely image. At this scale, clouds are only moving a couple pixels per frame at most, and the interpolation isn’t noticeable. We’ll have 288 images, giving us 24 fps over 12 seconds.

I made this python script to generate the filenames we’ll need. Run it like this:

cd ..

python pretty | parallel --colsep ' ' convert -average {1} {2} {3}


ffmpeg -r 24 -f image2 -pattern_type glob -i 'pretty/*.png' -vcodec mpeg4 -b:v 16000k

That bitrate may be too much, or for that matter too little.


I made a Gfycat account because it seemed like the best of a questionable lot.

Be a good data citizen and credit Himawari-8 in the metadata.



Further work

First, automating more of this! Also would be nice to look at:

  • Lossless video (that Gfycat can decode)
  • Zooming in
  • Looking at the horizon/limb
  • Smarter interpolation
Copy link

Small tip:

If you switch around x and y in the output file names i.e. ...-{y}{x}.png instead of ...-{x}{y}.png

parallel -j5 --delay 0.1 --header : 'curl -C - "{year}/{mo}/{dy}/{hr}{tenmin}000_{x}_{y}.png" > {year}-{mo}-{dy}T{hr}{tenmin}000-{y}{x}.png' ::: year 2015 ::: mo 11 ::: dy {27..28} ::: hr {00..23} ::: tenmin {0..5} ::: x {0,1} ::: y {0,1}

you can simply do

parallel --header : 'montage -mode concatenate -tile 2x {year}-{mo}-{dy}T{hr}{tenmin}000-*.png full/{year}-{mo}-{dy}T{hr}{tenmin}000.png' ::: year 2015 ::: mo 11 ::: dy {27..28} ::: hr {00..23} ::: tenmin {0..5}

and forego the step of moving them to different folders etc.
Might be easier when using the bigger tile sets.

Copy link

celoyd commented Jan 14, 2016

if you were feeling fancy, you could use, say, requests and skimage to fetch the data and untile it all within python.

Here’s a standalone fetcher/untiler in requests/Image.

Copy link

celoyd commented Jan 22, 2016

And here’s a way of removing the stripe noise you see at the highest zoom levels.

Copy link

thepiwo commented Jan 27, 2016

is there a licence to this work? I would like to opensource a bash-script to download&render videos, based on this

Copy link

m-ad commented Jan 31, 2016

I rewrote this to work under Windows. It is mostly Python-based, you can check it out here:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment