Skip to content

Instantly share code, notes, and snippets.

@deckarep
Last active December 20, 2016 23:36
Show Gist options
  • Save deckarep/ec80775ed9f9ab4921c86b2b18d01ec1 to your computer and use it in GitHub Desktop.
Save deckarep/ec80775ed9f9ab4921c86b2b18d01ec1 to your computer and use it in GitHub Desktop.
Shared Nothing Concurrency
package main
import (
"fmt"
"io"
"log"
"net/http"
"os"
"strings"
"sync"
)
// This approach demonstrates a shared nothing architecture where each goroutine receives
// a single site that it will download and store on disk as a text file.
// The WaitGroup is needed purely to ensure the process doesn't exit while the concurrent
// downloads are running.
func main() {
// List of all the sites we'd like to download concurrently.
sites := []string{"http://google.com", "http://gmc.com"}
// Allows us to wait for all work to complete before the main thread exits.
var wg sync.WaitGroup
wg.Add(len(sites))
// Concurrently kicks off a goroutine per site where the logic handles
// the fetching and downloading of individual files based on the site name.
for _, link := range sites {
go func(l string) {
// Issue an http GET request
resp, err := http.Get(l)
if err != nil {
log.Fatal("Couldn't download from link with err: ", err.Error())
}
defer resp.Body.Close()
// Create a file on disk for this particular site being downloaded.
file, err := os.Create(strings.TrimPrefix(l, "http://") + ".html")
if err != nil {
log.Fatal("Couldn't create file with err: ", err.Error())
}
// Copy the resp to the file as a stream.
_, err = io.Copy(file, resp.Body)
if err != nil {
log.Fatal("Couldn't copy the response to file with err: ", err.Error())
}
wg.Done()
}(link)
}
wg.Wait()
fmt.Printf("%d files successfully downloaded.\n", len(sites))
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment