Skip to content

Instantly share code, notes, and snippets.

@rdn32 rdn32/gist:5554769
Last active Dec 17, 2015

What would you like to do?
My solution to the webcrawler exercise in "A Tour of Go" ( sample solution involved explicit locking, whereas my solution is purely based on communication (although the way it achieves this does seem a little bit awkward...)The assumption is that fetching things is what will take the time, so the contention on the c…
// (I've ommitted the code supplied as part of the exercise)
type crawl_result struct {
url string
depth int
body string
urls []string
err error
// Crawl uses fetcher to recursively crawl
// pages starting with url, to a maximum of depth.
func Crawl(url string, depth int, fetcher Fetcher) {
ch := make(chan *crawl_result)
fetch := func(url string, depth int) {
res := new(crawl_result)
res.depth = depth
res.url = url
res.body, res.urls, res.err = fetcher.Fetch(url)
ch <- res
current := 1
go fetch(url, depth)
seen := make(map[string]bool)
for current > 0 {
res := <- ch;
if res.err != nil {
} else {
fmt.Printf("found: %s %q\n", res.url, res.body)
depth := res.depth - 1
for _, u := range res.urls {
if depth > 0 && !seen[u] {
seen[u] = true
go fetch(u, depth)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.