Skip to content

Instantly share code, notes, and snippets.

@amykyta
Last active November 6, 2015 00:10
Show Gist options
  • Save amykyta/a07f2705d2c1c5a355cf to your computer and use it in GitHub Desktop.
Save amykyta/a07f2705d2c1c5a355cf to your computer and use it in GitHub Desktop.
get data from url and form new urls
import urllib
def do_it():
f = urllib.urlopen("https://raw.githubusercontent.com/alexwaters/domainChecker/master/archive/domainDict");
data = f.read()
# closes automatically when f is garbage collected but why not
f.close()
# neat solution found on stack overflow
# [iter(data)] * n creates a list of n instances of the iterator to the string
# *[list] passes it to zip by making use of argument unpacking
# ''.join merges the tuples of 3 characters
data = map(''.join, zip(*[iter(data)]*3))
data = ["http://" + url + ".com" for url in data]
print "\n".join(data)
if __name__ == '__main__':
do_it()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment