Skip to content

Instantly share code, notes, and snippets.

@tlmaloney
Last active February 2, 2022 02:21
Show Gist options
  • Save tlmaloney/5650699 to your computer and use it in GitHub Desktop.
Save tlmaloney/5650699 to your computer and use it in GitHub Desktop.
Downloads and cleans up a CSV file from a Google Trends query.
@adamchapman
Copy link

Hi Tom, I read your forum posts at https://www.quantopian.com/posts/google-search-terms-predict-market-movements and it souds like you have a solid solution to the quota limit problem.

I've been trying to do this in Matlab, then in Autoit, and I managed to get the data out but I hit a quota limit every day and then I'm screwed.

My efforts at trying to run this in python have not been successful and I'm sure that's down to my lack of experience with python.

Would it be possible to compile this into an executable with command line inputs?

@adamchapman
Copy link

Sorry a bit premature- I'm a total novice when it comes to python. Now effectively using the command prompt with:

system('python trends.py username password C:/Users/Adam/Desktop/pythontrend.csv searchterm')

I think that to simulataneous queries would look like this:

system('python trends.py username password C:/Users/Adam/Desktop/pythontrend.csv "searchterm1, searchterm2"')

but the csv file doesn't appear to write when I do this- would it be simple to upgrade this code to allow multiple search terms? Sorry to be a pain.

@tlmaloney
Copy link
Author

Hi Adam,

Try without the comma

system('python trends.py username password C:/Users/Adam/Desktop/pythontrend.csv "searchterm1" "searchterm2"')

and let me know if that works.

@supermaxim
Copy link

Mechanize automatically follows robots.txt, but it can be disabled assuming you have permission, or you have through the ethics through ..

Insert br.set_handle_robots(False) to avoid
httperror_seek_wrapper: HTTP Error 403: request disallowed by robots.txt

The script runs fine on my linux box at least as Tom instructed

@ravimevcha
Copy link

This script is not working in Python 3.5.1, can you help upgrading it to work on latest version

@ravimevcha
Copy link

@supermaxim I have installed Python 2.7.11 but after that getting the same "HTTP Error 403: request disallowed by robots.txt" error. I added "br.set_handle_robots(False)" but this is giving me another error of "unexpected indent" - where do I add this line?

@ravimevcha
Copy link

Finally I am able to run the file now but its not giving any data... I am getting CSV file with just two column "date" and "debt" nothing else

@Fat4Fox
Copy link

Fat4Fox commented May 24, 2016

I am using Python 2.7.10, and mechanize-0.2.5, but got the error below:

Traceback (most recent call last):
File "./trends.py", line 71, in
sys.exit(main(sys.argv))
File "./trends.py", line 42, in main
form['Passwd'] = password
File "/Library/Python/2.7/site-packages/mechanize/_form.py", line 2780, in setitem
control = self.find_control(name)
File "/Library/Python/2.7/site-packages/mechanize/_form.py", line 3101, in find_control
return self._find_control(name, type, kind, id, label, predicate, nr)
File "/Library/Python/2.7/site-packages/mechanize/_form.py", line 3185, in _find_control
raise ControlNotFoundError("no control matching "+description)
mechanize._form.ControlNotFoundError: no control matching name 'Passed'

Changed to 2 steps of login and it's working now:

Login in to Google

response = br.open('https://accounts.google.com/ServiceLogin?hl=en&continue=https://www.google.com/')
forms = mechanize.ParseResponse(response)
form = forms[0]
form['Email'] = username
response = br.open(form.click())
forms = mechanize.ParseResponse(response)
form = forms[0]
form['Passwd'] = password
response = br.open(form.click())

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment