Skip to content

Instantly share code, notes, and snippets.

@AniX
AniX / tokenizer.py
Last active July 4, 2016 21:10
Tokenize text-field values work-around for Google App Engine Search API
def partialize(phrase, shortest=5):
"""Tokenize the string `phrase` argument for all possible sub-strings
at least `shortest` length of characters.
This is a work-around for Google App Engine's Search API not supporting
partial full-text search (as of time of writing, April 2013
In case of BBCode-formatted phrase, you should first strip() away all
BBCode tags before passing the string to this method.
@AniX
AniX / Cron-Job for deleting expired webapp2 UserToken
Created January 16, 2014 00:47
Code example how to implement a cron-job that deletes expired tokens in Google App Engine Python apps using webapp2 based authentication See http://blog.abahgat.com/2013/01/07/user-authentication-with-webapp2-on-google-app-engine/
import datetime
from google.appengine.ext import ndb
# eventually import custom User class here and adjust paths below
# parameter in timedelta() assumes that tokens expire ~3 months after creation:
expiredTokens = User.token_model.query(User.token_model.created <= (datetime.datetime.utcnow() - datetime.timedelta(3*365/12)))
# delete the tokens in bulks of 100:
while expiredTokens.count() > 0:
keys = expiredTokens.fetch(100, keys_only=True)