Created
February 25, 2012 00:15
-
-
Save jimklo/1904807 to your computer and use it in GitHub Desktop.
try to figure out collation in couchdb
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from couchdb.client import Server | |
server = Server() | |
db = server['collation-test'] | |
try: | |
for idx in range(65535): | |
db.save( {'unichr':unichr(idx), 'code': idx, 'hex': str(hex(idx))} ) | |
print "{0}\n".format(hex(idx)) | |
except Exception, e: | |
raise e |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
function(doc) { | |
log(doc.code + " hex:"+doc.hex); | |
emit(doc.unichr, doc.hex); | |
} |
Not really the point... it seems there's just a bug in the indexer this map exposes. If I had done the bulk update, I wouldn't have known if the insert failed.
…On Feb 27, 2012, at 6:50 PM, kxepal wrote:
bulk_update works faster(;
docs = []
for idx in range(65535):
docs.append({'unichr':unichr(idx), 'code': idx, 'hex': str(hex(idx))})
db.update(docs)
---
Reply to this email directly or view it on GitHub:
https://gist.github.com/1904807
It's easy to handle insert failure:
docs = []
for idx in range(65535):
docs.append({'unichr':unichr(idx), 'code': idx, 'hex': str(hex(idx))})
for success, docid, rev_or_exc in db.update(docs, all_or_nothing=True):
if success:
print docid, rev_or_exc
else:
raise rev_or_exc
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
bulk_update works faster(;