Skip to content

Instantly share code, notes, and snippets.

@dje
Created January 3, 2011 21:16
Show Gist options
  • Star 8 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save dje/763977 to your computer and use it in GitHub Desktop.
Save dje/763977 to your computer and use it in GitHub Desktop.
Paging through Fog 1K key limit
files = directory.files.all
truncated = files.is_truncated
while truncated
set = directory.files.all( :marker => files.last.key )
truncated = set.is_truncated
files = files + set
end
@jmontross
Copy link

what is directory?

I've got code like so ...
storage = Fog::Storage.new({:provider => 'AWS', :aws_access_key_id => ACCESS_KEY_ID, :aws_secret_access_key => SECRET_ACCESS_KEY})
files = storage.get_bucket(bucket)

files.body['Contents'].length
=> 1000

@jmontross
Copy link

EXACT CODE THAT WORKS.

storage = Fog::Storage.new({:provider => 'AWS', :aws_access_key_id => ACCESS_KEY_ID, :aws_secret_access_key => SECRET_ACCESS_KEY})
files = storage.get_bucket("lessonOverFlow",{'max-keys' =>'100000'})

truncated = files.body['IsTruncated']
the_response = files.body['Contents']
while truncated
files = storage.get_bucket("lessonOverFlow",{'max-keys' =>'100000', 'marker' => files.body['Contents'].last["Key"]})
truncated = files.body['IsTruncated']
the_response = the_response + files.body['Contents']
end

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment