+7 votes
in Amazon Web Service by

I am trying to list all directories within an S3 bucket using Python and Boto3.

I am using the following code:

s3 = session.resource('s3')  # I already have a boto3 Session object
bucket_names = [
for name in bucket_names:
    bucket = s3.Bucket(name)
    for obj in bucket.objects.all():  # this raises an exception
        # handle obj

When I run this I get the following exception stack trace:

File "botolist.py", line 67, in <module>
  for obj in bucket.objects.all():
File "/Library/Python/2.7/site-packages/boto3/resources/collection.py", line 82, in __iter__
  for page in self.pages():
File "/Library/Python/2.7/site-packages/boto3/resources/collection.py", line 165, in pages
  for page in pages:
File "/Library/Python/2.7/site-packages/botocore/paginate.py", line 83, in __iter__
  response = self._make_request(current_kwargs)
File "/Library/Python/2.7/site-packages/botocore/paginate.py", line 155, in _make_request
  return self._method(**current_kwargs)
File "/Library/Python/2.7/site-packages/botocore/client.py", line 270, in _api_call
  return self._make_api_call(operation_name, kwargs)
File "/Library/Python/2.7/site-packages/botocore/client.py", line 335, in _make_api_call
  raise ClientError(parsed_response, operation_name)

botocore.exceptions.ClientError: An error occurred (NoSuchKey) when calling the ListObjects operation: The specified key does not exist.

What is the correct way to list directories inside a bucket?

6 Answers

+2 votes

All these other responses suck. Using


Limits you to 1k results max. The rest of the answers are either wrong or too complex.

Dealing with the continuation token yourself is a terrible idea. Just use paginator, which deals with that logic for you

The solution you want is:

[e['Key'] for p in client.get_paginator("list_objects_v2")\
          for e in p['Contents']]
+5 votes

Alternatively you may want to use boto3.client


>>> import boto3 
>>> client = boto3.client('s3')
>>> client.list_objects(Bucket='MyBucket')

list_objects also supports other arguments that might be required to iterate though the result: Bucket, Delimiter, EncodingType, Marker, MaxKeys, Prefix

+5 votes

If you have the session, create a client and get the CommonPrefixes of the clients list_objects:

client = session.client('s3', 
                        # region_name='eu-west-1'

result = client.list_objects(Bucket='MyBucket', Delimiter='/')
for obj in result.get('CommonPrefixes'):
    #handle obj.get('Prefix')

There could be a lot of folders, and you might want to start in a subfolder, though. Something like this could handle that:

def folders(client, bucket, prefix=''):
    paginator = client.get_paginator('list_objects')
    for result in paginator.paginate(Bucket=bucket, Prefix=prefix, Delimiter='/'):
        for prefix in result.get('CommonPrefixes', []):
            yield prefix.get('Prefix')

gen_folders = folders(client, 'MyBucket')

gen_subfolders = folders(client, 'MyBucket', prefix='MySubFolder/')
+1 vote

The best way to get the list of ALL objects with a specific prefix in a S3 bucket is using list_objects_v2 along with ContinuationToken to overcome the 1000 object pagination limit.

import boto3
s3 = boto3.client('s3')

s3_bucket = 'your-bucket'
s3_prefix = 'your/prefix'
partial_list = s3.list_objects_v2(
obj_list = partial_list['Contents']
while partial_list['IsTruncated']:
    next_token = partial_list['NextContinuationToken']
    partial_list = s3.list_objects_v2(
0 votes

I would have thought that you can not have a slash in a bucket name. You say you want to list all directories within a bucket, but your code attempts to list all contents (not necessarily directories) within a number of buckets. These buckets probably do not exist (because they have illegal names). So when you run

bucket = s3.Bucket(name)

bucket is probably null, and the subsequent list will fail.

0 votes

If you have fewer than 1,000 objects in your folder you can use the following code:

import boto3

s3 = boto3.client('s3')
object_listing = s3.list_objects_v2(Bucket='bucket_name',
Welcome to Kloudwise, where you can ask any AWS, Azure, GCP and other cloud service questions and receive answers from other members of the community.