Cloud Storage

gcloud.storage

Shortcut methods for getting set up with Google Cloud Storage.

You’ll typically use these to get started with the API:

>>> import gcloud.storage
>>> bucket = gcloud.storage.get_bucket('bucket-id-here',
                                       'long-email@googleapis.com',
                                       '/path/to/private.key')
>>> # Then do other things...
>>> key = bucket.get_key('/remote/path/to/file.txt')
>>> print key.get_contents_as_string()
>>> key.set_contents_from_string('New contents!')
>>> bucket.upload_file('/remote/path/storage.txt', '/local/path.txt')

The main concepts with this API are:

gcloud.storage.__init__.get_bucket(bucket_name, project_name, client_email, private_key_path)[source]

Shortcut method to establish a connection to a particular bucket.

You’ll generally use this as the first call to working with the API:

>>> from gcloud import storage
>>> bucket = storage.get_bucket(project_name, bucket_name, email, key_path)
>>> # Now you can do things with the bucket.
>>> bucket.exists('/path/to/file.txt')
False
Parameters:
  • bucket_name (string) – The id of the bucket you want to use. This is akin to a disk name on a file system.
  • project_name (string) – The name of the project to connect to.
  • client_email (string) – The e-mail attached to the service account.
  • private_key_path (string) – The path to a private key file (this file was given to you when you created the service account).
Return type:

gcloud.storage.bucket.Bucket

Returns:

A bucket with a connection using the provided credentials.

gcloud.storage.__init__.get_connection(project_name, client_email, private_key_path)[source]

Shortcut method to establish a connection to Cloud Storage.

Use this if you are going to access several buckets with the same set of credentials:

>>> from gcloud import storage
>>> connection = storage.get_connection(project_name, email, key_path)
>>> bucket1 = connection.get_bucket('bucket1')
>>> bucket2 = connection.get_bucket('bucket2')
Parameters:
  • project_name (string) – The name of the project to connect to.
  • client_email (string) – The e-mail attached to the service account.
  • private_key_path (string) – The path to a private key file (this file was given to you when you created the service account).
Return type:

gcloud.storage.connection.Connection

Returns:

A connection defined with the proper credentials.

Connections

class gcloud.storage.connection.Connection(project_name, *args, **kwargs)[source]

Bases: gcloud.connection.Connection

A connection to Google Cloud Storage via the JSON REST API.

This class should understand only the basic types (and protobufs) in method arguments, however should be capable of returning advanced types.

See gcloud.connection.Connection for a full list of parameters. Connection differs only in needing a project name (which you specify when creating a project in the Cloud Console).

A typical use of this is to operate on gcloud.storage.bucket.Bucket objects:

>>> from gcloud import storage
>>> connection = storage.get_connection(project_name, email, key_path)
>>> bucket = connection.create_bucket('my-bucket-name')

You can then delete this bucket:

>>> bucket.delete()
>>> # or
>>> connection.delete_bucket(bucket)

If you want to access an existing bucket:

>>> bucket = connection.get_bucket('my-bucket-name')

A Connection is actually iterable and will return the gcloud.storage.bucket.Bucket objects inside the project:

>>> for bucket in connection:
>>>   print bucket
<Bucket: my-bucket-name>

In that same way, you can check for whether a bucket exists inside the project using Python’s in operator:

>>> print 'my-bucket-name' in connection
True
Parameters:project_name (string) – The project name to connect to.
API_ACCESS_ENDPOINT = 'https://storage.googleapis.com'
API_URL_TEMPLATE = '{api_base_url}/storage/{api_version}{path}'

A template used to craft the URL pointing toward a particular API call.

API_VERSION = 'v1beta2'

The version of the API, used in building the API call’s URL.

api_request(method, path=None, query_params=None, data=None, content_type=None, api_base_url=None, api_version=None, expect_json=True)[source]

Make a request over the HTTP transport to the Cloud Storage API.

You shouldn’t need to use this method, but if you plan to interact with the API using these primitives, this is the correct one to use...

Parameters:
  • method (string) – The HTTP method name (ie, GET, POST, etc).
  • path (string) – The path to the resource (ie, '/b/bucket-name').
  • query_params (dict) – A dictionary of keys and values to insert into the query string of the URL.
  • data (string) – The data to send as the body of the request.
  • content_type (string) – The proper MIME type of the data provided.
  • api_base_url (string) – The base URL for the API endpoint. Typically you won’t have to provide this.
  • api_version (string) – The version of the API to call. Typically you shouldn’t provide this and instead use the default for the library.
  • expect_json (bool) – If True, this method will try to parse the response as JSON and raise an exception if that cannot be done.
Raises:

Exception if the response code is not 200 OK.

build_api_url(path, query_params=None, api_base_url=None, api_version=None)[source]

Construct an API url given a few components, some optional.

Typically, you shouldn’t need to use this method.

Parameters:
  • path (string) – The path to the resource (ie, '/b/bucket-name').
  • query_params (dict) – A dictionary of keys and values to insert into the query string of the URL.
  • api_base_url (string) – The base URL for the API endpoint. Typically you won’t have to provide this.
  • api_version (string) – The version of the API to call. Typically you shouldn’t provide this and instead use the default for the library.
Return type:

string

Returns:

The URL assembled from the pieces provided.

create_bucket(bucket, *args, **kwargs)[source]

Create a new bucket.

For example:

>>> from gcloud import storage
>>> connection = storage.get_connection(project_name, client, key_path)
>>> bucket = connection.create_bucket('my-bucket')
>>> print bucket
<Bucket: my-bucket>
Parameters:bucket (string or gcloud.storage.bucket.Bucket) – The bucket name (or bucket object) to create.
Return type:gcloud.storage.bucket.Bucket
Returns:The newly created bucket.
delete_bucket(bucket, *args, **kwargs)[source]

Delete a bucket.

You can use this method to delete a bucket by name, or to delete a bucket object:

>>> from gcloud import storage
>>> connection = storage.get_connection(project_name, email, key_path)
>>> connection.delete_bucket('my-bucket')
True

You can also delete pass in the bucket object:

>>> bucket = connection.get_bucket('other-bucket')
>>> connection.delete_bucket(bucket)
True

If the bucket doesn’t exist, this will raise a gcloud.storage.exceptions.NotFoundError:

>>> from gcloud.storage import exceptions
>>> try:
>>>   connection.delete_bucket('my-bucket')
>>> except exceptions.NotFoundError:
>>>   print 'That bucket does not exist!'
Parameters:bucket (string or gcloud.storage.bucket.Bucket) – The bucket name (or bucket object) to create.
Return type:bool
Returns:True if the bucket was deleted.
Raises:gcloud.storage.exceptions.NotFoundError
generate_signed_url(resource, expiration, method='GET', content_md5=None, content_type=None)[source]

Generate a signed URL to provide query-string authentication to a resource.

Parameters:
  • resource (string) – A pointer to a specific resource (typically, /bucket-name/path/to/key.txt).
  • expiration (int, long, datetime.datetime, datetime.timedelta) – When the signed URL should expire.
  • method (string) – The HTTP verb that will be used when requesting the URL.
  • content_md5 (string) – The MD5 hash of the object referenced by resource.
  • content_type (string) – The content type of the object referenced by resource.
Return type:

string

Returns:

A signed URL you can use to access the resource until expiration.

get_all_buckets(*args, **kwargs)[source]

Get all buckets in the project.

This will not populate the list of keys available in each bucket.

You can also iterate over the connection object, so these two operations are identical:

>>> from gcloud import storage
>>> connection = storage.get_connection(project_name, email, key_path)
>>> for bucket in connection.get_all_buckets():
>>>   print bucket
>>> # ... is the same as ...
>>> for bucket in connection:
>>>   print bucket
Return type:list of gcloud.storage.bucket.Bucket objects.
Returns:All buckets belonging to this project.
get_bucket(bucket_name, *args, **kwargs)[source]

Get a bucket by name.

If the bucket isn’t found, this will raise a gcloud.storage.exceptions.NotFoundError. If you would rather get a bucket by name, and return None if the bucket isn’t found (like {}.get('...')) then use Connection.lookup().

For example:

>>> from gcloud import storage
>>> from gcloud.storage import exceptions
>>> connection = storage.get_connection(project_name, email, key_path)
>>> try:
>>>   bucket = connection.get_bucket('my-bucket')
>>> except exceptions.NotFoundError:
>>>   print 'Sorry, that bucket does not exist!'
Parameters:bucket_name (string) – The name of the bucket to get.
Return type:gcloud.storage.bucket.Bucket
Returns:The bucket matching the name provided.
Raises:gcloud.storage.exceptions.NotFoundError
lookup(bucket_name)[source]

Get a bucket by name, returning None if not found.

You can use this if you would rather checking for a None value than catching an exception:

>>> from gcloud import storage
>>> connection = storage.get_connection(project_name, email, key_path)
>>> bucket = connection.get_bucket('doesnt-exist')
>>> print bucket
None
>>> bucket = connection.get_bucket('my-bucket')
>>> print bucket
<Bucket: my-bucket>
Parameters:bucket_name (string) – The name of the bucket to get.
Return type:gcloud.storage.bucket.Bucket
Returns:The bucket matching the name provided or None if not found.
make_request(method, url, data=None, content_type=None, headers=None)[source]

A low level method to send a request to the API.

Typically, you shouldn’t need to use this method.

Parameters:
  • method (string) – The HTTP method to use in the request.
  • url (string) – The URL to send the request to.
  • data (string) – The data to send as the body of the request.
  • content_type (string) – The proper MIME type of the data provided.
  • headers (dict) – A dictionary of HTTP headers to send with the request.
Return type:

tuple of response (a dictionary of sorts) and content (a string).

Returns:

The HTTP response object and the content of the response.

new_bucket(bucket)[source]

Factory method for creating a new (unsaved) bucket object.

This method is really useful when you’re not sure whether you have an actual gcloud.storage.bucket.Bucket object or just a name of a bucket. It always returns the object:

>>> bucket = connection.new_bucket('bucket')
>>> print bucket
<Bucket: bucket>
>>> bucket = connection.new_bucket(bucket)
>>> print bucket
<Bucket: bucket>
Parameters:bucket (string or gcloud.storage.bucket.Bucket) – A name of a bucket or an existing Bucket object.

Buckets

class gcloud.storage.bucket.Bucket(connection=None, name=None, metadata=None)[source]

Bases: object

A class representing a Bucket on Cloud Storage.

Parameters:
clear_acl()[source]

Remove all ACL rules from the bucket.

Note that this won’t actually remove ALL the rules, but it will remove all the non-default rules. In short, you’ll still have access to a bucket that you created even after you clear ACL rules with this method.

For example, imagine that you granted access to this bucket to a bunch of coworkers:

>>> from gcloud import storage
>>> connection = storage.get_connection(project_name, email, private_key_path)
>>> bucket = connection.get_bucket(bucket_name)
>>> acl = bucket.get_acl()
>>> acl.user('coworker1@example.org').grant_read()
>>> acl.user('coworker2@example.org').grant_read()
>>> acl.save()

Now they work in another part of the company and you want to ‘start fresh’ on who has access:

>>> acl.clear_acl()

At this point all the custom rules you created have been removed.

clear_default_object_acl()[source]

Remove the Default Object ACL from this bucket.

configure_website(main_page_suffix=None, not_found_page=None)[source]

Configure website-related metadata.

Note

This (apparently) only works if your bucket name is a domain name (and to do that, you need to get approved somehow...).

Check out the official documentation here: https://developers.google.com/storage/docs/website-configuration

If you want this bucket to host a website, just provide the name of an index page and a page to use when a key isn’t found:

>>> from gcloud import storage
>>> connection = storage.get_connection(project_name, email, private_key_path)
>>> bucket = connection.get_bucket(bucket_name)
>>> bucket.configure_website('index.html', '404.html')

You probably should also make the whole bucket public:

>>> bucket.make_public(recursive=True, future=True)

This says: “Make the bucket public, and all the stuff already in the bucket, and anything else I add to the bucket. Just make it all public.”

Parameters:
  • main_page_suffix (string) – The page to use as the main page of a directory. Typically something like index.html.
  • not_found_page (string) – The file to use when a page isn’t found.
copy_key()[source]
delete()[source]

Delete this bucket.

The bucket must be empty in order to delete it. If the bucket doesn’t exist, this will raise a gcloud.storage.exceptions.NotFoundError. If the bucket is not empty, this will raise an Exception.

Raises:gcloud.storage.exceptions.NotFoundError
delete_key(key)[source]

Deletes a key from the current bucket.

If the key isn’t found, this will throw a gcloud.storage.exceptions.NotFoundError.

For example:

>>> from gcloud import storage
>>> from gcloud.storage import exceptions
>>> connection = storage.get_connection(project_name, email, key_path)
>>> bucket = connection.get_bucket('my-bucket')
>>> print bucket.get_all_keys()
[<Key: my-bucket, my-file.txt>]
>>> bucket.delete_key('my-file.txt')
>>> try:
...   bucket.delete_key('doesnt-exist')
... except exceptions.NotFoundError:
...   pass
Parameters:key (string or gcloud.storage.key.Key) – A key name or Key object to delete.
Return type:gcloud.storage.key.Key
Returns:The key that was just deleted.
Raises:gcloud.storage.exceptions.NotFoundError
delete_keys(keys)[source]
disable_website()[source]

Disable the website configuration for this bucket.

This is really just a shortcut for setting the website-related attributes to None.

classmethod from_dict(bucket_dict, connection=None)[source]

Construct a new bucket from a dictionary of data from Cloud Storage.

Parameters:bucket_dict (dict) – The dictionary of data to construct a bucket from.
Return type:Bucket
Returns:A bucket constructed from the data provided.
get_acl()[source]

Get ACL metadata as a gcloud.storage.acl.BucketACL object.

Return type:gcloud.storage.acl.BucketACL
Returns:An ACL object for the current bucket.
get_all_keys()[source]

List all the keys in this bucket.

This will not retrieve all the data for all the keys, it will only retrieve metadata about the keys.

This is equivalent to:

keys = [key for key in bucket]
Return type:list of gcloud.storage.key.Key
Returns:A list of all the Key objects in this bucket.
get_default_object_acl()[source]

Get the current Default Object ACL rules.

If the appropriate metadata isn’t available locally, this method will reload it from Cloud Storage.

Return type:gcloud.storage.acl.DefaultObjectACL
Returns:A DefaultObjectACL object for this bucket.
get_key(key)[source]

Get a key object by name.

This will return None if the key doesn’t exist:

>>> from gcloud import storage
>>> connection = storage.get_connection(project_name, email, key_path)
>>> bucket = connection.get_bucket('my-bucket')
>>> print bucket.get_key('/path/to/key.txt')
<Key: my-bucket, /path/to/key.txt>
>>> print bucket.get_key('/does-not-exist.txt')
None
Parameters:key (string or gcloud.storage.key.Key) – The name of the key to retrieve.
Return type:gcloud.storage.key.Key or None
Returns:The key object if it exists, otherwise None.
get_metadata(field=None, default=None)[source]

Get all metadata or a specific field.

If you request a field that isn’t available, and that field can be retrieved by refreshing data from Cloud Storage, this method will reload the data using Bucket.reload_metadata().

Parameters:
  • field (string) – (optional) A particular field to retrieve from metadata.
  • default (anything) – The value to return if the field provided wasn’t found.
Return type:

dict or anything

Returns:

All metadata or the value of the specific field.

has_metadata(field=None)[source]

Check if metadata is available locally.

Parameters:field (string) – (optional) the particular field to check for.
Return type:bool
Returns:Whether metadata is available locally.
make_public(recursive=False, future=False)[source]

Make a bucket public.

Parameters:
  • recursive (bool) – If True, this will make all keys inside the bucket public as well.
  • future (bool) – If True, this will make all objects created in the future public as well.
new_key(key)[source]

Given a path name (or a Key), return a gcloud.storage.key.Key object.

This is really useful when you’re not sure if you have a Key object or a string path name. Given either of those types, this returns the corresponding Key object.

Parameters:key (string or gcloud.storage.key.Key) – A path name or actual key object.
Return type:gcloud.storage.key.Key
Returns:A Key object with the path provided.
patch_metadata(metadata)[source]

Update particular fields of this bucket’s metadata.

This method will only update the fields provided and will not touch the other fields.

It will also reload the metadata locally based on the servers response.

Parameters:metadata (dict) – The dictionary of values to update.
Return type:Bucket
Returns:The current bucket.
path[source]

The URL path to this bucket.

reload_acl()[source]

Reload the ACL data from Cloud Storage.

Return type:Bucket
Returns:The current bucket.
reload_default_object_acl()[source]

Reload the Default Object ACL rules for this bucket.

Return type:Bucket
Returns:The current bucket.
reload_metadata(full=False)[source]

Reload metadata from Cloud Storage.

Parameters:full (bool) – If True, loads all data (include ACL data).
Return type:Bucket
Returns:The bucket you just reloaded data for.
save_acl(acl=None)[source]

Save the ACL data for this bucket.

If called without arguments, this will save the ACL currently stored on the Bucket object. For example, this will save the ACL stored in some_other_acl:

>>> bucket.acl = some_other_acl
>>> bucket.save_acl()

You can also provide a specific ACL to save instead of the one currently set on the Bucket object:

>>> bucket.save_acl(acl=my_other_acl)

You can use this to set access controls to be consistent from one bucket to another:

>>> bucket1 = connection.get_bucket(bucket1_name)
>>> bucket2 = connection.get_bucket(bucket2_name)
>>> bucket2.save_acl(bucket1.get_acl())

If you want to clear the ACL for the bucket, you must save an empty list ([]) rather than using None (which is interpreted as wanting to save the current ACL):

>>> bucket.save_acl(None)  # Saves the current ACL (self.acl).
>>> bucket.save_acl([])  # Clears the current ACL.
Parameters:acl (gcloud.storage.acl.ACL) – The ACL object to save. If left blank, this will save the ACL set locally on the bucket.
save_default_object_acl(acl=None)[source]

Save the Default Object ACL rules for this bucket.

Parameters:acl (gcloud.storage.acl.DefaultObjectACL) – The DefaultObjectACL object to save. If not provided, this will look at the default_object_acl property and save that.
upload_file(filename, key=None)[source]

Shortcut method to upload a file into this bucket.

Use this method to quickly put a local file in Cloud Storage.

For example:

>>> from gcloud import storage
>>> connection = storage.get_connection(project_name, email, key_path)
>>> bucket = connection.get_bucket('my-bucket')
>>> bucket.upload_file('~/my-file.txt', 'remote-text-file.txt')
>>> print bucket.get_all_keys()
[<Key: my-bucket, remote-text-file.txt>]

If you don’t provide a key value, we will try to upload the file using the local filename as the key (not the complete path):

>>> from gcloud import storage
>>> connection = storage.get_connection(project_name, email, key_path)
>>> bucket = connection.get_bucket('my-bucket')
>>> bucket.upload_file('~/my-file.txt')
>>> print bucket.get_all_keys()
[<Key: my-bucket, my-file.txt>]
Parameters:
  • filename (string) – Local path to the file you want to upload.
  • key (string or gcloud.storage.key.Key) –

    The key (either an object or a remote path) of where to put the file.

    If this is blank, we will try to upload the file to the root of the bucket with the same name as on your local file system.

Keys

class gcloud.storage.key.Key(bucket=None, name=None, metadata=None)[source]

Bases: object

A wrapper around Cloud Storage’s concept of an Object.

Parameters:
  • bucket (gcloud.storage.bucket.Bucket) – The bucket to which this key belongs.
  • name (string) – The name of the key. This corresponds to the unique path of the object in the bucket.
  • metadata (dict) – All the other data provided by Cloud Storage.
CHUNK_SIZE = 1048576

The size of a chunk of data whenever iterating (1 MB).

This must be a multiple of 256 KB per the API specification.

clear_acl()[source]

Remove all ACL rules from the key.

Note that this won’t actually remove ALL the rules, but it will remove all the non-default rules. In short, you’ll still have access to a key that you created even after you clear ACL rules with this method.

connection[source]

Getter property for the connection to use with this Key.

Return type:gcloud.storage.connection.Connection or None
Returns:The connection to use, or None if no connection is set.
delete()[source]

Deletes a key from Cloud Storage.

Return type:Key
Returns:The key that was just deleted.
exists()[source]

Determines whether or not this key exists.

Return type:bool
Returns:True if the key exists in Cloud Storage.
classmethod from_dict(key_dict, bucket=None)[source]

Instantiate a Key from data returned by the JSON API.

Parameters:
  • key_dict (dict) – A dictionary of data returned from getting an Cloud Storage object.
  • bucket (gcloud.storage.bucket.Bucket) – The bucket to which this key belongs (and by proxy, which connection to use).
Return type:

Key

Returns:

A key based on the data provided.

generate_signed_url(expiration, method='GET')[source]

Generates a signed URL for this key.

If you have a key that you want to allow access to for a set amount of time, you can use this method to generate a URL that is only valid within a certain time period.

This is particularly useful if you don’t want publicly accessible keys, but don’t want to require users to explicitly log in.

Parameters:
  • expiration (int, long, datetime.datetime, datetime.timedelta) – When the signed URL should expire.
  • method (string) – The HTTP verb that will be used when requesting the URL.
Return type:

string

Returns:

A signed URL you can use to access the resource until expiration.

get_acl()[source]

Get ACL metadata as a gcloud.storage.acl.ObjectACL object.

Return type:gcloud.storage.acl.ObjectACL
Returns:An ACL object for the current key.
get_contents_as_string()[source]

Gets the data stored on this Key as a string.

Return type:string
Returns:The data stored in this key.
Raises:gcloud.storage.exceptions.NotFoundError
get_contents_to_file(fh)[source]

Gets the contents of this key to a file-like object.

Parameters:fh (file) – A file handle to which to write the key’s data.
Raises:gcloud.storage.exceptions.NotFoundError
get_contents_to_filename(filename)[source]

Get the contents of this key to a file by name.

Parameters:filename (string) – A filename to be passed to open.
Raises:gcloud.storage.exceptions.NotFoundError
get_metadata(field=None, default=None)[source]

Get all metadata or a specific field.

If you request a field that isn’t available, and that field can be retrieved by refreshing data from Cloud Storage, this method will reload the data using Key.reload_metadata().

Parameters:
  • field (string) – (optional) A particular field to retrieve from metadata.
  • default (anything) – The value to return if the field provided wasn’t found.
Return type:

dict or anything

Returns:

All metadata or the value of the specific field.

has_metadata(field=None)[source]

Check if metadata is available locally.

Parameters:field (string) – (optional) the particular field to check for.
Return type:bool
Returns:Whether metadata is available locally.
make_public()[source]

Make this key public giving all users read access.

Return type:Key
Returns:The current key.
patch_metadata(metadata)[source]

Update particular fields of this key’s metadata.

This method will only update the fields provided and will not touch the other fields.

It will also reload the metadata locally based on the servers response.

Parameters:metadata (dict) – The dictionary of values to update.
Return type:Key
Returns:The current key.
path[source]

Getter property for the URL path to this Key.

Return type:string
Returns:The URL path to this Key.
public_url[source]
reload_acl()[source]

Reload the ACL data from Cloud Storage.

Return type:Key
Returns:The current key.
reload_metadata(full=False)[source]

Reload metadata from Cloud Storage.

Parameters:full (bool) – If True, loads all data (include ACL data).
Return type:Key
Returns:The key you just reloaded data for.
save_acl(acl=None)[source]

Save the ACL data for this key.

Parameters:acl (gcloud.storage.acl.ACL) – The ACL object to save. If left blank, this will save the ACL set locally on the key.
set_contents_from_file(fh, rewind=False, size=None, content_type=None)[source]

Set the contents of this key to the contents of a file handle.

Parameters:
  • fh (file) – A file handle open for reading.
  • rewind (bool) – If True, seek to the beginning of the file handle before writing the file to Cloud Storage.
  • size (int) – The number of bytes to read from the file handle. If not provided, we’ll try to guess the size using os.fstat()
set_contents_from_filename(filename)[source]

Open a path and set this key’s contents to the content of that file.

Parameters:filename (string) – The path to the file.
set_contents_from_string(data, content_type='text/plain')[source]

Sets the contents of this key to the provided string.

You can use this method to quickly set the value of a key:

>>> from gcloud import storage
>>> connection = storage.get_connection(project_name, email, key_path)
>>> bucket = connection.get_bucket(bucket_name)
>>> key = bucket.new_key('my_text_file.txt')
>>> key.set_contents_from_string('This is the contents of my file!')

Under the hood this is using a string buffer and calling gcloud.storage.key.Key.set_contents_from_file().

Parameters:data (string) – The data to store in this key.
Return type:Key
Returns:The updated Key object.

Access Control

This module makes it simple to interact with the access control lists that Cloud Storage provides.

gcloud.storage.bucket.Bucket has a getting method that creates an ACL object under the hood, and you can interact with that using gcloud.storage.bucket.Bucket.get_acl():

>>> from gcloud import storage
>>> connection = storage.get_connection(project_name, email, key_path)
>>> bucket = connection.get_bucket(bucket_name)
>>> acl = bucket.get_acl()

Adding and removing permissions can be done with the following methods (in increasing order of granularity):

  • ACL.all() corresponds to access for all users.
  • ACL.all_authenticated() corresponds to access for all users that are signed into a Google account.
  • ACL.domain() corresponds to access on a per Google Apps domain (ie, example.com).
  • ACL.group() corresponds to access on a per group basis (either by ID or e-mail address).
  • ACL.user() corresponds to access on a per user basis (either by ID or e-mail address).

And you are able to grant and revoke the following roles:

You can use any of these like any other factory method (these happen to be ACL.Entity factories):

>>> acl.user('me@example.org').grant_read()
>>> acl.all_authenticated().grant_write()

You can also chain these grant_* and revoke_* methods together for brevity:

>>> acl.all().grant_read().revoke_write()

After that, you can save any changes you make with the gcloud.storage.acl.ACL.save() method:

>>> acl.save()

You can alternatively save any existing gcloud.storage.acl.ACL object (whether it was created by a factory method or not) with the gcloud.storage.bucket.Bucket.save_acl() method:

>>> bucket.save_acl(acl)

To get the list of entity and role for each unique pair, the ACL class is iterable:

>>> print list(ACL)
[{'role': 'OWNER', 'entity': 'allUsers'}, ...]

This list of tuples can be used as the entity and role fields when sending metadata for ACLs to the API.

class gcloud.storage.acl.ACL[source]

Bases: object

Container class representing a list of access controls.

class Entity(type, identifier=None)[source]

Bases: object

Class representing a set of roles for an entity.

This is a helper class that you likely won’t ever construct outside of using the factor methods on the ACL object.

Parameters:
  • type (string) – The type of entity (ie, ‘group’ or ‘user’).
  • identifier (string) – The ID or e-mail of the entity. For the special entity types (like ‘allUsers’) this is optional.
get_roles()[source]

Get the list of roles permitted by this entity.

Return type:list of strings
Returns:The list of roles associated with this entity.
grant(role)[source]

Add a role to the entity.

Parameters:role (string) – The role to add to the entity.
Return type:ACL.Entity
Returns:The entity class.
grant_owner()[source]

Grant owner access to the current entity.

grant_read()[source]

Grant read access to the current entity.

grant_write()[source]

Grant write access to the current entity.

revoke(role)[source]

Remove a role from the entity.

Parameters:role (string) – The role to remove from the entity.
Return type:ACL.Entity
Returns:The entity class.
revoke_owner()[source]

Revoke owner access from the current entity.

revoke_read()[source]

Revoke read access from the current entity.

revoke_write()[source]

Revoke write access from the current entity.

class ACL.Role[source]

Bases: object

Enum style class for role-type constants.

Owner = 'OWNER'
Reader = 'READER'
Writer = 'WRITER'
ACL.add_entity(entity)[source]

Add an entity to the ACL.

Parameters:entity (ACL.Entity) – The entity to add to this ACL.
ACL.all()[source]

Factory method for an Entity representing all users.

Return type:ACL.Entity
Returns:An entity representing all users.
ACL.all_authenticated()[source]

Factory method for an Entity representing all authenticated users.

Return type:ACL.Entity
Returns:An entity representing all authenticated users.
ACL.domain(domain)[source]

Factory method for a domain Entity.

Parameters:domain (string) – The domain for this entity.
Return type:ACL.Entity
Returns:An entity corresponding to this domain.
ACL.entity(type, identifier=None)[source]

Factory method for creating an Entity.

If an entity with the same type and identifier already exists, this will return a reference to that entity. If not, it will create a new one and add it to the list of known entities for this ACL.

Parameters:
  • type (string) – The type of entity to create (ie, user, group, etc)
  • identifier (string) – The ID of the entity (if applicable). This can be either an ID or an e-mail address.
Return type:

ACL.Entity

Returns:

A new Entity or a refernece to an existing identical entity.

ACL.entity_from_dict(entity_dict)[source]

Build an ACL.Entity object from a dictionary of data.

An entity is a mutable object that represents a list of roles belonging to either a user or group or the special types for all users and all authenticated users.

Parameters:entity_dict (dict) – Dictionary full of data from an ACL lookup.
Return type:ACL.Entity
Returns:An Entity constructed from the dictionary.
ACL.get_entities()[source]

Get a list of all Entity objects.

Return type:list of ACL.Entity objects
Returns:A list of all Entity objects.
ACL.get_entity(entity, default=None)[source]

Gets an entity object from the ACL.

Parameters:
  • entity (ACL.Entity or string) – The entity to get lookup in the ACL.
  • default (anything) – This value will be returned if the entity doesn’t exist.
Return type:

ACL.Entity

Returns:

The corresponding entity or the value provided to default.

ACL.group(identifier)[source]

Factory method for a group Entity.

Parameters:identifier (string) – An id or e-mail for this particular group.
Return type:ACL.Entity
Returns:An Entity corresponding to this group.
ACL.has_entity(entity)[source]

Returns whether or not this ACL has any entries for an entity.

Parameters:entity (ACL.Entity) – The entity to check for existence in this ACL.
Return type:bool
Returns:True of the entity exists in the ACL.
ACL.save()[source]

A method to be overridden by subclasses.

Raises:NotImplementedError
ACL.user(identifier)[source]

Factory method for a user Entity.

Parameters:identifier (string) – An id or e-mail for this particular user.
Return type:ACL.Entity
Returns:An Entity corresponding to this user.
class gcloud.storage.acl.BucketACL(bucket)[source]

Bases: gcloud.storage.acl.ACL

An ACL specifically for a bucket.

Parameters:bucket (gcloud.storage.bucket.Bucket) – The bucket to which this ACL relates.
save()[source]

Save this ACL for the current bucket.

class gcloud.storage.acl.DefaultObjectACL(bucket)[source]

Bases: gcloud.storage.acl.BucketACL

A subclass of BucketACL representing the default object ACL for a bucket.

Parameters:bucket (gcloud.storage.bucket.Bucket) – The bucket to which this ACL relates.
save()[source]

Save this ACL as the default object ACL for the current bucket.

class gcloud.storage.acl.ObjectACL(key)[source]

Bases: gcloud.storage.acl.ACL

An ACL specifically for a key.

Parameters:key (gcloud.storage.key.Key) – The key that this ACL corresponds to.
save()[source]

Save this ACL for the current key.

Iterators

Iterators for paging through API responses.

These iterators simplify the process of paging through API responses where the response is a list of results with a nextPageToken.

To make an iterator work, just override the get_items_from_response method so that given a response (containing a page of results) it parses those results into an iterable of the actual objects you want:

class MyIterator(Iterator):
  def get_items_from_response(self, response):
    items = response.get('items', [])
    for item in items:
      yield MyItemClass.from_dict(item, other_arg=True)

You then can use this to get all the results from a resource:

>>> iterator = MyIterator(...)
>>> list(iterator)  # Convert to a list (consumes all values).

Or you can walk your way through items and call off the search early if you find what you’re looking for (resulting in possibly fewer requests):

>>> for item in MyIterator(...):
>>>   print item.name
>>>   if not item.is_valid:
>>>     break
class gcloud.storage.iterator.BucketIterator(connection)[source]

Bases: gcloud.storage.iterator.Iterator

An iterator listing all buckets.

You shouldn’t have to use this directly, but instead should use the helper methods on gcloud.storage.connection.Connection objects.

Parameters:connection (gcloud.storage.connection.Connection) – The connection to use for querying the list of buckets.
get_items_from_response(response)[source]

Factory method which yields gcloud.storage.bucket.Bucket items from a response.

Parameters:response (dict) – The JSON API response for a page of buckets.
class gcloud.storage.iterator.Iterator(connection, path)[source]

Bases: object

A generic class for iterating through Cloud Storage list responses.

Parameters:
get_items_from_response(response)[source]

Factory method called while iterating. This should be overriden.

This method should be overridden by a subclass. It should accept the API response of a request for the next page of items, and return a list (or other iterable) of items.

Typically this method will construct a Bucket or a Key from the page of results in the response.

Parameters:response (dict) – The response of asking for the next page of items.
Return type:iterable
Returns:Items that the iterator should yield.
get_next_page_response()[source]

Requests the next page from the path provided.

Return type:dict
Returns:The parsed JSON response of the next page’s contents.
get_query_params()[source]

Getter for query parameters for the next request.

Return type:dict or None
Returns:A dictionary of query parameters or None if there are none.
has_next_page()[source]

Determines whether or not this iterator has more pages.

Return type:bool
Returns:Whether the iterator has more pages or not.
reset()[source]

Resets the iterator to the beginning.

class gcloud.storage.iterator.KeyDataIterator(key)[source]

Bases: object

get_headers()[source]
get_next_chunk()[source]
get_url()[source]
has_more_data()[source]
reset()[source]
class gcloud.storage.iterator.KeyIterator(bucket)[source]

Bases: gcloud.storage.iterator.Iterator

An iterator listing keys.

You shouldn’t have to use this directly, but instead should use the helper methods on gcloud.storage.key.Key objects.

Parameters:bucket (gcloud.storage.bucket.Bucket) – The bucket from which to list keys.
get_items_from_response(response)[source]

Factory method which yields gcloud.storage.key.Key items from a response.

Parameters:response (dict) – The JSON API response for a page of keys.

Exceptions

exception gcloud.storage.exceptions.ConnectionError(response, content)[source]

Bases: gcloud.storage.exceptions.StorageError

exception gcloud.storage.exceptions.NotFoundError(response, content)[source]

Bases: gcloud.storage.exceptions.ConnectionError

exception gcloud.storage.exceptions.StorageDataError[source]

Bases: gcloud.storage.exceptions.StorageError

exception gcloud.storage.exceptions.StorageError[source]

Bases: exceptions.Exception