How to list buckets from Google Storage in Python?


How to list buckets from Google Storage in Python?



I have the following code in Python script:


from google.oauth2 import service_account
SCOPES = ['https://www.googleapis.com/auth/sqlservice.admin']
SERVICE_ACCOUNT_FILE = 'JSON_AUTH_FILE'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
import os
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = SERVICE_ACCOUNT_FILE



This does the auth with google.



Now I want to list all buckets :


from google.cloud import storage
storage_client = storage.Client()
buckets = list(storage_client.list_buckets())
print(buckets)



But this doesn't work. I get:


google.api_core.exceptions.Forbidden: 403

xxx@yyy.iam.gserviceaccount.com does not have storage.buckets.list access to project



It also has a link when I click it i see (which is weird because it says 403 but here it shows 401:


{
"error": {
"errors": [
{
"domain": "global",
"reason": "required",
"message": "Anonymous caller does not have storage.buckets.list access to project NUMBER.",
"locationType": "header",
"location": "Authorization"
}
],
"code": 401,
"message": "Anonymous caller does not have storage.buckets.list access to project NUMBER."
}
}



What am I doing wrong?





Why do you have a sqlservice.admin scope if you're trying to access Cloud Storage?
– Alberto Garcia-Raboso
Jul 2 at 13:54


sqlservice.admin





@AlbertoGarcia-Raboso I don't know. This is the first time I'm working with Google. This is what their guide showed in the example.
– jack
Jul 2 at 14:00





1 Answer
1



A couple of things to suggest in reference to this link:
https://cloud.google.com/storage/docs/reference/libraries#client-libraries-install-python



There are a couple of ways you can set roles for service accounts to access Google Storage:
https://cloud.google.com/iam/docs/understanding-roles#cloud_storage_roles



When you create your service account, select the Project Role: Storage -> Storage Admin. Setting this role will allow your service account to access and manipulate objects from Cloud Storage. Using the Storage Admin role ensures that you give the least amount of privilege to the service account so that it can't access other services.



If you're having problems with authentication perhaps look at setting the role to Project -> Editor which should give the service account edit access to most of the GCP services. Just be aware that if the service account is compromised the user will have access to most of your services in the GCP project.



By setting a custom role you can inherit the permissions given to a number of the default roles. A good way to do this is by using the "Create Role From Selection" in the IAM & Admin -> Roles section of the GCP Console.



For example you could combine the BigQuery Admin and Storage Object Admin into a single custom role by selecting the check-boxes for each role and creating your own custom role which you can then allocate to your service account in the IAM section of GCP.



Once you have a service account with the correct permissions you should be able set the GOOGLE_APPLICATION_CREDENTIALS environment variable and use the google library to access your storage buckets.


GOOGLE_APPLICATION_CREDENTIALS


google



Try this modification to your code once you have a service account with the correct role to test to see if it can list all the buckets the account has access to


import os
from google.cloud import storage

os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = "/home/user/Downloads/[FILE_NAME].json"

# Instantiates a client
storage_client = storage.Client()

# List all the buckets available
for bucket in storage_client.list_buckets():
print(bucket)





How do I change it for existing service account? I can't find this menu. There is only : edit / delete / create key and edit offer only changing the name
– jack
Jul 2 at 14:36






Say instead of listing buckets I want to upload file. Would I still need to change the Role? Because I actually need to upload files... the listing objects is just a test.
– jack
Jul 2 at 14:39





I usually just create a new service account when I need to add new roles. You'll be able to upload files using the Storage Object Admin because it has read and write permissions to the bucket
– ScottMcC
Jul 2 at 14:49





You can just create another account (you can create as many as you would like). If you're not using one that you've created you can just delete it.
– ScottMcC
Jul 2 at 15:02





The current role is Big Query editor.. If I change it to Storage Object Admin does it means I won't have permissions over big query?!
– jack
Jul 2 at 15:09






By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Popular posts from this blog

PHP contact form sending but not receiving emails

Do graphics cards have individual ID by which single devices can be distinguished?

iOS Top Alignment constraint based on screen (superview) height