Last active
October 23, 2024 20:14
-
-
Save heathhenley/158055812edc3dbd8faff059608492bb to your computer and use it in GitHub Desktop.
Getting a signed url on cloud run (minor extra auth step)
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from datetime import datetime, timedelta, timezone | |
import google.auth | |
from google.auth.transport import requests | |
from google.cloud import storage | |
def ingest(): | |
"""This endpoint responds with a presigned URL for GCP upload.""" | |
# This was the key for running locally and on cloud run with the same | |
# code, without doing something sketchy like deploying the json | |
# credential file. If you just leave this out, Cloud Run will use | |
# the default credentials for the service account running the service, | |
# but you get an error that it doesn't have a private key to make the | |
# signature. | |
# You also have to make sure it has signBlob permission under service | |
# accounts, and permissions to create objects in the bucket you are | |
# going to use. | |
creds, _ = google.auth.default( | |
# Without a scope param, it worked on cloud run but not locally, it | |
# failed locally when trying to refresh the creds below saying the scopes | |
# were not specified, so I guess it could not determine the scopes | |
# for the default user. | |
scopes=['https://www.googleapis.com/auth/[SCOPE YOU NEED]'], | |
) | |
if creds.token is None: | |
creds.refresh(requests.Request()) | |
storage_client = storage.Client() | |
blob_name = f'uploads/myBlob' | |
bucket = storage_client.bucket("myBucket") | |
blob = bucket.blob(blob_name) | |
expiration_time = datetime.now(timezone.utc) + timedelta(minutes=15) | |
signed_url = blob.generate_signed_url( | |
expiration=expiration_time, | |
method='PUT', | |
content_type='application/octet-stream', | |
version='v4', | |
service_account_email=creds.service_account_email, | |
access_token=creds.token, | |
) | |
if not signed_url: | |
return Response( | |
'Unable to generate signed URL', | |
status=status.HTTP_500_INTERNAL_SERVER_ERROR | |
) | |
return Response({'signed_url': signed_url}, status=status.HTTP_200_OK) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Using the google storage client to create / read / update bucket contents is not a problem on cloud with the default authentication set up (if the service account running the app has access to those things). However, I had some trouble figuring out how to get it set up to make pre-signed URLs. I kept getting an error that the default credentials on cloud run don't have a private key to sign it. This is the incantation that got it working for me, both on localhost for dev and in the cloud run environment, without needing to flag / switch on that.