There are sevaral threads/blogs on how you can use django-storage library to upload files/images on different cloud platforms(in my case I'm concerned about only S3).
I followed those and configured it by adding below settings in settings.py
storages to INSTALLED_APPS,
AWS config
AWS_ACCESS_KEY_ID =
AWS_SECRET_ACCESS_KEY =
AWS_STORAGE_BUCKET_NAME =
AWS_S3_REGION_NAME =
AWS_QUERYSTRING_AUTH = False
STORAGES = {
"default": {
"BACKEND": "path.to.CustomStoragBackendsClass",
},
"staticfiles": {
"BACKEND": "django.contrib.staticfiles.storage.StaticFilesStorage",
},
}
class CustomStoragBackendsClass(S3Boto3Storage):
default_acl = 'public-read'
Now in models, I'm trying to use this as below
class MyModel(mdoel.Model):
...
main_icon = models.ImageField(
null=True,
storage=CustomStoragBackendsClass(location="my_model/main_icons"),
upload_to=main_icon_upload_to_path,
)
...
and here is the upload_to path function in the same model file
def main_icon_upload_to_path(instance, filename):
return f"{instance.pk}/{filename}"
At first look all this looks fine but for some reason it's not working as expected. I'm still getting the path without instance id in it (basically it seems upload_to is not being used at all, only storage field is being used).
My goal is to have a single bucket for all images/files in the app. and then in that bucket will have folders representing each models from where files have been uploaded. and with in model directory, I have another folder which represents the field name, and then id of that model object and so on. here are few examples to make it more clear.
path1 = S3Bucket_name/model1_name/model1_field1_name/instance_id/file_name.png
path2 = S3Bucket_name/model1_name/model1_field2_name/instance_id/file_name.pdf
path3 = S3Bucket_name/model2_name/model2_field1_name/instance_id/file_name.jpg
tried by providing the full path in upload_to field itself, but that didn't work either. not sure if i'm missing anything.