skip to Main Content

I’m following this tutorial to add a progress bar when I’m uploading a file in Django, using ajax.
When I’m uploading the file to a folder using the upload_to option everything works fine.
But when I’m uploading the file to Azure using the storage option – It doesn’t work.
i.e. when this is my model:

class UploadFile(models.Model):
    title = models.CharField(max_length=50)
    file=models.FileField(upload_to='files/media/pre')

It works perfect, but when this is my model:

from myAzure import AzureMediaStorage as AMS
class UploadFile(models.Model):
    title = models.CharField(max_length=50)
    file = models.FileField(storage=AMS)

It gets stuck and not progressing.
(AMS is defined in myAzure.py by):

from storages.backends.azure_storage import AzureStorage

class AzureMediaStorage(AzureStorage):
    account_name = '<myAccountName>'
    account_key = '<myAccountKey>'
    azure_container = 'media'
    expiration_secs = None

How can I make it work?

EDIT:
If it was not clear:

  • my problem is not to upload to Azure, but to show progress bar.
  • From security reasons I do not want to upload the file from the browser and to use CORS and SAS but from my backend.

2

Answers


  1. I can suggest to try the work-around of storing file locally and then upload to Azure.

    Not sure if it will work but at least you may give it a try and tell if helps:

    class UploadFile(models.Model):
        title = models.CharField(max_length=50)
        file = models.FileField(upload_to='files/media/pre', null=True, blank=False)
        remote_file = models.FileField(storage=AMS, null=True, blank=True, default=None)
    
        def save(self, *args, **kwargs):
            if self.file:
                self.remote_file = self.file
                super().save(*args, **kwargs)  # in theory - this should trigger upload of remote_file
                self.file = None 
            super().save(*args, **kwargs)
            
    
    Login or Signup to reply.
  2. When one is uploading a file to a specific place, in order to track the current state of the upload either one adds a wrapper around the Python object or the place where one is uploading to provides a callback for monitoring.

    Since the Azure library doesn’t provide that callback, one can create a wrapper for the object or use an already existing one.

    There’s a library suggested by Alastair McCormack named tqdm with such wrapper that one can use.

    As George John shows, one can do something like this

    size = os.stat(fname).st_size
    with tqdm.wrapattr(open(fname, 'rb'), "read", total=size) as data:
        blob_client.upload_blob(data)
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search