Upload a file to Azure Blob Storage (ADLS Gen 2) using Python

Today we will learn on how to upload a csv file to Azure Blob Storage (ADLS) using Python 
Steps:
  • Create a file to be uploaded to Azure Storage
  • Create a python script
    • Install Azure package from pip
      • pip install azure-storage-file-datalake
    • Import the azure module
      • import os, uuid, sys
      • from azure.storage.filedatalake import DataLakeServiceClient
      • from azure.core._match_conditions import MatchConditions
      • from azure.storage.filedatalake._models import ContentSettings
    • Create a connection to your Azure storage in the python script
      • service_client = DataLakeServiceClient(account_url="{}://{}.dfs.core.windows.net".format("https", 'Storage_account_name'), credential='Storage_Account_Key')
    • Specify the container name
      • file_system_client = service_client.get_file_system_client(file_system="your_container_name")
    • Specify the directory in your Azure storage
      • directory_client = file_system_client.get_directory_client("my-directory")
    • Create a blank txt file
      • file_client = directory_client.create_file("uploaded-file.txt")
    • Read the txt file from your local computer
    • append the data into the txt file (by calling the append_data() function) created in the Azure storage
      •  local_file = open("C:\\Users\\xxxxxxx\\Desktop\\testFile.txt", 'r')
      •  file_contents = local_file.read()
      •  file_client.append_data(data=file_contents, offset=0, length=len(file_contents))
      •  file_client.flush_data(len(file_contents))
    • Go to your Azure blob storage and view the results
                                                                        


To know more about the File upload operations, visit Microsoft Site

Comments