AWS BOTO3

Report Notebook

Uses AND Working

AWS Services

AWS BOTO3

AWS BOTO3(BRIEF SUMMARY)

  1. The SDK is composed of two key Python packages:
  2. Botocore (the library providing the low-level functionality shared between the Python SDK and the AWS CLI)
  3. Boto3 (the package implementing the Python SDK itself).

AWS BOTO3 USES

  1. We Can do multiple things with the help of boto3
  2. We can easily access and used large data from AWS S3 with the help of the boto3
  3. Only one basic requirement for the user is the key Credential’s for the

Verification and Security Purpose

  1. With the help of the Boto3 we can Create, delete and Edit every thing with right permissions
  2. Its Safe and Useful for the python data imports

How to Use ?

  1. Test for the working of the code on jupyter notebook

How to import

  1. Import boto3

  1. If boto3 is not installed
  2. Use : pip install boto3
  3. How to List all the bucket available in AWS S3

  1. Code from above Picture

import boto3

aws_access_key = ‘xxxxxxxxxxxxxxxxxxxxx’

aws_secret_key = ‘aN+ieyQCw3zQ6nZxOboc7jxxxxxxxxxxx’

s3 = boto3.client(“s3”,

region_name=’us-east-1′,

aws_access_key_id=aws_access_key,

aws_secret_access_key=aws_secret_key)

bucket_response = s3.list_buckets()

buckets = bucket_response[“Buckets”]

  1. Bucket Creation code
  2. bucket = s3.create_bucket(Bucket=’boto3bucket852′)
  3. bucket_response = s3.list_buckets()
  4. buckets = bucket_response[“Buckets”]
  5. #Bucket name which avilable
  6. print(list(buckets))
  1. How to Create a Python File to Upload Local File to

AWS S3

  1. Import boto and boto3 simply
  2. Create Session
  3. /home/pawanrai852/basic.zip is the local path
  4. ‘dump/files/basic.zip’ is the bucket inside Directory

Code From Above Image for upload local to S3:

  1. import boto
  2. import boto3
  3. from boto.s3.key import Key
  4. aws_access_key = ‘AKIAWRCQLR4XXXXXXXXXXXX’
  5. aws_secret_key = ‘aN+ieyQCw3zQ6nZxObxxxxxxxxxxxxxxxxxx’
  6. session = boto3.Session(region_name=’us-east-1′,aws_access_key_id=aws_access_key,
  7. aws_secret_access_key=aws_secret_key)
  8. s3 = boto3.resource(‘s3’, aws_access_key_id=aws_access_key, aws_secret_access_key=aws_secret_key)
  9. BUCKET = ‘bucketforpawan’
  10. s3.Bucket(BUCKET).upload_file(‘/home/pawanrai852/basic.zip’, ‘dump/files/basic.zip’)

How to Create a Python File to Download S3 Bucket Items?

Code From above Image to Download file from S3:

  1. import boto3
  2. import subprocess
  3. import os
  4. aws_access_key = ‘AKIAWRCQLR43OBxxxxxxxxxxxxxx’
  5. aws_secret_key = ‘aN+ieyQCw3zQ6nXXXXXXXXXXXXXXXXXXXXXXXX’
  6. session = boto3.Session(region_name=’us-east-1′,aws_access_key_id=aws_access_key,
  7. aws_secret_access_key=aws_secret_key)
  8. ### Name for the buckets
  9. my_bucket_name = “bucketforpawan”
  10. bucket_folder_name = “dump/files”
  11. local_folder_path = “/home/pawanrai852/”
  12. # 1.Load thes list of files existing in the bucket folder
  13. FILES_NAMES = []
  14. s3 = boto3.resource(‘s3’, aws_access_key_id=aws_access_key, aws_secret_access_key=aws_secret_key)
  15. my_bucket = s3.Bucket(‘{}’.format(my_bucket_name))
  16. for object_summary in my_bucket.objects.filter(Prefix=”{}/”.format(bucket_folder_name)):
  17. # print(object_summary.key)
  18. FILES_NAMES.append(object_summary.key)
  19. # 2.List only new files that do not exist in local folder (to not copy everything!)
  20. new_filenames = list(set(FILES_NAMES )-set(os.listdir(local_folder_path)))
  21. # 3.Time to load files in your destination folder
  22. for new_filename in new_filenames:
  23. upload_S3files_CMD = “””aws s3 cp s3://{}/{}/{} {}”””.format(my_bucket_name,bucket_folder_name,new_filename ,local_folder_path)
  24. subprocess_call = subprocess.call([upload_S3files_CMD], shell=True)
  25. if subprocess_call != 0:
  26. print(“ALERT: loading files not working correctly, please re-check new loaded files”)

Conclusion: Boto3 is the best or sometime very useful when its come to import large data inside a program for processing and then we can also save the processed data again inside the S3 Bucket With the help of the Boto3.

Pawan Rai
DevOps Engineer at SprigHub
Cookie Value: