AWS + Python Boto3 testing

As of November 2019, I’m currently aware of at least 3 decent options how to unit test your Python app functions/methods used as wrappers of boto3 calls handling the interaction with AWS. Unit testing your functions with boto3 calls, using the methods I’m about to mention, has it’s pros and it’s cons:

pros:

  • You don’t need to spend money for testing ( If testing done correctly )
  • You can test the vulnerabilities of your codebase
  • You can get meaningful test coverage of your codebase
  • You can “add_client_error” to the Botocore Stubber response for negative testing
  • Moto and Localstack can be used as standalone “mockup’d” AWS servers and therefore used with any programming language allowing SDK interaction with AWS (¬†Java, C#, Ruby, Python, JavaScript, PHP, and Objective C (iOS) )

cons:

  • With Moto and Stubbers, if testing codebase written incorrectly, you can be hitting the real AWS services endpoints, and even worse, you can mess up your data in your production account
  • With Moto and Stubbers, I encourage you to do testing of singular action functions/methods, which might not always exactly fit your codebase ( This can however be seen actually as a pro )
  • With Localstack, there’s some overhead while spinning it up as a Docker based service
  • Typically not all AWS services fully covered as of 11/2019

 

Moto

Github repo link

Moto is a great project and you can see the service coverage grow on a regular basis. Currently supported services are listed here . Besides other cool features , Moto is also able to operate in a stand alone server mode.

Let’s see what Moto’s Github readme say on how you could end up testing simple S3 put object function:

import boto3
from moto import mock_s3

class MyModel(object):
    def __init__(self, name, value):
        self.name = name
        self.value = value

    def save(self):
        s3 = boto3.client('s3', region_name='us-east-1')
        s3.put_object(Bucket='mybucket', Key=self.name, Body=self.value)

@mock_s3
def test_s3_put_object():
    conn = boto3.resource('s3', region_name='us-east-1')
    # We need to create the bucket since this is all in Moto's 'virtual' AWS account
    conn.create_bucket(Bucket='mybucket')

    model_instance = MyModel('steve', 'is awesome')
    model_instance.save()

    body = conn.Object('mybucket', 'steve').get()['Body'].read().decode("utf-8")

    assert body == 'is awesome'

 

Botocore stubbers

Github repo link

Botocore, the foundation behind Boto3, the official AWS SDK for Python, has a class Stubber in the stub.py module, allowing you to stub out requests instead of hitting the real AWS endpoints.

Testing S3 put object method would look for instance like this:

import botocore.session
from botocore.stub import Stubber

def function_we_are_testing_putting_object_to_s3(client):
    ret = client.put_object(Bucket="mybucket", Key="testkey")
    return ret


def test_s3_put_object():
    client = botocore.session.get_session().create_client('s3')
    stubber = Stubber(client)
    put_object_response = {
            'Expiration': 'string',
            'ETag': 'abcd',
            'ServerSideEncryption': 'AES256',
            'VersionId': 'string',
            'SSECustomerAlgorithm': 'string',
            'SSECustomerKeyMD5': 'string',
            'SSEKMSKeyId': 'string',
            'SSEKMSEncryptionContext': 'string',
            'RequestCharged': 'requester'
        }

    expected_params = {'Bucket':'mybucket', 'Key':'testkey'}
    stubber.add_response('put_object', put_object_response, expected_params)

    with stubber:
        response = function_we_are_testing_putting_object_to_s3(client)
    assert response == put_object_response

Notice that when we try to get the response with the stubber context, we don’t need even any AWS credentials set on our machine. If you try to get the response for the assertion outside of the stubber context, you’ll get AWS missing credentials error. This way you can verify that AWS endpoint was not targeted at all.

Localstack

Github repo link

I did some research on Localstack at the beginning of 2019, see my tutorial here . What you’d need would be a function with a Boto3 connection set to your Localstack S3 like:

http://localhost:4572

So a S3 put object example could be somewhere along these lines:

import boto3

session = boto3.session.Session()

def test_s3_put_object():
    s3_client = session.client(
        service_name='s3',
        aws_access_key_id='aaa',
        aws_secret_access_key='bbb',
        endpoint_url='http://localhost:4572',
    )
    expected_put_object_status_code = 200
    s3_client_put_object_resp = s3_client.put_object(Bucket='mybucket', Key="testFile", Body="testContent")
    s3_client_put_object_resp_status_code = s3_client_put_object_resp.get('ResponseMetadata').get('HTTPStatusCode')

    #and now you can for instance assert the boto3 put_object response status_code is the status_code you expect

 

One last thing here, if your AWS S3 testing requires a dummy test file, I recently released my dummy_file_generator project into PyPI. See here.