Aws s3 cp an error occurred 403 when calling the HeadObject operation Forbidden

I'm trying to set up an Amazon Linux AMI(ami-f0091d91) and have a script that runs a copy command to copy from an S3 bucket.

This script works perfectly on my local machine but fails with the following error on the Amazon Image:

2016-03-22 01:07:47,110 - MainThread - botocore.auth - DEBUG - StringToSign:

HEAD

Tue, 22 Mar 2016 01:07:47 GMT

x-amz-security-token:AQoDYXdzEPr//////////wEa4ANtcDKVDItVq8Z5OKms8wpQ3MS4dxLtxVq6Om1aWDhLmZhL2zdqiasNBV4nQtVqwyPsRVyxl1Urq1BBCnZzDdl4blSklm6dvu+3efjwjhudk7AKaCEHWlTd/VR3cksSNMFTcI9aIUUwzGW8lD9y8MVpKzDkpxzNB7ZJbr9HQNu8uF/st0f45+ABLm8X4FsBPCl2I3wKqvwV/s2VioP/tJf7RGQK3FC079oxw3mOid5sEi28o0Qp4h/Vy9xEHQ28YQNHXOBafHi0vt7vZpOtOfCJBzXvKbk4zRXbLMamnWVe3V0dArncbNEgL1aAi1ooSQ8+Xps8ufFnqDp7HsquAj50p459XnPedv90uFFd6YnwiVkng9nNTAF+2Jo73+eKTt955Us25Chxvk72nAQsAZlt6NpfR+fF/Qs7jjMGSF6ucjkKbm0x5aCqCw6YknsoE1Rtn8Qz9tFxTmUzyCTNd7uRaxbswm7oHOdsM/Q69otjzqSIztlwgUh2M53LzgChQYx5RjYlrjcyAolRguJjpSq3LwZ5NEacm/W17bDOdaZL3y1977rSJrCxb7lmnHCOER5W0tsF9+XUGW1LMX69EWgFYdn5QNqFk6mcJsZWrR9dkehaQwjLPcv/29QcM+b5u/0goazCtwU=

/aws-codedeploy-us-west-2/latest/codedeploy-agent.noarch.rpm

2016-03-22 01:07:47,111 - MainThread - botocore.endpoint - DEBUG - Sending http request: <PreparedRequest [HEAD]>

2016-03-22 01:07:47,111 - MainThread - botocore.vendored.requests.packages.urllib3.connectionpool - INFO - Starting new HTTPS connection (1): aws-codedeploy-us-west-2.s3.amazonaws.com

2016-03-22 01:07:47,151 - MainThread - botocore.vendored.requests.packages.urllib3.connectionpool - DEBUG - "HEAD /latest/codedeploy-agent.noarch.rpm HTTP/1.1" 403 0

2016-03-22 01:07:47,151 - MainThread - botocore.parsers - DEBUG - Response headers: {'x-amz-id-2': '0mRvGge9ugu+KKyDmROm4jcTa1hAnA5Ax8vUlkKZXoJ//HVJAKxbpFHvOGaqiECa4sgon2F1kXw=', 'server': 'AmazonS3', 'transfer-encoding': 'chunked', 'x-amz-request-id': '6204CD88E880E5DD', 'date': 'Tue, 22 Mar 2016 01:07:46 GMT', 'content-type': 'application/xml'}

2016-03-22 01:07:47,152 - MainThread - botocore.parsers - DEBUG - Response body:

2016-03-22 01:07:47,152 - MainThread - botocore.hooks - DEBUG - Event needs-retry.s3.HeadObject: calling handler <botocore.retryhandler.RetryHandler object at 0x7f421075bcd0>

2016-03-22 01:07:47,152 - MainThread - botocore.retryhandler - DEBUG - No retry needed.

2016-03-22 01:07:47,152 - MainThread - botocore.hooks - DEBUG - Event after-call.s3.HeadObject: calling handler <function enhance_error_msg at 0x7f4211085758>

2016-03-22 01:07:47,152 - MainThread - botocore.hooks - DEBUG - Event after-call.s3.HeadObject: calling handler <awscli.errorhandler.ErrorHandler object at 0x7f421100cc90>

2016-03-22 01:07:47,152 - MainThread - awscli.errorhandler - DEBUG - HTTP Response Code: 403

2016-03-22 01:07:47,152 - MainThread - awscli.customizations.s3.s3handler - DEBUG - Exception caught during task execution: A client error (403) occurred when calling the HeadObject operation: Forbidden

Traceback (most recent call last):

  File "/usr/local/lib/python2.7/site-packages/awscli/customizations/s3/s3handler.py", line 100, in call

    total_files, total_parts = self._enqueue_tasks(files)

  File "/usr/local/lib/python2.7/site-packages/awscli/customizations/s3/s3handler.py", line 178, in _enqueue_tasks

    for filename in files:

  File "/usr/local/lib/python2.7/site-packages/awscli/customizations/s3/fileinfobuilder.py", line 31, in call

    for file_base in files:

  File "/usr/local/lib/python2.7/site-packages/awscli/customizations/s3/filegenerator.py", line 142, in call

    for src_path, extra_information in file_iterator:

  File "/usr/local/lib/python2.7/site-packages/awscli/customizations/s3/filegenerator.py", line 314, in list_objects

    yield self._list_single_object(s3_path)

  File "/usr/local/lib/python2.7/site-packages/awscli/customizations/s3/filegenerator.py", line 343, in _list_single_object

    response = self._client.head_object(**params)

  File "/usr/local/lib/python2.7/site-packages/botocore/client.py", line 228, in _api_call

    return self._make_api_call(operation_name, kwargs)

  File "/usr/local/lib/python2.7/site-packages/botocore/client.py", line 488, in _make_api_call

    model=operation_model, context=request_context

  File "/usr/local/lib/python2.7/site-packages/botocore/hooks.py", line 226, in emit

    return self._emit(event_name, kwargs)

  File "/usr/local/lib/python2.7/site-packages/botocore/hooks.py", line 209, in _emit

    response = handler(**kwargs)

  File "/usr/local/lib/python2.7/site-packages/awscli/errorhandler.py", line 70, in __call__

    http_status_code=http_response.status_code)

ClientError: A client error (403) occurred when calling the HeadObject operation: Forbidden

2016-03-22 01:07:47,153 - Thread-1 - awscli.customizations.s3.executor - DEBUG - Received print task: PrintTask(message='A client error (403) occurred when calling the HeadObject operation: Forbidden', error=True, total_parts=None, warning=None)

A client error (403) occurred when calling the HeadObject operation: Forbidden

How do I troubleshoot 403 Access Denied errors from Amazon S3?

How do I troubleshoot 403 Access Denied errors from Amazon S3?.
Use the AWS Systems Manager automation document. ... .
Check bucket and object ownership. ... .
Check the bucket policy or IAM user policies. ... .
Confirm that IAM permissions boundaries allow access to Amazon S3. ... .
Check the bucket's Amazon S3 Block Public Access settings..

When accessing a S3 bucket you get 403 Forbidden error What does it mean?

The "403 Forbidden" error can occur due to the following reasons: Permissions are missing for s3:PutObject to add an object or s3:PutObjectAcl to modify the object's ACL. You don't have permission to use an AWS Key Management Service (AWS KMS) key. There is an explicit deny statement in the bucket policy.

How do I fix an AWS S3 bucket policy and Public permissions access denied error?

To resolve these issues:.
Check that the IAM user or role has s3:Get:BucketPolicy permission to view the bucket policy and s3:PutBucketPolicy permission to edit it. ... .
If you're denied permissions, then use another IAM identity that has bucket access, and edit the bucket policy..

How do I give public access to S3 bucket?

Resolution.
Open the Amazon S3 console..
From the list of buckets, choose the bucket with the objects that you want to update..
Navigate to the folder that contains the objects..
From the object list, select all the objects that you want to make public..
Choose Actions, and then choose Make public..